Dec 05 08:18:49 localhost kernel: Linux version 5.14.0-645.el9.x86_64 (mockbuild@x86-05.stream.rdu2.redhat.com) (gcc (GCC) 11.5.0 20240719 (Red Hat 11.5.0-14), GNU ld version 2.35.2-68.el9) #1 SMP PREEMPT_DYNAMIC Fri Nov 28 14:01:17 UTC 2025
Dec 05 08:18:49 localhost kernel: The list of certified hardware and cloud instances for Red Hat Enterprise Linux 9 can be viewed at the Red Hat Ecosystem Catalog, https://catalog.redhat.com.
Dec 05 08:18:49 localhost kernel: Command line: BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-645.el9.x86_64 root=UUID=fcf6b761-831a-48a7-9f5f-068b5063763f ro console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-2G:192M,2G-64G:256M,64G-:512M
Dec 05 08:18:49 localhost kernel: BIOS-provided physical RAM map:
Dec 05 08:18:49 localhost kernel: BIOS-e820: [mem 0x0000000000000000-0x000000000009fbff] usable
Dec 05 08:18:49 localhost kernel: BIOS-e820: [mem 0x000000000009fc00-0x000000000009ffff] reserved
Dec 05 08:18:49 localhost kernel: BIOS-e820: [mem 0x00000000000f0000-0x00000000000fffff] reserved
Dec 05 08:18:49 localhost kernel: BIOS-e820: [mem 0x0000000000100000-0x00000000bffdafff] usable
Dec 05 08:18:49 localhost kernel: BIOS-e820: [mem 0x00000000bffdb000-0x00000000bfffffff] reserved
Dec 05 08:18:49 localhost kernel: BIOS-e820: [mem 0x00000000feffc000-0x00000000feffffff] reserved
Dec 05 08:18:49 localhost kernel: BIOS-e820: [mem 0x00000000fffc0000-0x00000000ffffffff] reserved
Dec 05 08:18:49 localhost kernel: BIOS-e820: [mem 0x0000000100000000-0x000000023fffffff] usable
Dec 05 08:18:49 localhost kernel: NX (Execute Disable) protection: active
Dec 05 08:18:49 localhost kernel: APIC: Static calls initialized
Dec 05 08:18:49 localhost kernel: SMBIOS 2.8 present.
Dec 05 08:18:49 localhost kernel: DMI: OpenStack Foundation OpenStack Nova, BIOS 1.15.0-1 04/01/2014
Dec 05 08:18:49 localhost kernel: Hypervisor detected: KVM
Dec 05 08:18:49 localhost kernel: kvm-clock: Using msrs 4b564d01 and 4b564d00
Dec 05 08:18:49 localhost kernel: kvm-clock: using sched offset of 3166111870 cycles
Dec 05 08:18:49 localhost kernel: clocksource: kvm-clock: mask: 0xffffffffffffffff max_cycles: 0x1cd42e4dffb, max_idle_ns: 881590591483 ns
Dec 05 08:18:49 localhost kernel: tsc: Detected 2799.998 MHz processor
Dec 05 08:18:49 localhost kernel: e820: update [mem 0x00000000-0x00000fff] usable ==> reserved
Dec 05 08:18:49 localhost kernel: e820: remove [mem 0x000a0000-0x000fffff] usable
Dec 05 08:18:49 localhost kernel: last_pfn = 0x240000 max_arch_pfn = 0x400000000
Dec 05 08:18:49 localhost kernel: MTRR map: 4 entries (3 fixed + 1 variable; max 19), built from 8 variable MTRRs
Dec 05 08:18:49 localhost kernel: x86/PAT: Configuration [0-7]: WB  WC  UC- UC  WB  WP  UC- WT  
Dec 05 08:18:49 localhost kernel: last_pfn = 0xbffdb max_arch_pfn = 0x400000000
Dec 05 08:18:49 localhost kernel: found SMP MP-table at [mem 0x000f5ae0-0x000f5aef]
Dec 05 08:18:49 localhost kernel: Using GB pages for direct mapping
Dec 05 08:18:49 localhost kernel: RAMDISK: [mem 0x2d472000-0x32a30fff]
Dec 05 08:18:49 localhost kernel: ACPI: Early table checksum verification disabled
Dec 05 08:18:49 localhost kernel: ACPI: RSDP 0x00000000000F5AA0 000014 (v00 BOCHS )
Dec 05 08:18:49 localhost kernel: ACPI: RSDT 0x00000000BFFE16BD 000030 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Dec 05 08:18:49 localhost kernel: ACPI: FACP 0x00000000BFFE1571 000074 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Dec 05 08:18:49 localhost kernel: ACPI: DSDT 0x00000000BFFDFC80 0018F1 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Dec 05 08:18:49 localhost kernel: ACPI: FACS 0x00000000BFFDFC40 000040
Dec 05 08:18:49 localhost kernel: ACPI: APIC 0x00000000BFFE15E5 0000B0 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Dec 05 08:18:49 localhost kernel: ACPI: WAET 0x00000000BFFE1695 000028 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Dec 05 08:18:49 localhost kernel: ACPI: Reserving FACP table memory at [mem 0xbffe1571-0xbffe15e4]
Dec 05 08:18:49 localhost kernel: ACPI: Reserving DSDT table memory at [mem 0xbffdfc80-0xbffe1570]
Dec 05 08:18:49 localhost kernel: ACPI: Reserving FACS table memory at [mem 0xbffdfc40-0xbffdfc7f]
Dec 05 08:18:49 localhost kernel: ACPI: Reserving APIC table memory at [mem 0xbffe15e5-0xbffe1694]
Dec 05 08:18:49 localhost kernel: ACPI: Reserving WAET table memory at [mem 0xbffe1695-0xbffe16bc]
Dec 05 08:18:49 localhost kernel: No NUMA configuration found
Dec 05 08:18:49 localhost kernel: Faking a node at [mem 0x0000000000000000-0x000000023fffffff]
Dec 05 08:18:49 localhost kernel: NODE_DATA(0) allocated [mem 0x23ffd3000-0x23fffdfff]
Dec 05 08:18:49 localhost kernel: crashkernel reserved: 0x00000000af000000 - 0x00000000bf000000 (256 MB)
Dec 05 08:18:49 localhost kernel: Zone ranges:
Dec 05 08:18:49 localhost kernel:   DMA      [mem 0x0000000000001000-0x0000000000ffffff]
Dec 05 08:18:49 localhost kernel:   DMA32    [mem 0x0000000001000000-0x00000000ffffffff]
Dec 05 08:18:49 localhost kernel:   Normal   [mem 0x0000000100000000-0x000000023fffffff]
Dec 05 08:18:49 localhost kernel:   Device   empty
Dec 05 08:18:49 localhost kernel: Movable zone start for each node
Dec 05 08:18:49 localhost kernel: Early memory node ranges
Dec 05 08:18:49 localhost kernel:   node   0: [mem 0x0000000000001000-0x000000000009efff]
Dec 05 08:18:49 localhost kernel:   node   0: [mem 0x0000000000100000-0x00000000bffdafff]
Dec 05 08:18:49 localhost kernel:   node   0: [mem 0x0000000100000000-0x000000023fffffff]
Dec 05 08:18:49 localhost kernel: Initmem setup node 0 [mem 0x0000000000001000-0x000000023fffffff]
Dec 05 08:18:49 localhost kernel: On node 0, zone DMA: 1 pages in unavailable ranges
Dec 05 08:18:49 localhost kernel: On node 0, zone DMA: 97 pages in unavailable ranges
Dec 05 08:18:49 localhost kernel: On node 0, zone Normal: 37 pages in unavailable ranges
Dec 05 08:18:49 localhost kernel: ACPI: PM-Timer IO Port: 0x608
Dec 05 08:18:49 localhost kernel: ACPI: LAPIC_NMI (acpi_id[0xff] dfl dfl lint[0x1])
Dec 05 08:18:49 localhost kernel: IOAPIC[0]: apic_id 0, version 17, address 0xfec00000, GSI 0-23
Dec 05 08:18:49 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 0 global_irq 2 dfl dfl)
Dec 05 08:18:49 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 5 global_irq 5 high level)
Dec 05 08:18:49 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 9 global_irq 9 high level)
Dec 05 08:18:49 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 10 global_irq 10 high level)
Dec 05 08:18:49 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 11 global_irq 11 high level)
Dec 05 08:18:49 localhost kernel: ACPI: Using ACPI (MADT) for SMP configuration information
Dec 05 08:18:49 localhost kernel: TSC deadline timer available
Dec 05 08:18:49 localhost kernel: CPU topo: Max. logical packages:   8
Dec 05 08:18:49 localhost kernel: CPU topo: Max. logical dies:       8
Dec 05 08:18:49 localhost kernel: CPU topo: Max. dies per package:   1
Dec 05 08:18:49 localhost kernel: CPU topo: Max. threads per core:   1
Dec 05 08:18:49 localhost kernel: CPU topo: Num. cores per package:     1
Dec 05 08:18:49 localhost kernel: CPU topo: Num. threads per package:   1
Dec 05 08:18:49 localhost kernel: CPU topo: Allowing 8 present CPUs plus 0 hotplug CPUs
Dec 05 08:18:49 localhost kernel: kvm-guest: APIC: eoi() replaced with kvm_guest_apic_eoi_write()
Dec 05 08:18:49 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0x00000000-0x00000fff]
Dec 05 08:18:49 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0x0009f000-0x0009ffff]
Dec 05 08:18:49 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0x000a0000-0x000effff]
Dec 05 08:18:49 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0x000f0000-0x000fffff]
Dec 05 08:18:49 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xbffdb000-0xbfffffff]
Dec 05 08:18:49 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xc0000000-0xfeffbfff]
Dec 05 08:18:49 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xfeffc000-0xfeffffff]
Dec 05 08:18:49 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xff000000-0xfffbffff]
Dec 05 08:18:49 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xfffc0000-0xffffffff]
Dec 05 08:18:49 localhost kernel: [mem 0xc0000000-0xfeffbfff] available for PCI devices
Dec 05 08:18:49 localhost kernel: Booting paravirtualized kernel on KVM
Dec 05 08:18:49 localhost kernel: clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1910969940391419 ns
Dec 05 08:18:49 localhost kernel: setup_percpu: NR_CPUS:8192 nr_cpumask_bits:8 nr_cpu_ids:8 nr_node_ids:1
Dec 05 08:18:49 localhost kernel: percpu: Embedded 64 pages/cpu s225280 r8192 d28672 u262144
Dec 05 08:18:49 localhost kernel: pcpu-alloc: s225280 r8192 d28672 u262144 alloc=1*2097152
Dec 05 08:18:49 localhost kernel: pcpu-alloc: [0] 0 1 2 3 4 5 6 7 
Dec 05 08:18:49 localhost kernel: kvm-guest: PV spinlocks disabled, no host support
Dec 05 08:18:49 localhost kernel: Kernel command line: BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-645.el9.x86_64 root=UUID=fcf6b761-831a-48a7-9f5f-068b5063763f ro console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-2G:192M,2G-64G:256M,64G-:512M
Dec 05 08:18:49 localhost kernel: Unknown kernel command line parameters "BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-645.el9.x86_64", will be passed to user space.
Dec 05 08:18:49 localhost kernel: random: crng init done
Dec 05 08:18:49 localhost kernel: Dentry cache hash table entries: 1048576 (order: 11, 8388608 bytes, linear)
Dec 05 08:18:49 localhost kernel: Inode-cache hash table entries: 524288 (order: 10, 4194304 bytes, linear)
Dec 05 08:18:49 localhost kernel: Fallback order for Node 0: 0 
Dec 05 08:18:49 localhost kernel: Built 1 zonelists, mobility grouping on.  Total pages: 2064091
Dec 05 08:18:49 localhost kernel: Policy zone: Normal
Dec 05 08:18:49 localhost kernel: mem auto-init: stack:off, heap alloc:off, heap free:off
Dec 05 08:18:49 localhost kernel: software IO TLB: area num 8.
Dec 05 08:18:49 localhost kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=8, Nodes=1
Dec 05 08:18:49 localhost kernel: ftrace: allocating 49335 entries in 193 pages
Dec 05 08:18:49 localhost kernel: ftrace: allocated 193 pages with 3 groups
Dec 05 08:18:49 localhost kernel: Dynamic Preempt: voluntary
Dec 05 08:18:49 localhost kernel: rcu: Preemptible hierarchical RCU implementation.
Dec 05 08:18:49 localhost kernel: rcu:         RCU event tracing is enabled.
Dec 05 08:18:49 localhost kernel: rcu:         RCU restricting CPUs from NR_CPUS=8192 to nr_cpu_ids=8.
Dec 05 08:18:49 localhost kernel:         Trampoline variant of Tasks RCU enabled.
Dec 05 08:18:49 localhost kernel:         Rude variant of Tasks RCU enabled.
Dec 05 08:18:49 localhost kernel:         Tracing variant of Tasks RCU enabled.
Dec 05 08:18:49 localhost kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies.
Dec 05 08:18:49 localhost kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=8
Dec 05 08:18:49 localhost kernel: RCU Tasks: Setting shift to 3 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=8.
Dec 05 08:18:49 localhost kernel: RCU Tasks Rude: Setting shift to 3 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=8.
Dec 05 08:18:49 localhost kernel: RCU Tasks Trace: Setting shift to 3 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=8.
Dec 05 08:18:49 localhost kernel: NR_IRQS: 524544, nr_irqs: 488, preallocated irqs: 16
Dec 05 08:18:49 localhost kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention.
Dec 05 08:18:49 localhost kernel: kfence: initialized - using 2097152 bytes for 255 objects at 0x(____ptrval____)-0x(____ptrval____)
Dec 05 08:18:49 localhost kernel: Console: colour VGA+ 80x25
Dec 05 08:18:49 localhost kernel: printk: console [ttyS0] enabled
Dec 05 08:18:49 localhost kernel: ACPI: Core revision 20230331
Dec 05 08:18:49 localhost kernel: APIC: Switch to symmetric I/O mode setup
Dec 05 08:18:49 localhost kernel: x2apic enabled
Dec 05 08:18:49 localhost kernel: APIC: Switched APIC routing to: physical x2apic
Dec 05 08:18:49 localhost kernel: tsc: Marking TSC unstable due to TSCs unsynchronized
Dec 05 08:18:49 localhost kernel: Calibrating delay loop (skipped) preset value.. 5599.99 BogoMIPS (lpj=2799998)
Dec 05 08:18:49 localhost kernel: x86/cpu: User Mode Instruction Prevention (UMIP) activated
Dec 05 08:18:49 localhost kernel: Last level iTLB entries: 4KB 512, 2MB 255, 4MB 127
Dec 05 08:18:49 localhost kernel: Last level dTLB entries: 4KB 512, 2MB 255, 4MB 127, 1GB 0
Dec 05 08:18:49 localhost kernel: Spectre V1 : Mitigation: usercopy/swapgs barriers and __user pointer sanitization
Dec 05 08:18:49 localhost kernel: Spectre V2 : Mitigation: Retpolines
Dec 05 08:18:49 localhost kernel: Spectre V2 : Spectre v2 / SpectreRSB: Filling RSB on context switch and VMEXIT
Dec 05 08:18:49 localhost kernel: Spectre V2 : Enabling Speculation Barrier for firmware calls
Dec 05 08:18:49 localhost kernel: RETBleed: Mitigation: untrained return thunk
Dec 05 08:18:49 localhost kernel: Spectre V2 : mitigation: Enabling conditional Indirect Branch Prediction Barrier
Dec 05 08:18:49 localhost kernel: Speculative Store Bypass: Mitigation: Speculative Store Bypass disabled via prctl
Dec 05 08:18:49 localhost kernel: Speculative Return Stack Overflow: IBPB-extending microcode not applied!
Dec 05 08:18:49 localhost kernel: Speculative Return Stack Overflow: WARNING: See https://kernel.org/doc/html/latest/admin-guide/hw-vuln/srso.html for mitigation options.
Dec 05 08:18:49 localhost kernel: x86/bugs: return thunk changed
Dec 05 08:18:49 localhost kernel: Speculative Return Stack Overflow: Vulnerable: Safe RET, no microcode
Dec 05 08:18:49 localhost kernel: x86/fpu: Supporting XSAVE feature 0x001: 'x87 floating point registers'
Dec 05 08:18:49 localhost kernel: x86/fpu: Supporting XSAVE feature 0x002: 'SSE registers'
Dec 05 08:18:49 localhost kernel: x86/fpu: Supporting XSAVE feature 0x004: 'AVX registers'
Dec 05 08:18:49 localhost kernel: x86/fpu: xstate_offset[2]:  576, xstate_sizes[2]:  256
Dec 05 08:18:49 localhost kernel: x86/fpu: Enabled xstate features 0x7, context size is 832 bytes, using 'compacted' format.
Dec 05 08:18:49 localhost kernel: Freeing SMP alternatives memory: 40K
Dec 05 08:18:49 localhost kernel: pid_max: default: 32768 minimum: 301
Dec 05 08:18:49 localhost kernel: LSM: initializing lsm=lockdown,capability,landlock,yama,integrity,selinux,bpf
Dec 05 08:18:49 localhost kernel: landlock: Up and running.
Dec 05 08:18:49 localhost kernel: Yama: becoming mindful.
Dec 05 08:18:49 localhost kernel: SELinux:  Initializing.
Dec 05 08:18:49 localhost kernel: LSM support for eBPF active
Dec 05 08:18:49 localhost kernel: Mount-cache hash table entries: 16384 (order: 5, 131072 bytes, linear)
Dec 05 08:18:49 localhost kernel: Mountpoint-cache hash table entries: 16384 (order: 5, 131072 bytes, linear)
Dec 05 08:18:49 localhost kernel: smpboot: CPU0: AMD EPYC-Rome Processor (family: 0x17, model: 0x31, stepping: 0x0)
Dec 05 08:18:49 localhost kernel: Performance Events: Fam17h+ core perfctr, AMD PMU driver.
Dec 05 08:18:49 localhost kernel: ... version:                0
Dec 05 08:18:49 localhost kernel: ... bit width:              48
Dec 05 08:18:49 localhost kernel: ... generic registers:      6
Dec 05 08:18:49 localhost kernel: ... value mask:             0000ffffffffffff
Dec 05 08:18:49 localhost kernel: ... max period:             00007fffffffffff
Dec 05 08:18:49 localhost kernel: ... fixed-purpose events:   0
Dec 05 08:18:49 localhost kernel: ... event mask:             000000000000003f
Dec 05 08:18:49 localhost kernel: signal: max sigframe size: 1776
Dec 05 08:18:49 localhost kernel: rcu: Hierarchical SRCU implementation.
Dec 05 08:18:49 localhost kernel: rcu:         Max phase no-delay instances is 400.
Dec 05 08:18:49 localhost kernel: smp: Bringing up secondary CPUs ...
Dec 05 08:18:49 localhost kernel: smpboot: x86: Booting SMP configuration:
Dec 05 08:18:49 localhost kernel: .... node  #0, CPUs:      #1 #2 #3 #4 #5 #6 #7
Dec 05 08:18:49 localhost kernel: smp: Brought up 1 node, 8 CPUs
Dec 05 08:18:49 localhost kernel: smpboot: Total of 8 processors activated (44799.96 BogoMIPS)
Dec 05 08:18:49 localhost kernel: node 0 deferred pages initialised in 41ms
Dec 05 08:18:49 localhost kernel: Memory: 7763864K/8388068K available (16384K kernel code, 5795K rwdata, 13908K rodata, 4196K init, 7156K bss, 618220K reserved, 0K cma-reserved)
Dec 05 08:18:49 localhost kernel: devtmpfs: initialized
Dec 05 08:18:49 localhost kernel: x86/mm: Memory block size: 128MB
Dec 05 08:18:49 localhost kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns
Dec 05 08:18:49 localhost kernel: futex hash table entries: 2048 (131072 bytes on 1 NUMA nodes, total 128 KiB, linear).
Dec 05 08:18:49 localhost kernel: pinctrl core: initialized pinctrl subsystem
Dec 05 08:18:49 localhost kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family
Dec 05 08:18:49 localhost kernel: DMA: preallocated 1024 KiB GFP_KERNEL pool for atomic allocations
Dec 05 08:18:49 localhost kernel: DMA: preallocated 1024 KiB GFP_KERNEL|GFP_DMA pool for atomic allocations
Dec 05 08:18:49 localhost kernel: DMA: preallocated 1024 KiB GFP_KERNEL|GFP_DMA32 pool for atomic allocations
Dec 05 08:18:49 localhost kernel: audit: initializing netlink subsys (disabled)
Dec 05 08:18:49 localhost kernel: audit: type=2000 audit(1764922726.618:1): state=initialized audit_enabled=0 res=1
Dec 05 08:18:49 localhost kernel: thermal_sys: Registered thermal governor 'fair_share'
Dec 05 08:18:49 localhost kernel: thermal_sys: Registered thermal governor 'step_wise'
Dec 05 08:18:49 localhost kernel: thermal_sys: Registered thermal governor 'user_space'
Dec 05 08:18:49 localhost kernel: cpuidle: using governor menu
Dec 05 08:18:49 localhost kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5
Dec 05 08:18:49 localhost kernel: PCI: Using configuration type 1 for base access
Dec 05 08:18:49 localhost kernel: PCI: Using configuration type 1 for extended access
Dec 05 08:18:49 localhost kernel: kprobes: kprobe jump-optimization is enabled. All kprobes are optimized if possible.
Dec 05 08:18:49 localhost kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages
Dec 05 08:18:49 localhost kernel: HugeTLB: 16380 KiB vmemmap can be freed for a 1.00 GiB page
Dec 05 08:18:49 localhost kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages
Dec 05 08:18:49 localhost kernel: HugeTLB: 28 KiB vmemmap can be freed for a 2.00 MiB page
Dec 05 08:18:49 localhost kernel: Demotion targets for Node 0: null
Dec 05 08:18:49 localhost kernel: cryptd: max_cpu_qlen set to 1000
Dec 05 08:18:49 localhost kernel: ACPI: Added _OSI(Module Device)
Dec 05 08:18:49 localhost kernel: ACPI: Added _OSI(Processor Device)
Dec 05 08:18:49 localhost kernel: ACPI: Added _OSI(3.0 _SCP Extensions)
Dec 05 08:18:49 localhost kernel: ACPI: Added _OSI(Processor Aggregator Device)
Dec 05 08:18:49 localhost kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded
Dec 05 08:18:49 localhost kernel: ACPI: _OSC evaluation for CPUs failed, trying _PDC
Dec 05 08:18:49 localhost kernel: ACPI: Interpreter enabled
Dec 05 08:18:49 localhost kernel: ACPI: PM: (supports S0 S3 S4 S5)
Dec 05 08:18:49 localhost kernel: ACPI: Using IOAPIC for interrupt routing
Dec 05 08:18:49 localhost kernel: PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug
Dec 05 08:18:49 localhost kernel: PCI: Using E820 reservations for host bridge windows
Dec 05 08:18:49 localhost kernel: ACPI: Enabled 2 GPEs in block 00 to 0F
Dec 05 08:18:49 localhost kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff])
Dec 05 08:18:49 localhost kernel: acpi PNP0A03:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI EDR HPX-Type3]
Dec 05 08:18:49 localhost kernel: acpiphp: Slot [3] registered
Dec 05 08:18:49 localhost kernel: acpiphp: Slot [4] registered
Dec 05 08:18:49 localhost kernel: acpiphp: Slot [5] registered
Dec 05 08:18:49 localhost kernel: acpiphp: Slot [6] registered
Dec 05 08:18:49 localhost kernel: acpiphp: Slot [7] registered
Dec 05 08:18:49 localhost kernel: acpiphp: Slot [8] registered
Dec 05 08:18:49 localhost kernel: acpiphp: Slot [9] registered
Dec 05 08:18:49 localhost kernel: acpiphp: Slot [10] registered
Dec 05 08:18:49 localhost kernel: acpiphp: Slot [11] registered
Dec 05 08:18:49 localhost kernel: acpiphp: Slot [12] registered
Dec 05 08:18:49 localhost kernel: acpiphp: Slot [13] registered
Dec 05 08:18:49 localhost kernel: acpiphp: Slot [14] registered
Dec 05 08:18:49 localhost kernel: acpiphp: Slot [15] registered
Dec 05 08:18:49 localhost kernel: acpiphp: Slot [16] registered
Dec 05 08:18:49 localhost kernel: acpiphp: Slot [17] registered
Dec 05 08:18:49 localhost kernel: acpiphp: Slot [18] registered
Dec 05 08:18:49 localhost kernel: acpiphp: Slot [19] registered
Dec 05 08:18:49 localhost kernel: acpiphp: Slot [20] registered
Dec 05 08:18:49 localhost kernel: acpiphp: Slot [21] registered
Dec 05 08:18:49 localhost kernel: acpiphp: Slot [22] registered
Dec 05 08:18:49 localhost kernel: acpiphp: Slot [23] registered
Dec 05 08:18:49 localhost kernel: acpiphp: Slot [24] registered
Dec 05 08:18:49 localhost kernel: acpiphp: Slot [25] registered
Dec 05 08:18:49 localhost kernel: acpiphp: Slot [26] registered
Dec 05 08:18:49 localhost kernel: acpiphp: Slot [27] registered
Dec 05 08:18:49 localhost kernel: acpiphp: Slot [28] registered
Dec 05 08:18:49 localhost kernel: acpiphp: Slot [29] registered
Dec 05 08:18:49 localhost kernel: acpiphp: Slot [30] registered
Dec 05 08:18:49 localhost kernel: acpiphp: Slot [31] registered
Dec 05 08:18:49 localhost kernel: PCI host bridge to bus 0000:00
Dec 05 08:18:49 localhost kernel: pci_bus 0000:00: root bus resource [io  0x0000-0x0cf7 window]
Dec 05 08:18:49 localhost kernel: pci_bus 0000:00: root bus resource [io  0x0d00-0xffff window]
Dec 05 08:18:49 localhost kernel: pci_bus 0000:00: root bus resource [mem 0x000a0000-0x000bffff window]
Dec 05 08:18:49 localhost kernel: pci_bus 0000:00: root bus resource [mem 0xc0000000-0xfebfffff window]
Dec 05 08:18:49 localhost kernel: pci_bus 0000:00: root bus resource [mem 0x240000000-0x2bfffffff window]
Dec 05 08:18:49 localhost kernel: pci_bus 0000:00: root bus resource [bus 00-ff]
Dec 05 08:18:49 localhost kernel: pci 0000:00:00.0: [8086:1237] type 00 class 0x060000 conventional PCI endpoint
Dec 05 08:18:49 localhost kernel: pci 0000:00:01.0: [8086:7000] type 00 class 0x060100 conventional PCI endpoint
Dec 05 08:18:49 localhost kernel: pci 0000:00:01.1: [8086:7010] type 00 class 0x010180 conventional PCI endpoint
Dec 05 08:18:49 localhost kernel: pci 0000:00:01.1: BAR 4 [io  0xc140-0xc14f]
Dec 05 08:18:49 localhost kernel: pci 0000:00:01.1: BAR 0 [io  0x01f0-0x01f7]: legacy IDE quirk
Dec 05 08:18:49 localhost kernel: pci 0000:00:01.1: BAR 1 [io  0x03f6]: legacy IDE quirk
Dec 05 08:18:49 localhost kernel: pci 0000:00:01.1: BAR 2 [io  0x0170-0x0177]: legacy IDE quirk
Dec 05 08:18:49 localhost kernel: pci 0000:00:01.1: BAR 3 [io  0x0376]: legacy IDE quirk
Dec 05 08:18:49 localhost kernel: pci 0000:00:01.2: [8086:7020] type 00 class 0x0c0300 conventional PCI endpoint
Dec 05 08:18:49 localhost kernel: pci 0000:00:01.2: BAR 4 [io  0xc100-0xc11f]
Dec 05 08:18:49 localhost kernel: pci 0000:00:01.3: [8086:7113] type 00 class 0x068000 conventional PCI endpoint
Dec 05 08:18:49 localhost kernel: pci 0000:00:01.3: quirk: [io  0x0600-0x063f] claimed by PIIX4 ACPI
Dec 05 08:18:49 localhost kernel: pci 0000:00:01.3: quirk: [io  0x0700-0x070f] claimed by PIIX4 SMB
Dec 05 08:18:49 localhost kernel: pci 0000:00:02.0: [1af4:1050] type 00 class 0x030000 conventional PCI endpoint
Dec 05 08:18:49 localhost kernel: pci 0000:00:02.0: BAR 0 [mem 0xfe000000-0xfe7fffff pref]
Dec 05 08:18:49 localhost kernel: pci 0000:00:02.0: BAR 2 [mem 0xfe800000-0xfe803fff 64bit pref]
Dec 05 08:18:49 localhost kernel: pci 0000:00:02.0: BAR 4 [mem 0xfeb90000-0xfeb90fff]
Dec 05 08:18:49 localhost kernel: pci 0000:00:02.0: ROM [mem 0xfeb80000-0xfeb8ffff pref]
Dec 05 08:18:49 localhost kernel: pci 0000:00:02.0: Video device with shadowed ROM at [mem 0x000c0000-0x000dffff]
Dec 05 08:18:49 localhost kernel: pci 0000:00:03.0: [1af4:1000] type 00 class 0x020000 conventional PCI endpoint
Dec 05 08:18:49 localhost kernel: pci 0000:00:03.0: BAR 0 [io  0xc080-0xc0bf]
Dec 05 08:18:49 localhost kernel: pci 0000:00:03.0: BAR 1 [mem 0xfeb91000-0xfeb91fff]
Dec 05 08:18:49 localhost kernel: pci 0000:00:03.0: BAR 4 [mem 0xfe804000-0xfe807fff 64bit pref]
Dec 05 08:18:49 localhost kernel: pci 0000:00:03.0: ROM [mem 0xfeb00000-0xfeb7ffff pref]
Dec 05 08:18:49 localhost kernel: pci 0000:00:04.0: [1af4:1001] type 00 class 0x010000 conventional PCI endpoint
Dec 05 08:18:49 localhost kernel: pci 0000:00:04.0: BAR 0 [io  0xc000-0xc07f]
Dec 05 08:18:49 localhost kernel: pci 0000:00:04.0: BAR 1 [mem 0xfeb92000-0xfeb92fff]
Dec 05 08:18:49 localhost kernel: pci 0000:00:04.0: BAR 4 [mem 0xfe808000-0xfe80bfff 64bit pref]
Dec 05 08:18:49 localhost kernel: pci 0000:00:05.0: [1af4:1002] type 00 class 0x00ff00 conventional PCI endpoint
Dec 05 08:18:49 localhost kernel: pci 0000:00:05.0: BAR 0 [io  0xc0c0-0xc0ff]
Dec 05 08:18:49 localhost kernel: pci 0000:00:05.0: BAR 4 [mem 0xfe80c000-0xfe80ffff 64bit pref]
Dec 05 08:18:49 localhost kernel: pci 0000:00:06.0: [1af4:1005] type 00 class 0x00ff00 conventional PCI endpoint
Dec 05 08:18:49 localhost kernel: pci 0000:00:06.0: BAR 0 [io  0xc120-0xc13f]
Dec 05 08:18:49 localhost kernel: pci 0000:00:06.0: BAR 4 [mem 0xfe810000-0xfe813fff 64bit pref]
Dec 05 08:18:49 localhost kernel: ACPI: PCI: Interrupt link LNKA configured for IRQ 10
Dec 05 08:18:49 localhost kernel: ACPI: PCI: Interrupt link LNKB configured for IRQ 10
Dec 05 08:18:49 localhost kernel: ACPI: PCI: Interrupt link LNKC configured for IRQ 11
Dec 05 08:18:49 localhost kernel: ACPI: PCI: Interrupt link LNKD configured for IRQ 11
Dec 05 08:18:49 localhost kernel: ACPI: PCI: Interrupt link LNKS configured for IRQ 9
Dec 05 08:18:49 localhost kernel: iommu: Default domain type: Translated
Dec 05 08:18:49 localhost kernel: iommu: DMA domain TLB invalidation policy: lazy mode
Dec 05 08:18:49 localhost kernel: SCSI subsystem initialized
Dec 05 08:18:49 localhost kernel: ACPI: bus type USB registered
Dec 05 08:18:49 localhost kernel: usbcore: registered new interface driver usbfs
Dec 05 08:18:49 localhost kernel: usbcore: registered new interface driver hub
Dec 05 08:18:49 localhost kernel: usbcore: registered new device driver usb
Dec 05 08:18:49 localhost kernel: pps_core: LinuxPPS API ver. 1 registered
Dec 05 08:18:49 localhost kernel: pps_core: Software ver. 5.3.6 - Copyright 2005-2007 Rodolfo Giometti <giometti@linux.it>
Dec 05 08:18:49 localhost kernel: PTP clock support registered
Dec 05 08:18:49 localhost kernel: EDAC MC: Ver: 3.0.0
Dec 05 08:18:49 localhost kernel: NetLabel: Initializing
Dec 05 08:18:49 localhost kernel: NetLabel:  domain hash size = 128
Dec 05 08:18:49 localhost kernel: NetLabel:  protocols = UNLABELED CIPSOv4 CALIPSO
Dec 05 08:18:49 localhost kernel: NetLabel:  unlabeled traffic allowed by default
Dec 05 08:18:49 localhost kernel: PCI: Using ACPI for IRQ routing
Dec 05 08:18:49 localhost kernel: PCI: pci_cache_line_size set to 64 bytes
Dec 05 08:18:49 localhost kernel: e820: reserve RAM buffer [mem 0x0009fc00-0x0009ffff]
Dec 05 08:18:49 localhost kernel: e820: reserve RAM buffer [mem 0xbffdb000-0xbfffffff]
Dec 05 08:18:49 localhost kernel: pci 0000:00:02.0: vgaarb: setting as boot VGA device
Dec 05 08:18:49 localhost kernel: pci 0000:00:02.0: vgaarb: bridge control possible
Dec 05 08:18:49 localhost kernel: pci 0000:00:02.0: vgaarb: VGA device added: decodes=io+mem,owns=io+mem,locks=none
Dec 05 08:18:49 localhost kernel: vgaarb: loaded
Dec 05 08:18:49 localhost kernel: clocksource: Switched to clocksource kvm-clock
Dec 05 08:18:49 localhost kernel: VFS: Disk quotas dquot_6.6.0
Dec 05 08:18:49 localhost kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes)
Dec 05 08:18:49 localhost kernel: pnp: PnP ACPI init
Dec 05 08:18:49 localhost kernel: pnp 00:03: [dma 2]
Dec 05 08:18:49 localhost kernel: pnp: PnP ACPI: found 5 devices
Dec 05 08:18:49 localhost kernel: clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns
Dec 05 08:18:49 localhost kernel: NET: Registered PF_INET protocol family
Dec 05 08:18:49 localhost kernel: IP idents hash table entries: 131072 (order: 8, 1048576 bytes, linear)
Dec 05 08:18:49 localhost kernel: tcp_listen_portaddr_hash hash table entries: 4096 (order: 4, 65536 bytes, linear)
Dec 05 08:18:49 localhost kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear)
Dec 05 08:18:49 localhost kernel: TCP established hash table entries: 65536 (order: 7, 524288 bytes, linear)
Dec 05 08:18:49 localhost kernel: TCP bind hash table entries: 65536 (order: 8, 1048576 bytes, linear)
Dec 05 08:18:49 localhost kernel: TCP: Hash tables configured (established 65536 bind 65536)
Dec 05 08:18:49 localhost kernel: MPTCP token hash table entries: 8192 (order: 5, 196608 bytes, linear)
Dec 05 08:18:49 localhost kernel: UDP hash table entries: 4096 (order: 5, 131072 bytes, linear)
Dec 05 08:18:49 localhost kernel: UDP-Lite hash table entries: 4096 (order: 5, 131072 bytes, linear)
Dec 05 08:18:49 localhost kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family
Dec 05 08:18:49 localhost kernel: NET: Registered PF_XDP protocol family
Dec 05 08:18:49 localhost kernel: pci_bus 0000:00: resource 4 [io  0x0000-0x0cf7 window]
Dec 05 08:18:49 localhost kernel: pci_bus 0000:00: resource 5 [io  0x0d00-0xffff window]
Dec 05 08:18:49 localhost kernel: pci_bus 0000:00: resource 6 [mem 0x000a0000-0x000bffff window]
Dec 05 08:18:49 localhost kernel: pci_bus 0000:00: resource 7 [mem 0xc0000000-0xfebfffff window]
Dec 05 08:18:49 localhost kernel: pci_bus 0000:00: resource 8 [mem 0x240000000-0x2bfffffff window]
Dec 05 08:18:49 localhost kernel: pci 0000:00:01.0: PIIX3: Enabling Passive Release
Dec 05 08:18:49 localhost kernel: pci 0000:00:00.0: Limiting direct PCI/PCI transfers
Dec 05 08:18:49 localhost kernel: ACPI: \_SB_.LNKD: Enabled at IRQ 11
Dec 05 08:18:49 localhost kernel: pci 0000:00:01.2: quirk_usb_early_handoff+0x0/0x160 took 71330 usecs
Dec 05 08:18:49 localhost kernel: PCI: CLS 0 bytes, default 64
Dec 05 08:18:49 localhost kernel: PCI-DMA: Using software bounce buffering for IO (SWIOTLB)
Dec 05 08:18:49 localhost kernel: software IO TLB: mapped [mem 0x00000000ab000000-0x00000000af000000] (64MB)
Dec 05 08:18:49 localhost kernel: ACPI: bus type thunderbolt registered
Dec 05 08:18:49 localhost kernel: Trying to unpack rootfs image as initramfs...
Dec 05 08:18:49 localhost kernel: Initialise system trusted keyrings
Dec 05 08:18:49 localhost kernel: Key type blacklist registered
Dec 05 08:18:49 localhost kernel: workingset: timestamp_bits=36 max_order=21 bucket_order=0
Dec 05 08:18:49 localhost kernel: zbud: loaded
Dec 05 08:18:49 localhost kernel: integrity: Platform Keyring initialized
Dec 05 08:18:49 localhost kernel: integrity: Machine keyring initialized
Dec 05 08:18:49 localhost kernel: Freeing initrd memory: 87804K
Dec 05 08:18:49 localhost kernel: NET: Registered PF_ALG protocol family
Dec 05 08:18:49 localhost kernel: xor: automatically using best checksumming function   avx       
Dec 05 08:18:49 localhost kernel: Key type asymmetric registered
Dec 05 08:18:49 localhost kernel: Asymmetric key parser 'x509' registered
Dec 05 08:18:49 localhost kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 246)
Dec 05 08:18:49 localhost kernel: io scheduler mq-deadline registered
Dec 05 08:18:49 localhost kernel: io scheduler kyber registered
Dec 05 08:18:49 localhost kernel: io scheduler bfq registered
Dec 05 08:18:49 localhost kernel: atomic64_test: passed for x86-64 platform with CX8 and with SSE
Dec 05 08:18:49 localhost kernel: shpchp: Standard Hot Plug PCI Controller Driver version: 0.4
Dec 05 08:18:49 localhost kernel: input: Power Button as /devices/LNXSYSTM:00/LNXPWRBN:00/input/input0
Dec 05 08:18:49 localhost kernel: ACPI: button: Power Button [PWRF]
Dec 05 08:18:49 localhost kernel: ACPI: \_SB_.LNKB: Enabled at IRQ 10
Dec 05 08:18:49 localhost kernel: ACPI: \_SB_.LNKC: Enabled at IRQ 11
Dec 05 08:18:49 localhost kernel: ACPI: \_SB_.LNKA: Enabled at IRQ 10
Dec 05 08:18:49 localhost kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled
Dec 05 08:18:49 localhost kernel: 00:00: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A
Dec 05 08:18:49 localhost kernel: Non-volatile memory driver v1.3
Dec 05 08:18:49 localhost kernel: rdac: device handler registered
Dec 05 08:18:49 localhost kernel: hp_sw: device handler registered
Dec 05 08:18:49 localhost kernel: emc: device handler registered
Dec 05 08:18:49 localhost kernel: alua: device handler registered
Dec 05 08:18:49 localhost kernel: uhci_hcd 0000:00:01.2: UHCI Host Controller
Dec 05 08:18:49 localhost kernel: uhci_hcd 0000:00:01.2: new USB bus registered, assigned bus number 1
Dec 05 08:18:49 localhost kernel: uhci_hcd 0000:00:01.2: detected 2 ports
Dec 05 08:18:49 localhost kernel: uhci_hcd 0000:00:01.2: irq 11, io port 0x0000c100
Dec 05 08:18:49 localhost kernel: usb usb1: New USB device found, idVendor=1d6b, idProduct=0001, bcdDevice= 5.14
Dec 05 08:18:49 localhost kernel: usb usb1: New USB device strings: Mfr=3, Product=2, SerialNumber=1
Dec 05 08:18:49 localhost kernel: usb usb1: Product: UHCI Host Controller
Dec 05 08:18:49 localhost kernel: usb usb1: Manufacturer: Linux 5.14.0-645.el9.x86_64 uhci_hcd
Dec 05 08:18:49 localhost kernel: usb usb1: SerialNumber: 0000:00:01.2
Dec 05 08:18:49 localhost kernel: hub 1-0:1.0: USB hub found
Dec 05 08:18:49 localhost kernel: hub 1-0:1.0: 2 ports detected
Dec 05 08:18:49 localhost kernel: usbcore: registered new interface driver usbserial_generic
Dec 05 08:18:49 localhost kernel: usbserial: USB Serial support registered for generic
Dec 05 08:18:49 localhost kernel: i8042: PNP: PS/2 Controller [PNP0303:KBD,PNP0f13:MOU] at 0x60,0x64 irq 1,12
Dec 05 08:18:49 localhost kernel: serio: i8042 KBD port at 0x60,0x64 irq 1
Dec 05 08:18:49 localhost kernel: serio: i8042 AUX port at 0x60,0x64 irq 12
Dec 05 08:18:49 localhost kernel: mousedev: PS/2 mouse device common for all mice
Dec 05 08:18:49 localhost kernel: rtc_cmos 00:04: RTC can wake from S4
Dec 05 08:18:49 localhost kernel: input: AT Translated Set 2 keyboard as /devices/platform/i8042/serio0/input/input1
Dec 05 08:18:49 localhost kernel: rtc_cmos 00:04: registered as rtc0
Dec 05 08:18:49 localhost kernel: rtc_cmos 00:04: setting system clock to 2025-12-05T08:18:48 UTC (1764922728)
Dec 05 08:18:49 localhost kernel: rtc_cmos 00:04: alarms up to one day, y3k, 242 bytes nvram
Dec 05 08:18:49 localhost kernel: amd_pstate: the _CPC object is not present in SBIOS or ACPI disabled
Dec 05 08:18:49 localhost kernel: input: VirtualPS/2 VMware VMMouse as /devices/platform/i8042/serio1/input/input4
Dec 05 08:18:49 localhost kernel: hid: raw HID events driver (C) Jiri Kosina
Dec 05 08:18:49 localhost kernel: usbcore: registered new interface driver usbhid
Dec 05 08:18:49 localhost kernel: usbhid: USB HID core driver
Dec 05 08:18:49 localhost kernel: input: VirtualPS/2 VMware VMMouse as /devices/platform/i8042/serio1/input/input3
Dec 05 08:18:49 localhost kernel: drop_monitor: Initializing network drop monitor service
Dec 05 08:18:49 localhost kernel: Initializing XFRM netlink socket
Dec 05 08:18:49 localhost kernel: NET: Registered PF_INET6 protocol family
Dec 05 08:18:49 localhost kernel: Segment Routing with IPv6
Dec 05 08:18:49 localhost kernel: NET: Registered PF_PACKET protocol family
Dec 05 08:18:49 localhost kernel: mpls_gso: MPLS GSO support
Dec 05 08:18:49 localhost kernel: IPI shorthand broadcast: enabled
Dec 05 08:18:49 localhost kernel: AVX2 version of gcm_enc/dec engaged.
Dec 05 08:18:49 localhost kernel: AES CTR mode by8 optimization enabled
Dec 05 08:18:49 localhost kernel: sched_clock: Marking stable (1607005305, 155462291)->(1861450389, -98982793)
Dec 05 08:18:49 localhost kernel: registered taskstats version 1
Dec 05 08:18:49 localhost kernel: Loading compiled-in X.509 certificates
Dec 05 08:18:49 localhost kernel: Loaded X.509 cert 'The CentOS Project: CentOS Stream kernel signing key: 4c28336b4850d771d036b52fb2778fdb4f02f708'
Dec 05 08:18:49 localhost kernel: Loaded X.509 cert 'Red Hat Enterprise Linux Driver Update Program (key 3): bf57f3e87362bc7229d9f465321773dfd1f77a80'
Dec 05 08:18:49 localhost kernel: Loaded X.509 cert 'Red Hat Enterprise Linux kpatch signing key: 4d38fd864ebe18c5f0b72e3852e2014c3a676fc8'
Dec 05 08:18:49 localhost kernel: Loaded X.509 cert 'RH-IMA-CA: Red Hat IMA CA: fb31825dd0e073685b264e3038963673f753959a'
Dec 05 08:18:49 localhost kernel: Loaded X.509 cert 'Nvidia GPU OOT signing 001: 55e1cef88193e60419f0b0ec379c49f77545acf0'
Dec 05 08:18:49 localhost kernel: Demotion targets for Node 0: null
Dec 05 08:18:49 localhost kernel: page_owner is disabled
Dec 05 08:18:49 localhost kernel: Key type .fscrypt registered
Dec 05 08:18:49 localhost kernel: Key type fscrypt-provisioning registered
Dec 05 08:18:49 localhost kernel: Key type big_key registered
Dec 05 08:18:49 localhost kernel: Key type encrypted registered
Dec 05 08:18:49 localhost kernel: ima: No TPM chip found, activating TPM-bypass!
Dec 05 08:18:49 localhost kernel: Loading compiled-in module X.509 certificates
Dec 05 08:18:49 localhost kernel: Loaded X.509 cert 'The CentOS Project: CentOS Stream kernel signing key: 4c28336b4850d771d036b52fb2778fdb4f02f708'
Dec 05 08:18:49 localhost kernel: ima: Allocated hash algorithm: sha256
Dec 05 08:18:49 localhost kernel: ima: No architecture policies found
Dec 05 08:18:49 localhost kernel: evm: Initialising EVM extended attributes:
Dec 05 08:18:49 localhost kernel: evm: security.selinux
Dec 05 08:18:49 localhost kernel: evm: security.SMACK64 (disabled)
Dec 05 08:18:49 localhost kernel: evm: security.SMACK64EXEC (disabled)
Dec 05 08:18:49 localhost kernel: evm: security.SMACK64TRANSMUTE (disabled)
Dec 05 08:18:49 localhost kernel: evm: security.SMACK64MMAP (disabled)
Dec 05 08:18:49 localhost kernel: evm: security.apparmor (disabled)
Dec 05 08:18:49 localhost kernel: evm: security.ima
Dec 05 08:18:49 localhost kernel: evm: security.capability
Dec 05 08:18:49 localhost kernel: evm: HMAC attrs: 0x1
Dec 05 08:18:49 localhost kernel: usb 1-1: new full-speed USB device number 2 using uhci_hcd
Dec 05 08:18:49 localhost kernel: Running certificate verification RSA selftest
Dec 05 08:18:49 localhost kernel: Loaded X.509 cert 'Certificate verification self-testing key: f58703bb33ce1b73ee02eccdee5b8817518fe3db'
Dec 05 08:18:49 localhost kernel: Running certificate verification ECDSA selftest
Dec 05 08:18:49 localhost kernel: Loaded X.509 cert 'Certificate verification ECDSA self-testing key: 2900bcea1deb7bc8479a84a23d758efdfdd2b2d3'
Dec 05 08:18:49 localhost kernel: clk: Disabling unused clocks
Dec 05 08:18:49 localhost kernel: Freeing unused decrypted memory: 2028K
Dec 05 08:18:49 localhost kernel: Freeing unused kernel image (initmem) memory: 4196K
Dec 05 08:18:49 localhost kernel: Write protecting the kernel read-only data: 30720k
Dec 05 08:18:49 localhost kernel: Freeing unused kernel image (rodata/data gap) memory: 428K
Dec 05 08:18:49 localhost kernel: usb 1-1: New USB device found, idVendor=0627, idProduct=0001, bcdDevice= 0.00
Dec 05 08:18:49 localhost kernel: usb 1-1: New USB device strings: Mfr=1, Product=3, SerialNumber=10
Dec 05 08:18:49 localhost kernel: usb 1-1: Product: QEMU USB Tablet
Dec 05 08:18:49 localhost kernel: usb 1-1: Manufacturer: QEMU
Dec 05 08:18:49 localhost kernel: usb 1-1: SerialNumber: 28754-0000:00:01.2-1
Dec 05 08:18:49 localhost kernel: x86/mm: Checked W+X mappings: passed, no W+X pages found.
Dec 05 08:18:49 localhost kernel: Run /init as init process
Dec 05 08:18:49 localhost kernel:   with arguments:
Dec 05 08:18:49 localhost kernel:     /init
Dec 05 08:18:49 localhost kernel:   with environment:
Dec 05 08:18:49 localhost kernel:     HOME=/
Dec 05 08:18:49 localhost kernel:     TERM=linux
Dec 05 08:18:49 localhost kernel:     BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-645.el9.x86_64
Dec 05 08:18:49 localhost kernel: input: QEMU QEMU USB Tablet as /devices/pci0000:00/0000:00:01.2/usb1/1-1/1-1:1.0/0003:0627:0001.0001/input/input5
Dec 05 08:18:49 localhost kernel: hid-generic 0003:0627:0001.0001: input,hidraw0: USB HID v0.01 Mouse [QEMU QEMU USB Tablet] on usb-0000:00:01.2-1/input0
Dec 05 08:18:49 localhost systemd[1]: systemd 252-59.el9 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT +GNUTLS +OPENSSL +ACL +BLKID +CURL +ELFUTILS +FIDO2 +IDN2 -IDN -IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY +P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK +XKBCOMMON +UTMP +SYSVINIT default-hierarchy=unified)
Dec 05 08:18:49 localhost systemd[1]: Detected virtualization kvm.
Dec 05 08:18:49 localhost systemd[1]: Detected architecture x86-64.
Dec 05 08:18:49 localhost systemd[1]: Running in initrd.
Dec 05 08:18:49 localhost systemd[1]: No hostname configured, using default hostname.
Dec 05 08:18:49 localhost systemd[1]: Hostname set to <localhost>.
Dec 05 08:18:49 localhost systemd[1]: Initializing machine ID from VM UUID.
Dec 05 08:18:49 localhost systemd[1]: Queued start job for default target Initrd Default Target.
Dec 05 08:18:49 localhost systemd[1]: Started Dispatch Password Requests to Console Directory Watch.
Dec 05 08:18:49 localhost systemd[1]: Reached target Local Encrypted Volumes.
Dec 05 08:18:49 localhost systemd[1]: Reached target Initrd /usr File System.
Dec 05 08:18:49 localhost systemd[1]: Reached target Local File Systems.
Dec 05 08:18:49 localhost systemd[1]: Reached target Path Units.
Dec 05 08:18:49 localhost systemd[1]: Reached target Slice Units.
Dec 05 08:18:49 localhost systemd[1]: Reached target Swaps.
Dec 05 08:18:49 localhost systemd[1]: Reached target Timer Units.
Dec 05 08:18:49 localhost systemd[1]: Listening on D-Bus System Message Bus Socket.
Dec 05 08:18:49 localhost systemd[1]: Listening on Journal Socket (/dev/log).
Dec 05 08:18:49 localhost systemd[1]: Listening on Journal Socket.
Dec 05 08:18:49 localhost systemd[1]: Listening on udev Control Socket.
Dec 05 08:18:49 localhost systemd[1]: Listening on udev Kernel Socket.
Dec 05 08:18:49 localhost systemd[1]: Reached target Socket Units.
Dec 05 08:18:49 localhost systemd[1]: Starting Create List of Static Device Nodes...
Dec 05 08:18:49 localhost systemd[1]: Starting Journal Service...
Dec 05 08:18:49 localhost systemd[1]: Load Kernel Modules was skipped because no trigger condition checks were met.
Dec 05 08:18:49 localhost systemd[1]: Starting Apply Kernel Variables...
Dec 05 08:18:49 localhost systemd[1]: Starting Create System Users...
Dec 05 08:18:49 localhost systemd[1]: Starting Setup Virtual Console...
Dec 05 08:18:49 localhost systemd[1]: Finished Create List of Static Device Nodes.
Dec 05 08:18:49 localhost systemd[1]: Finished Apply Kernel Variables.
Dec 05 08:18:49 localhost systemd[1]: Finished Create System Users.
Dec 05 08:18:49 localhost systemd-journald[304]: Journal started
Dec 05 08:18:49 localhost systemd-journald[304]: Runtime Journal (/run/log/journal/dc540e4bdbcd41cc91018457f90414ef) is 8.0M, max 153.6M, 145.6M free.
Dec 05 08:18:49 localhost systemd-sysusers[309]: Creating group 'users' with GID 100.
Dec 05 08:18:49 localhost systemd-sysusers[309]: Creating group 'dbus' with GID 81.
Dec 05 08:18:49 localhost systemd-sysusers[309]: Creating user 'dbus' (System Message Bus) with UID 81 and GID 81.
Dec 05 08:18:49 localhost systemd[1]: Started Journal Service.
Dec 05 08:18:49 localhost systemd[1]: Starting Create Static Device Nodes in /dev...
Dec 05 08:18:49 localhost systemd[1]: Starting Create Volatile Files and Directories...
Dec 05 08:18:49 localhost systemd[1]: Finished Create Static Device Nodes in /dev.
Dec 05 08:18:49 localhost systemd[1]: Finished Create Volatile Files and Directories.
Dec 05 08:18:49 localhost systemd[1]: Finished Setup Virtual Console.
Dec 05 08:18:49 localhost systemd[1]: dracut ask for additional cmdline parameters was skipped because no trigger condition checks were met.
Dec 05 08:18:49 localhost systemd[1]: Starting dracut cmdline hook...
Dec 05 08:18:49 localhost dracut-cmdline[324]: dracut-9 dracut-057-102.git20250818.el9
Dec 05 08:18:49 localhost dracut-cmdline[324]: Using kernel command line parameters:    BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-645.el9.x86_64 root=UUID=fcf6b761-831a-48a7-9f5f-068b5063763f ro console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-2G:192M,2G-64G:256M,64G-:512M
Dec 05 08:18:49 localhost systemd[1]: Finished dracut cmdline hook.
Dec 05 08:18:49 localhost systemd[1]: Starting dracut pre-udev hook...
Dec 05 08:18:49 localhost kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log.
Dec 05 08:18:49 localhost kernel: device-mapper: uevent: version 1.0.3
Dec 05 08:18:49 localhost kernel: device-mapper: ioctl: 4.50.0-ioctl (2025-04-28) initialised: dm-devel@lists.linux.dev
Dec 05 08:18:49 localhost kernel: RPC: Registered named UNIX socket transport module.
Dec 05 08:18:49 localhost kernel: RPC: Registered udp transport module.
Dec 05 08:18:49 localhost kernel: RPC: Registered tcp transport module.
Dec 05 08:18:49 localhost kernel: RPC: Registered tcp-with-tls transport module.
Dec 05 08:18:49 localhost kernel: RPC: Registered tcp NFSv4.1 backchannel transport module.
Dec 05 08:18:49 localhost rpc.statd[440]: Version 2.5.4 starting
Dec 05 08:18:49 localhost rpc.statd[440]: Initializing NSM state
Dec 05 08:18:49 localhost rpc.idmapd[445]: Setting log level to 0
Dec 05 08:18:49 localhost systemd[1]: Finished dracut pre-udev hook.
Dec 05 08:18:49 localhost systemd[1]: Starting Rule-based Manager for Device Events and Files...
Dec 05 08:18:49 localhost systemd-udevd[458]: Using default interface naming scheme 'rhel-9.0'.
Dec 05 08:18:49 localhost systemd[1]: Started Rule-based Manager for Device Events and Files.
Dec 05 08:18:49 localhost systemd[1]: Starting dracut pre-trigger hook...
Dec 05 08:18:49 localhost systemd[1]: Finished dracut pre-trigger hook.
Dec 05 08:18:49 localhost systemd[1]: Starting Coldplug All udev Devices...
Dec 05 08:18:49 localhost systemd[1]: Created slice Slice /system/modprobe.
Dec 05 08:18:49 localhost systemd[1]: Starting Load Kernel Module configfs...
Dec 05 08:18:49 localhost systemd[1]: Finished Coldplug All udev Devices.
Dec 05 08:18:49 localhost systemd[1]: nm-initrd.service was skipped because of an unmet condition check (ConditionPathExists=/run/NetworkManager/initrd/neednet).
Dec 05 08:18:49 localhost systemd[1]: Reached target Network.
Dec 05 08:18:49 localhost systemd[1]: nm-wait-online-initrd.service was skipped because of an unmet condition check (ConditionPathExists=/run/NetworkManager/initrd/neednet).
Dec 05 08:18:49 localhost systemd[1]: Starting dracut initqueue hook...
Dec 05 08:18:49 localhost systemd[1]: modprobe@configfs.service: Deactivated successfully.
Dec 05 08:18:49 localhost systemd[1]: Finished Load Kernel Module configfs.
Dec 05 08:18:49 localhost kernel: virtio_blk virtio2: 8/0/0 default/read/poll queues
Dec 05 08:18:49 localhost kernel: virtio_blk virtio2: [vda] 167772160 512-byte logical blocks (85.9 GB/80.0 GiB)
Dec 05 08:18:49 localhost kernel: libata version 3.00 loaded.
Dec 05 08:18:49 localhost kernel: ata_piix 0000:00:01.1: version 2.13
Dec 05 08:18:49 localhost kernel:  vda: vda1
Dec 05 08:18:49 localhost kernel: scsi host0: ata_piix
Dec 05 08:18:49 localhost kernel: scsi host1: ata_piix
Dec 05 08:18:49 localhost kernel: ata1: PATA max MWDMA2 cmd 0x1f0 ctl 0x3f6 bmdma 0xc140 irq 14 lpm-pol 0
Dec 05 08:18:49 localhost kernel: ata2: PATA max MWDMA2 cmd 0x170 ctl 0x376 bmdma 0xc148 irq 15 lpm-pol 0
Dec 05 08:18:49 localhost systemd-udevd[473]: Network interface NamePolicy= disabled on kernel command line.
Dec 05 08:18:49 localhost systemd[1]: Found device /dev/disk/by-uuid/fcf6b761-831a-48a7-9f5f-068b5063763f.
Dec 05 08:18:49 localhost systemd[1]: Reached target Initrd Root Device.
Dec 05 08:18:50 localhost kernel: ata1: found unknown device (class 0)
Dec 05 08:18:50 localhost systemd[1]: Mounting Kernel Configuration File System...
Dec 05 08:18:50 localhost kernel: ata1.00: ATAPI: QEMU DVD-ROM, 2.5+, max UDMA/100
Dec 05 08:18:50 localhost kernel: scsi 0:0:0:0: CD-ROM            QEMU     QEMU DVD-ROM     2.5+ PQ: 0 ANSI: 5
Dec 05 08:18:50 localhost kernel: scsi 0:0:0:0: Attached scsi generic sg0 type 5
Dec 05 08:18:50 localhost kernel: sr 0:0:0:0: [sr0] scsi3-mmc drive: 4x/4x cd/rw xa/form2 tray
Dec 05 08:18:50 localhost kernel: cdrom: Uniform CD-ROM driver Revision: 3.20
Dec 05 08:18:50 localhost kernel: sr 0:0:0:0: Attached scsi CD-ROM sr0
Dec 05 08:18:50 localhost systemd[1]: Mounted Kernel Configuration File System.
Dec 05 08:18:50 localhost systemd[1]: Reached target System Initialization.
Dec 05 08:18:50 localhost systemd[1]: Reached target Basic System.
Dec 05 08:18:50 localhost systemd[1]: Finished dracut initqueue hook.
Dec 05 08:18:50 localhost systemd[1]: Reached target Preparation for Remote File Systems.
Dec 05 08:18:50 localhost systemd[1]: Reached target Remote Encrypted Volumes.
Dec 05 08:18:50 localhost systemd[1]: Reached target Remote File Systems.
Dec 05 08:18:50 localhost systemd[1]: Starting dracut pre-mount hook...
Dec 05 08:18:50 localhost systemd[1]: Finished dracut pre-mount hook.
Dec 05 08:18:50 localhost systemd[1]: Starting File System Check on /dev/disk/by-uuid/fcf6b761-831a-48a7-9f5f-068b5063763f...
Dec 05 08:18:50 localhost systemd-fsck[554]: /usr/sbin/fsck.xfs: XFS file system.
Dec 05 08:18:50 localhost systemd[1]: Finished File System Check on /dev/disk/by-uuid/fcf6b761-831a-48a7-9f5f-068b5063763f.
Dec 05 08:18:50 localhost systemd[1]: Mounting /sysroot...
Dec 05 08:18:50 localhost kernel: SGI XFS with ACLs, security attributes, scrub, quota, no debug enabled
Dec 05 08:18:50 localhost kernel: XFS (vda1): Mounting V5 Filesystem fcf6b761-831a-48a7-9f5f-068b5063763f
Dec 05 08:18:50 localhost kernel: XFS (vda1): Ending clean mount
Dec 05 08:18:50 localhost systemd[1]: Mounted /sysroot.
Dec 05 08:18:50 localhost systemd[1]: Reached target Initrd Root File System.
Dec 05 08:18:50 localhost systemd[1]: Starting Mountpoints Configured in the Real Root...
Dec 05 08:18:50 localhost systemd[1]: initrd-parse-etc.service: Deactivated successfully.
Dec 05 08:18:50 localhost systemd[1]: Finished Mountpoints Configured in the Real Root.
Dec 05 08:18:50 localhost systemd[1]: Reached target Initrd File Systems.
Dec 05 08:18:50 localhost systemd[1]: Reached target Initrd Default Target.
Dec 05 08:18:50 localhost systemd[1]: Starting dracut mount hook...
Dec 05 08:18:50 localhost systemd[1]: Finished dracut mount hook.
Dec 05 08:18:50 localhost systemd[1]: Starting dracut pre-pivot and cleanup hook...
Dec 05 08:18:51 localhost rpc.idmapd[445]: exiting on signal 15
Dec 05 08:18:51 localhost systemd[1]: var-lib-nfs-rpc_pipefs.mount: Deactivated successfully.
Dec 05 08:18:51 localhost systemd[1]: Finished dracut pre-pivot and cleanup hook.
Dec 05 08:18:51 localhost systemd[1]: Starting Cleaning Up and Shutting Down Daemons...
Dec 05 08:18:51 localhost systemd[1]: Stopped target Network.
Dec 05 08:18:51 localhost systemd[1]: Stopped target Remote Encrypted Volumes.
Dec 05 08:18:51 localhost systemd[1]: Stopped target Timer Units.
Dec 05 08:18:51 localhost systemd[1]: dbus.socket: Deactivated successfully.
Dec 05 08:18:51 localhost systemd[1]: Closed D-Bus System Message Bus Socket.
Dec 05 08:18:51 localhost systemd[1]: dracut-pre-pivot.service: Deactivated successfully.
Dec 05 08:18:51 localhost systemd[1]: Stopped dracut pre-pivot and cleanup hook.
Dec 05 08:18:51 localhost systemd[1]: Stopped target Initrd Default Target.
Dec 05 08:18:51 localhost systemd[1]: Stopped target Basic System.
Dec 05 08:18:51 localhost systemd[1]: Stopped target Initrd Root Device.
Dec 05 08:18:51 localhost systemd[1]: Stopped target Initrd /usr File System.
Dec 05 08:18:51 localhost systemd[1]: Stopped target Path Units.
Dec 05 08:18:51 localhost systemd[1]: Stopped target Remote File Systems.
Dec 05 08:18:51 localhost systemd[1]: Stopped target Preparation for Remote File Systems.
Dec 05 08:18:51 localhost systemd[1]: Stopped target Slice Units.
Dec 05 08:18:51 localhost systemd[1]: Stopped target Socket Units.
Dec 05 08:18:51 localhost systemd[1]: Stopped target System Initialization.
Dec 05 08:18:51 localhost systemd[1]: Stopped target Local File Systems.
Dec 05 08:18:51 localhost systemd[1]: Stopped target Swaps.
Dec 05 08:18:51 localhost systemd[1]: dracut-mount.service: Deactivated successfully.
Dec 05 08:18:51 localhost systemd[1]: Stopped dracut mount hook.
Dec 05 08:18:51 localhost systemd[1]: dracut-pre-mount.service: Deactivated successfully.
Dec 05 08:18:51 localhost systemd[1]: Stopped dracut pre-mount hook.
Dec 05 08:18:51 localhost systemd[1]: Stopped target Local Encrypted Volumes.
Dec 05 08:18:51 localhost systemd[1]: systemd-ask-password-console.path: Deactivated successfully.
Dec 05 08:18:51 localhost systemd[1]: Stopped Dispatch Password Requests to Console Directory Watch.
Dec 05 08:18:51 localhost systemd[1]: dracut-initqueue.service: Deactivated successfully.
Dec 05 08:18:51 localhost systemd[1]: Stopped dracut initqueue hook.
Dec 05 08:18:51 localhost systemd[1]: systemd-sysctl.service: Deactivated successfully.
Dec 05 08:18:51 localhost systemd[1]: Stopped Apply Kernel Variables.
Dec 05 08:18:51 localhost systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully.
Dec 05 08:18:51 localhost systemd[1]: Stopped Create Volatile Files and Directories.
Dec 05 08:18:51 localhost systemd[1]: systemd-udev-trigger.service: Deactivated successfully.
Dec 05 08:18:51 localhost systemd[1]: Stopped Coldplug All udev Devices.
Dec 05 08:18:51 localhost systemd[1]: dracut-pre-trigger.service: Deactivated successfully.
Dec 05 08:18:51 localhost systemd[1]: Stopped dracut pre-trigger hook.
Dec 05 08:18:51 localhost systemd[1]: Stopping Rule-based Manager for Device Events and Files...
Dec 05 08:18:51 localhost systemd[1]: systemd-vconsole-setup.service: Deactivated successfully.
Dec 05 08:18:51 localhost systemd[1]: Stopped Setup Virtual Console.
Dec 05 08:18:51 localhost systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup.service.mount: Deactivated successfully.
Dec 05 08:18:51 localhost systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully.
Dec 05 08:18:51 localhost systemd[1]: initrd-cleanup.service: Deactivated successfully.
Dec 05 08:18:51 localhost systemd[1]: Finished Cleaning Up and Shutting Down Daemons.
Dec 05 08:18:51 localhost systemd[1]: systemd-udevd.service: Deactivated successfully.
Dec 05 08:18:51 localhost systemd[1]: Stopped Rule-based Manager for Device Events and Files.
Dec 05 08:18:51 localhost systemd[1]: systemd-udevd-control.socket: Deactivated successfully.
Dec 05 08:18:51 localhost systemd[1]: Closed udev Control Socket.
Dec 05 08:18:51 localhost systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully.
Dec 05 08:18:51 localhost systemd[1]: Closed udev Kernel Socket.
Dec 05 08:18:51 localhost systemd[1]: dracut-pre-udev.service: Deactivated successfully.
Dec 05 08:18:51 localhost systemd[1]: Stopped dracut pre-udev hook.
Dec 05 08:18:51 localhost systemd[1]: dracut-cmdline.service: Deactivated successfully.
Dec 05 08:18:51 localhost systemd[1]: Stopped dracut cmdline hook.
Dec 05 08:18:51 localhost systemd[1]: Starting Cleanup udev Database...
Dec 05 08:18:51 localhost systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully.
Dec 05 08:18:51 localhost systemd[1]: Stopped Create Static Device Nodes in /dev.
Dec 05 08:18:51 localhost systemd[1]: kmod-static-nodes.service: Deactivated successfully.
Dec 05 08:18:51 localhost systemd[1]: Stopped Create List of Static Device Nodes.
Dec 05 08:18:51 localhost systemd[1]: systemd-sysusers.service: Deactivated successfully.
Dec 05 08:18:51 localhost systemd[1]: Stopped Create System Users.
Dec 05 08:18:51 localhost systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup\x2ddev.service.mount: Deactivated successfully.
Dec 05 08:18:51 localhost systemd[1]: run-credentials-systemd\x2dsysusers.service.mount: Deactivated successfully.
Dec 05 08:18:51 localhost systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully.
Dec 05 08:18:51 localhost systemd[1]: Finished Cleanup udev Database.
Dec 05 08:18:51 localhost systemd[1]: Reached target Switch Root.
Dec 05 08:18:51 localhost systemd[1]: Starting Switch Root...
Dec 05 08:18:51 localhost systemd[1]: Switching root.
Dec 05 08:18:51 localhost systemd-journald[304]: Journal stopped
Dec 05 08:18:51 localhost systemd-journald[304]: Received SIGTERM from PID 1 (systemd).
Dec 05 08:18:51 localhost kernel: audit: type=1404 audit(1764922731.252:2): enforcing=1 old_enforcing=0 auid=4294967295 ses=4294967295 enabled=1 old-enabled=1 lsm=selinux res=1
Dec 05 08:18:51 localhost kernel: SELinux:  policy capability network_peer_controls=1
Dec 05 08:18:51 localhost kernel: SELinux:  policy capability open_perms=1
Dec 05 08:18:51 localhost kernel: SELinux:  policy capability extended_socket_class=1
Dec 05 08:18:51 localhost kernel: SELinux:  policy capability always_check_network=0
Dec 05 08:18:51 localhost kernel: SELinux:  policy capability cgroup_seclabel=1
Dec 05 08:18:51 localhost kernel: SELinux:  policy capability nnp_nosuid_transition=1
Dec 05 08:18:51 localhost kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Dec 05 08:18:51 localhost kernel: audit: type=1403 audit(1764922731.378:3): auid=4294967295 ses=4294967295 lsm=selinux res=1
Dec 05 08:18:51 localhost systemd[1]: Successfully loaded SELinux policy in 130.114ms.
Dec 05 08:18:51 localhost systemd[1]: Relabelled /dev, /dev/shm, /run, /sys/fs/cgroup in 27.206ms.
Dec 05 08:18:51 localhost systemd[1]: systemd 252-59.el9 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT +GNUTLS +OPENSSL +ACL +BLKID +CURL +ELFUTILS +FIDO2 +IDN2 -IDN -IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY +P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK +XKBCOMMON +UTMP +SYSVINIT default-hierarchy=unified)
Dec 05 08:18:51 localhost systemd[1]: Detected virtualization kvm.
Dec 05 08:18:51 localhost systemd[1]: Detected architecture x86-64.
Dec 05 08:18:51 localhost systemd-rc-local-generator[636]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 05 08:18:51 localhost systemd[1]: initrd-switch-root.service: Deactivated successfully.
Dec 05 08:18:51 localhost systemd[1]: Stopped Switch Root.
Dec 05 08:18:51 localhost systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1.
Dec 05 08:18:51 localhost systemd[1]: Created slice Slice /system/getty.
Dec 05 08:18:51 localhost systemd[1]: Created slice Slice /system/serial-getty.
Dec 05 08:18:51 localhost systemd[1]: Created slice Slice /system/sshd-keygen.
Dec 05 08:18:51 localhost systemd[1]: Created slice User and Session Slice.
Dec 05 08:18:51 localhost systemd[1]: Started Dispatch Password Requests to Console Directory Watch.
Dec 05 08:18:51 localhost systemd[1]: Started Forward Password Requests to Wall Directory Watch.
Dec 05 08:18:51 localhost systemd[1]: Set up automount Arbitrary Executable File Formats File System Automount Point.
Dec 05 08:18:51 localhost systemd[1]: Reached target Local Encrypted Volumes.
Dec 05 08:18:51 localhost systemd[1]: Stopped target Switch Root.
Dec 05 08:18:51 localhost systemd[1]: Stopped target Initrd File Systems.
Dec 05 08:18:51 localhost systemd[1]: Stopped target Initrd Root File System.
Dec 05 08:18:51 localhost systemd[1]: Reached target Local Integrity Protected Volumes.
Dec 05 08:18:51 localhost systemd[1]: Reached target Path Units.
Dec 05 08:18:51 localhost systemd[1]: Reached target rpc_pipefs.target.
Dec 05 08:18:51 localhost systemd[1]: Reached target Slice Units.
Dec 05 08:18:51 localhost systemd[1]: Reached target Swaps.
Dec 05 08:18:51 localhost systemd[1]: Reached target Local Verity Protected Volumes.
Dec 05 08:18:51 localhost systemd[1]: Listening on RPCbind Server Activation Socket.
Dec 05 08:18:51 localhost systemd[1]: Reached target RPC Port Mapper.
Dec 05 08:18:51 localhost systemd[1]: Listening on Process Core Dump Socket.
Dec 05 08:18:51 localhost systemd[1]: Listening on initctl Compatibility Named Pipe.
Dec 05 08:18:51 localhost systemd[1]: Listening on udev Control Socket.
Dec 05 08:18:51 localhost systemd[1]: Listening on udev Kernel Socket.
Dec 05 08:18:51 localhost systemd[1]: Mounting Huge Pages File System...
Dec 05 08:18:51 localhost systemd[1]: Mounting POSIX Message Queue File System...
Dec 05 08:18:51 localhost systemd[1]: Mounting Kernel Debug File System...
Dec 05 08:18:51 localhost systemd[1]: Mounting Kernel Trace File System...
Dec 05 08:18:51 localhost systemd[1]: Kernel Module supporting RPCSEC_GSS was skipped because of an unmet condition check (ConditionPathExists=/etc/krb5.keytab).
Dec 05 08:18:51 localhost systemd[1]: Starting Create List of Static Device Nodes...
Dec 05 08:18:51 localhost systemd[1]: Starting Load Kernel Module configfs...
Dec 05 08:18:51 localhost systemd[1]: Starting Load Kernel Module drm...
Dec 05 08:18:51 localhost systemd[1]: Starting Load Kernel Module efi_pstore...
Dec 05 08:18:51 localhost systemd[1]: Starting Load Kernel Module fuse...
Dec 05 08:18:51 localhost systemd[1]: Starting Read and set NIS domainname from /etc/sysconfig/network...
Dec 05 08:18:51 localhost systemd[1]: systemd-fsck-root.service: Deactivated successfully.
Dec 05 08:18:51 localhost systemd[1]: Stopped File System Check on Root Device.
Dec 05 08:18:51 localhost systemd[1]: Stopped Journal Service.
Dec 05 08:18:51 localhost systemd[1]: Starting Journal Service...
Dec 05 08:18:51 localhost systemd[1]: Load Kernel Modules was skipped because no trigger condition checks were met.
Dec 05 08:18:51 localhost systemd[1]: Starting Generate network units from Kernel command line...
Dec 05 08:18:51 localhost systemd[1]: TPM2 PCR Machine ID Measurement was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Dec 05 08:18:51 localhost systemd[1]: Starting Remount Root and Kernel File Systems...
Dec 05 08:18:51 localhost systemd[1]: Repartition Root Disk was skipped because no trigger condition checks were met.
Dec 05 08:18:51 localhost systemd[1]: Starting Apply Kernel Variables...
Dec 05 08:18:51 localhost kernel: fuse: init (API version 7.37)
Dec 05 08:18:51 localhost systemd[1]: Starting Coldplug All udev Devices...
Dec 05 08:18:51 localhost kernel: xfs filesystem being remounted at / supports timestamps until 2038 (0x7fffffff)
Dec 05 08:18:51 localhost systemd-journald[678]: Journal started
Dec 05 08:18:51 localhost systemd-journald[678]: Runtime Journal (/run/log/journal/4d4ef2323cc3337bbfd9081b2a323b4e) is 8.0M, max 153.6M, 145.6M free.
Dec 05 08:18:51 localhost systemd[1]: Queued start job for default target Multi-User System.
Dec 05 08:18:51 localhost systemd[1]: systemd-journald.service: Deactivated successfully.
Dec 05 08:18:51 localhost systemd[1]: Mounted Huge Pages File System.
Dec 05 08:18:51 localhost systemd[1]: Started Journal Service.
Dec 05 08:18:51 localhost systemd[1]: Mounted POSIX Message Queue File System.
Dec 05 08:18:51 localhost systemd[1]: Mounted Kernel Debug File System.
Dec 05 08:18:51 localhost systemd[1]: Mounted Kernel Trace File System.
Dec 05 08:18:51 localhost systemd[1]: Finished Create List of Static Device Nodes.
Dec 05 08:18:51 localhost systemd[1]: modprobe@configfs.service: Deactivated successfully.
Dec 05 08:18:51 localhost systemd[1]: Finished Load Kernel Module configfs.
Dec 05 08:18:51 localhost systemd[1]: modprobe@efi_pstore.service: Deactivated successfully.
Dec 05 08:18:51 localhost systemd[1]: Finished Load Kernel Module efi_pstore.
Dec 05 08:18:51 localhost systemd[1]: modprobe@fuse.service: Deactivated successfully.
Dec 05 08:18:51 localhost systemd[1]: Finished Load Kernel Module fuse.
Dec 05 08:18:51 localhost systemd[1]: Finished Read and set NIS domainname from /etc/sysconfig/network.
Dec 05 08:18:51 localhost systemd[1]: Finished Generate network units from Kernel command line.
Dec 05 08:18:51 localhost systemd[1]: Finished Remount Root and Kernel File Systems.
Dec 05 08:18:51 localhost systemd[1]: Finished Apply Kernel Variables.
Dec 05 08:18:51 localhost kernel: ACPI: bus type drm_connector registered
Dec 05 08:18:51 localhost systemd[1]: Mounting FUSE Control File System...
Dec 05 08:18:51 localhost systemd[1]: First Boot Wizard was skipped because of an unmet condition check (ConditionFirstBoot=yes).
Dec 05 08:18:51 localhost systemd[1]: Starting Rebuild Hardware Database...
Dec 05 08:18:51 localhost systemd[1]: Starting Flush Journal to Persistent Storage...
Dec 05 08:18:51 localhost systemd[1]: Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore).
Dec 05 08:18:51 localhost systemd[1]: Starting Load/Save OS Random Seed...
Dec 05 08:18:51 localhost systemd[1]: Starting Create System Users...
Dec 05 08:18:51 localhost systemd[1]: modprobe@drm.service: Deactivated successfully.
Dec 05 08:18:51 localhost systemd-journald[678]: Runtime Journal (/run/log/journal/4d4ef2323cc3337bbfd9081b2a323b4e) is 8.0M, max 153.6M, 145.6M free.
Dec 05 08:18:51 localhost systemd-journald[678]: Received client request to flush runtime journal.
Dec 05 08:18:51 localhost systemd[1]: Finished Load Kernel Module drm.
Dec 05 08:18:51 localhost systemd[1]: Mounted FUSE Control File System.
Dec 05 08:18:51 localhost systemd[1]: Finished Flush Journal to Persistent Storage.
Dec 05 08:18:51 localhost systemd[1]: Finished Load/Save OS Random Seed.
Dec 05 08:18:51 localhost systemd[1]: First Boot Complete was skipped because of an unmet condition check (ConditionFirstBoot=yes).
Dec 05 08:18:51 localhost systemd[1]: Finished Create System Users.
Dec 05 08:18:51 localhost systemd[1]: Starting Create Static Device Nodes in /dev...
Dec 05 08:18:52 localhost systemd[1]: Finished Coldplug All udev Devices.
Dec 05 08:18:52 localhost systemd[1]: Finished Create Static Device Nodes in /dev.
Dec 05 08:18:52 localhost systemd[1]: Reached target Preparation for Local File Systems.
Dec 05 08:18:52 localhost systemd[1]: Reached target Local File Systems.
Dec 05 08:18:52 localhost systemd[1]: Starting Rebuild Dynamic Linker Cache...
Dec 05 08:18:52 localhost systemd[1]: Mark the need to relabel after reboot was skipped because of an unmet condition check (ConditionSecurity=!selinux).
Dec 05 08:18:52 localhost systemd[1]: Set Up Additional Binary Formats was skipped because no trigger condition checks were met.
Dec 05 08:18:52 localhost systemd[1]: Update Boot Loader Random Seed was skipped because no trigger condition checks were met.
Dec 05 08:18:52 localhost systemd[1]: Starting Automatic Boot Loader Update...
Dec 05 08:18:52 localhost systemd[1]: Commit a transient machine-id on disk was skipped because of an unmet condition check (ConditionPathIsMountPoint=/etc/machine-id).
Dec 05 08:18:52 localhost systemd[1]: Starting Create Volatile Files and Directories...
Dec 05 08:18:52 localhost bootctl[699]: Couldn't find EFI system partition, skipping.
Dec 05 08:18:52 localhost systemd[1]: Finished Automatic Boot Loader Update.
Dec 05 08:18:52 localhost systemd[1]: Finished Rebuild Dynamic Linker Cache.
Dec 05 08:18:52 localhost systemd[1]: Finished Create Volatile Files and Directories.
Dec 05 08:18:52 localhost systemd[1]: Starting Security Auditing Service...
Dec 05 08:18:52 localhost systemd[1]: Starting RPC Bind...
Dec 05 08:18:52 localhost systemd[1]: Starting Rebuild Journal Catalog...
Dec 05 08:18:52 localhost auditd[705]: audit dispatcher initialized with q_depth=2000 and 1 active plugins
Dec 05 08:18:52 localhost auditd[705]: Init complete, auditd 3.1.5 listening for events (startup state enable)
Dec 05 08:18:52 localhost systemd[1]: Started RPC Bind.
Dec 05 08:18:52 localhost systemd[1]: Finished Rebuild Journal Catalog.
Dec 05 08:18:52 localhost augenrules[710]: /sbin/augenrules: No change
Dec 05 08:18:52 localhost augenrules[725]: No rules
Dec 05 08:18:52 localhost augenrules[725]: enabled 1
Dec 05 08:18:52 localhost augenrules[725]: failure 1
Dec 05 08:18:52 localhost augenrules[725]: pid 705
Dec 05 08:18:52 localhost augenrules[725]: rate_limit 0
Dec 05 08:18:52 localhost augenrules[725]: backlog_limit 8192
Dec 05 08:18:52 localhost augenrules[725]: lost 0
Dec 05 08:18:52 localhost augenrules[725]: backlog 0
Dec 05 08:18:52 localhost augenrules[725]: backlog_wait_time 60000
Dec 05 08:18:52 localhost augenrules[725]: backlog_wait_time_actual 0
Dec 05 08:18:52 localhost augenrules[725]: enabled 1
Dec 05 08:18:52 localhost augenrules[725]: failure 1
Dec 05 08:18:52 localhost augenrules[725]: pid 705
Dec 05 08:18:52 localhost augenrules[725]: rate_limit 0
Dec 05 08:18:52 localhost augenrules[725]: backlog_limit 8192
Dec 05 08:18:52 localhost augenrules[725]: lost 0
Dec 05 08:18:52 localhost augenrules[725]: backlog 0
Dec 05 08:18:52 localhost augenrules[725]: backlog_wait_time 60000
Dec 05 08:18:52 localhost augenrules[725]: backlog_wait_time_actual 0
Dec 05 08:18:52 localhost augenrules[725]: enabled 1
Dec 05 08:18:52 localhost augenrules[725]: failure 1
Dec 05 08:18:52 localhost augenrules[725]: pid 705
Dec 05 08:18:52 localhost augenrules[725]: rate_limit 0
Dec 05 08:18:52 localhost augenrules[725]: backlog_limit 8192
Dec 05 08:18:52 localhost augenrules[725]: lost 0
Dec 05 08:18:52 localhost augenrules[725]: backlog 0
Dec 05 08:18:52 localhost augenrules[725]: backlog_wait_time 60000
Dec 05 08:18:52 localhost augenrules[725]: backlog_wait_time_actual 0
Dec 05 08:18:52 localhost systemd[1]: Started Security Auditing Service.
Dec 05 08:18:52 localhost systemd[1]: Starting Record System Boot/Shutdown in UTMP...
Dec 05 08:18:52 localhost systemd[1]: Finished Record System Boot/Shutdown in UTMP.
Dec 05 08:18:52 localhost systemd[1]: Finished Rebuild Hardware Database.
Dec 05 08:18:52 localhost systemd[1]: Starting Rule-based Manager for Device Events and Files...
Dec 05 08:18:52 localhost systemd[1]: Starting Update is Completed...
Dec 05 08:18:52 localhost systemd[1]: Finished Update is Completed.
Dec 05 08:18:52 localhost systemd-udevd[733]: Using default interface naming scheme 'rhel-9.0'.
Dec 05 08:18:52 localhost systemd[1]: Started Rule-based Manager for Device Events and Files.
Dec 05 08:18:52 localhost systemd[1]: Reached target System Initialization.
Dec 05 08:18:52 localhost systemd[1]: Started dnf makecache --timer.
Dec 05 08:18:52 localhost systemd[1]: Started Daily rotation of log files.
Dec 05 08:18:52 localhost systemd[1]: Started Daily Cleanup of Temporary Directories.
Dec 05 08:18:52 localhost systemd[1]: Reached target Timer Units.
Dec 05 08:18:52 localhost systemd[1]: Listening on D-Bus System Message Bus Socket.
Dec 05 08:18:52 localhost systemd[1]: Listening on SSSD Kerberos Cache Manager responder socket.
Dec 05 08:18:52 localhost systemd[1]: Reached target Socket Units.
Dec 05 08:18:52 localhost systemd[1]: Starting D-Bus System Message Bus...
Dec 05 08:18:52 localhost systemd[1]: TPM2 PCR Barrier (Initialization) was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Dec 05 08:18:52 localhost systemd[1]: Condition check resulted in /dev/ttyS0 being skipped.
Dec 05 08:18:52 localhost systemd[1]: Starting Load Kernel Module configfs...
Dec 05 08:18:52 localhost systemd[1]: modprobe@configfs.service: Deactivated successfully.
Dec 05 08:18:52 localhost systemd[1]: Finished Load Kernel Module configfs.
Dec 05 08:18:52 localhost systemd-udevd[742]: Network interface NamePolicy= disabled on kernel command line.
Dec 05 08:18:52 localhost systemd[1]: Started D-Bus System Message Bus.
Dec 05 08:18:52 localhost systemd[1]: Reached target Basic System.
Dec 05 08:18:52 localhost dbus-broker-lau[767]: Ready
Dec 05 08:18:52 localhost kernel: input: PC Speaker as /devices/platform/pcspkr/input/input6
Dec 05 08:18:52 localhost systemd[1]: Starting NTP client/server...
Dec 05 08:18:52 localhost systemd[1]: Starting Cloud-init: Local Stage (pre-network)...
Dec 05 08:18:52 localhost systemd[1]: Starting Restore /run/initramfs on shutdown...
Dec 05 08:18:52 localhost chronyd[781]: chronyd version 4.8 starting (+CMDMON +REFCLOCK +RTC +PRIVDROP +SCFILTER +SIGND +NTS +SECHASH +IPV6 +DEBUG)
Dec 05 08:18:52 localhost chronyd[781]: Loaded 0 symmetric keys
Dec 05 08:18:52 localhost chronyd[781]: Using right/UTC timezone to obtain leap second data
Dec 05 08:18:52 localhost chronyd[781]: Loaded seccomp filter (level 2)
Dec 05 08:18:52 localhost systemd[1]: Starting IPv4 firewall with iptables...
Dec 05 08:18:52 localhost systemd[1]: Started irqbalance daemon.
Dec 05 08:18:52 localhost kernel: piix4_smbus 0000:00:01.3: SMBus Host Controller at 0x700, revision 0
Dec 05 08:18:52 localhost kernel: i2c i2c-0: 1/1 memory slots populated (from DMI)
Dec 05 08:18:52 localhost kernel: i2c i2c-0: Memory type 0x07 not supported yet, not instantiating SPD
Dec 05 08:18:52 localhost kernel: kvm_amd: TSC scaling supported
Dec 05 08:18:52 localhost kernel: kvm_amd: Nested Virtualization enabled
Dec 05 08:18:52 localhost kernel: kvm_amd: Nested Paging enabled
Dec 05 08:18:52 localhost kernel: kvm_amd: LBR virtualization supported
Dec 05 08:18:52 localhost kernel: Warning: Deprecated Driver is detected: nft_compat will not be maintained in a future major release and may be disabled
Dec 05 08:18:52 localhost systemd[1]: Load CPU microcode update was skipped because of an unmet condition check (ConditionPathExists=/sys/devices/system/cpu/microcode/reload).
Dec 05 08:18:52 localhost systemd[1]: OpenSSH ecdsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Dec 05 08:18:52 localhost systemd[1]: OpenSSH ed25519 Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Dec 05 08:18:52 localhost systemd[1]: OpenSSH rsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Dec 05 08:18:52 localhost systemd[1]: Reached target sshd-keygen.target.
Dec 05 08:18:52 localhost systemd[1]: System Security Services Daemon was skipped because no trigger condition checks were met.
Dec 05 08:18:52 localhost systemd[1]: Reached target User and Group Name Lookups.
Dec 05 08:18:52 localhost kernel: Warning: Deprecated Driver is detected: nft_compat_module_init will not be maintained in a future major release and may be disabled
Dec 05 08:18:52 localhost kernel: [drm] pci: virtio-vga detected at 0000:00:02.0
Dec 05 08:18:52 localhost kernel: virtio-pci 0000:00:02.0: vgaarb: deactivate vga console
Dec 05 08:18:52 localhost kernel: Console: switching to colour dummy device 80x25
Dec 05 08:18:52 localhost kernel: [drm] features: -virgl +edid -resource_blob -host_visible
Dec 05 08:18:52 localhost kernel: [drm] features: -context_init
Dec 05 08:18:52 localhost kernel: [drm] number of scanouts: 1
Dec 05 08:18:52 localhost kernel: [drm] number of cap sets: 0
Dec 05 08:18:52 localhost systemd[1]: Starting User Login Management...
Dec 05 08:18:53 localhost systemd[1]: Started NTP client/server.
Dec 05 08:18:53 localhost systemd[1]: Finished Restore /run/initramfs on shutdown.
Dec 05 08:18:53 localhost kernel: [drm] Initialized virtio_gpu 0.1.0 for 0000:00:02.0 on minor 0
Dec 05 08:18:53 localhost kernel: fbcon: virtio_gpudrmfb (fb0) is primary device
Dec 05 08:18:53 localhost kernel: Console: switching to colour frame buffer device 128x48
Dec 05 08:18:53 localhost kernel: virtio-pci 0000:00:02.0: [drm] fb0: virtio_gpudrmfb frame buffer device
Dec 05 08:18:53 localhost systemd-logind[807]: New seat seat0.
Dec 05 08:18:53 localhost systemd-logind[807]: Watching system buttons on /dev/input/event0 (Power Button)
Dec 05 08:18:53 localhost systemd-logind[807]: Watching system buttons on /dev/input/event1 (AT Translated Set 2 keyboard)
Dec 05 08:18:53 localhost systemd[1]: Started User Login Management.
Dec 05 08:18:53 localhost iptables.init[783]: iptables: Applying firewall rules: [  OK  ]
Dec 05 08:18:53 localhost systemd[1]: Finished IPv4 firewall with iptables.
Dec 05 08:18:53 localhost cloud-init[842]: Cloud-init v. 24.4-7.el9 running 'init-local' at Fri, 05 Dec 2025 08:18:53 +0000. Up 6.27 seconds.
Dec 05 08:18:53 localhost kernel: ISO 9660 Extensions: Microsoft Joliet Level 3
Dec 05 08:18:53 localhost kernel: ISO 9660 Extensions: RRIP_1991A
Dec 05 08:18:53 localhost systemd[1]: run-cloud\x2dinit-tmp-tmpf9_9210i.mount: Deactivated successfully.
Dec 05 08:18:53 localhost systemd[1]: Starting Hostname Service...
Dec 05 08:18:53 localhost systemd[1]: Started Hostname Service.
Dec 05 08:18:53 np0005546565.novalocal systemd-hostnamed[856]: Hostname set to <np0005546565.novalocal> (static)
Dec 05 08:18:53 np0005546565.novalocal systemd[1]: Finished Cloud-init: Local Stage (pre-network).
Dec 05 08:18:53 np0005546565.novalocal systemd[1]: Reached target Preparation for Network.
Dec 05 08:18:53 np0005546565.novalocal systemd[1]: Starting Network Manager...
Dec 05 08:18:53 np0005546565.novalocal NetworkManager[860]: <info>  [1764922733.7068] NetworkManager (version 1.54.1-1.el9) is starting... (boot:37103174-0a80-476d-aa28-333d5ef7214b)
Dec 05 08:18:53 np0005546565.novalocal NetworkManager[860]: <info>  [1764922733.7072] Read config: /etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf
Dec 05 08:18:53 np0005546565.novalocal NetworkManager[860]: <info>  [1764922733.7130] manager[0x56291aeb1080]: monitoring kernel firmware directory '/lib/firmware'.
Dec 05 08:18:53 np0005546565.novalocal NetworkManager[860]: <info>  [1764922733.7163] hostname: hostname: using hostnamed
Dec 05 08:18:53 np0005546565.novalocal NetworkManager[860]: <info>  [1764922733.7163] hostname: static hostname changed from (none) to "np0005546565.novalocal"
Dec 05 08:18:53 np0005546565.novalocal NetworkManager[860]: <info>  [1764922733.7167] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto)
Dec 05 08:18:53 np0005546565.novalocal NetworkManager[860]: <info>  [1764922733.7255] manager[0x56291aeb1080]: rfkill: Wi-Fi hardware radio set enabled
Dec 05 08:18:53 np0005546565.novalocal NetworkManager[860]: <info>  [1764922733.7255] manager[0x56291aeb1080]: rfkill: WWAN hardware radio set enabled
Dec 05 08:18:53 np0005546565.novalocal systemd[1]: Listening on Load/Save RF Kill Switch Status /dev/rfkill Watch.
Dec 05 08:18:53 np0005546565.novalocal NetworkManager[860]: <info>  [1764922733.7294] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-device-plugin-team.so)
Dec 05 08:18:53 np0005546565.novalocal NetworkManager[860]: <info>  [1764922733.7296] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file
Dec 05 08:18:53 np0005546565.novalocal NetworkManager[860]: <info>  [1764922733.7297] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file
Dec 05 08:18:53 np0005546565.novalocal NetworkManager[860]: <info>  [1764922733.7297] manager: Networking is enabled by state file
Dec 05 08:18:53 np0005546565.novalocal NetworkManager[860]: <info>  [1764922733.7299] settings: Loaded settings plugin: keyfile (internal)
Dec 05 08:18:53 np0005546565.novalocal NetworkManager[860]: <info>  [1764922733.7310] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-settings-plugin-ifcfg-rh.so")
Dec 05 08:18:53 np0005546565.novalocal NetworkManager[860]: <info>  [1764922733.7328] Warning: the ifcfg-rh plugin is deprecated, please migrate connections to the keyfile format using "nmcli connection migrate"
Dec 05 08:18:53 np0005546565.novalocal NetworkManager[860]: <info>  [1764922733.7340] dhcp: init: Using DHCP client 'internal'
Dec 05 08:18:53 np0005546565.novalocal NetworkManager[860]: <info>  [1764922733.7342] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1)
Dec 05 08:18:53 np0005546565.novalocal NetworkManager[860]: <info>  [1764922733.7354] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec 05 08:18:53 np0005546565.novalocal NetworkManager[860]: <info>  [1764922733.7361] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'external')
Dec 05 08:18:53 np0005546565.novalocal NetworkManager[860]: <info>  [1764922733.7368] device (lo): Activation: starting connection 'lo' (2b3ccb97-e960-48b1-9417-7b23d43663c4)
Dec 05 08:18:53 np0005546565.novalocal NetworkManager[860]: <info>  [1764922733.7377] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2)
Dec 05 08:18:53 np0005546565.novalocal NetworkManager[860]: <info>  [1764922733.7380] device (eth0): state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Dec 05 08:18:53 np0005546565.novalocal NetworkManager[860]: <info>  [1764922733.7405] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager"
Dec 05 08:18:53 np0005546565.novalocal NetworkManager[860]: <info>  [1764922733.7408] device (lo): state change: disconnected -> prepare (reason 'none', managed-type: 'external')
Dec 05 08:18:53 np0005546565.novalocal NetworkManager[860]: <info>  [1764922733.7411] device (lo): state change: prepare -> config (reason 'none', managed-type: 'external')
Dec 05 08:18:53 np0005546565.novalocal NetworkManager[860]: <info>  [1764922733.7412] device (lo): state change: config -> ip-config (reason 'none', managed-type: 'external')
Dec 05 08:18:53 np0005546565.novalocal NetworkManager[860]: <info>  [1764922733.7414] device (eth0): carrier: link connected
Dec 05 08:18:53 np0005546565.novalocal NetworkManager[860]: <info>  [1764922733.7417] device (lo): state change: ip-config -> ip-check (reason 'none', managed-type: 'external')
Dec 05 08:18:53 np0005546565.novalocal NetworkManager[860]: <info>  [1764922733.7423] device (eth0): state change: unavailable -> disconnected (reason 'carrier-changed', managed-type: 'full')
Dec 05 08:18:53 np0005546565.novalocal NetworkManager[860]: <info>  [1764922733.7428] policy: auto-activating connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Dec 05 08:18:53 np0005546565.novalocal NetworkManager[860]: <info>  [1764922733.7432] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Dec 05 08:18:53 np0005546565.novalocal NetworkManager[860]: <info>  [1764922733.7433] device (eth0): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Dec 05 08:18:53 np0005546565.novalocal NetworkManager[860]: <info>  [1764922733.7435] manager: NetworkManager state is now CONNECTING
Dec 05 08:18:53 np0005546565.novalocal NetworkManager[860]: <info>  [1764922733.7436] device (eth0): state change: prepare -> config (reason 'none', managed-type: 'full')
Dec 05 08:18:53 np0005546565.novalocal NetworkManager[860]: <info>  [1764922733.7442] device (eth0): state change: config -> ip-config (reason 'none', managed-type: 'full')
Dec 05 08:18:53 np0005546565.novalocal NetworkManager[860]: <info>  [1764922733.7445] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Dec 05 08:18:53 np0005546565.novalocal systemd[1]: Starting Network Manager Script Dispatcher Service...
Dec 05 08:18:53 np0005546565.novalocal systemd[1]: Started Network Manager.
Dec 05 08:18:53 np0005546565.novalocal systemd[1]: Reached target Network.
Dec 05 08:18:53 np0005546565.novalocal systemd[1]: Starting Network Manager Wait Online...
Dec 05 08:18:53 np0005546565.novalocal systemd[1]: Starting GSSAPI Proxy Daemon...
Dec 05 08:18:53 np0005546565.novalocal systemd[1]: Started Network Manager Script Dispatcher Service.
Dec 05 08:18:53 np0005546565.novalocal NetworkManager[860]: <info>  [1764922733.7707] device (lo): state change: ip-check -> secondaries (reason 'none', managed-type: 'external')
Dec 05 08:18:53 np0005546565.novalocal NetworkManager[860]: <info>  [1764922733.7709] device (lo): state change: secondaries -> activated (reason 'none', managed-type: 'external')
Dec 05 08:18:53 np0005546565.novalocal NetworkManager[860]: <info>  [1764922733.7714] device (lo): Activation: successful, device activated.
Dec 05 08:18:53 np0005546565.novalocal systemd[1]: Started GSSAPI Proxy Daemon.
Dec 05 08:18:53 np0005546565.novalocal systemd[1]: RPC security service for NFS client and server was skipped because of an unmet condition check (ConditionPathExists=/etc/krb5.keytab).
Dec 05 08:18:53 np0005546565.novalocal systemd[1]: Reached target NFS client services.
Dec 05 08:18:53 np0005546565.novalocal systemd[1]: Reached target Preparation for Remote File Systems.
Dec 05 08:18:53 np0005546565.novalocal systemd[1]: Reached target Remote File Systems.
Dec 05 08:18:53 np0005546565.novalocal systemd[1]: TPM2 PCR Barrier (User) was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Dec 05 08:18:57 np0005546565.novalocal NetworkManager[860]: <info>  [1764922737.6321] dhcp4 (eth0): state changed new lease, address=38.102.83.154
Dec 05 08:18:57 np0005546565.novalocal NetworkManager[860]: <info>  [1764922737.6335] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS
Dec 05 08:18:57 np0005546565.novalocal NetworkManager[860]: <info>  [1764922737.6352] device (eth0): state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Dec 05 08:18:57 np0005546565.novalocal NetworkManager[860]: <info>  [1764922737.6392] device (eth0): state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Dec 05 08:18:57 np0005546565.novalocal NetworkManager[860]: <info>  [1764922737.6394] device (eth0): state change: secondaries -> activated (reason 'none', managed-type: 'full')
Dec 05 08:18:57 np0005546565.novalocal NetworkManager[860]: <info>  [1764922737.6397] manager: NetworkManager state is now CONNECTED_SITE
Dec 05 08:18:57 np0005546565.novalocal NetworkManager[860]: <info>  [1764922737.6402] device (eth0): Activation: successful, device activated.
Dec 05 08:18:57 np0005546565.novalocal NetworkManager[860]: <info>  [1764922737.6406] manager: NetworkManager state is now CONNECTED_GLOBAL
Dec 05 08:18:57 np0005546565.novalocal NetworkManager[860]: <info>  [1764922737.6409] manager: startup complete
Dec 05 08:18:57 np0005546565.novalocal systemd[1]: Finished Network Manager Wait Online.
Dec 05 08:18:57 np0005546565.novalocal systemd[1]: Starting Cloud-init: Network Stage...
Dec 05 08:18:57 np0005546565.novalocal cloud-init[923]: Cloud-init v. 24.4-7.el9 running 'init' at Fri, 05 Dec 2025 08:18:57 +0000. Up 11.02 seconds.
Dec 05 08:18:58 np0005546565.novalocal cloud-init[923]: ci-info: +++++++++++++++++++++++++++++++++++++++Net device info+++++++++++++++++++++++++++++++++++++++
Dec 05 08:18:58 np0005546565.novalocal cloud-init[923]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+
Dec 05 08:18:58 np0005546565.novalocal cloud-init[923]: ci-info: | Device |  Up  |           Address            |      Mask     | Scope  |     Hw-Address    |
Dec 05 08:18:58 np0005546565.novalocal cloud-init[923]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+
Dec 05 08:18:58 np0005546565.novalocal cloud-init[923]: ci-info: |  eth0  | True |        38.102.83.154         | 255.255.255.0 | global | fa:16:3e:94:54:b0 |
Dec 05 08:18:58 np0005546565.novalocal cloud-init[923]: ci-info: |  eth0  | True | fe80::f816:3eff:fe94:54b0/64 |       .       |  link  | fa:16:3e:94:54:b0 |
Dec 05 08:18:58 np0005546565.novalocal cloud-init[923]: ci-info: |   lo   | True |          127.0.0.1           |   255.0.0.0   |  host  |         .         |
Dec 05 08:18:58 np0005546565.novalocal cloud-init[923]: ci-info: |   lo   | True |           ::1/128            |       .       |  host  |         .         |
Dec 05 08:18:58 np0005546565.novalocal cloud-init[923]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+
Dec 05 08:18:58 np0005546565.novalocal cloud-init[923]: ci-info: +++++++++++++++++++++++++++++++++Route IPv4 info+++++++++++++++++++++++++++++++++
Dec 05 08:18:58 np0005546565.novalocal cloud-init[923]: ci-info: +-------+-----------------+---------------+-----------------+-----------+-------+
Dec 05 08:18:58 np0005546565.novalocal cloud-init[923]: ci-info: | Route |   Destination   |    Gateway    |     Genmask     | Interface | Flags |
Dec 05 08:18:58 np0005546565.novalocal cloud-init[923]: ci-info: +-------+-----------------+---------------+-----------------+-----------+-------+
Dec 05 08:18:58 np0005546565.novalocal cloud-init[923]: ci-info: |   0   |     0.0.0.0     |  38.102.83.1  |     0.0.0.0     |    eth0   |   UG  |
Dec 05 08:18:58 np0005546565.novalocal cloud-init[923]: ci-info: |   1   |   38.102.83.0   |    0.0.0.0    |  255.255.255.0  |    eth0   |   U   |
Dec 05 08:18:58 np0005546565.novalocal cloud-init[923]: ci-info: |   2   | 169.254.169.254 | 38.102.83.126 | 255.255.255.255 |    eth0   |  UGH  |
Dec 05 08:18:58 np0005546565.novalocal cloud-init[923]: ci-info: +-------+-----------------+---------------+-----------------+-----------+-------+
Dec 05 08:18:58 np0005546565.novalocal cloud-init[923]: ci-info: +++++++++++++++++++Route IPv6 info+++++++++++++++++++
Dec 05 08:18:58 np0005546565.novalocal cloud-init[923]: ci-info: +-------+-------------+---------+-----------+-------+
Dec 05 08:18:58 np0005546565.novalocal cloud-init[923]: ci-info: | Route | Destination | Gateway | Interface | Flags |
Dec 05 08:18:58 np0005546565.novalocal cloud-init[923]: ci-info: +-------+-------------+---------+-----------+-------+
Dec 05 08:18:58 np0005546565.novalocal cloud-init[923]: ci-info: |   1   |  fe80::/64  |    ::   |    eth0   |   U   |
Dec 05 08:18:58 np0005546565.novalocal cloud-init[923]: ci-info: |   3   |    local    |    ::   |    eth0   |   U   |
Dec 05 08:18:58 np0005546565.novalocal cloud-init[923]: ci-info: |   4   |  multicast  |    ::   |    eth0   |   U   |
Dec 05 08:18:58 np0005546565.novalocal cloud-init[923]: ci-info: +-------+-------------+---------+-----------+-------+
Dec 05 08:18:58 np0005546565.novalocal useradd[990]: new group: name=cloud-user, GID=1001
Dec 05 08:18:58 np0005546565.novalocal useradd[990]: new user: name=cloud-user, UID=1001, GID=1001, home=/home/cloud-user, shell=/bin/bash, from=none
Dec 05 08:18:58 np0005546565.novalocal useradd[990]: add 'cloud-user' to group 'adm'
Dec 05 08:18:58 np0005546565.novalocal useradd[990]: add 'cloud-user' to group 'systemd-journal'
Dec 05 08:18:58 np0005546565.novalocal useradd[990]: add 'cloud-user' to shadow group 'adm'
Dec 05 08:18:58 np0005546565.novalocal useradd[990]: add 'cloud-user' to shadow group 'systemd-journal'
Dec 05 08:18:59 np0005546565.novalocal cloud-init[923]: Generating public/private rsa key pair.
Dec 05 08:18:59 np0005546565.novalocal cloud-init[923]: Your identification has been saved in /etc/ssh/ssh_host_rsa_key
Dec 05 08:18:59 np0005546565.novalocal cloud-init[923]: Your public key has been saved in /etc/ssh/ssh_host_rsa_key.pub
Dec 05 08:18:59 np0005546565.novalocal cloud-init[923]: The key fingerprint is:
Dec 05 08:18:59 np0005546565.novalocal cloud-init[923]: SHA256:WCxhE/OECtOXtbGpTPm2FkKSwPItrR+8oGyJulg9RjE root@np0005546565.novalocal
Dec 05 08:18:59 np0005546565.novalocal cloud-init[923]: The key's randomart image is:
Dec 05 08:18:59 np0005546565.novalocal cloud-init[923]: +---[RSA 3072]----+
Dec 05 08:18:59 np0005546565.novalocal cloud-init[923]: | ...  *=+        |
Dec 05 08:18:59 np0005546565.novalocal cloud-init[923]: |. +..o+O =       |
Dec 05 08:18:59 np0005546565.novalocal cloud-init[923]: | o =Eo= B        |
Dec 05 08:18:59 np0005546565.novalocal cloud-init[923]: |  o +B *         |
Dec 05 08:18:59 np0005546565.novalocal cloud-init[923]: |   +. = S        |
Dec 05 08:18:59 np0005546565.novalocal cloud-init[923]: |  ooo  o o       |
Dec 05 08:18:59 np0005546565.novalocal cloud-init[923]: |o.oo+o  o        |
Dec 05 08:18:59 np0005546565.novalocal cloud-init[923]: |+= .o. .         |
Dec 05 08:18:59 np0005546565.novalocal cloud-init[923]: |*.               |
Dec 05 08:18:59 np0005546565.novalocal cloud-init[923]: +----[SHA256]-----+
Dec 05 08:18:59 np0005546565.novalocal cloud-init[923]: Generating public/private ecdsa key pair.
Dec 05 08:18:59 np0005546565.novalocal cloud-init[923]: Your identification has been saved in /etc/ssh/ssh_host_ecdsa_key
Dec 05 08:18:59 np0005546565.novalocal cloud-init[923]: Your public key has been saved in /etc/ssh/ssh_host_ecdsa_key.pub
Dec 05 08:18:59 np0005546565.novalocal cloud-init[923]: The key fingerprint is:
Dec 05 08:18:59 np0005546565.novalocal cloud-init[923]: SHA256:3XWjxM03qXUJXA98tgIW2a6rSE7cTq3OS0KQbuzydt8 root@np0005546565.novalocal
Dec 05 08:18:59 np0005546565.novalocal cloud-init[923]: The key's randomart image is:
Dec 05 08:18:59 np0005546565.novalocal cloud-init[923]: +---[ECDSA 256]---+
Dec 05 08:18:59 np0005546565.novalocal cloud-init[923]: |           .=oo. |
Dec 05 08:18:59 np0005546565.novalocal cloud-init[923]: |      .    +.o=.*|
Dec 05 08:18:59 np0005546565.novalocal cloud-init[923]: |     o    . oo.@*|
Dec 05 08:18:59 np0005546565.novalocal cloud-init[923]: |    o .  . ..++o=|
Dec 05 08:18:59 np0005546565.novalocal cloud-init[923]: |     + .S . oo.  |
Dec 05 08:18:59 np0005546565.novalocal cloud-init[923]: |    o o . ..     |
Dec 05 08:18:59 np0005546565.novalocal cloud-init[923]: |   . . = + ..    |
Dec 05 08:18:59 np0005546565.novalocal cloud-init[923]: |    o.+.B o.     |
Dec 05 08:18:59 np0005546565.novalocal cloud-init[923]: |    ...ooOoE     |
Dec 05 08:18:59 np0005546565.novalocal cloud-init[923]: +----[SHA256]-----+
Dec 05 08:18:59 np0005546565.novalocal cloud-init[923]: Generating public/private ed25519 key pair.
Dec 05 08:18:59 np0005546565.novalocal cloud-init[923]: Your identification has been saved in /etc/ssh/ssh_host_ed25519_key
Dec 05 08:18:59 np0005546565.novalocal cloud-init[923]: Your public key has been saved in /etc/ssh/ssh_host_ed25519_key.pub
Dec 05 08:18:59 np0005546565.novalocal cloud-init[923]: The key fingerprint is:
Dec 05 08:18:59 np0005546565.novalocal cloud-init[923]: SHA256:BFGDs8sNJmYWMXTOIG/W4l9wL/6e+bGn7woETjz7hFw root@np0005546565.novalocal
Dec 05 08:18:59 np0005546565.novalocal cloud-init[923]: The key's randomart image is:
Dec 05 08:18:59 np0005546565.novalocal cloud-init[923]: +--[ED25519 256]--+
Dec 05 08:18:59 np0005546565.novalocal cloud-init[923]: |  ..=.++o        |
Dec 05 08:18:59 np0005546565.novalocal cloud-init[923]: |   o.Boo .       |
Dec 05 08:18:59 np0005546565.novalocal cloud-init[923]: |    =.=oB E      |
Dec 05 08:18:59 np0005546565.novalocal cloud-init[923]: |   +=.+B B       |
Dec 05 08:18:59 np0005546565.novalocal cloud-init[923]: |   +.+ +S +      |
Dec 05 08:18:59 np0005546565.novalocal cloud-init[923]: |     .oo.=       |
Dec 05 08:18:59 np0005546565.novalocal cloud-init[923]: |      . . o .    |
Dec 05 08:18:59 np0005546565.novalocal cloud-init[923]: |         . + o.  |
Dec 05 08:18:59 np0005546565.novalocal cloud-init[923]: |         .=.=*o  |
Dec 05 08:18:59 np0005546565.novalocal cloud-init[923]: +----[SHA256]-----+
Dec 05 08:18:59 np0005546565.novalocal systemd[1]: Finished Cloud-init: Network Stage.
Dec 05 08:18:59 np0005546565.novalocal systemd[1]: Reached target Cloud-config availability.
Dec 05 08:18:59 np0005546565.novalocal systemd[1]: Reached target Network is Online.
Dec 05 08:18:59 np0005546565.novalocal systemd[1]: Starting Cloud-init: Config Stage...
Dec 05 08:18:59 np0005546565.novalocal systemd[1]: Starting Crash recovery kernel arming...
Dec 05 08:18:59 np0005546565.novalocal systemd[1]: Starting Notify NFS peers of a restart...
Dec 05 08:18:59 np0005546565.novalocal systemd[1]: Starting System Logging Service...
Dec 05 08:18:59 np0005546565.novalocal sm-notify[1006]: Version 2.5.4 starting
Dec 05 08:18:59 np0005546565.novalocal systemd[1]: Starting OpenSSH server daemon...
Dec 05 08:18:59 np0005546565.novalocal systemd[1]: Starting Permit User Sessions...
Dec 05 08:18:59 np0005546565.novalocal systemd[1]: Started Notify NFS peers of a restart.
Dec 05 08:18:59 np0005546565.novalocal systemd[1]: Finished Permit User Sessions.
Dec 05 08:18:59 np0005546565.novalocal systemd[1]: Started Command Scheduler.
Dec 05 08:18:59 np0005546565.novalocal sshd[1008]: Server listening on 0.0.0.0 port 22.
Dec 05 08:18:59 np0005546565.novalocal sshd[1008]: Server listening on :: port 22.
Dec 05 08:18:59 np0005546565.novalocal systemd[1]: Started Getty on tty1.
Dec 05 08:18:59 np0005546565.novalocal systemd[1]: Started Serial Getty on ttyS0.
Dec 05 08:18:59 np0005546565.novalocal crond[1011]: (CRON) STARTUP (1.5.7)
Dec 05 08:18:59 np0005546565.novalocal crond[1011]: (CRON) INFO (Syslog will be used instead of sendmail.)
Dec 05 08:18:59 np0005546565.novalocal systemd[1]: Reached target Login Prompts.
Dec 05 08:18:59 np0005546565.novalocal crond[1011]: (CRON) INFO (RANDOM_DELAY will be scaled with factor 43% if used.)
Dec 05 08:18:59 np0005546565.novalocal crond[1011]: (CRON) INFO (running with inotify support)
Dec 05 08:18:59 np0005546565.novalocal systemd[1]: Started OpenSSH server daemon.
Dec 05 08:18:59 np0005546565.novalocal rsyslogd[1007]: [origin software="rsyslogd" swVersion="8.2510.0-2.el9" x-pid="1007" x-info="https://www.rsyslog.com"] start
Dec 05 08:18:59 np0005546565.novalocal systemd[1]: Started System Logging Service.
Dec 05 08:18:59 np0005546565.novalocal systemd[1]: Reached target Multi-User System.
Dec 05 08:18:59 np0005546565.novalocal rsyslogd[1007]: imjournal: No statefile exists, /var/lib/rsyslog/imjournal.state will be created (ignore if this is first run): No such file or directory [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2040 ]
Dec 05 08:18:59 np0005546565.novalocal sshd-session[1021]: Unable to negotiate with 38.102.83.114 port 52258: no matching host key type found. Their offer: ssh-ed25519,ssh-ed25519-cert-v01@openssh.com [preauth]
Dec 05 08:18:59 np0005546565.novalocal systemd[1]: Starting Record Runlevel Change in UTMP...
Dec 05 08:18:59 np0005546565.novalocal systemd[1]: systemd-update-utmp-runlevel.service: Deactivated successfully.
Dec 05 08:18:59 np0005546565.novalocal systemd[1]: Finished Record Runlevel Change in UTMP.
Dec 05 08:18:59 np0005546565.novalocal rsyslogd[1007]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Dec 05 08:18:59 np0005546565.novalocal sshd-session[1039]: Unable to negotiate with 38.102.83.114 port 52280: no matching host key type found. Their offer: ecdsa-sha2-nistp384,ecdsa-sha2-nistp384-cert-v01@openssh.com [preauth]
Dec 05 08:18:59 np0005546565.novalocal sshd-session[1051]: Unable to negotiate with 38.102.83.114 port 52286: no matching host key type found. Their offer: ecdsa-sha2-nistp521,ecdsa-sha2-nistp521-cert-v01@openssh.com [preauth]
Dec 05 08:18:59 np0005546565.novalocal sshd-session[1013]: Connection closed by 38.102.83.114 port 52254 [preauth]
Dec 05 08:18:59 np0005546565.novalocal sshd-session[1079]: Unable to negotiate with 38.102.83.114 port 52314: no matching host key type found. Their offer: ssh-rsa,ssh-rsa-cert-v01@openssh.com [preauth]
Dec 05 08:18:59 np0005546565.novalocal sshd-session[1083]: Unable to negotiate with 38.102.83.114 port 52326: no matching host key type found. Their offer: ssh-dss,ssh-dss-cert-v01@openssh.com [preauth]
Dec 05 08:18:59 np0005546565.novalocal kdumpctl[1020]: kdump: No kdump initial ramdisk found.
Dec 05 08:18:59 np0005546565.novalocal kdumpctl[1020]: kdump: Rebuilding /boot/initramfs-5.14.0-645.el9.x86_64kdump.img
Dec 05 08:18:59 np0005546565.novalocal sshd-session[1030]: Connection closed by 38.102.83.114 port 52266 [preauth]
Dec 05 08:18:59 np0005546565.novalocal sshd-session[1059]: Connection closed by 38.102.83.114 port 52290 [preauth]
Dec 05 08:18:59 np0005546565.novalocal sshd-session[1074]: Connection closed by 38.102.83.114 port 52298 [preauth]
Dec 05 08:18:59 np0005546565.novalocal cloud-init[1167]: Cloud-init v. 24.4-7.el9 running 'modules:config' at Fri, 05 Dec 2025 08:18:59 +0000. Up 12.62 seconds.
Dec 05 08:18:59 np0005546565.novalocal systemd[1]: Finished Cloud-init: Config Stage.
Dec 05 08:18:59 np0005546565.novalocal systemd[1]: Starting Cloud-init: Final Stage...
Dec 05 08:18:59 np0005546565.novalocal dracut[1285]: dracut-057-102.git20250818.el9
Dec 05 08:18:59 np0005546565.novalocal cloud-init[1306]: Cloud-init v. 24.4-7.el9 running 'modules:final' at Fri, 05 Dec 2025 08:18:59 +0000. Up 13.01 seconds.
Dec 05 08:19:00 np0005546565.novalocal dracut[1287]: Executing: /usr/bin/dracut --quiet --hostonly --hostonly-cmdline --hostonly-i18n --hostonly-mode strict --hostonly-nics  --mount "/dev/disk/by-uuid/fcf6b761-831a-48a7-9f5f-068b5063763f /sysroot xfs rw,relatime,seclabel,attr2,inode64,logbufs=8,logbsize=32k,noquota" --squash-compressor zstd --no-hostonly-default-device --add-confdir /lib/kdump/dracut.conf.d -f /boot/initramfs-5.14.0-645.el9.x86_64kdump.img 5.14.0-645.el9.x86_64
Dec 05 08:19:00 np0005546565.novalocal cloud-init[1332]: #############################################################
Dec 05 08:19:00 np0005546565.novalocal cloud-init[1334]: -----BEGIN SSH HOST KEY FINGERPRINTS-----
Dec 05 08:19:00 np0005546565.novalocal cloud-init[1341]: 256 SHA256:3XWjxM03qXUJXA98tgIW2a6rSE7cTq3OS0KQbuzydt8 root@np0005546565.novalocal (ECDSA)
Dec 05 08:19:00 np0005546565.novalocal cloud-init[1349]: 256 SHA256:BFGDs8sNJmYWMXTOIG/W4l9wL/6e+bGn7woETjz7hFw root@np0005546565.novalocal (ED25519)
Dec 05 08:19:00 np0005546565.novalocal cloud-init[1357]: 3072 SHA256:WCxhE/OECtOXtbGpTPm2FkKSwPItrR+8oGyJulg9RjE root@np0005546565.novalocal (RSA)
Dec 05 08:19:00 np0005546565.novalocal cloud-init[1358]: -----END SSH HOST KEY FINGERPRINTS-----
Dec 05 08:19:00 np0005546565.novalocal cloud-init[1362]: #############################################################
Dec 05 08:19:00 np0005546565.novalocal cloud-init[1306]: Cloud-init v. 24.4-7.el9 finished at Fri, 05 Dec 2025 08:19:00 +0000. Datasource DataSourceConfigDrive [net,ver=2][source=/dev/sr0].  Up 13.18 seconds
Dec 05 08:19:00 np0005546565.novalocal systemd[1]: Finished Cloud-init: Final Stage.
Dec 05 08:19:00 np0005546565.novalocal systemd[1]: Reached target Cloud-init target.
Dec 05 08:19:00 np0005546565.novalocal dracut[1287]: dracut module 'systemd-networkd' will not be installed, because command 'networkctl' could not be found!
Dec 05 08:19:00 np0005546565.novalocal dracut[1287]: dracut module 'systemd-networkd' will not be installed, because command '/usr/lib/systemd/systemd-networkd' could not be found!
Dec 05 08:19:00 np0005546565.novalocal dracut[1287]: dracut module 'systemd-networkd' will not be installed, because command '/usr/lib/systemd/systemd-networkd-wait-online' could not be found!
Dec 05 08:19:00 np0005546565.novalocal dracut[1287]: dracut module 'systemd-resolved' will not be installed, because command 'resolvectl' could not be found!
Dec 05 08:19:00 np0005546565.novalocal dracut[1287]: dracut module 'systemd-resolved' will not be installed, because command '/usr/lib/systemd/systemd-resolved' could not be found!
Dec 05 08:19:00 np0005546565.novalocal dracut[1287]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-timesyncd' could not be found!
Dec 05 08:19:00 np0005546565.novalocal dracut[1287]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-time-wait-sync' could not be found!
Dec 05 08:19:00 np0005546565.novalocal dracut[1287]: dracut module 'busybox' will not be installed, because command 'busybox' could not be found!
Dec 05 08:19:00 np0005546565.novalocal dracut[1287]: dracut module 'dbus-daemon' will not be installed, because command 'dbus-daemon' could not be found!
Dec 05 08:19:00 np0005546565.novalocal dracut[1287]: dracut module 'rngd' will not be installed, because command 'rngd' could not be found!
Dec 05 08:19:00 np0005546565.novalocal dracut[1287]: dracut module 'connman' will not be installed, because command 'connmand' could not be found!
Dec 05 08:19:00 np0005546565.novalocal dracut[1287]: dracut module 'connman' will not be installed, because command 'connmanctl' could not be found!
Dec 05 08:19:00 np0005546565.novalocal dracut[1287]: dracut module 'connman' will not be installed, because command 'connmand-wait-online' could not be found!
Dec 05 08:19:00 np0005546565.novalocal dracut[1287]: dracut module 'network-wicked' will not be installed, because command 'wicked' could not be found!
Dec 05 08:19:00 np0005546565.novalocal dracut[1287]: Module 'ifcfg' will not be installed, because it's in the list to be omitted!
Dec 05 08:19:00 np0005546565.novalocal dracut[1287]: Module 'plymouth' will not be installed, because it's in the list to be omitted!
Dec 05 08:19:00 np0005546565.novalocal dracut[1287]: 62bluetooth: Could not find any command of '/usr/lib/bluetooth/bluetoothd /usr/libexec/bluetooth/bluetoothd'!
Dec 05 08:19:00 np0005546565.novalocal dracut[1287]: dracut module 'lvmmerge' will not be installed, because command 'lvm' could not be found!
Dec 05 08:19:00 np0005546565.novalocal dracut[1287]: dracut module 'lvmthinpool-monitor' will not be installed, because command 'lvm' could not be found!
Dec 05 08:19:00 np0005546565.novalocal dracut[1287]: dracut module 'btrfs' will not be installed, because command 'btrfs' could not be found!
Dec 05 08:19:00 np0005546565.novalocal dracut[1287]: dracut module 'dmraid' will not be installed, because command 'dmraid' could not be found!
Dec 05 08:19:00 np0005546565.novalocal dracut[1287]: dracut module 'lvm' will not be installed, because command 'lvm' could not be found!
Dec 05 08:19:00 np0005546565.novalocal dracut[1287]: dracut module 'mdraid' will not be installed, because command 'mdadm' could not be found!
Dec 05 08:19:00 np0005546565.novalocal dracut[1287]: dracut module 'pcsc' will not be installed, because command 'pcscd' could not be found!
Dec 05 08:19:00 np0005546565.novalocal dracut[1287]: dracut module 'tpm2-tss' will not be installed, because command 'tpm2' could not be found!
Dec 05 08:19:00 np0005546565.novalocal dracut[1287]: dracut module 'cifs' will not be installed, because command 'mount.cifs' could not be found!
Dec 05 08:19:00 np0005546565.novalocal dracut[1287]: dracut module 'iscsi' will not be installed, because command 'iscsi-iname' could not be found!
Dec 05 08:19:00 np0005546565.novalocal dracut[1287]: dracut module 'iscsi' will not be installed, because command 'iscsiadm' could not be found!
Dec 05 08:19:00 np0005546565.novalocal dracut[1287]: dracut module 'iscsi' will not be installed, because command 'iscsid' could not be found!
Dec 05 08:19:00 np0005546565.novalocal dracut[1287]: dracut module 'nvmf' will not be installed, because command 'nvme' could not be found!
Dec 05 08:19:00 np0005546565.novalocal dracut[1287]: Module 'resume' will not be installed, because it's in the list to be omitted!
Dec 05 08:19:00 np0005546565.novalocal dracut[1287]: dracut module 'biosdevname' will not be installed, because command 'biosdevname' could not be found!
Dec 05 08:19:00 np0005546565.novalocal dracut[1287]: Module 'earlykdump' will not be installed, because it's in the list to be omitted!
Dec 05 08:19:01 np0005546565.novalocal dracut[1287]: dracut module 'memstrack' will not be installed, because command 'memstrack' could not be found!
Dec 05 08:19:01 np0005546565.novalocal dracut[1287]: memstrack is not available
Dec 05 08:19:01 np0005546565.novalocal dracut[1287]: If you need to use rd.memdebug>=4, please install memstrack and procps-ng
Dec 05 08:19:01 np0005546565.novalocal dracut[1287]: dracut module 'systemd-resolved' will not be installed, because command 'resolvectl' could not be found!
Dec 05 08:19:01 np0005546565.novalocal dracut[1287]: dracut module 'systemd-resolved' will not be installed, because command '/usr/lib/systemd/systemd-resolved' could not be found!
Dec 05 08:19:01 np0005546565.novalocal dracut[1287]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-timesyncd' could not be found!
Dec 05 08:19:01 np0005546565.novalocal dracut[1287]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-time-wait-sync' could not be found!
Dec 05 08:19:01 np0005546565.novalocal dracut[1287]: dracut module 'busybox' will not be installed, because command 'busybox' could not be found!
Dec 05 08:19:01 np0005546565.novalocal dracut[1287]: dracut module 'dbus-daemon' will not be installed, because command 'dbus-daemon' could not be found!
Dec 05 08:19:01 np0005546565.novalocal dracut[1287]: dracut module 'rngd' will not be installed, because command 'rngd' could not be found!
Dec 05 08:19:01 np0005546565.novalocal dracut[1287]: dracut module 'connman' will not be installed, because command 'connmand' could not be found!
Dec 05 08:19:01 np0005546565.novalocal dracut[1287]: dracut module 'connman' will not be installed, because command 'connmanctl' could not be found!
Dec 05 08:19:01 np0005546565.novalocal dracut[1287]: dracut module 'connman' will not be installed, because command 'connmand-wait-online' could not be found!
Dec 05 08:19:01 np0005546565.novalocal dracut[1287]: dracut module 'network-wicked' will not be installed, because command 'wicked' could not be found!
Dec 05 08:19:01 np0005546565.novalocal dracut[1287]: 62bluetooth: Could not find any command of '/usr/lib/bluetooth/bluetoothd /usr/libexec/bluetooth/bluetoothd'!
Dec 05 08:19:01 np0005546565.novalocal dracut[1287]: dracut module 'lvmmerge' will not be installed, because command 'lvm' could not be found!
Dec 05 08:19:01 np0005546565.novalocal dracut[1287]: dracut module 'lvmthinpool-monitor' will not be installed, because command 'lvm' could not be found!
Dec 05 08:19:01 np0005546565.novalocal dracut[1287]: dracut module 'btrfs' will not be installed, because command 'btrfs' could not be found!
Dec 05 08:19:01 np0005546565.novalocal dracut[1287]: dracut module 'dmraid' will not be installed, because command 'dmraid' could not be found!
Dec 05 08:19:01 np0005546565.novalocal dracut[1287]: dracut module 'lvm' will not be installed, because command 'lvm' could not be found!
Dec 05 08:19:01 np0005546565.novalocal dracut[1287]: dracut module 'mdraid' will not be installed, because command 'mdadm' could not be found!
Dec 05 08:19:01 np0005546565.novalocal dracut[1287]: dracut module 'pcsc' will not be installed, because command 'pcscd' could not be found!
Dec 05 08:19:01 np0005546565.novalocal dracut[1287]: dracut module 'tpm2-tss' will not be installed, because command 'tpm2' could not be found!
Dec 05 08:19:01 np0005546565.novalocal dracut[1287]: dracut module 'cifs' will not be installed, because command 'mount.cifs' could not be found!
Dec 05 08:19:01 np0005546565.novalocal dracut[1287]: dracut module 'iscsi' will not be installed, because command 'iscsi-iname' could not be found!
Dec 05 08:19:01 np0005546565.novalocal dracut[1287]: dracut module 'iscsi' will not be installed, because command 'iscsiadm' could not be found!
Dec 05 08:19:01 np0005546565.novalocal dracut[1287]: dracut module 'iscsi' will not be installed, because command 'iscsid' could not be found!
Dec 05 08:19:01 np0005546565.novalocal dracut[1287]: dracut module 'nvmf' will not be installed, because command 'nvme' could not be found!
Dec 05 08:19:01 np0005546565.novalocal dracut[1287]: dracut module 'memstrack' will not be installed, because command 'memstrack' could not be found!
Dec 05 08:19:01 np0005546565.novalocal dracut[1287]: memstrack is not available
Dec 05 08:19:01 np0005546565.novalocal dracut[1287]: If you need to use rd.memdebug>=4, please install memstrack and procps-ng
Dec 05 08:19:01 np0005546565.novalocal dracut[1287]: *** Including module: systemd ***
Dec 05 08:19:01 np0005546565.novalocal dracut[1287]: *** Including module: fips ***
Dec 05 08:19:01 np0005546565.novalocal dracut[1287]: *** Including module: systemd-initrd ***
Dec 05 08:19:01 np0005546565.novalocal dracut[1287]: *** Including module: i18n ***
Dec 05 08:19:02 np0005546565.novalocal dracut[1287]: *** Including module: drm ***
Dec 05 08:19:02 np0005546565.novalocal dracut[1287]: *** Including module: prefixdevname ***
Dec 05 08:19:02 np0005546565.novalocal dracut[1287]: *** Including module: kernel-modules ***
Dec 05 08:19:02 np0005546565.novalocal chronyd[781]: Selected source 216.128.178.20 (2.centos.pool.ntp.org)
Dec 05 08:19:02 np0005546565.novalocal chronyd[781]: System clock TAI offset set to 37 seconds
Dec 05 08:19:02 np0005546565.novalocal kernel: block vda: the capability attribute has been deprecated.
Dec 05 08:19:02 np0005546565.novalocal dracut[1287]: *** Including module: kernel-modules-extra ***
Dec 05 08:19:02 np0005546565.novalocal dracut[1287]:   kernel-modules-extra: configuration source "/run/depmod.d" does not exist
Dec 05 08:19:02 np0005546565.novalocal dracut[1287]:   kernel-modules-extra: configuration source "/lib/depmod.d" does not exist
Dec 05 08:19:02 np0005546565.novalocal dracut[1287]:   kernel-modules-extra: parsing configuration file "/etc/depmod.d/dist.conf"
Dec 05 08:19:02 np0005546565.novalocal dracut[1287]:   kernel-modules-extra: /etc/depmod.d/dist.conf: added "updates extra built-in weak-updates" to the list of search directories
Dec 05 08:19:02 np0005546565.novalocal dracut[1287]: *** Including module: qemu ***
Dec 05 08:19:02 np0005546565.novalocal irqbalance[793]: Cannot change IRQ 25 affinity: Operation not permitted
Dec 05 08:19:02 np0005546565.novalocal irqbalance[793]: IRQ 25 affinity is now unmanaged
Dec 05 08:19:02 np0005546565.novalocal irqbalance[793]: Cannot change IRQ 31 affinity: Operation not permitted
Dec 05 08:19:02 np0005546565.novalocal irqbalance[793]: IRQ 31 affinity is now unmanaged
Dec 05 08:19:02 np0005546565.novalocal irqbalance[793]: Cannot change IRQ 28 affinity: Operation not permitted
Dec 05 08:19:02 np0005546565.novalocal irqbalance[793]: IRQ 28 affinity is now unmanaged
Dec 05 08:19:02 np0005546565.novalocal irqbalance[793]: Cannot change IRQ 32 affinity: Operation not permitted
Dec 05 08:19:02 np0005546565.novalocal irqbalance[793]: IRQ 32 affinity is now unmanaged
Dec 05 08:19:02 np0005546565.novalocal irqbalance[793]: Cannot change IRQ 30 affinity: Operation not permitted
Dec 05 08:19:02 np0005546565.novalocal irqbalance[793]: IRQ 30 affinity is now unmanaged
Dec 05 08:19:02 np0005546565.novalocal irqbalance[793]: Cannot change IRQ 29 affinity: Operation not permitted
Dec 05 08:19:02 np0005546565.novalocal irqbalance[793]: IRQ 29 affinity is now unmanaged
Dec 05 08:19:02 np0005546565.novalocal dracut[1287]: *** Including module: fstab-sys ***
Dec 05 08:19:03 np0005546565.novalocal dracut[1287]: *** Including module: rootfs-block ***
Dec 05 08:19:03 np0005546565.novalocal dracut[1287]: *** Including module: terminfo ***
Dec 05 08:19:03 np0005546565.novalocal dracut[1287]: *** Including module: udev-rules ***
Dec 05 08:19:03 np0005546565.novalocal dracut[1287]: Skipping udev rule: 91-permissions.rules
Dec 05 08:19:03 np0005546565.novalocal dracut[1287]: Skipping udev rule: 80-drivers-modprobe.rules
Dec 05 08:19:03 np0005546565.novalocal dracut[1287]: *** Including module: virtiofs ***
Dec 05 08:19:03 np0005546565.novalocal dracut[1287]: *** Including module: dracut-systemd ***
Dec 05 08:19:03 np0005546565.novalocal dracut[1287]: *** Including module: usrmount ***
Dec 05 08:19:03 np0005546565.novalocal dracut[1287]: *** Including module: base ***
Dec 05 08:19:03 np0005546565.novalocal dracut[1287]: *** Including module: fs-lib ***
Dec 05 08:19:03 np0005546565.novalocal dracut[1287]: *** Including module: kdumpbase ***
Dec 05 08:19:04 np0005546565.novalocal dracut[1287]: *** Including module: microcode_ctl-fw_dir_override ***
Dec 05 08:19:04 np0005546565.novalocal dracut[1287]:   microcode_ctl module: mangling fw_dir
Dec 05 08:19:04 np0005546565.novalocal dracut[1287]:     microcode_ctl: reset fw_dir to "/lib/firmware/updates /lib/firmware"
Dec 05 08:19:04 np0005546565.novalocal dracut[1287]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel"...
Dec 05 08:19:04 np0005546565.novalocal dracut[1287]:     microcode_ctl: configuration "intel" is ignored
Dec 05 08:19:04 np0005546565.novalocal dracut[1287]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-2d-07"...
Dec 05 08:19:04 np0005546565.novalocal chronyd[781]: Selected source 174.138.193.90 (2.centos.pool.ntp.org)
Dec 05 08:19:04 np0005546565.novalocal dracut[1287]:     microcode_ctl: configuration "intel-06-2d-07" is ignored
Dec 05 08:19:04 np0005546565.novalocal dracut[1287]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-4e-03"...
Dec 05 08:19:04 np0005546565.novalocal dracut[1287]:     microcode_ctl: configuration "intel-06-4e-03" is ignored
Dec 05 08:19:04 np0005546565.novalocal dracut[1287]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-4f-01"...
Dec 05 08:19:04 np0005546565.novalocal dracut[1287]:     microcode_ctl: configuration "intel-06-4f-01" is ignored
Dec 05 08:19:04 np0005546565.novalocal dracut[1287]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-55-04"...
Dec 05 08:19:04 np0005546565.novalocal dracut[1287]:     microcode_ctl: configuration "intel-06-55-04" is ignored
Dec 05 08:19:04 np0005546565.novalocal dracut[1287]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-5e-03"...
Dec 05 08:19:04 np0005546565.novalocal dracut[1287]:     microcode_ctl: configuration "intel-06-5e-03" is ignored
Dec 05 08:19:04 np0005546565.novalocal dracut[1287]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8c-01"...
Dec 05 08:19:04 np0005546565.novalocal dracut[1287]:     microcode_ctl: configuration "intel-06-8c-01" is ignored
Dec 05 08:19:04 np0005546565.novalocal dracut[1287]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8e-9e-0x-0xca"...
Dec 05 08:19:04 np0005546565.novalocal dracut[1287]:     microcode_ctl: configuration "intel-06-8e-9e-0x-0xca" is ignored
Dec 05 08:19:04 np0005546565.novalocal dracut[1287]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8e-9e-0x-dell"...
Dec 05 08:19:04 np0005546565.novalocal dracut[1287]:     microcode_ctl: configuration "intel-06-8e-9e-0x-dell" is ignored
Dec 05 08:19:04 np0005546565.novalocal dracut[1287]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8f-08"...
Dec 05 08:19:04 np0005546565.novalocal dracut[1287]:     microcode_ctl: configuration "intel-06-8f-08" is ignored
Dec 05 08:19:04 np0005546565.novalocal dracut[1287]:     microcode_ctl: final fw_dir: "/lib/firmware/updates /lib/firmware"
Dec 05 08:19:04 np0005546565.novalocal dracut[1287]: *** Including module: openssl ***
Dec 05 08:19:04 np0005546565.novalocal dracut[1287]: *** Including module: shutdown ***
Dec 05 08:19:04 np0005546565.novalocal dracut[1287]: *** Including module: squash ***
Dec 05 08:19:04 np0005546565.novalocal dracut[1287]: *** Including modules done ***
Dec 05 08:19:04 np0005546565.novalocal dracut[1287]: *** Installing kernel module dependencies ***
Dec 05 08:19:05 np0005546565.novalocal dracut[1287]: *** Installing kernel module dependencies done ***
Dec 05 08:19:05 np0005546565.novalocal dracut[1287]: *** Resolving executable dependencies ***
Dec 05 08:19:06 np0005546565.novalocal dracut[1287]: *** Resolving executable dependencies done ***
Dec 05 08:19:06 np0005546565.novalocal dracut[1287]: *** Generating early-microcode cpio image ***
Dec 05 08:19:06 np0005546565.novalocal dracut[1287]: *** Store current command line parameters ***
Dec 05 08:19:06 np0005546565.novalocal dracut[1287]: Stored kernel commandline:
Dec 05 08:19:06 np0005546565.novalocal dracut[1287]: No dracut internal kernel commandline stored in the initramfs
Dec 05 08:19:06 np0005546565.novalocal dracut[1287]: *** Install squash loader ***
Dec 05 08:19:07 np0005546565.novalocal dracut[1287]: *** Squashing the files inside the initramfs ***
Dec 05 08:19:07 np0005546565.novalocal systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Dec 05 08:19:08 np0005546565.novalocal dracut[1287]: *** Squashing the files inside the initramfs done ***
Dec 05 08:19:08 np0005546565.novalocal dracut[1287]: *** Creating image file '/boot/initramfs-5.14.0-645.el9.x86_64kdump.img' ***
Dec 05 08:19:08 np0005546565.novalocal dracut[1287]: *** Hardlinking files ***
Dec 05 08:19:08 np0005546565.novalocal dracut[1287]: Mode:           real
Dec 05 08:19:08 np0005546565.novalocal dracut[1287]: Files:          50
Dec 05 08:19:08 np0005546565.novalocal dracut[1287]: Linked:         0 files
Dec 05 08:19:08 np0005546565.novalocal dracut[1287]: Compared:       0 xattrs
Dec 05 08:19:08 np0005546565.novalocal dracut[1287]: Compared:       0 files
Dec 05 08:19:08 np0005546565.novalocal dracut[1287]: Saved:          0 B
Dec 05 08:19:08 np0005546565.novalocal dracut[1287]: Duration:       0.000801 seconds
Dec 05 08:19:08 np0005546565.novalocal dracut[1287]: *** Hardlinking files done ***
Dec 05 08:19:09 np0005546565.novalocal dracut[1287]: *** Creating initramfs image file '/boot/initramfs-5.14.0-645.el9.x86_64kdump.img' done ***
Dec 05 08:19:09 np0005546565.novalocal kdumpctl[1020]: kdump: kexec: loaded kdump kernel
Dec 05 08:19:09 np0005546565.novalocal kdumpctl[1020]: kdump: Starting kdump: [OK]
Dec 05 08:19:09 np0005546565.novalocal systemd[1]: Finished Crash recovery kernel arming.
Dec 05 08:19:09 np0005546565.novalocal systemd[1]: Startup finished in 1.947s (kernel) + 2.361s (initrd) + 18.399s (userspace) = 22.708s.
Dec 05 08:19:17 np0005546565.novalocal sshd-session[4296]: Accepted publickey for zuul from 38.102.83.114 port 44646 ssh2: RSA SHA256:zhs3MiW0JhxzckYcMHQES8SMYHj1iGcomnyzmbiwor8
Dec 05 08:19:17 np0005546565.novalocal systemd[1]: Created slice User Slice of UID 1000.
Dec 05 08:19:17 np0005546565.novalocal systemd[1]: Starting User Runtime Directory /run/user/1000...
Dec 05 08:19:17 np0005546565.novalocal systemd-logind[807]: New session 1 of user zuul.
Dec 05 08:19:17 np0005546565.novalocal systemd[1]: Finished User Runtime Directory /run/user/1000.
Dec 05 08:19:17 np0005546565.novalocal systemd[1]: Starting User Manager for UID 1000...
Dec 05 08:19:17 np0005546565.novalocal systemd[4300]: pam_unix(systemd-user:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 05 08:19:17 np0005546565.novalocal systemd[4300]: Queued start job for default target Main User Target.
Dec 05 08:19:17 np0005546565.novalocal systemd[4300]: Created slice User Application Slice.
Dec 05 08:19:17 np0005546565.novalocal systemd[4300]: Started Mark boot as successful after the user session has run 2 minutes.
Dec 05 08:19:17 np0005546565.novalocal systemd[4300]: Started Daily Cleanup of User's Temporary Directories.
Dec 05 08:19:17 np0005546565.novalocal systemd[4300]: Reached target Paths.
Dec 05 08:19:17 np0005546565.novalocal systemd[4300]: Reached target Timers.
Dec 05 08:19:17 np0005546565.novalocal systemd[4300]: Starting D-Bus User Message Bus Socket...
Dec 05 08:19:17 np0005546565.novalocal systemd[4300]: Starting Create User's Volatile Files and Directories...
Dec 05 08:19:17 np0005546565.novalocal systemd[4300]: Finished Create User's Volatile Files and Directories.
Dec 05 08:19:17 np0005546565.novalocal systemd[4300]: Listening on D-Bus User Message Bus Socket.
Dec 05 08:19:17 np0005546565.novalocal systemd[4300]: Reached target Sockets.
Dec 05 08:19:17 np0005546565.novalocal systemd[4300]: Reached target Basic System.
Dec 05 08:19:17 np0005546565.novalocal systemd[4300]: Reached target Main User Target.
Dec 05 08:19:17 np0005546565.novalocal systemd[4300]: Startup finished in 176ms.
Dec 05 08:19:17 np0005546565.novalocal systemd[1]: Started User Manager for UID 1000.
Dec 05 08:19:17 np0005546565.novalocal systemd[1]: Started Session 1 of User zuul.
Dec 05 08:19:18 np0005546565.novalocal sshd-session[4296]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 05 08:19:18 np0005546565.novalocal python3[4382]: ansible-setup Invoked with gather_subset=['!all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 05 08:19:21 np0005546565.novalocal python3[4410]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 05 08:19:23 np0005546565.novalocal systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Dec 05 08:19:30 np0005546565.novalocal python3[4470]: ansible-setup Invoked with gather_subset=['network'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 05 08:19:31 np0005546565.novalocal python3[4510]: ansible-zuul_console Invoked with path=/tmp/console-{log_uuid}.log port=19885 state=present
Dec 05 08:19:33 np0005546565.novalocal python3[4536]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDQrOx61GLF0o/CB4cyAk83PinyJUn0X496NWLVlD1IQGhmkzM1ukkL6K26cLXP8KYqj+i/+wUd40/hSHestjPjmmg0S9vmwIkEHMhV8F2NcPp0GMzOglWHs1J2bf0vn2+WaK2eUCYSuk4sT4+JiqaHhdt2zOkX91K+eo2nQJBRTa+dA7412ijE9S/NB1HRT8Qy9o7Hg7VWs1jRum+hTqJ0+ThYz1RmIabLKq66wMWQ5nDggnm4JaqfAVCZhrqrhsbUhOlZOeQHWMez5pW1SsRdLDa5S8/IoGL7+7qZkMhf86PNONOHxBkCzGfwikO/itW+cGZaFi6CDMrJ3gRvdW66KnaLc6U9zzPnnLZ2BvGnYNLODFgO1O2v5/sejfBGA0Z4KFKLbkwYJq8WO80rkVbSTWlKOj8sYRpG/Ie1QWC6uQ+x8qPWGZvRFUPPlvRwtGmnPqC1Z/blAzhYQLOLD6/z9XqVnC7YiOX0v1Y+e150u0foqLjHIrNrUpjDr+PVaeE= zuul-build-sshkey manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 05 08:19:33 np0005546565.novalocal python3[4560]: ansible-file Invoked with state=directory path=/home/zuul/.ssh mode=448 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 08:19:34 np0005546565.novalocal python3[4659]: ansible-ansible.legacy.stat Invoked with path=/home/zuul/.ssh/id_rsa follow=False get_checksum=False checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 05 08:19:34 np0005546565.novalocal python3[4730]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764922773.8454742-252-131800264912704/source dest=/home/zuul/.ssh/id_rsa mode=384 force=False _original_basename=4909e2ae87444cb387b45bb5e51019bb_id_rsa follow=False checksum=29847eaf364f99c6c2ff09f0534d88222813a6a3 backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 08:19:35 np0005546565.novalocal python3[4853]: ansible-ansible.legacy.stat Invoked with path=/home/zuul/.ssh/id_rsa.pub follow=False get_checksum=False checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 05 08:19:35 np0005546565.novalocal python3[4924]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764922774.884141-307-136320961709030/source dest=/home/zuul/.ssh/id_rsa.pub mode=420 force=False _original_basename=4909e2ae87444cb387b45bb5e51019bb_id_rsa.pub follow=False checksum=166a31c478fe95f3828c748b2f05ab7a18ae71dc backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 08:19:36 np0005546565.novalocal python3[4972]: ansible-ping Invoked with data=pong
Dec 05 08:19:37 np0005546565.novalocal python3[4996]: ansible-setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 05 08:19:39 np0005546565.novalocal python3[5054]: ansible-zuul_debug_info Invoked with ipv4_route_required=False ipv6_route_required=False image_manifest_files=['/etc/dib-builddate.txt', '/etc/image-hostname.txt'] image_manifest=None traceroute_host=None
Dec 05 08:19:41 np0005546565.novalocal python3[5086]: ansible-file Invoked with path=/home/zuul/zuul-output/logs state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 08:19:41 np0005546565.novalocal python3[5110]: ansible-file Invoked with path=/home/zuul/zuul-output/artifacts state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 08:19:41 np0005546565.novalocal python3[5134]: ansible-file Invoked with path=/home/zuul/zuul-output/docs state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 08:19:42 np0005546565.novalocal python3[5158]: ansible-file Invoked with path=/home/zuul/zuul-output/logs state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 08:19:42 np0005546565.novalocal python3[5182]: ansible-file Invoked with path=/home/zuul/zuul-output/artifacts state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 08:19:42 np0005546565.novalocal python3[5206]: ansible-file Invoked with path=/home/zuul/zuul-output/docs state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 08:19:44 np0005546565.novalocal sudo[5230]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xqyepnbicocipltcwwsktvsuagoariqt ; /usr/bin/python3'
Dec 05 08:19:44 np0005546565.novalocal sudo[5230]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 08:19:44 np0005546565.novalocal python3[5232]: ansible-file Invoked with path=/etc/ci state=directory owner=root group=root mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 08:19:44 np0005546565.novalocal sudo[5230]: pam_unix(sudo:session): session closed for user root
Dec 05 08:19:44 np0005546565.novalocal sudo[5308]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mlhorwfqjuzqoganamgxrvsnxuvkajis ; /usr/bin/python3'
Dec 05 08:19:44 np0005546565.novalocal sudo[5308]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 08:19:45 np0005546565.novalocal python3[5310]: ansible-ansible.legacy.stat Invoked with path=/etc/ci/mirror_info.sh follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 05 08:19:45 np0005546565.novalocal sudo[5308]: pam_unix(sudo:session): session closed for user root
Dec 05 08:19:45 np0005546565.novalocal sudo[5381]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-atpkottmskmluxgndckydzmnjagwmupk ; /usr/bin/python3'
Dec 05 08:19:45 np0005546565.novalocal sudo[5381]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 08:19:45 np0005546565.novalocal python3[5383]: ansible-ansible.legacy.copy Invoked with dest=/etc/ci/mirror_info.sh owner=root group=root mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1764922784.6442223-33-191537121349716/source follow=False _original_basename=mirror_info.sh.j2 checksum=92d92a03afdddee82732741071f662c729080c35 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 08:19:45 np0005546565.novalocal sudo[5381]: pam_unix(sudo:session): session closed for user root
Dec 05 08:19:46 np0005546565.novalocal python3[5431]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAABIwAAAQEA4Z/c9osaGGtU6X8fgELwfj/yayRurfcKA0HMFfdpPxev2dbwljysMuzoVp4OZmW1gvGtyYPSNRvnzgsaabPNKNo2ym5NToCP6UM+KSe93aln4BcM/24mXChYAbXJQ5Bqq/pIzsGs/pKetQN+vwvMxLOwTvpcsCJBXaa981RKML6xj9l/UZ7IIq1HSEKMvPLxZMWdu0Ut8DkCd5F4nOw9Wgml2uYpDCj5LLCrQQ9ChdOMz8hz6SighhNlRpPkvPaet3OXxr/ytFMu7j7vv06CaEnuMMiY2aTWN1Imin9eHAylIqFHta/3gFfQSWt9jXM7owkBLKL7ATzhaAn+fjNupw== arxcruz@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 05 08:19:46 np0005546565.novalocal python3[5455]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQDS4Fn6k4deCnIlOtLWqZJyksbepjQt04j8Ed8CGx9EKkj0fKiAxiI4TadXQYPuNHMixZy4Nevjb6aDhL5Z906TfvNHKUrjrG7G26a0k8vdc61NEQ7FmcGMWRLwwc6ReDO7lFpzYKBMk4YqfWgBuGU/K6WLKiVW2cVvwIuGIaYrE1OiiX0iVUUk7KApXlDJMXn7qjSYynfO4mF629NIp8FJal38+Kv+HA+0QkE5Y2xXnzD4Lar5+keymiCHRntPppXHeLIRzbt0gxC7v3L72hpQ3BTBEzwHpeS8KY+SX1y5lRMN45thCHfJqGmARJREDjBvWG8JXOPmVIKQtZmVcD5b mandreou@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 05 08:19:46 np0005546565.novalocal python3[5479]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQC9MiLfy30deHA7xPOAlew5qUq3UP2gmRMYJi8PtkjFB20/DKeWwWNnkZPqP9AayruRoo51SIiVg870gbZE2jYl+Ncx/FYDe56JeC3ySZsXoAVkC9bP7gkOGqOmJjirvAgPMI7bogVz8i+66Q4Ar7OKTp3762G4IuWPPEg4ce4Y7lx9qWocZapHYq4cYKMxrOZ7SEbFSATBbe2bPZAPKTw8do/Eny+Hq/LkHFhIeyra6cqTFQYShr+zPln0Cr+ro/pDX3bB+1ubFgTpjpkkkQsLhDfR6cCdCWM2lgnS3BTtYj5Ct9/JRPR5YOphqZz+uB+OEu2IL68hmU9vNTth1KeX rlandy@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 05 08:19:47 np0005546565.novalocal python3[5503]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIFCbgz8gdERiJlk2IKOtkjQxEXejrio6ZYMJAVJYpOIp raukadah@gmail.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 05 08:19:47 np0005546565.novalocal python3[5527]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIBqb3Q/9uDf4LmihQ7xeJ9gA/STIQUFPSfyyV0m8AoQi bshewale@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 05 08:19:47 np0005546565.novalocal python3[5551]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQC0I8QqQx0Az2ysJt2JuffucLijhBqnsXKEIx5GyHwxVULROa8VtNFXUDH6ZKZavhiMcmfHB2+TBTda+lDP4FldYj06dGmzCY+IYGa+uDRdxHNGYjvCfLFcmLlzRK6fNbTcui+KlUFUdKe0fb9CRoGKyhlJD5GRkM1Dv+Yb6Bj+RNnmm1fVGYxzmrD2utvffYEb0SZGWxq2R9gefx1q/3wCGjeqvufEV+AskPhVGc5T7t9eyZ4qmslkLh1/nMuaIBFcr9AUACRajsvk6mXrAN1g3HlBf2gQlhi1UEyfbqIQvzzFtsbLDlSum/KmKjy818GzvWjERfQ0VkGzCd9bSLVL dviroel@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 05 08:19:47 np0005546565.novalocal python3[5575]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDLOQd4ZLtkZXQGY6UwAr/06ppWQK4fDO3HaqxPk98csyOCBXsliSKK39Bso828+5srIXiW7aI6aC9P5mwi4mUZlGPfJlQbfrcGvY+b/SocuvaGK+1RrHLoJCT52LBhwgrzlXio2jeksZeein8iaTrhsPrOAs7KggIL/rB9hEiB3NaOPWhhoCP4vlW6MEMExGcqB/1FVxXFBPnLkEyW0Lk7ycVflZl2ocRxbfjZi0+tI1Wlinp8PvSQSc/WVrAcDgKjc/mB4ODPOyYy3G8FHgfMsrXSDEyjBKgLKMsdCrAUcqJQWjkqXleXSYOV4q3pzL+9umK+q/e3P/bIoSFQzmJKTU1eDfuvPXmow9F5H54fii/Da7ezlMJ+wPGHJrRAkmzvMbALy7xwswLhZMkOGNtRcPqaKYRmIBKpw3o6bCTtcNUHOtOQnzwY8JzrM2eBWJBXAANYw+9/ho80JIiwhg29CFNpVBuHbql2YxJQNrnl90guN65rYNpDxdIluweyUf8= anbanerj@kaermorhen manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 05 08:19:48 np0005546565.novalocal python3[5599]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQC3VwV8Im9kRm49lt3tM36hj4Zv27FxGo4C1Q/0jqhzFmHY7RHbmeRr8ObhwWoHjXSozKWg8FL5ER0z3hTwL0W6lez3sL7hUaCmSuZmG5Hnl3x4vTSxDI9JZ/Y65rtYiiWQo2fC5xJhU/4+0e5e/pseCm8cKRSu+SaxhO+sd6FDojA2x1BzOzKiQRDy/1zWGp/cZkxcEuB1wHI5LMzN03c67vmbu+fhZRAUO4dQkvcnj2LrhQtpa+ytvnSjr8icMDosf1OsbSffwZFyHB/hfWGAfe0eIeSA2XPraxiPknXxiPKx2MJsaUTYbsZcm3EjFdHBBMumw5rBI74zLrMRvCO9GwBEmGT4rFng1nP+yw5DB8sn2zqpOsPg1LYRwCPOUveC13P6pgsZZPh812e8v5EKnETct+5XI3dVpdw6CnNiLwAyVAF15DJvBGT/u1k0Myg/bQn+Gv9k2MSj6LvQmf6WbZu2Wgjm30z3FyCneBqTL7mLF19YXzeC0ufHz5pnO1E= dasm@fedora manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 05 08:19:48 np0005546565.novalocal python3[5623]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIHUnwjB20UKmsSed9X73eGNV5AOEFccQ3NYrRW776pEk cjeanner manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 05 08:19:48 np0005546565.novalocal python3[5647]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDercCMGn8rW1C4P67tHgtflPdTeXlpyUJYH+6XDd2lR jgilaber@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 05 08:19:49 np0005546565.novalocal python3[5671]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIAMI6kkg9Wg0sG7jIJmyZemEBwUn1yzNpQQd3gnulOmZ adrianfuscoarnejo@gmail.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 05 08:19:49 np0005546565.novalocal python3[5695]: ansible-authorized_key Invoked with user=zuul state=present key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBPijwpQu/3jhhhBZInXNOLEH57DrknPc3PLbsRvYyJIFzwYjX+WD4a7+nGnMYS42MuZk6TJcVqgnqofVx4isoD4= ramishra@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 05 08:19:49 np0005546565.novalocal python3[5719]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIGpU/BepK3qX0NRf5Np+dOBDqzQEefhNrw2DCZaH3uWW rebtoor@monolith manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 05 08:19:49 np0005546565.novalocal python3[5743]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDK0iKdi8jQTpQrDdLVH/AAgLVYyTXF7AQ1gjc/5uT3t ykarel@yatinkarel manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 05 08:19:50 np0005546565.novalocal python3[5767]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIF/V/cLotA6LZeO32VL45Hd78skuA2lJA425Sm2LlQeZ fmount@horcrux manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 05 08:19:50 np0005546565.novalocal python3[5791]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDa7QCjuDMVmRPo1rREbGwzYeBCYVN+Ou/3WKXZEC6Sr manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 05 08:19:50 np0005546565.novalocal python3[5815]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAACAQCfNtF7NvKl915TGsGGoseUb06Hj8L/S4toWf0hExeY+F00woL6NvBlJD0nDct+P5a22I4EhvoQCRQ8reaPCm1lybR3uiRIJsj+8zkVvLwby9LXzfZorlNG9ofjd00FEmB09uW/YvTl6Q9XwwwX6tInzIOv3TMqTHHGOL74ibbj8J/FJR0cFEyj0z4WQRvtkh32xAHl83gbuINryMt0sqRI+clj2381NKL55DRLQrVw0gsfqqxiHAnXg21qWmc4J+b9e9kiuAFQjcjwTVkwJCcg3xbPwC/qokYRby/Y5S40UUd7/jEARGXT7RZgpzTuDd1oZiCVrnrqJNPaMNdVv5MLeFdf1B7iIe5aa/fGouX7AO4SdKhZUdnJmCFAGvjC6S3JMZ2wAcUl+OHnssfmdj7XL50cLo27vjuzMtLAgSqi6N99m92WCF2s8J9aVzszX7Xz9OKZCeGsiVJp3/NdABKzSEAyM9xBD/5Vho894Sav+otpySHe3p6RUTgbB5Zu8VyZRZ/UtB3ueXxyo764yrc6qWIDqrehm84Xm9g+/jpIBzGPl07NUNJpdt/6Sgf9RIKXw/7XypO5yZfUcuFNGTxLfqjTNrtgLZNcjfav6sSdVXVcMPL//XNuRdKmVFaO76eV/oGMQGr1fGcCD+N+CpI7+Q+fCNB6VFWG4nZFuI/Iuw== averdagu@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 05 08:19:50 np0005546565.novalocal python3[5839]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDq8l27xI+QlQVdS4djp9ogSoyrNE2+Ox6vKPdhSNL1J3PE5w+WCSvMz9A5gnNuH810zwbekEApbxTze/gLQJwBHA52CChfURpXrFaxY7ePXRElwKAL3mJfzBWY/c5jnNL9TCVmFJTGZkFZP3Nh+BMgZvL6xBkt3WKm6Uq18qzd9XeKcZusrA+O+uLv1fVeQnadY9RIqOCyeFYCzLWrUfTyE8x/XG0hAWIM7qpnF2cALQS2h9n4hW5ybiUN790H08wf9hFwEf5nxY9Z9dVkPFQiTSGKNBzmnCXU9skxS/xhpFjJ5duGSZdtAHe9O+nGZm9c67hxgtf8e5PDuqAdXEv2cf6e3VBAt+Bz8EKI3yosTj0oZHfwr42Yzb1l/SKy14Rggsrc9KAQlrGXan6+u2jcQqqx7l+SWmnpFiWTV9u5cWj2IgOhApOitmRBPYqk9rE2usfO0hLn/Pj/R/Nau4803e1/EikdLE7Ps95s9mX5jRDjAoUa2JwFF5RsVFyL910= ashigupt@ashigupt.remote.csb manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 05 08:19:51 np0005546565.novalocal python3[5863]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIOKLl0NYKwoZ/JY5KeZU8VwRAggeOxqQJeoqp3dsAaY9 manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 05 08:19:51 np0005546565.novalocal python3[5887]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIASASQOH2BcOyLKuuDOdWZlPi2orcjcA8q4400T73DLH evallesp@fedora manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 05 08:19:51 np0005546565.novalocal python3[5911]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAILeBWlamUph+jRKV2qrx1PGU7vWuGIt5+z9k96I8WehW amsinha@amsinha-mac manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 05 08:19:52 np0005546565.novalocal python3[5935]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIANvVgvJBlK3gb1yz5uef/JqIGq4HLEmY2dYA8e37swb morenod@redhat-laptop manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 05 08:19:52 np0005546565.novalocal python3[5959]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAACAQDZdI7t1cxYx65heVI24HTV4F7oQLW1zyfxHreL2TIJKxjyrUUKIFEUmTutcBlJRLNT2Eoix6x1sOw9YrchloCLcn//SGfTElr9mSc5jbjb7QXEU+zJMhtxyEJ1Po3CUGnj7ckiIXw7wcawZtrEOAQ9pH3ExYCJcEMiyNjRQZCxT3tPK+S4B95EWh5Fsrz9CkwpjNRPPH7LigCeQTM3Wc7r97utAslBUUvYceDSLA7rMgkitJE38b7rZBeYzsGQ8YYUBjTCtehqQXxCRjizbHWaaZkBU+N3zkKB6n/iCNGIO690NK7A/qb6msTijiz1PeuM8ThOsi9qXnbX5v0PoTpcFSojV7NHAQ71f0XXuS43FhZctT+Dcx44dT8Fb5vJu2cJGrk+qF8ZgJYNpRS7gPg0EG2EqjK7JMf9ULdjSu0r+KlqIAyLvtzT4eOnQipoKlb/WG5D/0ohKv7OMQ352ggfkBFIQsRXyyTCT98Ft9juqPuahi3CAQmP4H9dyE+7+Kz437PEtsxLmfm6naNmWi7Ee1DqWPwS8rEajsm4sNM4wW9gdBboJQtc0uZw0DfLj1I9r3Mc8Ol0jYtz0yNQDSzVLrGCaJlC311trU70tZ+ZkAVV6Mn8lOhSbj1cK0lvSr6ZK4dgqGl3I1eTZJJhbLNdg7UOVaiRx9543+C/p/As7w== brjackma@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 05 08:19:52 np0005546565.novalocal python3[5983]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIKwedoZ0TWPJX/z/4TAbO/kKcDZOQVgRH0hAqrL5UCI1 vcastell@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 05 08:19:52 np0005546565.novalocal python3[6007]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIEmv8sE8GCk6ZTPIqF0FQrttBdL3mq7rCm/IJy0xDFh7 michburk@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 05 08:19:53 np0005546565.novalocal python3[6031]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAICy6GpGEtwevXEEn4mmLR5lmSLe23dGgAvzkB9DMNbkf rsafrono@rsafrono manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 05 08:19:55 np0005546565.novalocal sudo[6055]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rthecbeagknpehxdylarjkadisowioqs ; /usr/bin/python3'
Dec 05 08:19:55 np0005546565.novalocal sudo[6055]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 08:19:55 np0005546565.novalocal python3[6057]: ansible-community.general.timezone Invoked with name=UTC hwclock=None
Dec 05 08:19:55 np0005546565.novalocal systemd[1]: Starting Time & Date Service...
Dec 05 08:19:56 np0005546565.novalocal systemd[1]: Started Time & Date Service.
Dec 05 08:19:56 np0005546565.novalocal systemd-timedated[6059]: Changed time zone to 'UTC' (UTC).
Dec 05 08:19:56 np0005546565.novalocal sudo[6055]: pam_unix(sudo:session): session closed for user root
Dec 05 08:19:57 np0005546565.novalocal sudo[6086]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-khauahhsngnsuktocbrepeevunkyyuaj ; /usr/bin/python3'
Dec 05 08:19:57 np0005546565.novalocal sudo[6086]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 08:19:57 np0005546565.novalocal python3[6088]: ansible-file Invoked with path=/etc/nodepool state=directory mode=511 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 08:19:57 np0005546565.novalocal sudo[6086]: pam_unix(sudo:session): session closed for user root
Dec 05 08:19:57 np0005546565.novalocal python3[6164]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/sub_nodes follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 05 08:19:58 np0005546565.novalocal python3[6235]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/sub_nodes src=/home/zuul/.ansible/tmp/ansible-tmp-1764922797.704123-252-216722578072067/source _original_basename=tmp5er6xx87 follow=False checksum=da39a3ee5e6b4b0d3255bfef95601890afd80709 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 08:19:58 np0005546565.novalocal python3[6335]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/sub_nodes_private follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 05 08:19:59 np0005546565.novalocal python3[6406]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/sub_nodes_private src=/home/zuul/.ansible/tmp/ansible-tmp-1764922798.593022-302-194751269741560/source _original_basename=tmpyvquz4ih follow=False checksum=da39a3ee5e6b4b0d3255bfef95601890afd80709 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 08:19:59 np0005546565.novalocal sudo[6506]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-puokomyenptvlrlmxngmdzvgbzwebhyg ; /usr/bin/python3'
Dec 05 08:19:59 np0005546565.novalocal sudo[6506]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 08:20:00 np0005546565.novalocal python3[6508]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/node_private follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 05 08:20:00 np0005546565.novalocal sudo[6506]: pam_unix(sudo:session): session closed for user root
Dec 05 08:20:00 np0005546565.novalocal sudo[6579]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cwrewbsxqnwpawdveqyzoklpsdulehph ; /usr/bin/python3'
Dec 05 08:20:00 np0005546565.novalocal sudo[6579]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 08:20:00 np0005546565.novalocal python3[6581]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/node_private src=/home/zuul/.ansible/tmp/ansible-tmp-1764922799.7464786-382-24331043427714/source _original_basename=tmp54arp5f_ follow=False checksum=7869a9f58d9df3124603f12d5ce0d9f14ed8cf76 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 08:20:00 np0005546565.novalocal sudo[6579]: pam_unix(sudo:session): session closed for user root
Dec 05 08:20:00 np0005546565.novalocal python3[6629]: ansible-ansible.legacy.command Invoked with _raw_params=cp .ssh/id_rsa /etc/nodepool/id_rsa zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 05 08:20:01 np0005546565.novalocal python3[6655]: ansible-ansible.legacy.command Invoked with _raw_params=cp .ssh/id_rsa.pub /etc/nodepool/id_rsa.pub zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 05 08:20:01 np0005546565.novalocal sudo[6733]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bmvlgyngdzkuabboocqksmnbammmiqvr ; /usr/bin/python3'
Dec 05 08:20:01 np0005546565.novalocal sudo[6733]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 08:20:01 np0005546565.novalocal python3[6735]: ansible-ansible.legacy.stat Invoked with path=/etc/sudoers.d/zuul-sudo-grep follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 05 08:20:01 np0005546565.novalocal sudo[6733]: pam_unix(sudo:session): session closed for user root
Dec 05 08:20:01 np0005546565.novalocal sudo[6806]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wijeopmbupmiwxczhbydrgcjvplljjbg ; /usr/bin/python3'
Dec 05 08:20:01 np0005546565.novalocal sudo[6806]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 08:20:01 np0005546565.novalocal python3[6808]: ansible-ansible.legacy.copy Invoked with dest=/etc/sudoers.d/zuul-sudo-grep mode=288 src=/home/zuul/.ansible/tmp/ansible-tmp-1764922801.403022-452-151671038450844/source _original_basename=tmpsk7e2qbg follow=False checksum=bdca1a77493d00fb51567671791f4aa30f66c2f0 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 08:20:02 np0005546565.novalocal sudo[6806]: pam_unix(sudo:session): session closed for user root
Dec 05 08:20:02 np0005546565.novalocal sudo[6857]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cwmgqmzpokcpwvncdwwriammmctzqlmk ; /usr/bin/python3'
Dec 05 08:20:02 np0005546565.novalocal sudo[6857]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 08:20:02 np0005546565.novalocal python3[6859]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/visudo -c zuul_log_id=fa163e3b-3c83-edfb-00da-00000000001f-1-compute1 zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 05 08:20:02 np0005546565.novalocal sudo[6857]: pam_unix(sudo:session): session closed for user root
Dec 05 08:20:03 np0005546565.novalocal python3[6887]: ansible-ansible.legacy.command Invoked with executable=/bin/bash _raw_params=env
                                                       _uses_shell=True zuul_log_id=fa163e3b-3c83-edfb-00da-000000000020-1-compute1 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None creates=None removes=None stdin=None
Dec 05 08:20:04 np0005546565.novalocal python3[6915]: ansible-file Invoked with path=/home/zuul/workspace state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 08:20:24 np0005546565.novalocal sudo[6939]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gpurthosnplayrvjdyqfadxegonrafph ; /usr/bin/python3'
Dec 05 08:20:24 np0005546565.novalocal sudo[6939]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 08:20:25 np0005546565.novalocal python3[6941]: ansible-ansible.builtin.file Invoked with path=/etc/ci/env state=directory mode=0755 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 08:20:25 np0005546565.novalocal sudo[6939]: pam_unix(sudo:session): session closed for user root
Dec 05 08:20:26 np0005546565.novalocal systemd[1]: systemd-timedated.service: Deactivated successfully.
Dec 05 08:21:25 np0005546565.novalocal sshd-session[4309]: Received disconnect from 38.102.83.114 port 44646:11: disconnected by user
Dec 05 08:21:25 np0005546565.novalocal sshd-session[4309]: Disconnected from user zuul 38.102.83.114 port 44646
Dec 05 08:21:25 np0005546565.novalocal sshd-session[4296]: pam_unix(sshd:session): session closed for user zuul
Dec 05 08:21:25 np0005546565.novalocal systemd-logind[807]: Session 1 logged out. Waiting for processes to exit.
Dec 05 08:21:27 np0005546565.novalocal kernel: pci 0000:00:07.0: [1af4:1000] type 00 class 0x020000 conventional PCI endpoint
Dec 05 08:21:27 np0005546565.novalocal kernel: pci 0000:00:07.0: BAR 0 [io  0x0000-0x003f]
Dec 05 08:21:27 np0005546565.novalocal kernel: pci 0000:00:07.0: BAR 1 [mem 0x00000000-0x00000fff]
Dec 05 08:21:27 np0005546565.novalocal kernel: pci 0000:00:07.0: BAR 4 [mem 0x00000000-0x00003fff 64bit pref]
Dec 05 08:21:27 np0005546565.novalocal kernel: pci 0000:00:07.0: ROM [mem 0x00000000-0x0007ffff pref]
Dec 05 08:21:27 np0005546565.novalocal kernel: pci 0000:00:07.0: ROM [mem 0xc0000000-0xc007ffff pref]: assigned
Dec 05 08:21:27 np0005546565.novalocal kernel: pci 0000:00:07.0: BAR 4 [mem 0x240000000-0x240003fff 64bit pref]: assigned
Dec 05 08:21:27 np0005546565.novalocal kernel: pci 0000:00:07.0: BAR 1 [mem 0xc0080000-0xc0080fff]: assigned
Dec 05 08:21:27 np0005546565.novalocal kernel: pci 0000:00:07.0: BAR 0 [io  0x1000-0x103f]: assigned
Dec 05 08:21:27 np0005546565.novalocal kernel: virtio-pci 0000:00:07.0: enabling device (0000 -> 0003)
Dec 05 08:21:27 np0005546565.novalocal NetworkManager[860]: <info>  [1764922887.3833] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3)
Dec 05 08:21:27 np0005546565.novalocal systemd-udevd[6944]: Network interface NamePolicy= disabled on kernel command line.
Dec 05 08:21:27 np0005546565.novalocal NetworkManager[860]: <info>  [1764922887.4043] device (eth1): state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Dec 05 08:21:27 np0005546565.novalocal NetworkManager[860]: <info>  [1764922887.4077] settings: (eth1): created default wired connection 'Wired connection 1'
Dec 05 08:21:27 np0005546565.novalocal NetworkManager[860]: <info>  [1764922887.4080] device (eth1): carrier: link connected
Dec 05 08:21:27 np0005546565.novalocal NetworkManager[860]: <info>  [1764922887.4081] device (eth1): state change: unavailable -> disconnected (reason 'carrier-changed', managed-type: 'full')
Dec 05 08:21:27 np0005546565.novalocal NetworkManager[860]: <info>  [1764922887.4086] policy: auto-activating connection 'Wired connection 1' (b6c4a16e-f9a7-3917-b1a2-dbfcd5a6a2e4)
Dec 05 08:21:27 np0005546565.novalocal NetworkManager[860]: <info>  [1764922887.4089] device (eth1): Activation: starting connection 'Wired connection 1' (b6c4a16e-f9a7-3917-b1a2-dbfcd5a6a2e4)
Dec 05 08:21:27 np0005546565.novalocal NetworkManager[860]: <info>  [1764922887.4090] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Dec 05 08:21:27 np0005546565.novalocal NetworkManager[860]: <info>  [1764922887.4092] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'full')
Dec 05 08:21:27 np0005546565.novalocal NetworkManager[860]: <info>  [1764922887.4095] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'full')
Dec 05 08:21:27 np0005546565.novalocal NetworkManager[860]: <info>  [1764922887.4099] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Dec 05 08:21:27 np0005546565.novalocal systemd[4300]: Starting Mark boot as successful...
Dec 05 08:21:27 np0005546565.novalocal systemd[4300]: Finished Mark boot as successful.
Dec 05 08:21:28 np0005546565.novalocal sshd-session[6949]: Accepted publickey for zuul from 38.102.83.114 port 58430 ssh2: RSA SHA256:7++2dNw2PJkt7ZbXTGnBEG3OOx6HMlhXSSRVYWpGqmQ
Dec 05 08:21:28 np0005546565.novalocal systemd-logind[807]: New session 3 of user zuul.
Dec 05 08:21:28 np0005546565.novalocal systemd[1]: Started Session 3 of User zuul.
Dec 05 08:21:28 np0005546565.novalocal sshd-session[6949]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 05 08:21:28 np0005546565.novalocal python3[6976]: ansible-ansible.legacy.command Invoked with _raw_params=ip -j link zuul_log_id=fa163e3b-3c83-f9fe-4324-000000000189-0-controller zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 05 08:21:35 np0005546565.novalocal sudo[7054]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fvduypkphdhfvlkktgpssehtyitfafba ; OS_CLOUD=vexxhost /usr/bin/python3'
Dec 05 08:21:35 np0005546565.novalocal sudo[7054]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 08:21:35 np0005546565.novalocal python3[7056]: ansible-ansible.legacy.stat Invoked with path=/etc/NetworkManager/system-connections/ci-private-network.nmconnection follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 05 08:21:35 np0005546565.novalocal sudo[7054]: pam_unix(sudo:session): session closed for user root
Dec 05 08:21:35 np0005546565.novalocal sudo[7127]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jaicrvdpdqiipdcfjsepigfdlgtnbfgq ; OS_CLOUD=vexxhost /usr/bin/python3'
Dec 05 08:21:35 np0005546565.novalocal sudo[7127]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 08:21:36 np0005546565.novalocal python3[7129]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764922895.4157531-155-220659833836560/source dest=/etc/NetworkManager/system-connections/ci-private-network.nmconnection mode=0600 owner=root group=root follow=False _original_basename=bootstrap-ci-network-nm-connection.nmconnection.j2 checksum=bdff7176a3cbce76837ea7b060f07f8ed7e48c0d backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 08:21:36 np0005546565.novalocal sudo[7127]: pam_unix(sudo:session): session closed for user root
Dec 05 08:21:36 np0005546565.novalocal sudo[7177]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kdjeodxdiqmqdxavunftdpsvptgbpcpt ; OS_CLOUD=vexxhost /usr/bin/python3'
Dec 05 08:21:36 np0005546565.novalocal sudo[7177]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 08:21:36 np0005546565.novalocal python3[7179]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 05 08:21:36 np0005546565.novalocal systemd[1]: NetworkManager-wait-online.service: Deactivated successfully.
Dec 05 08:21:36 np0005546565.novalocal systemd[1]: Stopped Network Manager Wait Online.
Dec 05 08:21:36 np0005546565.novalocal systemd[1]: Stopping Network Manager Wait Online...
Dec 05 08:21:36 np0005546565.novalocal systemd[1]: Stopping Network Manager...
Dec 05 08:21:36 np0005546565.novalocal NetworkManager[860]: <info>  [1764922896.6506] caught SIGTERM, shutting down normally.
Dec 05 08:21:36 np0005546565.novalocal NetworkManager[860]: <info>  [1764922896.6523] dhcp4 (eth0): canceled DHCP transaction
Dec 05 08:21:36 np0005546565.novalocal NetworkManager[860]: <info>  [1764922896.6524] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Dec 05 08:21:36 np0005546565.novalocal NetworkManager[860]: <info>  [1764922896.6524] dhcp4 (eth0): state changed no lease
Dec 05 08:21:36 np0005546565.novalocal NetworkManager[860]: <info>  [1764922896.6527] manager: NetworkManager state is now CONNECTING
Dec 05 08:21:36 np0005546565.novalocal NetworkManager[860]: <info>  [1764922896.6666] dhcp4 (eth1): canceled DHCP transaction
Dec 05 08:21:36 np0005546565.novalocal NetworkManager[860]: <info>  [1764922896.6667] dhcp4 (eth1): state changed no lease
Dec 05 08:21:36 np0005546565.novalocal systemd[1]: Starting Network Manager Script Dispatcher Service...
Dec 05 08:21:36 np0005546565.novalocal NetworkManager[860]: <info>  [1764922896.6794] exiting (success)
Dec 05 08:21:36 np0005546565.novalocal systemd[1]: Started Network Manager Script Dispatcher Service.
Dec 05 08:21:36 np0005546565.novalocal systemd[1]: NetworkManager.service: Deactivated successfully.
Dec 05 08:21:36 np0005546565.novalocal systemd[1]: Stopped Network Manager.
Dec 05 08:21:36 np0005546565.novalocal systemd[1]: NetworkManager.service: Consumed 1.155s CPU time, 9.9M memory peak.
Dec 05 08:21:36 np0005546565.novalocal systemd[1]: Starting Network Manager...
Dec 05 08:21:36 np0005546565.novalocal NetworkManager[7190]: <info>  [1764922896.7274] NetworkManager (version 1.54.1-1.el9) is starting... (after a restart, boot:37103174-0a80-476d-aa28-333d5ef7214b)
Dec 05 08:21:36 np0005546565.novalocal NetworkManager[7190]: <info>  [1764922896.7275] Read config: /etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf
Dec 05 08:21:36 np0005546565.novalocal NetworkManager[7190]: <info>  [1764922896.7326] manager[0x5641950fb070]: monitoring kernel firmware directory '/lib/firmware'.
Dec 05 08:21:36 np0005546565.novalocal systemd[1]: Starting Hostname Service...
Dec 05 08:21:36 np0005546565.novalocal systemd[1]: Started Hostname Service.
Dec 05 08:21:36 np0005546565.novalocal NetworkManager[7190]: <info>  [1764922896.8126] hostname: hostname: using hostnamed
Dec 05 08:21:36 np0005546565.novalocal NetworkManager[7190]: <info>  [1764922896.8127] hostname: static hostname changed from (none) to "np0005546565.novalocal"
Dec 05 08:21:36 np0005546565.novalocal NetworkManager[7190]: <info>  [1764922896.8133] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto)
Dec 05 08:21:36 np0005546565.novalocal NetworkManager[7190]: <info>  [1764922896.8139] manager[0x5641950fb070]: rfkill: Wi-Fi hardware radio set enabled
Dec 05 08:21:36 np0005546565.novalocal NetworkManager[7190]: <info>  [1764922896.8140] manager[0x5641950fb070]: rfkill: WWAN hardware radio set enabled
Dec 05 08:21:36 np0005546565.novalocal NetworkManager[7190]: <info>  [1764922896.8186] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-device-plugin-team.so)
Dec 05 08:21:36 np0005546565.novalocal NetworkManager[7190]: <info>  [1764922896.8187] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file
Dec 05 08:21:36 np0005546565.novalocal NetworkManager[7190]: <info>  [1764922896.8188] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file
Dec 05 08:21:36 np0005546565.novalocal NetworkManager[7190]: <info>  [1764922896.8188] manager: Networking is enabled by state file
Dec 05 08:21:36 np0005546565.novalocal NetworkManager[7190]: <info>  [1764922896.8191] settings: Loaded settings plugin: keyfile (internal)
Dec 05 08:21:36 np0005546565.novalocal NetworkManager[7190]: <info>  [1764922896.8197] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-settings-plugin-ifcfg-rh.so")
Dec 05 08:21:36 np0005546565.novalocal NetworkManager[7190]: <info>  [1764922896.8241] Warning: the ifcfg-rh plugin is deprecated, please migrate connections to the keyfile format using "nmcli connection migrate"
Dec 05 08:21:36 np0005546565.novalocal NetworkManager[7190]: <info>  [1764922896.8256] dhcp: init: Using DHCP client 'internal'
Dec 05 08:21:36 np0005546565.novalocal NetworkManager[7190]: <info>  [1764922896.8261] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1)
Dec 05 08:21:36 np0005546565.novalocal NetworkManager[7190]: <info>  [1764922896.8270] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec 05 08:21:36 np0005546565.novalocal NetworkManager[7190]: <info>  [1764922896.8280] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'external')
Dec 05 08:21:36 np0005546565.novalocal NetworkManager[7190]: <info>  [1764922896.8294] device (lo): Activation: starting connection 'lo' (2b3ccb97-e960-48b1-9417-7b23d43663c4)
Dec 05 08:21:36 np0005546565.novalocal NetworkManager[7190]: <info>  [1764922896.8306] device (eth0): carrier: link connected
Dec 05 08:21:36 np0005546565.novalocal NetworkManager[7190]: <info>  [1764922896.8313] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2)
Dec 05 08:21:36 np0005546565.novalocal NetworkManager[7190]: <info>  [1764922896.8321] manager: (eth0): assume: will attempt to assume matching connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03) (indicated)
Dec 05 08:21:36 np0005546565.novalocal NetworkManager[7190]: <info>  [1764922896.8322] device (eth0): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Dec 05 08:21:36 np0005546565.novalocal NetworkManager[7190]: <info>  [1764922896.8334] device (eth0): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Dec 05 08:21:36 np0005546565.novalocal NetworkManager[7190]: <info>  [1764922896.8346] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Dec 05 08:21:36 np0005546565.novalocal NetworkManager[7190]: <info>  [1764922896.8356] device (eth1): carrier: link connected
Dec 05 08:21:36 np0005546565.novalocal NetworkManager[7190]: <info>  [1764922896.8363] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3)
Dec 05 08:21:36 np0005546565.novalocal NetworkManager[7190]: <info>  [1764922896.8372] manager: (eth1): assume: will attempt to assume matching connection 'Wired connection 1' (b6c4a16e-f9a7-3917-b1a2-dbfcd5a6a2e4) (indicated)
Dec 05 08:21:36 np0005546565.novalocal NetworkManager[7190]: <info>  [1764922896.8372] device (eth1): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Dec 05 08:21:36 np0005546565.novalocal NetworkManager[7190]: <info>  [1764922896.8382] device (eth1): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Dec 05 08:21:36 np0005546565.novalocal NetworkManager[7190]: <info>  [1764922896.8393] device (eth1): Activation: starting connection 'Wired connection 1' (b6c4a16e-f9a7-3917-b1a2-dbfcd5a6a2e4)
Dec 05 08:21:36 np0005546565.novalocal systemd[1]: Started Network Manager.
Dec 05 08:21:36 np0005546565.novalocal NetworkManager[7190]: <info>  [1764922896.8403] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager"
Dec 05 08:21:36 np0005546565.novalocal NetworkManager[7190]: <info>  [1764922896.8414] device (lo): state change: disconnected -> prepare (reason 'none', managed-type: 'external')
Dec 05 08:21:36 np0005546565.novalocal NetworkManager[7190]: <info>  [1764922896.8419] device (lo): state change: prepare -> config (reason 'none', managed-type: 'external')
Dec 05 08:21:36 np0005546565.novalocal NetworkManager[7190]: <info>  [1764922896.8422] device (lo): state change: config -> ip-config (reason 'none', managed-type: 'external')
Dec 05 08:21:36 np0005546565.novalocal NetworkManager[7190]: <info>  [1764922896.8427] device (eth0): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Dec 05 08:21:36 np0005546565.novalocal NetworkManager[7190]: <info>  [1764922896.8432] device (eth0): state change: prepare -> config (reason 'none', managed-type: 'assume')
Dec 05 08:21:36 np0005546565.novalocal NetworkManager[7190]: <info>  [1764922896.8437] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Dec 05 08:21:36 np0005546565.novalocal NetworkManager[7190]: <info>  [1764922896.8441] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'assume')
Dec 05 08:21:36 np0005546565.novalocal NetworkManager[7190]: <info>  [1764922896.8447] device (lo): state change: ip-config -> ip-check (reason 'none', managed-type: 'external')
Dec 05 08:21:36 np0005546565.novalocal NetworkManager[7190]: <info>  [1764922896.8459] device (eth0): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Dec 05 08:21:36 np0005546565.novalocal NetworkManager[7190]: <info>  [1764922896.8463] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Dec 05 08:21:36 np0005546565.novalocal NetworkManager[7190]: <info>  [1764922896.8480] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Dec 05 08:21:36 np0005546565.novalocal NetworkManager[7190]: <info>  [1764922896.8489] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Dec 05 08:21:36 np0005546565.novalocal systemd[1]: Starting Network Manager Wait Online...
Dec 05 08:21:36 np0005546565.novalocal NetworkManager[7190]: <info>  [1764922896.8508] device (lo): state change: ip-check -> secondaries (reason 'none', managed-type: 'external')
Dec 05 08:21:36 np0005546565.novalocal NetworkManager[7190]: <info>  [1764922896.8514] device (lo): state change: secondaries -> activated (reason 'none', managed-type: 'external')
Dec 05 08:21:36 np0005546565.novalocal NetworkManager[7190]: <info>  [1764922896.8519] device (lo): Activation: successful, device activated.
Dec 05 08:21:36 np0005546565.novalocal NetworkManager[7190]: <info>  [1764922896.8528] dhcp4 (eth0): state changed new lease, address=38.102.83.154
Dec 05 08:21:36 np0005546565.novalocal NetworkManager[7190]: <info>  [1764922896.8537] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS
Dec 05 08:21:36 np0005546565.novalocal NetworkManager[7190]: <info>  [1764922896.8676] device (eth0): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Dec 05 08:21:36 np0005546565.novalocal NetworkManager[7190]: <info>  [1764922896.8712] device (eth0): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Dec 05 08:21:36 np0005546565.novalocal NetworkManager[7190]: <info>  [1764922896.8713] device (eth0): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Dec 05 08:21:36 np0005546565.novalocal NetworkManager[7190]: <info>  [1764922896.8717] manager: NetworkManager state is now CONNECTED_SITE
Dec 05 08:21:36 np0005546565.novalocal NetworkManager[7190]: <info>  [1764922896.8720] device (eth0): Activation: successful, device activated.
Dec 05 08:21:36 np0005546565.novalocal NetworkManager[7190]: <info>  [1764922896.8724] manager: NetworkManager state is now CONNECTED_GLOBAL
Dec 05 08:21:36 np0005546565.novalocal sudo[7177]: pam_unix(sudo:session): session closed for user root
Dec 05 08:21:37 np0005546565.novalocal python3[7263]: ansible-ansible.legacy.command Invoked with _raw_params=ip route zuul_log_id=fa163e3b-3c83-f9fe-4324-0000000000c8-0-controller zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 05 08:21:46 np0005546565.novalocal systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Dec 05 08:22:06 np0005546565.novalocal systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Dec 05 08:22:21 np0005546565.novalocal NetworkManager[7190]: <info>  [1764922941.9436] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Dec 05 08:22:21 np0005546565.novalocal systemd[1]: Starting Network Manager Script Dispatcher Service...
Dec 05 08:22:21 np0005546565.novalocal systemd[1]: Started Network Manager Script Dispatcher Service.
Dec 05 08:22:21 np0005546565.novalocal NetworkManager[7190]: <info>  [1764922941.9695] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Dec 05 08:22:21 np0005546565.novalocal NetworkManager[7190]: <info>  [1764922941.9697] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Dec 05 08:22:21 np0005546565.novalocal NetworkManager[7190]: <info>  [1764922941.9705] device (eth1): Activation: successful, device activated.
Dec 05 08:22:21 np0005546565.novalocal NetworkManager[7190]: <info>  [1764922941.9711] manager: startup complete
Dec 05 08:22:21 np0005546565.novalocal NetworkManager[7190]: <info>  [1764922941.9714] device (eth1): state change: activated -> failed (reason 'ip-config-unavailable', managed-type: 'full')
Dec 05 08:22:21 np0005546565.novalocal NetworkManager[7190]: <warn>  [1764922941.9719] device (eth1): Activation: failed for connection 'Wired connection 1'
Dec 05 08:22:21 np0005546565.novalocal NetworkManager[7190]: <info>  [1764922941.9725] device (eth1): state change: failed -> disconnected (reason 'none', managed-type: 'full')
Dec 05 08:22:21 np0005546565.novalocal systemd[1]: Finished Network Manager Wait Online.
Dec 05 08:22:21 np0005546565.novalocal NetworkManager[7190]: <info>  [1764922941.9806] dhcp4 (eth1): canceled DHCP transaction
Dec 05 08:22:21 np0005546565.novalocal NetworkManager[7190]: <info>  [1764922941.9806] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Dec 05 08:22:21 np0005546565.novalocal NetworkManager[7190]: <info>  [1764922941.9807] dhcp4 (eth1): state changed no lease
Dec 05 08:22:21 np0005546565.novalocal NetworkManager[7190]: <info>  [1764922941.9821] policy: auto-activating connection 'ci-private-network' (91a49ed8-302a-59cf-a590-ba724ca6d638)
Dec 05 08:22:21 np0005546565.novalocal NetworkManager[7190]: <info>  [1764922941.9825] device (eth1): Activation: starting connection 'ci-private-network' (91a49ed8-302a-59cf-a590-ba724ca6d638)
Dec 05 08:22:21 np0005546565.novalocal NetworkManager[7190]: <info>  [1764922941.9826] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Dec 05 08:22:21 np0005546565.novalocal NetworkManager[7190]: <info>  [1764922941.9828] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'full')
Dec 05 08:22:21 np0005546565.novalocal NetworkManager[7190]: <info>  [1764922941.9834] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'full')
Dec 05 08:22:21 np0005546565.novalocal NetworkManager[7190]: <info>  [1764922941.9841] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Dec 05 08:22:21 np0005546565.novalocal NetworkManager[7190]: <info>  [1764922941.9949] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Dec 05 08:22:21 np0005546565.novalocal NetworkManager[7190]: <info>  [1764922941.9951] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'full')
Dec 05 08:22:21 np0005546565.novalocal NetworkManager[7190]: <info>  [1764922941.9959] device (eth1): Activation: successful, device activated.
Dec 05 08:22:32 np0005546565.novalocal systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Dec 05 08:22:37 np0005546565.novalocal sshd-session[6952]: Received disconnect from 38.102.83.114 port 58430:11: disconnected by user
Dec 05 08:22:37 np0005546565.novalocal sshd-session[6952]: Disconnected from user zuul 38.102.83.114 port 58430
Dec 05 08:22:37 np0005546565.novalocal sshd-session[6949]: pam_unix(sshd:session): session closed for user zuul
Dec 05 08:22:37 np0005546565.novalocal systemd[1]: session-3.scope: Deactivated successfully.
Dec 05 08:22:37 np0005546565.novalocal systemd[1]: session-3.scope: Consumed 1.555s CPU time.
Dec 05 08:22:37 np0005546565.novalocal systemd-logind[807]: Session 3 logged out. Waiting for processes to exit.
Dec 05 08:22:37 np0005546565.novalocal systemd-logind[807]: Removed session 3.
Dec 05 08:23:05 np0005546565.novalocal sshd-session[7292]: Accepted publickey for zuul from 38.102.83.114 port 46118 ssh2: RSA SHA256:7++2dNw2PJkt7ZbXTGnBEG3OOx6HMlhXSSRVYWpGqmQ
Dec 05 08:23:05 np0005546565.novalocal systemd-logind[807]: New session 4 of user zuul.
Dec 05 08:23:05 np0005546565.novalocal systemd[1]: Started Session 4 of User zuul.
Dec 05 08:23:05 np0005546565.novalocal sshd-session[7292]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 05 08:23:05 np0005546565.novalocal sudo[7371]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cxbxqddvfxaebcdgwvqtqtpilefzuund ; OS_CLOUD=vexxhost /usr/bin/python3'
Dec 05 08:23:05 np0005546565.novalocal sudo[7371]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 08:23:05 np0005546565.novalocal python3[7373]: ansible-ansible.legacy.stat Invoked with path=/etc/ci/env/networking-info.yml follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 05 08:23:05 np0005546565.novalocal sudo[7371]: pam_unix(sudo:session): session closed for user root
Dec 05 08:23:06 np0005546565.novalocal sudo[7444]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-azvklwcidtudgwfgzansadmhmxlucezc ; OS_CLOUD=vexxhost /usr/bin/python3'
Dec 05 08:23:06 np0005546565.novalocal sudo[7444]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 08:23:06 np0005546565.novalocal python3[7446]: ansible-ansible.legacy.copy Invoked with dest=/etc/ci/env/networking-info.yml owner=root group=root mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764922985.6448107-365-139882683834797/source _original_basename=tmpmo4rhzge follow=False checksum=0e17567cf5e60a1794ca810572c7a4d8296e7102 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 08:23:06 np0005546565.novalocal sudo[7444]: pam_unix(sudo:session): session closed for user root
Dec 05 08:23:09 np0005546565.novalocal sshd-session[7295]: Connection closed by 38.102.83.114 port 46118
Dec 05 08:23:09 np0005546565.novalocal sshd-session[7292]: pam_unix(sshd:session): session closed for user zuul
Dec 05 08:23:09 np0005546565.novalocal systemd[1]: session-4.scope: Deactivated successfully.
Dec 05 08:23:09 np0005546565.novalocal systemd-logind[807]: Session 4 logged out. Waiting for processes to exit.
Dec 05 08:23:09 np0005546565.novalocal systemd-logind[807]: Removed session 4.
Dec 05 08:24:38 np0005546565.novalocal systemd[4300]: Created slice User Background Tasks Slice.
Dec 05 08:24:38 np0005546565.novalocal systemd[4300]: Starting Cleanup of User's Temporary Files and Directories...
Dec 05 08:24:38 np0005546565.novalocal systemd[4300]: Finished Cleanup of User's Temporary Files and Directories.
Dec 05 08:28:08 np0005546565.novalocal sshd-session[7476]: Connection reset by authenticating user root 45.140.17.124 port 20494 [preauth]
Dec 05 08:28:10 np0005546565.novalocal sshd-session[7478]: Connection reset by authenticating user root 45.140.17.124 port 20514 [preauth]
Dec 05 08:28:13 np0005546565.novalocal sshd-session[7480]: Connection reset by authenticating user root 45.140.17.124 port 20542 [preauth]
Dec 05 08:28:14 np0005546565.novalocal sshd-session[7485]: Accepted publickey for zuul from 38.102.83.114 port 57370 ssh2: RSA SHA256:7++2dNw2PJkt7ZbXTGnBEG3OOx6HMlhXSSRVYWpGqmQ
Dec 05 08:28:14 np0005546565.novalocal systemd-logind[807]: New session 5 of user zuul.
Dec 05 08:28:14 np0005546565.novalocal systemd[1]: Started Session 5 of User zuul.
Dec 05 08:28:14 np0005546565.novalocal sshd-session[7485]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 05 08:28:14 np0005546565.novalocal sudo[7512]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oaailwjgclplokypyncjabfbzqcjccxg ; /usr/bin/python3'
Dec 05 08:28:14 np0005546565.novalocal sudo[7512]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 08:28:15 np0005546565.novalocal python3[7514]: ansible-ansible.legacy.command Invoked with _raw_params=lsblk -nd -o MAJ:MIN /dev/vda
                                                       _uses_shell=True zuul_log_id=fa163e3b-3c83-694a-7516-000000001cd2-1-compute1 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 05 08:28:15 np0005546565.novalocal sudo[7512]: pam_unix(sudo:session): session closed for user root
Dec 05 08:28:15 np0005546565.novalocal sudo[7540]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kblfnwjsvoyomaebjgrfyrwaaeftwpfz ; /usr/bin/python3'
Dec 05 08:28:15 np0005546565.novalocal sudo[7540]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 08:28:15 np0005546565.novalocal sshd-session[7482]: Connection reset by authenticating user root 45.140.17.124 port 65424 [preauth]
Dec 05 08:28:15 np0005546565.novalocal python3[7542]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/init.scope state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 08:28:15 np0005546565.novalocal sudo[7540]: pam_unix(sudo:session): session closed for user root
Dec 05 08:28:15 np0005546565.novalocal sudo[7568]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-snzragntmudcxpotzqjuqctzgdlougeh ; /usr/bin/python3'
Dec 05 08:28:15 np0005546565.novalocal sudo[7568]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 08:28:15 np0005546565.novalocal python3[7570]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/machine.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 08:28:15 np0005546565.novalocal sudo[7568]: pam_unix(sudo:session): session closed for user root
Dec 05 08:28:15 np0005546565.novalocal sudo[7595]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xwbwccegjgbrhuqddotskavptgwoksqo ; /usr/bin/python3'
Dec 05 08:28:15 np0005546565.novalocal sudo[7595]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 08:28:16 np0005546565.novalocal python3[7597]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/system.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 08:28:16 np0005546565.novalocal sudo[7595]: pam_unix(sudo:session): session closed for user root
Dec 05 08:28:16 np0005546565.novalocal sudo[7621]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ahutadvfyzxmxywqwzmhzfxneeavrrhw ; /usr/bin/python3'
Dec 05 08:28:16 np0005546565.novalocal sudo[7621]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 08:28:16 np0005546565.novalocal python3[7623]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/user.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 08:28:16 np0005546565.novalocal sudo[7621]: pam_unix(sudo:session): session closed for user root
Dec 05 08:28:16 np0005546565.novalocal sudo[7647]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-izewcivgofgeljzijgefezswbabcyixp ; /usr/bin/python3'
Dec 05 08:28:16 np0005546565.novalocal sudo[7647]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 08:28:16 np0005546565.novalocal python3[7649]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system.conf.d state=directory mode=0755 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 08:28:17 np0005546565.novalocal sudo[7647]: pam_unix(sudo:session): session closed for user root
Dec 05 08:28:17 np0005546565.novalocal sudo[7725]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cngwqztpaioazjmnyinvqkblcsbxfzfo ; /usr/bin/python3'
Dec 05 08:28:17 np0005546565.novalocal sudo[7725]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 08:28:17 np0005546565.novalocal python3[7727]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system.conf.d/override.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 05 08:28:17 np0005546565.novalocal sudo[7725]: pam_unix(sudo:session): session closed for user root
Dec 05 08:28:17 np0005546565.novalocal sshd-session[7565]: Connection reset by authenticating user root 45.140.17.124 port 65460 [preauth]
Dec 05 08:28:17 np0005546565.novalocal sudo[7798]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fyjuoioixrkrkadclcgxpeuqiggwcaeb ; /usr/bin/python3'
Dec 05 08:28:17 np0005546565.novalocal sudo[7798]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 08:28:17 np0005546565.novalocal python3[7800]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system.conf.d/override.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764923297.2060106-510-81384874551472/source _original_basename=tmpfd6r3aoy follow=False checksum=a05098bd3d2321238ea1169d0e6f135b35b392d4 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 08:28:17 np0005546565.novalocal sudo[7798]: pam_unix(sudo:session): session closed for user root
Dec 05 08:28:18 np0005546565.novalocal sudo[7848]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-albmymvtvhwddibezmdbqrnqeeuafzkd ; /usr/bin/python3'
Dec 05 08:28:18 np0005546565.novalocal sudo[7848]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 08:28:19 np0005546565.novalocal python3[7850]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Dec 05 08:28:19 np0005546565.novalocal systemd[1]: Reloading.
Dec 05 08:28:19 np0005546565.novalocal systemd-rc-local-generator[7870]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 05 08:28:19 np0005546565.novalocal sudo[7848]: pam_unix(sudo:session): session closed for user root
Dec 05 08:28:20 np0005546565.novalocal sudo[7903]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gpcinljxbdlfqxdgjaeyxfsiaoutvxzm ; /usr/bin/python3'
Dec 05 08:28:20 np0005546565.novalocal sudo[7903]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 08:28:20 np0005546565.novalocal python3[7905]: ansible-ansible.builtin.wait_for Invoked with path=/sys/fs/cgroup/system.slice/io.max state=present timeout=30 host=127.0.0.1 connect_timeout=5 delay=0 active_connection_states=['ESTABLISHED', 'FIN_WAIT1', 'FIN_WAIT2', 'SYN_RECV', 'SYN_SENT', 'TIME_WAIT'] sleep=1 port=None search_regex=None exclude_hosts=None msg=None
Dec 05 08:28:20 np0005546565.novalocal sudo[7903]: pam_unix(sudo:session): session closed for user root
Dec 05 08:28:21 np0005546565.novalocal sudo[7929]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sykcqeiwuyejeouybhwbefqslnzsisjr ; /usr/bin/python3'
Dec 05 08:28:21 np0005546565.novalocal sudo[7929]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 08:28:21 np0005546565.novalocal python3[7931]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/init.scope/io.max
                                                       _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 05 08:28:21 np0005546565.novalocal sudo[7929]: pam_unix(sudo:session): session closed for user root
Dec 05 08:28:21 np0005546565.novalocal sudo[7957]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eqromgwdcaglthvnqnsfzsikiulorcdq ; /usr/bin/python3'
Dec 05 08:28:21 np0005546565.novalocal sudo[7957]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 08:28:21 np0005546565.novalocal python3[7959]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/machine.slice/io.max
                                                       _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 05 08:28:21 np0005546565.novalocal sudo[7957]: pam_unix(sudo:session): session closed for user root
Dec 05 08:28:21 np0005546565.novalocal sudo[7985]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tboooehijifqvrtxgrthhvkzpojyuudb ; /usr/bin/python3'
Dec 05 08:28:21 np0005546565.novalocal sudo[7985]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 08:28:21 np0005546565.novalocal python3[7987]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/system.slice/io.max
                                                       _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 05 08:28:21 np0005546565.novalocal sudo[7985]: pam_unix(sudo:session): session closed for user root
Dec 05 08:28:21 np0005546565.novalocal sudo[8013]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-flikzzcjavklhljakhpxsglasdszcdvn ; /usr/bin/python3'
Dec 05 08:28:21 np0005546565.novalocal sudo[8013]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 08:28:22 np0005546565.novalocal python3[8015]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/user.slice/io.max
                                                       _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 05 08:28:22 np0005546565.novalocal sudo[8013]: pam_unix(sudo:session): session closed for user root
Dec 05 08:28:23 np0005546565.novalocal python3[8042]: ansible-ansible.legacy.command Invoked with _raw_params=echo "init";    cat /sys/fs/cgroup/init.scope/io.max; echo "machine"; cat /sys/fs/cgroup/machine.slice/io.max; echo "system";  cat /sys/fs/cgroup/system.slice/io.max; echo "user";    cat /sys/fs/cgroup/user.slice/io.max;
                                                       _uses_shell=True zuul_log_id=fa163e3b-3c83-694a-7516-000000001cd9-1-compute1 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 05 08:28:23 np0005546565.novalocal python3[8072]: ansible-ansible.builtin.stat Invoked with path=/sys/fs/cgroup/kubepods.slice/io.max follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Dec 05 08:28:26 np0005546565.novalocal sshd-session[7488]: Connection closed by 38.102.83.114 port 57370
Dec 05 08:28:26 np0005546565.novalocal sshd-session[7485]: pam_unix(sshd:session): session closed for user zuul
Dec 05 08:28:26 np0005546565.novalocal systemd[1]: session-5.scope: Deactivated successfully.
Dec 05 08:28:26 np0005546565.novalocal systemd[1]: session-5.scope: Consumed 4.246s CPU time.
Dec 05 08:28:26 np0005546565.novalocal systemd-logind[807]: Session 5 logged out. Waiting for processes to exit.
Dec 05 08:28:26 np0005546565.novalocal systemd-logind[807]: Removed session 5.
Dec 05 08:28:28 np0005546565.novalocal sshd-session[8076]: Accepted publickey for zuul from 38.102.83.114 port 53514 ssh2: RSA SHA256:7++2dNw2PJkt7ZbXTGnBEG3OOx6HMlhXSSRVYWpGqmQ
Dec 05 08:28:28 np0005546565.novalocal systemd-logind[807]: New session 6 of user zuul.
Dec 05 08:28:28 np0005546565.novalocal systemd[1]: Started Session 6 of User zuul.
Dec 05 08:28:28 np0005546565.novalocal sshd-session[8076]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 05 08:28:28 np0005546565.novalocal sudo[8103]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rgaephibgclzwkyqegsmkyjfltcgraki ; /usr/bin/python3'
Dec 05 08:28:28 np0005546565.novalocal sudo[8103]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 08:28:28 np0005546565.novalocal python3[8105]: ansible-ansible.legacy.dnf Invoked with name=['podman', 'buildah'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False use_backend=auto conf_file=None disable_excludes=None download_dir=None list=None releasever=None
Dec 05 08:28:42 np0005546565.novalocal kernel: SELinux:  Converting 385 SID table entries...
Dec 05 08:28:42 np0005546565.novalocal kernel: SELinux:  policy capability network_peer_controls=1
Dec 05 08:28:42 np0005546565.novalocal kernel: SELinux:  policy capability open_perms=1
Dec 05 08:28:42 np0005546565.novalocal kernel: SELinux:  policy capability extended_socket_class=1
Dec 05 08:28:42 np0005546565.novalocal kernel: SELinux:  policy capability always_check_network=0
Dec 05 08:28:42 np0005546565.novalocal kernel: SELinux:  policy capability cgroup_seclabel=1
Dec 05 08:28:42 np0005546565.novalocal kernel: SELinux:  policy capability nnp_nosuid_transition=1
Dec 05 08:28:42 np0005546565.novalocal kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Dec 05 08:28:51 np0005546565.novalocal kernel: SELinux:  Converting 385 SID table entries...
Dec 05 08:28:51 np0005546565.novalocal kernel: SELinux:  policy capability network_peer_controls=1
Dec 05 08:28:51 np0005546565.novalocal kernel: SELinux:  policy capability open_perms=1
Dec 05 08:28:51 np0005546565.novalocal kernel: SELinux:  policy capability extended_socket_class=1
Dec 05 08:28:51 np0005546565.novalocal kernel: SELinux:  policy capability always_check_network=0
Dec 05 08:28:51 np0005546565.novalocal kernel: SELinux:  policy capability cgroup_seclabel=1
Dec 05 08:28:51 np0005546565.novalocal kernel: SELinux:  policy capability nnp_nosuid_transition=1
Dec 05 08:28:51 np0005546565.novalocal kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Dec 05 08:29:00 np0005546565.novalocal kernel: SELinux:  Converting 385 SID table entries...
Dec 05 08:29:00 np0005546565.novalocal kernel: SELinux:  policy capability network_peer_controls=1
Dec 05 08:29:00 np0005546565.novalocal kernel: SELinux:  policy capability open_perms=1
Dec 05 08:29:00 np0005546565.novalocal kernel: SELinux:  policy capability extended_socket_class=1
Dec 05 08:29:00 np0005546565.novalocal kernel: SELinux:  policy capability always_check_network=0
Dec 05 08:29:00 np0005546565.novalocal kernel: SELinux:  policy capability cgroup_seclabel=1
Dec 05 08:29:00 np0005546565.novalocal kernel: SELinux:  policy capability nnp_nosuid_transition=1
Dec 05 08:29:00 np0005546565.novalocal kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Dec 05 08:29:01 np0005546565.novalocal setsebool[8173]: The virt_use_nfs policy boolean was changed to 1 by root
Dec 05 08:29:01 np0005546565.novalocal setsebool[8173]: The virt_sandbox_use_all_caps policy boolean was changed to 1 by root
Dec 05 08:29:12 np0005546565.novalocal kernel: SELinux:  Converting 388 SID table entries...
Dec 05 08:29:12 np0005546565.novalocal kernel: SELinux:  policy capability network_peer_controls=1
Dec 05 08:29:12 np0005546565.novalocal kernel: SELinux:  policy capability open_perms=1
Dec 05 08:29:12 np0005546565.novalocal kernel: SELinux:  policy capability extended_socket_class=1
Dec 05 08:29:12 np0005546565.novalocal kernel: SELinux:  policy capability always_check_network=0
Dec 05 08:29:12 np0005546565.novalocal kernel: SELinux:  policy capability cgroup_seclabel=1
Dec 05 08:29:12 np0005546565.novalocal kernel: SELinux:  policy capability nnp_nosuid_transition=1
Dec 05 08:29:12 np0005546565.novalocal kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Dec 05 08:29:30 np0005546565.novalocal dbus-broker-launch[773]: avc:  op=load_policy lsm=selinux seqno=6 res=1
Dec 05 08:29:30 np0005546565.novalocal systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Dec 05 08:29:30 np0005546565.novalocal systemd[1]: Starting man-db-cache-update.service...
Dec 05 08:29:30 np0005546565.novalocal systemd[1]: Reloading.
Dec 05 08:29:30 np0005546565.novalocal systemd-rc-local-generator[8928]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 05 08:29:30 np0005546565.novalocal systemd[1]: Queuing reload/restart jobs for marked units…
Dec 05 08:29:31 np0005546565.novalocal sudo[8103]: pam_unix(sudo:session): session closed for user root
Dec 05 08:29:36 np0005546565.novalocal python3[13940]: ansible-ansible.legacy.command Invoked with _raw_params=echo "openstack-k8s-operators+cirobot"
                                                        _uses_shell=True zuul_log_id=fa163e3b-3c83-b195-f7de-00000000000c-1-compute1 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 05 08:29:37 np0005546565.novalocal kernel: evm: overlay not supported
Dec 05 08:29:37 np0005546565.novalocal systemd[4300]: Starting D-Bus User Message Bus...
Dec 05 08:29:37 np0005546565.novalocal dbus-broker-launch[14325]: Policy to allow eavesdropping in /usr/share/dbus-1/session.conf +31: Eavesdropping is deprecated and ignored
Dec 05 08:29:37 np0005546565.novalocal dbus-broker-launch[14325]: Policy to allow eavesdropping in /usr/share/dbus-1/session.conf +33: Eavesdropping is deprecated and ignored
Dec 05 08:29:37 np0005546565.novalocal systemd[4300]: Started D-Bus User Message Bus.
Dec 05 08:29:37 np0005546565.novalocal dbus-broker-lau[14325]: Ready
Dec 05 08:29:37 np0005546565.novalocal systemd[4300]: selinux: avc:  op=load_policy lsm=selinux seqno=6 res=1
Dec 05 08:29:37 np0005546565.novalocal systemd[4300]: Created slice Slice /user.
Dec 05 08:29:37 np0005546565.novalocal systemd[4300]: podman-14253.scope: unit configures an IP firewall, but not running as root.
Dec 05 08:29:37 np0005546565.novalocal systemd[4300]: (This warning is only shown for the first unit using IP firewalling.)
Dec 05 08:29:37 np0005546565.novalocal systemd[4300]: Started podman-14253.scope.
Dec 05 08:29:37 np0005546565.novalocal systemd[4300]: Started podman-pause-d027e637.scope.
Dec 05 08:29:38 np0005546565.novalocal sudo[14819]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lcuxeebwsqomjphqymqxoxzxtprkqffw ; /usr/bin/python3'
Dec 05 08:29:38 np0005546565.novalocal sudo[14819]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 08:29:38 np0005546565.novalocal python3[14830]: ansible-ansible.builtin.blockinfile Invoked with state=present insertafter=EOF dest=/etc/containers/registries.conf content=[[registry]]
                                                       location = "38.102.83.175:5001"
                                                       insecure = true path=/etc/containers/registries.conf block=[[registry]]
                                                       location = "38.102.83.175:5001"
                                                       insecure = true marker=# {mark} ANSIBLE MANAGED BLOCK create=False backup=False marker_begin=BEGIN marker_end=END unsafe_writes=False insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 08:29:38 np0005546565.novalocal python3[14830]: ansible-ansible.builtin.blockinfile [WARNING] Module remote_tmp /root/.ansible/tmp did not exist and was created with a mode of 0700, this may cause issues when running as another user. To avoid this, create the remote_tmp dir with the correct permissions manually
Dec 05 08:29:38 np0005546565.novalocal sudo[14819]: pam_unix(sudo:session): session closed for user root
Dec 05 08:29:39 np0005546565.novalocal sshd-session[8079]: Connection closed by 38.102.83.114 port 53514
Dec 05 08:29:39 np0005546565.novalocal sshd-session[8076]: pam_unix(sshd:session): session closed for user zuul
Dec 05 08:29:39 np0005546565.novalocal systemd[1]: session-6.scope: Deactivated successfully.
Dec 05 08:29:39 np0005546565.novalocal systemd[1]: session-6.scope: Consumed 1min 183ms CPU time.
Dec 05 08:29:39 np0005546565.novalocal systemd-logind[807]: Session 6 logged out. Waiting for processes to exit.
Dec 05 08:29:39 np0005546565.novalocal systemd-logind[807]: Removed session 6.
Dec 05 08:29:57 np0005546565.novalocal sshd-session[24038]: Connection closed by 38.102.83.238 port 54976 [preauth]
Dec 05 08:29:57 np0005546565.novalocal sshd-session[24036]: Connection closed by 38.102.83.238 port 54970 [preauth]
Dec 05 08:29:57 np0005546565.novalocal sshd-session[24041]: Unable to negotiate with 38.102.83.238 port 54984: no matching host key type found. Their offer: ssh-ed25519 [preauth]
Dec 05 08:29:57 np0005546565.novalocal sshd-session[24042]: Unable to negotiate with 38.102.83.238 port 54988: no matching host key type found. Their offer: sk-ecdsa-sha2-nistp256@openssh.com [preauth]
Dec 05 08:29:57 np0005546565.novalocal sshd-session[24044]: Unable to negotiate with 38.102.83.238 port 54994: no matching host key type found. Their offer: sk-ssh-ed25519@openssh.com [preauth]
Dec 05 08:30:02 np0005546565.novalocal sshd-session[26383]: Accepted publickey for zuul from 38.102.83.114 port 56388 ssh2: RSA SHA256:7++2dNw2PJkt7ZbXTGnBEG3OOx6HMlhXSSRVYWpGqmQ
Dec 05 08:30:02 np0005546565.novalocal systemd-logind[807]: New session 7 of user zuul.
Dec 05 08:30:02 np0005546565.novalocal systemd[1]: Started Session 7 of User zuul.
Dec 05 08:30:02 np0005546565.novalocal sshd-session[26383]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 05 08:30:03 np0005546565.novalocal python3[26506]: ansible-ansible.posix.authorized_key Invoked with user=zuul key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBLQsfyw1BtfRs15kjKPo1toVJI5qvd/79HW6DeDcnao3HOHOC0wrwQht/v4tVmqv8sOzR40EJgsmxA21LHypqRU= zuul@np0005546563.novalocal
                                                        manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 05 08:30:03 np0005546565.novalocal sudo[26720]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lmwphoypbvdnclyljzgwovdoqguxdxdm ; /usr/bin/python3'
Dec 05 08:30:03 np0005546565.novalocal sudo[26720]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 08:30:03 np0005546565.novalocal python3[26730]: ansible-ansible.posix.authorized_key Invoked with user=root key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBLQsfyw1BtfRs15kjKPo1toVJI5qvd/79HW6DeDcnao3HOHOC0wrwQht/v4tVmqv8sOzR40EJgsmxA21LHypqRU= zuul@np0005546563.novalocal
                                                        manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 05 08:30:03 np0005546565.novalocal sudo[26720]: pam_unix(sudo:session): session closed for user root
Dec 05 08:30:03 np0005546565.novalocal irqbalance[793]: Cannot change IRQ 27 affinity: Operation not permitted
Dec 05 08:30:03 np0005546565.novalocal irqbalance[793]: IRQ 27 affinity is now unmanaged
Dec 05 08:30:04 np0005546565.novalocal sudo[27135]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hjmfhytahafyucubiwflygjwirjyvzyn ; /usr/bin/python3'
Dec 05 08:30:04 np0005546565.novalocal sudo[27135]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 08:30:04 np0005546565.novalocal python3[27142]: ansible-ansible.builtin.user Invoked with name=cloud-admin shell=/bin/bash state=present non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on np0005546565.novalocal update_password=always uid=None group=None groups=None comment=None home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None
Dec 05 08:30:04 np0005546565.novalocal useradd[27216]: new group: name=cloud-admin, GID=1002
Dec 05 08:30:04 np0005546565.novalocal useradd[27216]: new user: name=cloud-admin, UID=1002, GID=1002, home=/home/cloud-admin, shell=/bin/bash, from=none
Dec 05 08:30:04 np0005546565.novalocal sudo[27135]: pam_unix(sudo:session): session closed for user root
Dec 05 08:30:05 np0005546565.novalocal sudo[27351]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cjfyxyxxeonojhjckkpayxtsgpmvzhxh ; /usr/bin/python3'
Dec 05 08:30:05 np0005546565.novalocal sudo[27351]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 08:30:05 np0005546565.novalocal python3[27358]: ansible-ansible.posix.authorized_key Invoked with user=cloud-admin key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBLQsfyw1BtfRs15kjKPo1toVJI5qvd/79HW6DeDcnao3HOHOC0wrwQht/v4tVmqv8sOzR40EJgsmxA21LHypqRU= zuul@np0005546563.novalocal
                                                        manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 05 08:30:05 np0005546565.novalocal sudo[27351]: pam_unix(sudo:session): session closed for user root
Dec 05 08:30:05 np0005546565.novalocal sudo[27632]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lrlxhdxvfbcajlehlppyxetubxbadwcr ; /usr/bin/python3'
Dec 05 08:30:05 np0005546565.novalocal sudo[27632]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 08:30:05 np0005546565.novalocal python3[27638]: ansible-ansible.legacy.stat Invoked with path=/etc/sudoers.d/cloud-admin follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 05 08:30:05 np0005546565.novalocal sudo[27632]: pam_unix(sudo:session): session closed for user root
Dec 05 08:30:06 np0005546565.novalocal sudo[27898]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jqbkmasqtjbkhazrspoiohqafhrjngti ; /usr/bin/python3'
Dec 05 08:30:06 np0005546565.novalocal sudo[27898]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 08:30:06 np0005546565.novalocal python3[27906]: ansible-ansible.legacy.copy Invoked with dest=/etc/sudoers.d/cloud-admin mode=0640 src=/home/zuul/.ansible/tmp/ansible-tmp-1764923405.4246168-168-257636466765434/source _original_basename=tmpgu_3kx37 follow=False checksum=e7614e5ad3ab06eaae55b8efaa2ed81b63ea5634 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 08:30:06 np0005546565.novalocal sudo[27898]: pam_unix(sudo:session): session closed for user root
Dec 05 08:30:06 np0005546565.novalocal sudo[28243]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mewhmqsxopdomctgvtiknwlniblidjql ; /usr/bin/python3'
Dec 05 08:30:06 np0005546565.novalocal sudo[28243]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 08:30:06 np0005546565.novalocal python3[28249]: ansible-ansible.builtin.hostname Invoked with name=compute-1 use=systemd
Dec 05 08:30:06 np0005546565.novalocal systemd[1]: Starting Hostname Service...
Dec 05 08:30:07 np0005546565.novalocal systemd[1]: Started Hostname Service.
Dec 05 08:30:07 np0005546565.novalocal systemd-hostnamed[28336]: Changed pretty hostname to 'compute-1'
Dec 05 08:30:07 compute-1 systemd-hostnamed[28336]: Hostname set to <compute-1> (static)
Dec 05 08:30:07 compute-1 NetworkManager[7190]: <info>  [1764923407.1007] hostname: static hostname changed from "np0005546565.novalocal" to "compute-1"
Dec 05 08:30:07 compute-1 systemd[1]: Starting Network Manager Script Dispatcher Service...
Dec 05 08:30:07 compute-1 systemd[1]: Started Network Manager Script Dispatcher Service.
Dec 05 08:30:07 compute-1 sudo[28243]: pam_unix(sudo:session): session closed for user root
Dec 05 08:30:07 compute-1 sshd-session[26443]: Connection closed by 38.102.83.114 port 56388
Dec 05 08:30:07 compute-1 sshd-session[26383]: pam_unix(sshd:session): session closed for user zuul
Dec 05 08:30:07 compute-1 systemd[1]: session-7.scope: Deactivated successfully.
Dec 05 08:30:07 compute-1 systemd[1]: session-7.scope: Consumed 2.237s CPU time.
Dec 05 08:30:07 compute-1 systemd-logind[807]: Session 7 logged out. Waiting for processes to exit.
Dec 05 08:30:07 compute-1 systemd-logind[807]: Removed session 7.
Dec 05 08:30:10 compute-1 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Dec 05 08:30:10 compute-1 systemd[1]: Finished man-db-cache-update.service.
Dec 05 08:30:10 compute-1 systemd[1]: man-db-cache-update.service: Consumed 48.702s CPU time.
Dec 05 08:30:10 compute-1 systemd[1]: run-rcf6db9d5d63b4637b26650ed4a63bcec.service: Deactivated successfully.
Dec 05 08:30:17 compute-1 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Dec 05 08:30:37 compute-1 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Dec 05 08:31:28 compute-1 systemd[1]: Starting dnf makecache...
Dec 05 08:31:28 compute-1 dnf[29988]: Failed determining last makecache time.
Dec 05 08:31:29 compute-1 dnf[29988]: CentOS Stream 9 - BaseOS                         30 kB/s | 7.3 kB     00:00
Dec 05 08:31:29 compute-1 dnf[29988]: CentOS Stream 9 - AppStream                      84 kB/s | 7.4 kB     00:00
Dec 05 08:31:29 compute-1 dnf[29988]: CentOS Stream 9 - CRB                            31 kB/s | 7.2 kB     00:00
Dec 05 08:31:29 compute-1 dnf[29988]: CentOS Stream 9 - Extras packages                86 kB/s | 8.3 kB     00:00
Dec 05 08:31:29 compute-1 dnf[29988]: Metadata cache created.
Dec 05 08:31:29 compute-1 systemd[1]: dnf-makecache.service: Deactivated successfully.
Dec 05 08:31:29 compute-1 systemd[1]: Finished dnf makecache.
Dec 05 08:34:28 compute-1 systemd[1]: Starting Cleanup of Temporary Directories...
Dec 05 08:34:28 compute-1 systemd[1]: systemd-tmpfiles-clean.service: Deactivated successfully.
Dec 05 08:34:28 compute-1 systemd[1]: Finished Cleanup of Temporary Directories.
Dec 05 08:34:28 compute-1 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dclean.service.mount: Deactivated successfully.
Dec 05 08:35:47 compute-1 sshd-session[30000]: Accepted publickey for zuul from 38.102.83.238 port 36882 ssh2: RSA SHA256:7++2dNw2PJkt7ZbXTGnBEG3OOx6HMlhXSSRVYWpGqmQ
Dec 05 08:35:47 compute-1 systemd-logind[807]: New session 8 of user zuul.
Dec 05 08:35:47 compute-1 systemd[1]: Started Session 8 of User zuul.
Dec 05 08:35:47 compute-1 sshd-session[30000]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 05 08:35:47 compute-1 python3[30076]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 05 08:35:50 compute-1 sudo[30190]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hduddifgfoiqindtomxdatlkdibdgynz ; /usr/bin/python3'
Dec 05 08:35:50 compute-1 sudo[30190]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 08:35:50 compute-1 python3[30192]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/delorean.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 05 08:35:50 compute-1 sudo[30190]: pam_unix(sudo:session): session closed for user root
Dec 05 08:35:51 compute-1 sudo[30263]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jwpjhrlzglvsuhukadqwhuqunqoheqdv ; /usr/bin/python3'
Dec 05 08:35:51 compute-1 sudo[30263]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 08:35:51 compute-1 python3[30265]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1764923750.5696096-36304-17841496049482/source mode=0755 _original_basename=delorean.repo follow=False checksum=39c885eb875fd03e010d1b0454241c26b121dfb2 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 08:35:51 compute-1 sudo[30263]: pam_unix(sudo:session): session closed for user root
Dec 05 08:35:51 compute-1 sudo[30289]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wtvmdonsetwlyfiqgmzxgszgqqzledgw ; /usr/bin/python3'
Dec 05 08:35:51 compute-1 sudo[30289]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 08:35:51 compute-1 python3[30291]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/delorean-antelope-testing.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 05 08:35:51 compute-1 sudo[30289]: pam_unix(sudo:session): session closed for user root
Dec 05 08:35:51 compute-1 sudo[30362]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xuslewuaginaxhfqvzrczaobjurrmycv ; /usr/bin/python3'
Dec 05 08:35:51 compute-1 sudo[30362]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 08:35:51 compute-1 python3[30364]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1764923750.5696096-36304-17841496049482/source mode=0755 _original_basename=delorean-antelope-testing.repo follow=False checksum=4ebc56dead962b5d40b8d420dad43b948b84d3fc backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 08:35:52 compute-1 sudo[30362]: pam_unix(sudo:session): session closed for user root
Dec 05 08:35:52 compute-1 sudo[30388]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ggecwvszfbxokvdjczrisdsvimtlmuvw ; /usr/bin/python3'
Dec 05 08:35:52 compute-1 sudo[30388]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 08:35:52 compute-1 python3[30390]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-highavailability.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 05 08:35:52 compute-1 sudo[30388]: pam_unix(sudo:session): session closed for user root
Dec 05 08:35:52 compute-1 sudo[30461]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iqfzksirwmulhhlhucjyutqotujayzhn ; /usr/bin/python3'
Dec 05 08:35:52 compute-1 sudo[30461]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 08:35:52 compute-1 python3[30463]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1764923750.5696096-36304-17841496049482/source mode=0755 _original_basename=repo-setup-centos-highavailability.repo follow=False checksum=55d0f695fd0d8f47cbc3044ce0dcf5f88862490f backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 08:35:52 compute-1 sudo[30461]: pam_unix(sudo:session): session closed for user root
Dec 05 08:35:52 compute-1 sudo[30487]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-crsqijgkvoaiothagunmxgywzwjqjqku ; /usr/bin/python3'
Dec 05 08:35:52 compute-1 sudo[30487]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 08:35:52 compute-1 python3[30489]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-powertools.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 05 08:35:52 compute-1 sudo[30487]: pam_unix(sudo:session): session closed for user root
Dec 05 08:35:53 compute-1 sudo[30560]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kelpaiqeclheewzrkqehwlwueshkvwhf ; /usr/bin/python3'
Dec 05 08:35:53 compute-1 sudo[30560]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 08:35:53 compute-1 python3[30562]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1764923750.5696096-36304-17841496049482/source mode=0755 _original_basename=repo-setup-centos-powertools.repo follow=False checksum=4b0cf99aa89c5c5be0151545863a7a7568f67568 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 08:35:53 compute-1 sudo[30560]: pam_unix(sudo:session): session closed for user root
Dec 05 08:35:53 compute-1 sudo[30586]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-veuzhhlourundslzvocefhozkiywfehc ; /usr/bin/python3'
Dec 05 08:35:53 compute-1 sudo[30586]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 08:35:53 compute-1 python3[30588]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-appstream.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 05 08:35:53 compute-1 sudo[30586]: pam_unix(sudo:session): session closed for user root
Dec 05 08:35:53 compute-1 sudo[30659]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dhixtjexujxjcstxqhfakqeknzqrhqru ; /usr/bin/python3'
Dec 05 08:35:53 compute-1 sudo[30659]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 08:35:53 compute-1 python3[30661]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1764923750.5696096-36304-17841496049482/source mode=0755 _original_basename=repo-setup-centos-appstream.repo follow=False checksum=e89244d2503b2996429dda1857290c1e91e393a1 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 08:35:53 compute-1 sudo[30659]: pam_unix(sudo:session): session closed for user root
Dec 05 08:35:53 compute-1 sudo[30685]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uvcqzitffvlkgiszfszgwgnyfsixdbko ; /usr/bin/python3'
Dec 05 08:35:53 compute-1 sudo[30685]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 08:35:54 compute-1 python3[30687]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-baseos.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 05 08:35:54 compute-1 sudo[30685]: pam_unix(sudo:session): session closed for user root
Dec 05 08:35:54 compute-1 sudo[30758]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-znypzfozcgoumnofhpbaztydlvtpofeu ; /usr/bin/python3'
Dec 05 08:35:54 compute-1 sudo[30758]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 08:35:54 compute-1 python3[30760]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1764923750.5696096-36304-17841496049482/source mode=0755 _original_basename=repo-setup-centos-baseos.repo follow=False checksum=36d926db23a40dbfa5c84b5e4d43eac6fa2301d6 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 08:35:54 compute-1 sudo[30758]: pam_unix(sudo:session): session closed for user root
Dec 05 08:35:54 compute-1 sudo[30784]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ljnxcloppoycqtfcvrwybbwqxmdqlvqz ; /usr/bin/python3'
Dec 05 08:35:54 compute-1 sudo[30784]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 08:35:54 compute-1 python3[30786]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/delorean.repo.md5 follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 05 08:35:54 compute-1 sudo[30784]: pam_unix(sudo:session): session closed for user root
Dec 05 08:35:54 compute-1 sudo[30857]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rymdquhuopnceiazsznapupopwqklskk ; /usr/bin/python3'
Dec 05 08:35:54 compute-1 sudo[30857]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 08:35:54 compute-1 python3[30859]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1764923750.5696096-36304-17841496049482/source mode=0755 _original_basename=delorean.repo.md5 follow=False checksum=6e18e2038d54303b4926db53c0b6cced515a9151 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 08:35:54 compute-1 sudo[30857]: pam_unix(sudo:session): session closed for user root
Dec 05 08:36:06 compute-1 python3[30907]: ansible-ansible.legacy.command Invoked with _raw_params=hostname _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 05 08:39:26 compute-1 sshd-session[30910]: Connection reset by authenticating user root 91.202.233.33 port 23246 [preauth]
Dec 05 08:39:29 compute-1 sshd-session[30912]: Connection reset by authenticating user root 91.202.233.33 port 23272 [preauth]
Dec 05 08:39:31 compute-1 sshd-session[30914]: Connection reset by authenticating user root 91.202.233.33 port 23278 [preauth]
Dec 05 08:39:33 compute-1 sshd-session[30916]: Invalid user vagrant from 91.202.233.33 port 45170
Dec 05 08:39:34 compute-1 sshd-session[30916]: Connection reset by invalid user vagrant 91.202.233.33 port 45170 [preauth]
Dec 05 08:39:37 compute-1 sshd-session[30919]: Connection reset by authenticating user root 91.202.233.33 port 45188 [preauth]
Dec 05 08:41:06 compute-1 sshd-session[30003]: Received disconnect from 38.102.83.238 port 36882:11: disconnected by user
Dec 05 08:41:06 compute-1 sshd-session[30003]: Disconnected from user zuul 38.102.83.238 port 36882
Dec 05 08:41:06 compute-1 sshd-session[30000]: pam_unix(sshd:session): session closed for user zuul
Dec 05 08:41:06 compute-1 systemd[1]: session-8.scope: Deactivated successfully.
Dec 05 08:41:06 compute-1 systemd[1]: session-8.scope: Consumed 4.926s CPU time.
Dec 05 08:41:06 compute-1 systemd-logind[807]: Session 8 logged out. Waiting for processes to exit.
Dec 05 08:41:06 compute-1 systemd-logind[807]: Removed session 8.
Dec 05 08:45:24 compute-1 sshd-session[30924]: Received disconnect from 101.47.162.91 port 39994:11: Bye Bye [preauth]
Dec 05 08:45:24 compute-1 sshd-session[30924]: Disconnected from authenticating user root 101.47.162.91 port 39994 [preauth]
Dec 05 08:47:03 compute-1 sshd[1008]: Timeout before authentication for connection from 115.190.64.245 to 38.102.83.154, pid = 30923
Dec 05 08:47:25 compute-1 sshd-session[30929]: Received disconnect from 43.225.158.169 port 53803:11: Bye Bye [preauth]
Dec 05 08:47:25 compute-1 sshd-session[30929]: Disconnected from authenticating user root 43.225.158.169 port 53803 [preauth]
Dec 05 08:47:30 compute-1 sshd-session[30931]: Received disconnect from 122.114.113.177 port 57462:11: Bye Bye [preauth]
Dec 05 08:47:30 compute-1 sshd-session[30931]: Disconnected from authenticating user root 122.114.113.177 port 57462 [preauth]
Dec 05 08:47:46 compute-1 sshd-session[30933]: Received disconnect from 122.168.194.41 port 40192:11: Bye Bye [preauth]
Dec 05 08:47:46 compute-1 sshd-session[30933]: Disconnected from authenticating user root 122.168.194.41 port 40192 [preauth]
Dec 05 08:48:03 compute-1 sshd-session[30935]: Received disconnect from 185.118.15.236 port 60996:11: Bye Bye [preauth]
Dec 05 08:48:03 compute-1 sshd-session[30935]: Disconnected from authenticating user root 185.118.15.236 port 60996 [preauth]
Dec 05 08:48:54 compute-1 sshd[1008]: Timeout before authentication for connection from 112.20.185.158 to 38.102.83.154, pid = 30928
Dec 05 08:49:56 compute-1 sshd-session[30940]: Received disconnect from 122.168.194.41 port 35540:11: Bye Bye [preauth]
Dec 05 08:49:56 compute-1 sshd-session[30940]: Disconnected from authenticating user root 122.168.194.41 port 35540 [preauth]
Dec 05 08:49:59 compute-1 sshd-session[30942]: Received disconnect from 43.225.158.169 port 45994:11: Bye Bye [preauth]
Dec 05 08:49:59 compute-1 sshd-session[30942]: Disconnected from authenticating user root 43.225.158.169 port 45994 [preauth]
Dec 05 08:50:28 compute-1 sshd[1008]: Timeout before authentication for connection from 101.126.71.100 to 38.102.83.154, pid = 30937
Dec 05 08:50:40 compute-1 sshd-session[30946]: Accepted publickey for zuul from 192.168.122.30 port 35304 ssh2: ECDSA SHA256:xFeNApY160LxGIiSPyA1pr6/OAQjwgC1t0EAbYvSE3Q
Dec 05 08:50:40 compute-1 systemd-logind[807]: New session 9 of user zuul.
Dec 05 08:50:40 compute-1 systemd[1]: Started Session 9 of User zuul.
Dec 05 08:50:40 compute-1 sshd-session[30946]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 05 08:50:41 compute-1 python3.9[31099]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 05 08:50:42 compute-1 sudo[31278]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lombqrnfchkotzssuyjrwtcdzvxuqeif ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764924642.3186066-57-61453909742596/AnsiballZ_command.py'
Dec 05 08:50:42 compute-1 sudo[31278]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 08:50:43 compute-1 python3.9[31280]: ansible-ansible.legacy.command Invoked with _raw_params=set -euxo pipefail
                                            pushd /var/tmp
                                            curl -sL https://github.com/openstack-k8s-operators/repo-setup/archive/refs/heads/main.tar.gz | tar -xz
                                            pushd repo-setup-main
                                            python3 -m venv ./venv
                                            PBR_VERSION=0.0.0 ./venv/bin/pip install ./
                                            ./venv/bin/repo-setup current-podified -b antelope
                                            popd
                                            rm -rf repo-setup-main
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 05 08:50:50 compute-1 sudo[31278]: pam_unix(sudo:session): session closed for user root
Dec 05 08:50:50 compute-1 sshd-session[30949]: Connection closed by 192.168.122.30 port 35304
Dec 05 08:50:50 compute-1 sshd-session[30946]: pam_unix(sshd:session): session closed for user zuul
Dec 05 08:50:50 compute-1 systemd-logind[807]: Session 9 logged out. Waiting for processes to exit.
Dec 05 08:50:50 compute-1 systemd[1]: session-9.scope: Deactivated successfully.
Dec 05 08:50:50 compute-1 systemd[1]: session-9.scope: Consumed 7.854s CPU time.
Dec 05 08:50:50 compute-1 systemd-logind[807]: Removed session 9.
Dec 05 08:50:55 compute-1 sshd-session[31339]: Accepted publickey for zuul from 192.168.122.30 port 37274 ssh2: ECDSA SHA256:xFeNApY160LxGIiSPyA1pr6/OAQjwgC1t0EAbYvSE3Q
Dec 05 08:50:55 compute-1 systemd-logind[807]: New session 10 of user zuul.
Dec 05 08:50:55 compute-1 systemd[1]: Started Session 10 of User zuul.
Dec 05 08:50:55 compute-1 sshd-session[31339]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 05 08:50:56 compute-1 python3.9[31492]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 05 08:50:57 compute-1 sshd-session[31342]: Connection closed by 192.168.122.30 port 37274
Dec 05 08:50:57 compute-1 sshd-session[31339]: pam_unix(sshd:session): session closed for user zuul
Dec 05 08:50:57 compute-1 systemd[1]: session-10.scope: Deactivated successfully.
Dec 05 08:50:57 compute-1 systemd-logind[807]: Session 10 logged out. Waiting for processes to exit.
Dec 05 08:50:57 compute-1 systemd-logind[807]: Removed session 10.
Dec 05 08:50:59 compute-1 sshd-session[31519]: Received disconnect from 185.118.15.236 port 32956:11: Bye Bye [preauth]
Dec 05 08:50:59 compute-1 sshd-session[31519]: Disconnected from authenticating user root 185.118.15.236 port 32956 [preauth]
Dec 05 08:51:12 compute-1 sshd-session[31521]: Accepted publickey for zuul from 192.168.122.30 port 56482 ssh2: ECDSA SHA256:xFeNApY160LxGIiSPyA1pr6/OAQjwgC1t0EAbYvSE3Q
Dec 05 08:51:12 compute-1 systemd-logind[807]: New session 11 of user zuul.
Dec 05 08:51:12 compute-1 systemd[1]: Started Session 11 of User zuul.
Dec 05 08:51:12 compute-1 sshd-session[31521]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 05 08:51:13 compute-1 python3.9[31674]: ansible-ansible.legacy.ping Invoked with data=pong
Dec 05 08:51:14 compute-1 python3.9[31848]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 05 08:51:15 compute-1 sudo[31998]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nsnunnzqaxwpzypvuqhxfjnvxutafxdf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764924675.4063196-94-166025376179555/AnsiballZ_command.py'
Dec 05 08:51:15 compute-1 sudo[31998]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 08:51:16 compute-1 python3.9[32000]: ansible-ansible.legacy.command Invoked with _raw_params=PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin which growvols
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 05 08:51:16 compute-1 sudo[31998]: pam_unix(sudo:session): session closed for user root
Dec 05 08:51:17 compute-1 sudo[32151]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-clyflwrtxmugsykuxvwuncyazwajvywz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764924676.6742146-130-101527147910338/AnsiballZ_stat.py'
Dec 05 08:51:17 compute-1 sudo[32151]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 08:51:17 compute-1 python3.9[32155]: ansible-ansible.builtin.stat Invoked with path=/etc/ansible/facts.d/bootc.fact follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 05 08:51:17 compute-1 sudo[32151]: pam_unix(sudo:session): session closed for user root
Dec 05 08:51:18 compute-1 sudo[32305]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nyftluhyyeylsmgqdvpyerekwncowxgq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764924677.5215485-154-28941679808915/AnsiballZ_file.py'
Dec 05 08:51:18 compute-1 sudo[32305]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 08:51:18 compute-1 python3.9[32307]: ansible-ansible.builtin.file Invoked with mode=755 path=/etc/ansible/facts.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 08:51:18 compute-1 sudo[32305]: pam_unix(sudo:session): session closed for user root
Dec 05 08:51:18 compute-1 sshd-session[32152]: Received disconnect from 43.225.158.169 port 59136:11: Bye Bye [preauth]
Dec 05 08:51:18 compute-1 sshd-session[32152]: Disconnected from authenticating user root 43.225.158.169 port 59136 [preauth]
Dec 05 08:51:18 compute-1 sudo[32457]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tyrwoeoryalcwgyqpeavtjvjgppfzewl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764924678.547561-178-271779938613639/AnsiballZ_stat.py'
Dec 05 08:51:18 compute-1 sudo[32457]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 08:51:19 compute-1 python3.9[32459]: ansible-ansible.legacy.stat Invoked with path=/etc/ansible/facts.d/bootc.fact follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 08:51:19 compute-1 sudo[32457]: pam_unix(sudo:session): session closed for user root
Dec 05 08:51:19 compute-1 sudo[32580]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cmsrhyorvffakxzgotrqvqvqbvezshfq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764924678.547561-178-271779938613639/AnsiballZ_copy.py'
Dec 05 08:51:19 compute-1 sudo[32580]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 08:51:19 compute-1 python3.9[32582]: ansible-ansible.legacy.copy Invoked with dest=/etc/ansible/facts.d/bootc.fact mode=755 src=/home/zuul/.ansible/tmp/ansible-tmp-1764924678.547561-178-271779938613639/.source.fact _original_basename=bootc.fact follow=False checksum=eb4122ce7fc50a38407beb511c4ff8c178005b12 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 08:51:19 compute-1 sudo[32580]: pam_unix(sudo:session): session closed for user root
Dec 05 08:51:20 compute-1 sudo[32734]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rzconyioponqivkamohpzvkrvbmfnkgp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764924680.1555681-223-30584419842729/AnsiballZ_setup.py'
Dec 05 08:51:20 compute-1 sudo[32734]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 08:51:21 compute-1 python3.9[32736]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 05 08:51:21 compute-1 sudo[32734]: pam_unix(sudo:session): session closed for user root
Dec 05 08:51:21 compute-1 sshd-session[32607]: Received disconnect from 122.168.194.41 port 42142:11: Bye Bye [preauth]
Dec 05 08:51:21 compute-1 sshd-session[32607]: Disconnected from authenticating user root 122.168.194.41 port 42142 [preauth]
Dec 05 08:51:21 compute-1 sudo[32890]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-minfsdfhfmcuppiczdidktldhclvzadm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764924681.5185993-247-1196005553588/AnsiballZ_file.py'
Dec 05 08:51:21 compute-1 sudo[32890]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 08:51:22 compute-1 python3.9[32892]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/log/journal setype=var_log_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 05 08:51:22 compute-1 sudo[32890]: pam_unix(sudo:session): session closed for user root
Dec 05 08:51:22 compute-1 sudo[33042]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cnedockhxcywgmmimrcwqggneekxjxjz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764924682.3692129-274-254865611345655/AnsiballZ_file.py'
Dec 05 08:51:22 compute-1 sudo[33042]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 08:51:22 compute-1 python3.9[33044]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/config-data/ansible-generated recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 05 08:51:22 compute-1 sudo[33042]: pam_unix(sudo:session): session closed for user root
Dec 05 08:51:23 compute-1 python3.9[33194]: ansible-ansible.builtin.service_facts Invoked
Dec 05 08:51:29 compute-1 sshd-session[33322]: Connection reset by authenticating user root 45.140.17.124 port 58296 [preauth]
Dec 05 08:51:30 compute-1 python3.9[33449]: ansible-ansible.builtin.lineinfile Invoked with line=cloud-init=disabled path=/proc/cmdline state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 08:51:30 compute-1 sshd[1008]: Timeout before authentication for connection from 101.47.162.91 to 38.102.83.154, pid = 30938
Dec 05 08:51:31 compute-1 python3.9[33601]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 05 08:51:32 compute-1 python3.9[33755]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 05 08:51:32 compute-1 sshd-session[33450]: Connection reset by authenticating user root 45.140.17.124 port 58312 [preauth]
Dec 05 08:51:33 compute-1 sudo[33914]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uudhjzfdvsmcptyiydrmffqjjbpoadxk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764924693.2246532-418-98388550097006/AnsiballZ_setup.py'
Dec 05 08:51:33 compute-1 sudo[33914]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 08:51:33 compute-1 python3.9[33916]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec 05 08:51:34 compute-1 sudo[33914]: pam_unix(sudo:session): session closed for user root
Dec 05 08:51:34 compute-1 sudo[33998]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lcnlnntvkrtbyrgfkjsgxindkqmiwzrs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764924693.2246532-418-98388550097006/AnsiballZ_dnf.py'
Dec 05 08:51:34 compute-1 sudo[33998]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 08:51:34 compute-1 sshd-session[33787]: Invalid user user from 45.140.17.124 port 37476
Dec 05 08:51:34 compute-1 python3.9[34000]: ansible-ansible.legacy.dnf Invoked with name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec 05 08:51:35 compute-1 sshd-session[33787]: Connection reset by invalid user user 45.140.17.124 port 37476 [preauth]
Dec 05 08:51:37 compute-1 sshd-session[34011]: Connection reset by authenticating user root 45.140.17.124 port 37492 [preauth]
Dec 05 08:51:48 compute-1 sshd-session[34066]: Connection reset by 45.140.17.124 port 37510 [preauth]
Dec 05 08:52:03 compute-1 sshd-session[34145]: error: maximum authentication attempts exceeded for root from 176.235.182.73 port 41762 ssh2 [preauth]
Dec 05 08:52:03 compute-1 sshd-session[34145]: Disconnecting authenticating user root 176.235.182.73 port 41762: Too many authentication failures [preauth]
Dec 05 08:52:04 compute-1 sshd[1008]: Timeout before authentication for connection from 14.103.118.153 to 38.102.83.154, pid = 30944
Dec 05 08:52:05 compute-1 sshd-session[34153]: error: maximum authentication attempts exceeded for root from 176.235.182.73 port 42240 ssh2 [preauth]
Dec 05 08:52:05 compute-1 sshd-session[34153]: Disconnecting authenticating user root 176.235.182.73 port 42240: Too many authentication failures [preauth]
Dec 05 08:52:07 compute-1 sshd-session[34156]: error: maximum authentication attempts exceeded for root from 176.235.182.73 port 42687 ssh2 [preauth]
Dec 05 08:52:07 compute-1 sshd-session[34156]: Disconnecting authenticating user root 176.235.182.73 port 42687: Too many authentication failures [preauth]
Dec 05 08:52:08 compute-1 sshd-session[34158]: Received disconnect from 176.235.182.73 port 43192:11: disconnected by user [preauth]
Dec 05 08:52:08 compute-1 sshd-session[34158]: Disconnected from authenticating user root 176.235.182.73 port 43192 [preauth]
Dec 05 08:52:09 compute-1 sshd-session[34160]: Invalid user admin from 176.235.182.73 port 43471
Dec 05 08:52:10 compute-1 sshd-session[34160]: error: maximum authentication attempts exceeded for invalid user admin from 176.235.182.73 port 43471 ssh2 [preauth]
Dec 05 08:52:10 compute-1 sshd-session[34160]: Disconnecting invalid user admin 176.235.182.73 port 43471: Too many authentication failures [preauth]
Dec 05 08:52:11 compute-1 sshd-session[34162]: Invalid user admin from 176.235.182.73 port 44014
Dec 05 08:52:12 compute-1 sshd-session[34162]: error: maximum authentication attempts exceeded for invalid user admin from 176.235.182.73 port 44014 ssh2 [preauth]
Dec 05 08:52:12 compute-1 sshd-session[34162]: Disconnecting invalid user admin 176.235.182.73 port 44014: Too many authentication failures [preauth]
Dec 05 08:52:13 compute-1 sshd-session[34164]: Invalid user admin from 176.235.182.73 port 44553
Dec 05 08:52:14 compute-1 sshd-session[34164]: Received disconnect from 176.235.182.73 port 44553:11: disconnected by user [preauth]
Dec 05 08:52:14 compute-1 sshd-session[34164]: Disconnected from invalid user admin 176.235.182.73 port 44553 [preauth]
Dec 05 08:52:15 compute-1 sshd-session[34166]: Invalid user oracle from 176.235.182.73 port 44930
Dec 05 08:52:16 compute-1 sshd-session[34166]: error: maximum authentication attempts exceeded for invalid user oracle from 176.235.182.73 port 44930 ssh2 [preauth]
Dec 05 08:52:16 compute-1 sshd-session[34166]: Disconnecting invalid user oracle 176.235.182.73 port 44930: Too many authentication failures [preauth]
Dec 05 08:52:17 compute-1 sshd-session[34168]: Invalid user oracle from 176.235.182.73 port 45459
Dec 05 08:52:18 compute-1 sshd-session[34168]: error: maximum authentication attempts exceeded for invalid user oracle from 176.235.182.73 port 45459 ssh2 [preauth]
Dec 05 08:52:18 compute-1 sshd-session[34168]: Disconnecting invalid user oracle 176.235.182.73 port 45459: Too many authentication failures [preauth]
Dec 05 08:52:19 compute-1 sshd-session[34170]: Invalid user oracle from 176.235.182.73 port 45989
Dec 05 08:52:19 compute-1 sshd-session[34172]: Received disconnect from 185.118.15.236 port 33074:11: Bye Bye [preauth]
Dec 05 08:52:19 compute-1 sshd-session[34172]: Disconnected from authenticating user root 185.118.15.236 port 33074 [preauth]
Dec 05 08:52:20 compute-1 sshd-session[34170]: Received disconnect from 176.235.182.73 port 45989:11: disconnected by user [preauth]
Dec 05 08:52:20 compute-1 sshd-session[34170]: Disconnected from invalid user oracle 176.235.182.73 port 45989 [preauth]
Dec 05 08:52:21 compute-1 sshd-session[34179]: Invalid user usuario from 176.235.182.73 port 46459
Dec 05 08:52:21 compute-1 systemd[1]: Reloading.
Dec 05 08:52:21 compute-1 systemd-rc-local-generator[34221]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 05 08:52:21 compute-1 systemd[1]: Listening on Device-mapper event daemon FIFOs.
Dec 05 08:52:21 compute-1 sshd-session[34179]: error: maximum authentication attempts exceeded for invalid user usuario from 176.235.182.73 port 46459 ssh2 [preauth]
Dec 05 08:52:21 compute-1 sshd-session[34179]: Disconnecting invalid user usuario 176.235.182.73 port 46459: Too many authentication failures [preauth]
Dec 05 08:52:22 compute-1 systemd[1]: Reloading.
Dec 05 08:52:22 compute-1 systemd-rc-local-generator[34267]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 05 08:52:22 compute-1 systemd[1]: Starting Monitoring of LVM2 mirrors, snapshots etc. using dmeventd or progress polling...
Dec 05 08:52:22 compute-1 systemd[1]: Finished Monitoring of LVM2 mirrors, snapshots etc. using dmeventd or progress polling.
Dec 05 08:52:22 compute-1 systemd[1]: Reloading.
Dec 05 08:52:22 compute-1 systemd-rc-local-generator[34310]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 05 08:52:22 compute-1 systemd[1]: Listening on LVM2 poll daemon socket.
Dec 05 08:52:22 compute-1 dbus-broker-launch[767]: Noticed file-system modification, trigger reload.
Dec 05 08:52:22 compute-1 dbus-broker-launch[767]: Noticed file-system modification, trigger reload.
Dec 05 08:52:22 compute-1 dbus-broker-launch[767]: Noticed file-system modification, trigger reload.
Dec 05 08:52:23 compute-1 sshd-session[34276]: Invalid user usuario from 176.235.182.73 port 46957
Dec 05 08:52:23 compute-1 sshd-session[34276]: error: maximum authentication attempts exceeded for invalid user usuario from 176.235.182.73 port 46957 ssh2 [preauth]
Dec 05 08:52:23 compute-1 sshd-session[34276]: Disconnecting invalid user usuario 176.235.182.73 port 46957: Too many authentication failures [preauth]
Dec 05 08:52:24 compute-1 sshd-session[34333]: Invalid user usuario from 176.235.182.73 port 47489
Dec 05 08:52:25 compute-1 sshd-session[34333]: Received disconnect from 176.235.182.73 port 47489:11: disconnected by user [preauth]
Dec 05 08:52:25 compute-1 sshd-session[34333]: Disconnected from invalid user usuario 176.235.182.73 port 47489 [preauth]
Dec 05 08:52:26 compute-1 sshd-session[34340]: Invalid user test from 176.235.182.73 port 47736
Dec 05 08:52:27 compute-1 sshd-session[34340]: error: maximum authentication attempts exceeded for invalid user test from 176.235.182.73 port 47736 ssh2 [preauth]
Dec 05 08:52:27 compute-1 sshd-session[34340]: Disconnecting invalid user test 176.235.182.73 port 47736: Too many authentication failures [preauth]
Dec 05 08:52:28 compute-1 sshd-session[34350]: Invalid user test from 176.235.182.73 port 48224
Dec 05 08:52:28 compute-1 sshd-session[34350]: error: maximum authentication attempts exceeded for invalid user test from 176.235.182.73 port 48224 ssh2 [preauth]
Dec 05 08:52:28 compute-1 sshd-session[34350]: Disconnecting invalid user test 176.235.182.73 port 48224: Too many authentication failures [preauth]
Dec 05 08:52:30 compute-1 sshd-session[34353]: Invalid user test from 176.235.182.73 port 48780
Dec 05 08:52:30 compute-1 sshd-session[34353]: Received disconnect from 176.235.182.73 port 48780:11: disconnected by user [preauth]
Dec 05 08:52:30 compute-1 sshd-session[34353]: Disconnected from invalid user test 176.235.182.73 port 48780 [preauth]
Dec 05 08:52:31 compute-1 sshd-session[34361]: Invalid user user from 176.235.182.73 port 49202
Dec 05 08:52:32 compute-1 sshd-session[34361]: error: maximum authentication attempts exceeded for invalid user user from 176.235.182.73 port 49202 ssh2 [preauth]
Dec 05 08:52:32 compute-1 sshd-session[34361]: Disconnecting invalid user user 176.235.182.73 port 49202: Too many authentication failures [preauth]
Dec 05 08:52:33 compute-1 sshd-session[34369]: Invalid user user from 176.235.182.73 port 49746
Dec 05 08:52:34 compute-1 sshd-session[34371]: Received disconnect from 43.225.158.169 port 44046:11: Bye Bye [preauth]
Dec 05 08:52:34 compute-1 sshd-session[34371]: Disconnected from authenticating user root 43.225.158.169 port 44046 [preauth]
Dec 05 08:52:34 compute-1 sshd-session[34369]: error: maximum authentication attempts exceeded for invalid user user from 176.235.182.73 port 49746 ssh2 [preauth]
Dec 05 08:52:34 compute-1 sshd-session[34369]: Disconnecting invalid user user 176.235.182.73 port 49746: Too many authentication failures [preauth]
Dec 05 08:52:35 compute-1 sshd-session[34380]: Invalid user user from 176.235.182.73 port 50303
Dec 05 08:52:36 compute-1 sshd-session[34380]: Received disconnect from 176.235.182.73 port 50303:11: disconnected by user [preauth]
Dec 05 08:52:36 compute-1 sshd-session[34380]: Disconnected from invalid user user 176.235.182.73 port 50303 [preauth]
Dec 05 08:52:37 compute-1 sshd-session[34387]: Invalid user ftpuser from 176.235.182.73 port 50697
Dec 05 08:52:37 compute-1 sshd-session[34387]: error: maximum authentication attempts exceeded for invalid user ftpuser from 176.235.182.73 port 50697 ssh2 [preauth]
Dec 05 08:52:37 compute-1 sshd-session[34387]: Disconnecting invalid user ftpuser 176.235.182.73 port 50697: Too many authentication failures [preauth]
Dec 05 08:52:39 compute-1 sshd-session[34395]: Invalid user ftpuser from 176.235.182.73 port 51124
Dec 05 08:52:39 compute-1 sshd-session[34395]: error: maximum authentication attempts exceeded for invalid user ftpuser from 176.235.182.73 port 51124 ssh2 [preauth]
Dec 05 08:52:39 compute-1 sshd-session[34395]: Disconnecting invalid user ftpuser 176.235.182.73 port 51124: Too many authentication failures [preauth]
Dec 05 08:52:41 compute-1 sshd-session[34402]: Invalid user ftpuser from 176.235.182.73 port 51643
Dec 05 08:52:41 compute-1 sshd-session[34402]: Received disconnect from 176.235.182.73 port 51643:11: disconnected by user [preauth]
Dec 05 08:52:41 compute-1 sshd-session[34402]: Disconnected from invalid user ftpuser 176.235.182.73 port 51643 [preauth]
Dec 05 08:52:42 compute-1 sshd-session[34409]: Invalid user test1 from 176.235.182.73 port 52146
Dec 05 08:52:43 compute-1 sshd-session[34409]: error: maximum authentication attempts exceeded for invalid user test1 from 176.235.182.73 port 52146 ssh2 [preauth]
Dec 05 08:52:43 compute-1 sshd-session[34409]: Disconnecting invalid user test1 176.235.182.73 port 52146: Too many authentication failures [preauth]
Dec 05 08:52:44 compute-1 sshd-session[34415]: Received disconnect from 122.168.194.41 port 36684:11: Bye Bye [preauth]
Dec 05 08:52:44 compute-1 sshd-session[34415]: Disconnected from authenticating user root 122.168.194.41 port 36684 [preauth]
Dec 05 08:52:44 compute-1 sshd-session[34418]: Invalid user test1 from 176.235.182.73 port 52599
Dec 05 08:52:45 compute-1 sshd-session[34418]: error: maximum authentication attempts exceeded for invalid user test1 from 176.235.182.73 port 52599 ssh2 [preauth]
Dec 05 08:52:45 compute-1 sshd-session[34418]: Disconnecting invalid user test1 176.235.182.73 port 52599: Too many authentication failures [preauth]
Dec 05 08:52:46 compute-1 sshd-session[34430]: Invalid user test1 from 176.235.182.73 port 53125
Dec 05 08:52:47 compute-1 sshd-session[34430]: Received disconnect from 176.235.182.73 port 53125:11: disconnected by user [preauth]
Dec 05 08:52:47 compute-1 sshd-session[34430]: Disconnected from invalid user test1 176.235.182.73 port 53125 [preauth]
Dec 05 08:52:48 compute-1 sshd-session[34437]: Invalid user test2 from 176.235.182.73 port 53583
Dec 05 08:52:49 compute-1 sshd-session[34437]: error: maximum authentication attempts exceeded for invalid user test2 from 176.235.182.73 port 53583 ssh2 [preauth]
Dec 05 08:52:49 compute-1 sshd-session[34437]: Disconnecting invalid user test2 176.235.182.73 port 53583: Too many authentication failures [preauth]
Dec 05 08:52:50 compute-1 sshd[1008]: Timeout before authentication for connection from 101.126.89.144 to 38.102.83.154, pid = 31313
Dec 05 08:52:50 compute-1 sshd-session[34449]: Invalid user test2 from 176.235.182.73 port 54150
Dec 05 08:52:51 compute-1 sshd-session[34449]: error: maximum authentication attempts exceeded for invalid user test2 from 176.235.182.73 port 54150 ssh2 [preauth]
Dec 05 08:52:51 compute-1 sshd-session[34449]: Disconnecting invalid user test2 176.235.182.73 port 54150: Too many authentication failures [preauth]
Dec 05 08:52:52 compute-1 sshd-session[34456]: Invalid user test2 from 176.235.182.73 port 54661
Dec 05 08:52:53 compute-1 sshd-session[34456]: Received disconnect from 176.235.182.73 port 54661:11: disconnected by user [preauth]
Dec 05 08:52:53 compute-1 sshd-session[34456]: Disconnected from invalid user test2 176.235.182.73 port 54661 [preauth]
Dec 05 08:52:54 compute-1 sshd-session[34464]: Invalid user ubuntu from 176.235.182.73 port 54992
Dec 05 08:52:54 compute-1 sshd-session[34464]: error: maximum authentication attempts exceeded for invalid user ubuntu from 176.235.182.73 port 54992 ssh2 [preauth]
Dec 05 08:52:54 compute-1 sshd-session[34464]: Disconnecting invalid user ubuntu 176.235.182.73 port 54992: Too many authentication failures [preauth]
Dec 05 08:52:56 compute-1 sshd-session[34489]: Invalid user ubuntu from 176.235.182.73 port 55499
Dec 05 08:52:56 compute-1 sshd-session[34489]: error: maximum authentication attempts exceeded for invalid user ubuntu from 176.235.182.73 port 55499 ssh2 [preauth]
Dec 05 08:52:56 compute-1 sshd-session[34489]: Disconnecting invalid user ubuntu 176.235.182.73 port 55499: Too many authentication failures [preauth]
Dec 05 08:52:58 compute-1 sshd-session[34496]: Invalid user ubuntu from 176.235.182.73 port 56003
Dec 05 08:52:58 compute-1 sshd-session[34496]: Received disconnect from 176.235.182.73 port 56003:11: disconnected by user [preauth]
Dec 05 08:52:58 compute-1 sshd-session[34496]: Disconnected from invalid user ubuntu 176.235.182.73 port 56003 [preauth]
Dec 05 08:52:59 compute-1 sshd-session[34525]: Invalid user pi from 176.235.182.73 port 56428
Dec 05 08:53:00 compute-1 sshd-session[34525]: Received disconnect from 176.235.182.73 port 56428:11: disconnected by user [preauth]
Dec 05 08:53:00 compute-1 sshd-session[34525]: Disconnected from invalid user pi 176.235.182.73 port 56428 [preauth]
Dec 05 08:53:01 compute-1 sshd-session[34533]: Invalid user baikal from 176.235.182.73 port 56782
Dec 05 08:53:01 compute-1 sshd-session[34533]: Received disconnect from 176.235.182.73 port 56782:11: disconnected by user [preauth]
Dec 05 08:53:01 compute-1 sshd-session[34533]: Disconnected from invalid user baikal 176.235.182.73 port 56782 [preauth]
Dec 05 08:53:30 compute-1 sshd-session[34599]: Invalid user admin from 45.135.232.92 port 38646
Dec 05 08:53:30 compute-1 sshd-session[34599]: Connection reset by invalid user admin 45.135.232.92 port 38646 [preauth]
Dec 05 08:53:30 compute-1 kernel: SELinux:  Converting 2718 SID table entries...
Dec 05 08:53:30 compute-1 kernel: SELinux:  policy capability network_peer_controls=1
Dec 05 08:53:30 compute-1 kernel: SELinux:  policy capability open_perms=1
Dec 05 08:53:30 compute-1 kernel: SELinux:  policy capability extended_socket_class=1
Dec 05 08:53:30 compute-1 kernel: SELinux:  policy capability always_check_network=0
Dec 05 08:53:30 compute-1 kernel: SELinux:  policy capability cgroup_seclabel=1
Dec 05 08:53:30 compute-1 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Dec 05 08:53:30 compute-1 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Dec 05 08:53:31 compute-1 dbus-broker-launch[773]: avc:  op=load_policy lsm=selinux seqno=8 res=1
Dec 05 08:53:31 compute-1 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Dec 05 08:53:31 compute-1 systemd[1]: Starting man-db-cache-update.service...
Dec 05 08:53:31 compute-1 systemd[1]: Reloading.
Dec 05 08:53:31 compute-1 systemd-rc-local-generator[34717]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 05 08:53:31 compute-1 systemd[1]: Queuing reload/restart jobs for marked units…
Dec 05 08:53:32 compute-1 sudo[33998]: pam_unix(sudo:session): session closed for user root
Dec 05 08:53:32 compute-1 sshd-session[34678]: Invalid user admin from 45.135.232.92 port 38654
Dec 05 08:53:32 compute-1 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Dec 05 08:53:32 compute-1 systemd[1]: Finished man-db-cache-update.service.
Dec 05 08:53:32 compute-1 systemd[1]: man-db-cache-update.service: Consumed 1.340s CPU time.
Dec 05 08:53:32 compute-1 systemd[1]: run-red1be53773c74eacb319e5760e9b7b4f.service: Deactivated successfully.
Dec 05 08:53:32 compute-1 sudo[35634]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yixlcacfyblbqelpbjpskdmzjvyyzbtn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764924812.4124038-455-77053042794101/AnsiballZ_command.py'
Dec 05 08:53:32 compute-1 sudo[35634]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 08:53:32 compute-1 sshd-session[34678]: Connection reset by invalid user admin 45.135.232.92 port 38654 [preauth]
Dec 05 08:53:32 compute-1 python3.9[35636]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -V driverctl lvm2 crudini jq nftables NetworkManager openstack-selinux python3-libselinux python3-pyyaml rsync tmpwatch sysstat iproute-tc ksmtuned systemd-container crypto-policies-scripts grubby sos _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 05 08:53:33 compute-1 sshd-session[35486]: Received disconnect from 185.118.15.236 port 33206:11: Bye Bye [preauth]
Dec 05 08:53:33 compute-1 sshd-session[35486]: Disconnected from authenticating user root 185.118.15.236 port 33206 [preauth]
Dec 05 08:53:33 compute-1 sudo[35634]: pam_unix(sudo:session): session closed for user root
Dec 05 08:53:34 compute-1 sudo[35917]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xpspssraaxrvnybatoygmepajumupuid ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764924814.1104357-478-179101213896329/AnsiballZ_selinux.py'
Dec 05 08:53:34 compute-1 sudo[35917]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 08:53:35 compute-1 sshd-session[35638]: Invalid user kodi from 45.135.232.92 port 38658
Dec 05 08:53:35 compute-1 python3.9[35919]: ansible-ansible.posix.selinux Invoked with policy=targeted state=enforcing configfile=/etc/selinux/config update_kernel_param=False
Dec 05 08:53:35 compute-1 sudo[35917]: pam_unix(sudo:session): session closed for user root
Dec 05 08:53:35 compute-1 sshd-session[35638]: Connection reset by invalid user kodi 45.135.232.92 port 38658 [preauth]
Dec 05 08:53:36 compute-1 sudo[36070]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-siahicrwfwixrkcdbrecqgvmoieiqjzk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764924815.819755-511-193776628586052/AnsiballZ_command.py'
Dec 05 08:53:36 compute-1 sudo[36070]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 08:53:36 compute-1 python3.9[36072]: ansible-ansible.legacy.command Invoked with cmd=dd if=/dev/zero of=/swap count=1024 bs=1M creates=/swap _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None removes=None stdin=None
Dec 05 08:53:38 compute-1 sshd-session[36019]: Connection reset by authenticating user root 45.135.232.92 port 57822 [preauth]
Dec 05 08:53:39 compute-1 sudo[36070]: pam_unix(sudo:session): session closed for user root
Dec 05 08:53:40 compute-1 sudo[36226]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rxoomtdhwxuumtkoszilwiyxvegehivm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764924820.174185-535-251822084695945/AnsiballZ_file.py'
Dec 05 08:53:40 compute-1 sudo[36226]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 08:53:40 compute-1 python3.9[36228]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/swap recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False state=None _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 08:53:40 compute-1 sudo[36226]: pam_unix(sudo:session): session closed for user root
Dec 05 08:53:41 compute-1 sudo[36378]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bbzkfgmavfzaqzbcvmlezfocckthkxoa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764924821.4685843-559-122830085399275/AnsiballZ_mount.py'
Dec 05 08:53:41 compute-1 sudo[36378]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 08:53:42 compute-1 sshd-session[36075]: Connection reset by authenticating user root 45.135.232.92 port 57834 [preauth]
Dec 05 08:53:42 compute-1 python3.9[36380]: ansible-ansible.posix.mount Invoked with dump=0 fstype=swap name=none opts=sw passno=0 src=/swap state=present path=none boot=True opts_no_log=False backup=False fstab=None
Dec 05 08:53:42 compute-1 sudo[36378]: pam_unix(sudo:session): session closed for user root
Dec 05 08:53:46 compute-1 sshd-session[36405]: Received disconnect from 43.225.158.169 port 57189:11: Bye Bye [preauth]
Dec 05 08:53:46 compute-1 sshd-session[36405]: Disconnected from authenticating user root 43.225.158.169 port 57189 [preauth]
Dec 05 08:53:50 compute-1 sudo[36533]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dyqnbsccjitkydtsfkmdzjtskqlwwwzz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764924829.9214392-644-228912210002291/AnsiballZ_file.py'
Dec 05 08:53:50 compute-1 sudo[36533]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 08:53:50 compute-1 python3.9[36535]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/ca-trust/source/anchors setype=cert_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 05 08:53:50 compute-1 sudo[36533]: pam_unix(sudo:session): session closed for user root
Dec 05 08:53:53 compute-1 sudo[36685]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gseiuxmqzjwswxttkjmrxihbnkbicwxt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764924832.7385075-667-238969367577503/AnsiballZ_stat.py'
Dec 05 08:53:53 compute-1 sudo[36685]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 08:53:55 compute-1 python3.9[36687]: ansible-ansible.legacy.stat Invoked with path=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 08:53:55 compute-1 sudo[36685]: pam_unix(sudo:session): session closed for user root
Dec 05 08:53:56 compute-1 sudo[36808]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dodgggciqdlahuhouflkvsllrqxyefhq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764924832.7385075-667-238969367577503/AnsiballZ_copy.py'
Dec 05 08:53:56 compute-1 sudo[36808]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 08:53:56 compute-1 python3.9[36810]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764924832.7385075-667-238969367577503/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=0908ee08d371593f319166fe14009677e91e42dc backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 08:53:56 compute-1 sudo[36808]: pam_unix(sudo:session): session closed for user root
Dec 05 08:53:57 compute-1 sudo[36960]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ljfttotujyuuvfcafrlqgkxvbmhqqsfu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764924837.3310964-739-17850458316980/AnsiballZ_stat.py'
Dec 05 08:53:57 compute-1 sudo[36960]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 08:53:57 compute-1 python3.9[36962]: ansible-ansible.builtin.stat Invoked with path=/etc/lvm/devices/system.devices follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 05 08:53:57 compute-1 sudo[36960]: pam_unix(sudo:session): session closed for user root
Dec 05 08:53:58 compute-1 sudo[37112]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vifpmtmiidosvjzddcsarnmkvzullssp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764924838.1502311-763-122124687475595/AnsiballZ_command.py'
Dec 05 08:53:58 compute-1 sudo[37112]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 08:53:58 compute-1 python3.9[37114]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/vgimportdevices --all _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 05 08:53:58 compute-1 sudo[37112]: pam_unix(sudo:session): session closed for user root
Dec 05 08:53:59 compute-1 sudo[37265]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zdsrkpouhoekyudnyedwqgtwgxhhcwsj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764924838.9581122-787-273637534533368/AnsiballZ_file.py'
Dec 05 08:53:59 compute-1 sudo[37265]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 08:53:59 compute-1 sshd-session[37268]: Connection closed by 148.113.208.45 port 47171 [preauth]
Dec 05 08:53:59 compute-1 python3.9[37267]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/lvm/devices/system.devices state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 08:53:59 compute-1 sudo[37265]: pam_unix(sudo:session): session closed for user root
Dec 05 08:54:00 compute-1 sudo[37423]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pbifhpqmrebkiryqpmaeoducjynrakdz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764924840.0613675-820-189045692513973/AnsiballZ_getent.py'
Dec 05 08:54:00 compute-1 sudo[37423]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 08:54:00 compute-1 python3.9[37425]: ansible-ansible.builtin.getent Invoked with database=passwd key=qemu fail_key=True service=None split=None
Dec 05 08:54:00 compute-1 sudo[37423]: pam_unix(sudo:session): session closed for user root
Dec 05 08:54:00 compute-1 rsyslogd[1007]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Dec 05 08:54:01 compute-1 sshd-session[37294]: Received disconnect from 122.168.194.41 port 52560:11: Bye Bye [preauth]
Dec 05 08:54:01 compute-1 sshd-session[37294]: Disconnected from authenticating user root 122.168.194.41 port 52560 [preauth]
Dec 05 08:54:01 compute-1 sshd-session[37296]: Received disconnect from 122.114.113.177 port 34268:11: Bye Bye [preauth]
Dec 05 08:54:01 compute-1 sshd-session[37296]: Disconnected from authenticating user root 122.114.113.177 port 34268 [preauth]
Dec 05 08:54:01 compute-1 sudo[37577]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gjutptkjrvkoccppfupceffxidmfaetu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764924841.1883318-844-116950100897698/AnsiballZ_group.py'
Dec 05 08:54:01 compute-1 sudo[37577]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 08:54:01 compute-1 python3.9[37579]: ansible-ansible.builtin.group Invoked with gid=107 name=qemu state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Dec 05 08:54:01 compute-1 groupadd[37580]: group added to /etc/group: name=qemu, GID=107
Dec 05 08:54:01 compute-1 groupadd[37580]: group added to /etc/gshadow: name=qemu
Dec 05 08:54:01 compute-1 groupadd[37580]: new group: name=qemu, GID=107
Dec 05 08:54:01 compute-1 sudo[37577]: pam_unix(sudo:session): session closed for user root
Dec 05 08:54:02 compute-1 sudo[37735]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mgidfbzjjqgarzpmtuohsmijdintxbtx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764924842.2131095-868-39882663599852/AnsiballZ_user.py'
Dec 05 08:54:02 compute-1 sudo[37735]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 08:54:02 compute-1 python3.9[37737]: ansible-ansible.builtin.user Invoked with comment=qemu user group=qemu groups=[''] name=qemu shell=/sbin/nologin state=present uid=107 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-1 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Dec 05 08:54:02 compute-1 useradd[37739]: new user: name=qemu, UID=107, GID=107, home=/home/qemu, shell=/sbin/nologin, from=/dev/pts/0
Dec 05 08:54:03 compute-1 sudo[37735]: pam_unix(sudo:session): session closed for user root
Dec 05 08:54:03 compute-1 sudo[37895]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-snetdibobcthfjkvyyazcyqjrfsehdxy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764924843.2628915-892-110416931303282/AnsiballZ_getent.py'
Dec 05 08:54:03 compute-1 sudo[37895]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 08:54:03 compute-1 python3.9[37897]: ansible-ansible.builtin.getent Invoked with database=passwd key=hugetlbfs fail_key=True service=None split=None
Dec 05 08:54:03 compute-1 sudo[37895]: pam_unix(sudo:session): session closed for user root
Dec 05 08:54:04 compute-1 sudo[38048]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mjcvahyoxtefzixjlinvrcqbewsspnsf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764924844.0534773-916-106359071598766/AnsiballZ_group.py'
Dec 05 08:54:04 compute-1 sudo[38048]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 08:54:04 compute-1 python3.9[38050]: ansible-ansible.builtin.group Invoked with gid=42477 name=hugetlbfs state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Dec 05 08:54:04 compute-1 groupadd[38051]: group added to /etc/group: name=hugetlbfs, GID=42477
Dec 05 08:54:04 compute-1 groupadd[38051]: group added to /etc/gshadow: name=hugetlbfs
Dec 05 08:54:04 compute-1 groupadd[38051]: new group: name=hugetlbfs, GID=42477
Dec 05 08:54:04 compute-1 sudo[38048]: pam_unix(sudo:session): session closed for user root
Dec 05 08:54:05 compute-1 sudo[38206]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ndgettlrusmerxuccnuqmvsbvkrebvkq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764924845.0158017-943-14297439724594/AnsiballZ_file.py'
Dec 05 08:54:05 compute-1 sudo[38206]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 08:54:05 compute-1 python3.9[38208]: ansible-ansible.builtin.file Invoked with group=qemu mode=0755 owner=qemu path=/var/lib/vhost_sockets setype=virt_cache_t seuser=system_u state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None serole=None selevel=None attributes=None
Dec 05 08:54:05 compute-1 sudo[38206]: pam_unix(sudo:session): session closed for user root
Dec 05 08:54:06 compute-1 sudo[38358]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wgnkokvtfffckqolaawrpsojtofullmr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764924846.1165292-976-117048439384971/AnsiballZ_dnf.py'
Dec 05 08:54:06 compute-1 sudo[38358]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 08:54:06 compute-1 python3.9[38360]: ansible-ansible.legacy.dnf Invoked with name=['dracut-config-generic'] state=absent allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec 05 08:54:09 compute-1 sudo[38358]: pam_unix(sudo:session): session closed for user root
Dec 05 08:54:09 compute-1 sudo[38511]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-acnaotcksupvqmwhvfrtktpfpuvcskuh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764924849.5923893-1000-97539314386565/AnsiballZ_file.py'
Dec 05 08:54:09 compute-1 sudo[38511]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 08:54:10 compute-1 python3.9[38513]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/modules-load.d setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 05 08:54:10 compute-1 sudo[38511]: pam_unix(sudo:session): session closed for user root
Dec 05 08:54:10 compute-1 sudo[38663]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vzdgbwseupwvytnywjqvvnjhaubjbmdi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764924850.3489134-1024-272163424350147/AnsiballZ_stat.py'
Dec 05 08:54:10 compute-1 sudo[38663]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 08:54:10 compute-1 python3.9[38665]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/99-edpm.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 08:54:10 compute-1 sudo[38663]: pam_unix(sudo:session): session closed for user root
Dec 05 08:54:11 compute-1 sudo[38786]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-otwjwklcogyxkuwrnhxvcsprnipjraco ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764924850.3489134-1024-272163424350147/AnsiballZ_copy.py'
Dec 05 08:54:11 compute-1 sudo[38786]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 08:54:11 compute-1 python3.9[38788]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/99-edpm.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764924850.3489134-1024-272163424350147/.source.conf follow=False _original_basename=edpm-modprobe.conf.j2 checksum=8021efe01721d8fa8cab46b95c00ec1be6dbb9d0 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Dec 05 08:54:11 compute-1 sudo[38786]: pam_unix(sudo:session): session closed for user root
Dec 05 08:54:12 compute-1 sudo[38938]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lnjbjuixdilzhthlnqkixjzjbyucesbm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764924851.693031-1069-147116626351230/AnsiballZ_systemd.py'
Dec 05 08:54:12 compute-1 sudo[38938]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 08:54:12 compute-1 python3.9[38940]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 05 08:54:12 compute-1 systemd[1]: Starting Load Kernel Modules...
Dec 05 08:54:12 compute-1 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this.
Dec 05 08:54:12 compute-1 kernel: Bridge firewalling registered
Dec 05 08:54:12 compute-1 systemd-modules-load[38944]: Inserted module 'br_netfilter'
Dec 05 08:54:12 compute-1 systemd[1]: Finished Load Kernel Modules.
Dec 05 08:54:12 compute-1 sudo[38938]: pam_unix(sudo:session): session closed for user root
Dec 05 08:54:13 compute-1 sudo[39097]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iyhmvizmoretblmmxclxedhzhyudwvop ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764924852.9902642-1093-260504627430484/AnsiballZ_stat.py'
Dec 05 08:54:13 compute-1 sudo[39097]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 08:54:13 compute-1 python3.9[39099]: ansible-ansible.legacy.stat Invoked with path=/etc/sysctl.d/99-edpm.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 08:54:13 compute-1 sudo[39097]: pam_unix(sudo:session): session closed for user root
Dec 05 08:54:13 compute-1 sudo[39220]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yvplfuqvbckxkhqyvlmvxymguuiztqpm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764924852.9902642-1093-260504627430484/AnsiballZ_copy.py'
Dec 05 08:54:13 compute-1 sudo[39220]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 08:54:14 compute-1 python3.9[39222]: ansible-ansible.legacy.copy Invoked with dest=/etc/sysctl.d/99-edpm.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764924852.9902642-1093-260504627430484/.source.conf follow=False _original_basename=edpm-sysctl.conf.j2 checksum=2a366439721b855adcfe4d7f152babb68596a007 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Dec 05 08:54:14 compute-1 sudo[39220]: pam_unix(sudo:session): session closed for user root
Dec 05 08:54:14 compute-1 sudo[39372]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nhaaijxrcapjreynciecccuedshjdjsn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764924854.7198215-1147-214938821102976/AnsiballZ_dnf.py'
Dec 05 08:54:14 compute-1 sudo[39372]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 08:54:15 compute-1 python3.9[39374]: ansible-ansible.legacy.dnf Invoked with name=['tuned', 'tuned-profiles-cpu-partitioning'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec 05 08:54:19 compute-1 dbus-broker-launch[767]: Noticed file-system modification, trigger reload.
Dec 05 08:54:19 compute-1 dbus-broker-launch[767]: Noticed file-system modification, trigger reload.
Dec 05 08:54:19 compute-1 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Dec 05 08:54:19 compute-1 systemd[1]: Starting man-db-cache-update.service...
Dec 05 08:54:19 compute-1 systemd[1]: Reloading.
Dec 05 08:54:19 compute-1 systemd-rc-local-generator[39433]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 05 08:54:19 compute-1 systemd[1]: Queuing reload/restart jobs for marked units…
Dec 05 08:54:20 compute-1 sudo[39372]: pam_unix(sudo:session): session closed for user root
Dec 05 08:54:21 compute-1 python3.9[40960]: ansible-ansible.builtin.stat Invoked with path=/etc/tuned/active_profile follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 05 08:54:22 compute-1 python3.9[42023]: ansible-ansible.builtin.slurp Invoked with src=/etc/tuned/active_profile
Dec 05 08:54:23 compute-1 python3.9[42873]: ansible-ansible.builtin.stat Invoked with path=/etc/tuned/throughput-performance-variables.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 05 08:54:23 compute-1 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Dec 05 08:54:23 compute-1 systemd[1]: Finished man-db-cache-update.service.
Dec 05 08:54:23 compute-1 systemd[1]: man-db-cache-update.service: Consumed 5.139s CPU time.
Dec 05 08:54:23 compute-1 systemd[1]: run-r68564b5a95d540fc8e6c9dd355dca929.service: Deactivated successfully.
Dec 05 08:54:23 compute-1 sudo[43534]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-swmcbrhvesvoqzvmbiwgjethzsovrbvi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764924863.5500247-1264-84345227015092/AnsiballZ_command.py'
Dec 05 08:54:23 compute-1 sudo[43534]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 08:54:24 compute-1 python3.9[43536]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/tuned-adm profile throughput-performance _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 05 08:54:24 compute-1 systemd[1]: Starting Dynamic System Tuning Daemon...
Dec 05 08:54:24 compute-1 systemd[1]: Starting Authorization Manager...
Dec 05 08:54:24 compute-1 systemd[1]: Started Dynamic System Tuning Daemon.
Dec 05 08:54:24 compute-1 polkitd[43753]: Started polkitd version 0.117
Dec 05 08:54:24 compute-1 polkitd[43753]: Loading rules from directory /etc/polkit-1/rules.d
Dec 05 08:54:24 compute-1 polkitd[43753]: Loading rules from directory /usr/share/polkit-1/rules.d
Dec 05 08:54:24 compute-1 polkitd[43753]: Finished loading, compiling and executing 2 rules
Dec 05 08:54:24 compute-1 polkitd[43753]: Acquired the name org.freedesktop.PolicyKit1 on the system bus
Dec 05 08:54:24 compute-1 systemd[1]: Started Authorization Manager.
Dec 05 08:54:24 compute-1 sudo[43534]: pam_unix(sudo:session): session closed for user root
Dec 05 08:54:25 compute-1 sudo[43921]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-buyyclmqkvznaptppwlgefiulnapbaco ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764924865.1662028-1291-114284418488497/AnsiballZ_systemd.py'
Dec 05 08:54:25 compute-1 sudo[43921]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 08:54:25 compute-1 python3.9[43923]: ansible-ansible.builtin.systemd Invoked with enabled=True name=tuned state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 05 08:54:25 compute-1 systemd[1]: Stopping Dynamic System Tuning Daemon...
Dec 05 08:54:25 compute-1 systemd[1]: tuned.service: Deactivated successfully.
Dec 05 08:54:25 compute-1 systemd[1]: Stopped Dynamic System Tuning Daemon.
Dec 05 08:54:25 compute-1 systemd[1]: Starting Dynamic System Tuning Daemon...
Dec 05 08:54:26 compute-1 systemd[1]: Started Dynamic System Tuning Daemon.
Dec 05 08:54:26 compute-1 sudo[43921]: pam_unix(sudo:session): session closed for user root
Dec 05 08:54:26 compute-1 python3.9[44084]: ansible-ansible.builtin.slurp Invoked with src=/proc/cmdline
Dec 05 08:54:29 compute-1 sshd-session[43447]: Received disconnect from 101.47.162.91 port 56342:11: Bye Bye [preauth]
Dec 05 08:54:29 compute-1 sshd-session[43447]: Disconnected from authenticating user root 101.47.162.91 port 56342 [preauth]
Dec 05 08:54:31 compute-1 sudo[44234]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vzguewennixuvpliqemvjiubasgzbgos ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764924870.876777-1462-41970734117285/AnsiballZ_systemd.py'
Dec 05 08:54:31 compute-1 sudo[44234]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 08:54:31 compute-1 python3.9[44236]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ksm.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 05 08:54:31 compute-1 systemd[1]: Reloading.
Dec 05 08:54:31 compute-1 systemd-rc-local-generator[44265]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 05 08:54:31 compute-1 sudo[44234]: pam_unix(sudo:session): session closed for user root
Dec 05 08:54:32 compute-1 sudo[44422]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fvsllbupbmrgjtmejjrweriwehpffeha ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764924872.0298462-1462-75074912952468/AnsiballZ_systemd.py'
Dec 05 08:54:32 compute-1 sudo[44422]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 08:54:32 compute-1 python3.9[44424]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ksmtuned.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 05 08:54:32 compute-1 systemd[1]: Reloading.
Dec 05 08:54:32 compute-1 systemd-rc-local-generator[44455]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 05 08:54:32 compute-1 sudo[44422]: pam_unix(sudo:session): session closed for user root
Dec 05 08:54:33 compute-1 sudo[44612]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lplihgtcpspcdggxmueenaimnkszbrit ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764924873.2761092-1510-277714147927249/AnsiballZ_command.py'
Dec 05 08:54:33 compute-1 sudo[44612]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 08:54:33 compute-1 python3.9[44614]: ansible-ansible.legacy.command Invoked with _raw_params=mkswap "/swap" _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 05 08:54:33 compute-1 sudo[44612]: pam_unix(sudo:session): session closed for user root
Dec 05 08:54:34 compute-1 sudo[44765]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nopdhtjlweoufsidokmfxrxondsaoqfp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764924874.0400946-1534-257337361434657/AnsiballZ_command.py'
Dec 05 08:54:34 compute-1 sudo[44765]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 08:54:34 compute-1 python3.9[44767]: ansible-ansible.legacy.command Invoked with _raw_params=swapon "/swap" _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 05 08:54:34 compute-1 kernel: Adding 1048572k swap on /swap.  Priority:-2 extents:1 across:1048572k 
Dec 05 08:54:34 compute-1 sudo[44765]: pam_unix(sudo:session): session closed for user root
Dec 05 08:54:35 compute-1 sudo[44918]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pismnzgjfmjebmuldenqlgoazhalxfjm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764924874.8225777-1558-61929967653675/AnsiballZ_command.py'
Dec 05 08:54:35 compute-1 sudo[44918]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 08:54:35 compute-1 python3.9[44920]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/bin/update-ca-trust _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 05 08:54:37 compute-1 sudo[44918]: pam_unix(sudo:session): session closed for user root
Dec 05 08:54:37 compute-1 sudo[45080]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vqiaxeijrocvgyhkowmscwhtyovsxaji ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764924877.2558413-1582-247204658420592/AnsiballZ_command.py'
Dec 05 08:54:37 compute-1 sudo[45080]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 08:54:37 compute-1 python3.9[45082]: ansible-ansible.legacy.command Invoked with _raw_params=echo 2 >/sys/kernel/mm/ksm/run _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 05 08:54:37 compute-1 sudo[45080]: pam_unix(sudo:session): session closed for user root
Dec 05 08:54:38 compute-1 sudo[45233]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hywdeyfgfooscwgaxuniyksakvtyushm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764924877.9910052-1606-79688752937566/AnsiballZ_systemd.py'
Dec 05 08:54:38 compute-1 sudo[45233]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 08:54:38 compute-1 python3.9[45235]: ansible-ansible.builtin.systemd Invoked with name=systemd-sysctl.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 05 08:54:38 compute-1 systemd[1]: systemd-sysctl.service: Deactivated successfully.
Dec 05 08:54:38 compute-1 systemd[1]: Stopped Apply Kernel Variables.
Dec 05 08:54:38 compute-1 systemd[1]: Stopping Apply Kernel Variables...
Dec 05 08:54:38 compute-1 systemd[1]: Starting Apply Kernel Variables...
Dec 05 08:54:38 compute-1 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully.
Dec 05 08:54:38 compute-1 systemd[1]: Finished Apply Kernel Variables.
Dec 05 08:54:38 compute-1 sudo[45233]: pam_unix(sudo:session): session closed for user root
Dec 05 08:54:39 compute-1 sshd-session[31524]: Connection closed by 192.168.122.30 port 56482
Dec 05 08:54:39 compute-1 sshd-session[31521]: pam_unix(sshd:session): session closed for user zuul
Dec 05 08:54:39 compute-1 systemd[1]: session-11.scope: Deactivated successfully.
Dec 05 08:54:39 compute-1 systemd[1]: session-11.scope: Consumed 2min 28.761s CPU time.
Dec 05 08:54:39 compute-1 systemd-logind[807]: Session 11 logged out. Waiting for processes to exit.
Dec 05 08:54:39 compute-1 systemd-logind[807]: Removed session 11.
Dec 05 08:54:44 compute-1 sshd-session[45265]: Accepted publickey for zuul from 192.168.122.30 port 45552 ssh2: ECDSA SHA256:xFeNApY160LxGIiSPyA1pr6/OAQjwgC1t0EAbYvSE3Q
Dec 05 08:54:44 compute-1 systemd-logind[807]: New session 12 of user zuul.
Dec 05 08:54:44 compute-1 systemd[1]: Started Session 12 of User zuul.
Dec 05 08:54:44 compute-1 sshd-session[45265]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 05 08:54:45 compute-1 python3.9[45418]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 05 08:54:47 compute-1 python3.9[45572]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 05 08:54:48 compute-1 sudo[45728]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bushisovexgjnzhxtdzrgfzscdoczcmk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764924888.1463776-111-221633412856056/AnsiballZ_command.py'
Dec 05 08:54:48 compute-1 sudo[45728]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 08:54:48 compute-1 python3.9[45730]: ansible-ansible.legacy.command Invoked with _raw_params=PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin which growvols
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 05 08:54:48 compute-1 sudo[45728]: pam_unix(sudo:session): session closed for user root
Dec 05 08:54:48 compute-1 sshd-session[45601]: Received disconnect from 185.118.15.236 port 33324:11: Bye Bye [preauth]
Dec 05 08:54:48 compute-1 sshd-session[45601]: Disconnected from authenticating user root 185.118.15.236 port 33324 [preauth]
Dec 05 08:54:49 compute-1 python3.9[45881]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 05 08:54:50 compute-1 sudo[46035]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nfillczpyblonmniadtxbrdjrbojyren ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764924890.3569725-171-234689348170363/AnsiballZ_setup.py'
Dec 05 08:54:50 compute-1 sudo[46035]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 08:54:50 compute-1 python3.9[46037]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec 05 08:54:51 compute-1 sudo[46035]: pam_unix(sudo:session): session closed for user root
Dec 05 08:54:51 compute-1 sudo[46119]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cgsjwjpscvwjqokqdowblfnbjzueihfd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764924890.3569725-171-234689348170363/AnsiballZ_dnf.py'
Dec 05 08:54:51 compute-1 sudo[46119]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 08:54:51 compute-1 python3.9[46121]: ansible-ansible.legacy.dnf Invoked with name=['podman'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec 05 08:54:53 compute-1 sudo[46119]: pam_unix(sudo:session): session closed for user root
Dec 05 08:54:54 compute-1 sudo[46272]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ieahdsqxpnyycivcnckweleqgikfehsw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764924893.954973-207-230485733123488/AnsiballZ_setup.py'
Dec 05 08:54:54 compute-1 sudo[46272]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 08:54:54 compute-1 python3.9[46274]: ansible-ansible.builtin.setup Invoked with filter=['ansible_interfaces'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec 05 08:54:54 compute-1 sudo[46272]: pam_unix(sudo:session): session closed for user root
Dec 05 08:54:55 compute-1 sudo[46445]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fjzipumcmoyebhalsembbdohwwjlrfjx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764924895.1256387-240-249611102999462/AnsiballZ_file.py'
Dec 05 08:54:55 compute-1 sudo[46445]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 08:54:55 compute-1 python3.9[46447]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/containers/networks recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 08:54:55 compute-1 sudo[46445]: pam_unix(sudo:session): session closed for user root
Dec 05 08:54:56 compute-1 sshd-session[46318]: Received disconnect from 43.225.158.169 port 42097:11: Bye Bye [preauth]
Dec 05 08:54:56 compute-1 sshd-session[46318]: Disconnected from authenticating user root 43.225.158.169 port 42097 [preauth]
Dec 05 08:54:56 compute-1 sudo[46597]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ywsqwukjddkgydtkhezogkabzkizxycy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764924896.1063821-264-144996046606404/AnsiballZ_command.py'
Dec 05 08:54:56 compute-1 sudo[46597]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 08:54:56 compute-1 python3.9[46599]: ansible-ansible.legacy.command Invoked with _raw_params=podman network inspect podman
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 05 08:54:56 compute-1 systemd[1]: var-lib-containers-storage-overlay-compat2573609993-merged.mount: Deactivated successfully.
Dec 05 08:54:56 compute-1 podman[46600]: 2025-12-05 08:54:56.670000159 +0000 UTC m=+0.058296720 system refresh
Dec 05 08:54:56 compute-1 sudo[46597]: pam_unix(sudo:session): session closed for user root
Dec 05 08:54:57 compute-1 sudo[46761]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jbrjhshbvzkbfcnsussrqdrlgrmtsign ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764924897.001454-288-138163280472444/AnsiballZ_stat.py'
Dec 05 08:54:57 compute-1 sudo[46761]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 08:54:57 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 05 08:54:57 compute-1 python3.9[46763]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/networks/podman.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 08:54:57 compute-1 sudo[46761]: pam_unix(sudo:session): session closed for user root
Dec 05 08:54:58 compute-1 sudo[46884]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ghtljfjpuxegyxqxbyyvjdwaxamtanxf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764924897.001454-288-138163280472444/AnsiballZ_copy.py'
Dec 05 08:54:58 compute-1 sudo[46884]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 08:54:58 compute-1 python3.9[46886]: ansible-ansible.legacy.copy Invoked with dest=/etc/containers/networks/podman.json group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764924897.001454-288-138163280472444/.source.json follow=False _original_basename=podman_network_config.j2 checksum=ff0a917a0635aff7175e03388fdd59d396443e64 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 08:54:58 compute-1 sudo[46884]: pam_unix(sudo:session): session closed for user root
Dec 05 08:54:58 compute-1 sudo[47036]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hecarjngxnjtuuoxhtvcksquwckgajyi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764924898.5880184-333-57692039882072/AnsiballZ_stat.py'
Dec 05 08:54:58 compute-1 sudo[47036]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 08:54:59 compute-1 python3.9[47038]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 08:54:59 compute-1 sudo[47036]: pam_unix(sudo:session): session closed for user root
Dec 05 08:54:59 compute-1 sudo[47159]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jdizqmfotxddgntfpfalsadjhtcprizo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764924898.5880184-333-57692039882072/AnsiballZ_copy.py'
Dec 05 08:54:59 compute-1 sudo[47159]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 08:55:00 compute-1 python3.9[47161]: ansible-ansible.legacy.copy Invoked with dest=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764924898.5880184-333-57692039882072/.source.conf follow=False _original_basename=registries.conf.j2 checksum=3bde743f330b42b2957435461d5abbcdf0e7bc51 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Dec 05 08:55:00 compute-1 sudo[47159]: pam_unix(sudo:session): session closed for user root
Dec 05 08:55:00 compute-1 sudo[47311]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pzgirnzrdlwhcoaebwecobhawdqlzple ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764924900.4757166-381-59116150570010/AnsiballZ_ini_file.py'
Dec 05 08:55:00 compute-1 sudo[47311]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 08:55:01 compute-1 python3.9[47313]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=pids_limit owner=root path=/etc/containers/containers.conf section=containers setype=etc_t value=4096 backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Dec 05 08:55:01 compute-1 sudo[47311]: pam_unix(sudo:session): session closed for user root
Dec 05 08:55:01 compute-1 sudo[47463]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jrdrmrnjxicdslancwjdzagmggupslsf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764924901.2701411-381-275861892758803/AnsiballZ_ini_file.py'
Dec 05 08:55:01 compute-1 sudo[47463]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 08:55:01 compute-1 python3.9[47465]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=events_logger owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="journald" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Dec 05 08:55:01 compute-1 sudo[47463]: pam_unix(sudo:session): session closed for user root
Dec 05 08:55:02 compute-1 sudo[47615]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vrosnxslswfumxpsptjqknuppdecvlxi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764924901.9527688-381-43993555034282/AnsiballZ_ini_file.py'
Dec 05 08:55:02 compute-1 sudo[47615]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 08:55:02 compute-1 python3.9[47617]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=runtime owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="crun" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Dec 05 08:55:02 compute-1 sudo[47615]: pam_unix(sudo:session): session closed for user root
Dec 05 08:55:02 compute-1 sudo[47767]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hdmqqjnlvxjrljbeiahnbqpgtvxfcofx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764924902.6254034-381-134437615015174/AnsiballZ_ini_file.py'
Dec 05 08:55:02 compute-1 sudo[47767]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 08:55:03 compute-1 python3.9[47769]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=network_backend owner=root path=/etc/containers/containers.conf section=network setype=etc_t value="netavark" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Dec 05 08:55:03 compute-1 sudo[47767]: pam_unix(sudo:session): session closed for user root
Dec 05 08:55:04 compute-1 python3.9[47919]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 05 08:55:05 compute-1 sudo[48071]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nhnbgedlwsyklwjuwmyylkngqtmvrkgx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764924904.7520454-501-173459432619407/AnsiballZ_dnf.py'
Dec 05 08:55:05 compute-1 sudo[48071]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 08:55:05 compute-1 python3.9[48073]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Dec 05 08:55:06 compute-1 sudo[48071]: pam_unix(sudo:session): session closed for user root
Dec 05 08:55:07 compute-1 sudo[48224]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-txpxtagzvoyknmqgaaxxqhnqvtfrxpor ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764924907.1602693-525-218985316335358/AnsiballZ_dnf.py'
Dec 05 08:55:07 compute-1 sudo[48224]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 08:55:07 compute-1 python3.9[48226]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['openstack-network-scripts'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Dec 05 08:55:09 compute-1 sudo[48224]: pam_unix(sudo:session): session closed for user root
Dec 05 08:55:10 compute-1 sudo[48384]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-etrxpdwcarvfpwrdprdktrlgxlndjadp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764924910.3886669-555-89653466493154/AnsiballZ_dnf.py'
Dec 05 08:55:10 compute-1 sudo[48384]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 08:55:10 compute-1 python3.9[48386]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['podman', 'buildah'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Dec 05 08:55:12 compute-1 sudo[48384]: pam_unix(sudo:session): session closed for user root
Dec 05 08:55:13 compute-1 sudo[48537]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-irbigeetmigiricmrihpfadwiylwtske ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764924912.8692951-582-9088623704952/AnsiballZ_dnf.py'
Dec 05 08:55:13 compute-1 sudo[48537]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 08:55:13 compute-1 python3.9[48539]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['tuned', 'tuned-profiles-cpu-partitioning'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Dec 05 08:55:14 compute-1 sudo[48537]: pam_unix(sudo:session): session closed for user root
Dec 05 08:55:15 compute-1 sudo[48690]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iwxsyzlkzomfnldafcmkaxzensmdbgyf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764924915.5235755-615-205765051679990/AnsiballZ_dnf.py'
Dec 05 08:55:15 compute-1 sudo[48690]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 08:55:16 compute-1 python3.9[48692]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['NetworkManager-ovs'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Dec 05 08:55:17 compute-1 sudo[48690]: pam_unix(sudo:session): session closed for user root
Dec 05 08:55:18 compute-1 sshd-session[48694]: Received disconnect from 122.168.194.41 port 33696:11: Bye Bye [preauth]
Dec 05 08:55:18 compute-1 sshd-session[48694]: Disconnected from authenticating user root 122.168.194.41 port 33696 [preauth]
Dec 05 08:55:18 compute-1 sudo[48848]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-edpbxlfxtctfxvdbjxuwerkhdobghgoi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764924918.0102692-639-86050311625385/AnsiballZ_dnf.py'
Dec 05 08:55:18 compute-1 sudo[48848]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 08:55:18 compute-1 python3.9[48850]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['os-net-config'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Dec 05 08:55:22 compute-1 sudo[48848]: pam_unix(sudo:session): session closed for user root
Dec 05 08:55:24 compute-1 sudo[49017]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vdrqfsxmwvxvqqnmtizkckityixbfxsw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764924923.8187838-666-263900333179538/AnsiballZ_dnf.py'
Dec 05 08:55:24 compute-1 sudo[49017]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 08:55:24 compute-1 python3.9[49019]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['openssh-server'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Dec 05 08:55:25 compute-1 sudo[49017]: pam_unix(sudo:session): session closed for user root
Dec 05 08:55:26 compute-1 sudo[49170]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-btfvisbzhopzjgpqtiapbebhtpxtbknz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764924926.0943334-693-217041931126629/AnsiballZ_dnf.py'
Dec 05 08:55:26 compute-1 sudo[49170]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 08:55:26 compute-1 python3.9[49172]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['libvirt ', 'libvirt-admin ', 'libvirt-client ', 'libvirt-daemon ', 'qemu-kvm', 'qemu-img', 'libguestfs', 'libseccomp', 'swtpm', 'swtpm-tools', 'edk2-ovmf', 'ceph-common', 'cyrus-sasl-scram'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Dec 05 08:55:39 compute-1 sudo[49170]: pam_unix(sudo:session): session closed for user root
Dec 05 08:55:41 compute-1 sudo[49506]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-honjkqfqtwxnspawgcfpcfqitkaeccmb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764924941.2024088-720-217296985163181/AnsiballZ_dnf.py'
Dec 05 08:55:41 compute-1 sudo[49506]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 08:55:41 compute-1 python3.9[49508]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['iscsi-initiator-utils'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Dec 05 08:55:43 compute-1 sudo[49506]: pam_unix(sudo:session): session closed for user root
Dec 05 08:55:44 compute-1 sudo[49662]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-orznokyssvjiwgalyevutoblursdtuuk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764924944.0560431-753-135457388935663/AnsiballZ_file.py'
Dec 05 08:55:44 compute-1 sudo[49662]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 08:55:44 compute-1 python3.9[49664]: ansible-ansible.builtin.file Invoked with group=zuul mode=0770 owner=zuul path=/root/.config/containers recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 08:55:44 compute-1 sudo[49662]: pam_unix(sudo:session): session closed for user root
Dec 05 08:55:45 compute-1 sudo[49837]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ubkyavplliuojycxinfcrxstgqiomyqm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764924944.816729-777-96541513198074/AnsiballZ_stat.py'
Dec 05 08:55:45 compute-1 sudo[49837]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 08:55:45 compute-1 python3.9[49839]: ansible-ansible.legacy.stat Invoked with path=/root/.config/containers/auth.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 08:55:45 compute-1 sudo[49837]: pam_unix(sudo:session): session closed for user root
Dec 05 08:55:45 compute-1 sudo[49960]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-junngbgtxwvtdglfcrequtifbekdcgeo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764924944.816729-777-96541513198074/AnsiballZ_copy.py'
Dec 05 08:55:45 compute-1 sudo[49960]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 08:55:46 compute-1 python3.9[49962]: ansible-ansible.legacy.copy Invoked with dest=/root/.config/containers/auth.json group=zuul mode=0660 owner=zuul src=/home/zuul/.ansible/tmp/ansible-tmp-1764924944.816729-777-96541513198074/.source.json _original_basename=.2p3nhudp follow=False checksum=bf21a9e8fbc5a3846fb05b4fa0859e0917b2202f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 08:55:46 compute-1 sudo[49960]: pam_unix(sudo:session): session closed for user root
Dec 05 08:55:47 compute-1 sudo[50112]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cksoevdnjzaexyclwselfoumtfrjkfam ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764924946.6308675-831-157000409972129/AnsiballZ_podman_image.py'
Dec 05 08:55:47 compute-1 sudo[50112]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 08:55:47 compute-1 python3.9[50114]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Dec 05 08:55:47 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 05 08:55:50 compute-1 systemd[1]: var-lib-containers-storage-overlay-compat4064981147-lower\x2dmapped.mount: Deactivated successfully.
Dec 05 08:55:56 compute-1 podman[50126]: 2025-12-05 08:55:56.417876867 +0000 UTC m=+9.028166618 image pull 3a37a52861b2e44ebd2a63ca2589a7c9d8e4119e5feace9d19c6312ed9b8421c quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified
Dec 05 08:55:56 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 05 08:55:56 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 05 08:55:56 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 05 08:55:56 compute-1 sudo[50112]: pam_unix(sudo:session): session closed for user root
Dec 05 08:55:57 compute-1 sudo[50421]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rmqcpepikagqxptxopvvmruzgzocnbii ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764924957.441809-865-57198780069025/AnsiballZ_podman_image.py'
Dec 05 08:55:57 compute-1 sudo[50421]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 08:55:57 compute-1 python3.9[50423]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Dec 05 08:55:58 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 05 08:56:01 compute-1 sshd-session[50485]: Received disconnect from 185.118.15.236 port 33446:11: Bye Bye [preauth]
Dec 05 08:56:01 compute-1 sshd-session[50485]: Disconnected from authenticating user root 185.118.15.236 port 33446 [preauth]
Dec 05 08:56:06 compute-1 sshd-session[50503]: Received disconnect from 43.225.158.169 port 55236:11: Bye Bye [preauth]
Dec 05 08:56:06 compute-1 sshd-session[50503]: Disconnected from authenticating user root 43.225.158.169 port 55236 [preauth]
Dec 05 08:56:12 compute-1 podman[50436]: 2025-12-05 08:56:12.592000481 +0000 UTC m=+14.555357936 image pull 014dc726c85414b29f2dde7b5d875685d08784761c0f0ffa8630d1583a877bf9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Dec 05 08:56:12 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 05 08:56:12 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 05 08:56:12 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 05 08:56:12 compute-1 sudo[50421]: pam_unix(sudo:session): session closed for user root
Dec 05 08:56:13 compute-1 sudo[50737]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jbuylbxtuqhouxlfdtnxwhzcsisdqsnd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764924973.4038815-894-190906511652114/AnsiballZ_podman_image.py'
Dec 05 08:56:13 compute-1 sudo[50737]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 08:56:13 compute-1 python3.9[50739]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Dec 05 08:56:13 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 05 08:56:15 compute-1 podman[50751]: 2025-12-05 08:56:15.198654944 +0000 UTC m=+1.180601797 image pull 9af6aa52ee187025bc25565b66d3eefb486acac26f9281e33f4cce76a40d21f7 quay.io/podified-antelope-centos9/openstack-multipathd:current-podified
Dec 05 08:56:15 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 05 08:56:15 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 05 08:56:15 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 05 08:56:15 compute-1 sudo[50737]: pam_unix(sudo:session): session closed for user root
Dec 05 08:56:16 compute-1 sudo[50986]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hpenbukjptsauwhepsizkpedkkglibtf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764924975.7831137-921-281111912923664/AnsiballZ_podman_image.py'
Dec 05 08:56:16 compute-1 sudo[50986]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 08:56:16 compute-1 python3.9[50988]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Dec 05 08:56:16 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 05 08:56:32 compute-1 sshd-session[51057]: Received disconnect from 122.168.194.41 port 37742:11: Bye Bye [preauth]
Dec 05 08:56:32 compute-1 sshd-session[51057]: Disconnected from authenticating user root 122.168.194.41 port 37742 [preauth]
Dec 05 08:56:42 compute-1 podman[51000]: 2025-12-05 08:56:42.79677594 +0000 UTC m=+26.426121879 image pull 5571c1b2140c835f70406e4553b3b44135b9c9b4eb673345cbd571460c5d59a3 quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified
Dec 05 08:56:42 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 05 08:56:42 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 05 08:56:42 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 05 08:56:43 compute-1 sudo[50986]: pam_unix(sudo:session): session closed for user root
Dec 05 08:56:44 compute-1 sudo[51261]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pwqjqiqkasspiiuqvofxiehaysvhtpsc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925003.6864157-954-39138756856870/AnsiballZ_podman_image.py'
Dec 05 08:56:44 compute-1 sudo[51261]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 08:56:44 compute-1 python3.9[51263]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Dec 05 08:56:44 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 05 08:56:47 compute-1 podman[51275]: 2025-12-05 08:56:47.39824495 +0000 UTC m=+3.062973882 image pull 343ba269c9fe0a56d7572c8ca328dbce002017c4dd4986f43667971dd03085c2 quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified
Dec 05 08:56:47 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 05 08:56:47 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 05 08:56:47 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 05 08:56:47 compute-1 sudo[51261]: pam_unix(sudo:session): session closed for user root
Dec 05 08:56:48 compute-1 sudo[51535]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nysetxklwxxgyaygjkfgslftrbzkzmtv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925007.8276906-954-241937596664502/AnsiballZ_podman_image.py'
Dec 05 08:56:48 compute-1 sudo[51535]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 08:56:48 compute-1 python3.9[51537]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/prometheus/node-exporter:v1.5.0 tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Dec 05 08:56:50 compute-1 podman[51548]: 2025-12-05 08:56:50.104671574 +0000 UTC m=+1.705808567 image pull 0da6a335fe1356545476b749c68f022c897de3a2139e8f0054f6937349ee2b83 quay.io/prometheus/node-exporter:v1.5.0
Dec 05 08:56:50 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 05 08:56:50 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 05 08:56:50 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 05 08:56:50 compute-1 sudo[51535]: pam_unix(sudo:session): session closed for user root
Dec 05 08:56:51 compute-1 sshd-session[45268]: Connection closed by 192.168.122.30 port 45552
Dec 05 08:56:51 compute-1 sshd-session[45265]: pam_unix(sshd:session): session closed for user zuul
Dec 05 08:56:51 compute-1 systemd[1]: session-12.scope: Deactivated successfully.
Dec 05 08:56:51 compute-1 systemd[1]: session-12.scope: Consumed 2min 27.599s CPU time.
Dec 05 08:56:51 compute-1 systemd-logind[807]: Session 12 logged out. Waiting for processes to exit.
Dec 05 08:56:51 compute-1 systemd-logind[807]: Removed session 12.
Dec 05 08:56:55 compute-1 sshd-session[51359]: Connection closed by 101.47.162.91 port 59104 [preauth]
Dec 05 08:56:56 compute-1 sshd-session[51699]: Accepted publickey for zuul from 192.168.122.30 port 50214 ssh2: ECDSA SHA256:xFeNApY160LxGIiSPyA1pr6/OAQjwgC1t0EAbYvSE3Q
Dec 05 08:56:56 compute-1 systemd-logind[807]: New session 13 of user zuul.
Dec 05 08:56:56 compute-1 systemd[1]: Started Session 13 of User zuul.
Dec 05 08:56:56 compute-1 sshd-session[51699]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 05 08:56:57 compute-1 python3.9[51852]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 05 08:56:59 compute-1 sudo[52006]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-icxzcsikzwzlwznpviqzskygtwjdkkvb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925018.9137497-69-30729451598131/AnsiballZ_getent.py'
Dec 05 08:56:59 compute-1 sudo[52006]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 08:56:59 compute-1 python3.9[52008]: ansible-ansible.builtin.getent Invoked with database=passwd key=openvswitch fail_key=True service=None split=None
Dec 05 08:56:59 compute-1 sudo[52006]: pam_unix(sudo:session): session closed for user root
Dec 05 08:57:00 compute-1 sudo[52159]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-blqqciukxwfnvvidgiuqzgxpcfpvojhi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925019.8493228-93-220870991982139/AnsiballZ_group.py'
Dec 05 08:57:00 compute-1 sudo[52159]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 08:57:00 compute-1 python3.9[52161]: ansible-ansible.builtin.group Invoked with gid=42476 name=openvswitch state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Dec 05 08:57:00 compute-1 groupadd[52162]: group added to /etc/group: name=openvswitch, GID=42476
Dec 05 08:57:00 compute-1 groupadd[52162]: group added to /etc/gshadow: name=openvswitch
Dec 05 08:57:00 compute-1 groupadd[52162]: new group: name=openvswitch, GID=42476
Dec 05 08:57:00 compute-1 sudo[52159]: pam_unix(sudo:session): session closed for user root
Dec 05 08:57:01 compute-1 sudo[52317]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vjdyvcdnvwoxpdnakicmhwutctgzsliu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925020.8483458-117-118382864931674/AnsiballZ_user.py'
Dec 05 08:57:01 compute-1 sudo[52317]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 08:57:01 compute-1 python3.9[52319]: ansible-ansible.builtin.user Invoked with comment=openvswitch user group=openvswitch groups=['hugetlbfs'] name=openvswitch shell=/sbin/nologin state=present uid=42476 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-1 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Dec 05 08:57:01 compute-1 useradd[52321]: new user: name=openvswitch, UID=42476, GID=42476, home=/home/openvswitch, shell=/sbin/nologin, from=/dev/pts/0
Dec 05 08:57:01 compute-1 useradd[52321]: add 'openvswitch' to group 'hugetlbfs'
Dec 05 08:57:01 compute-1 useradd[52321]: add 'openvswitch' to shadow group 'hugetlbfs'
Dec 05 08:57:01 compute-1 sudo[52317]: pam_unix(sudo:session): session closed for user root
Dec 05 08:57:03 compute-1 sudo[52477]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-udjurhnxzbiqndqimjmgtsnfrasnmslw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925023.0064604-147-190262258403416/AnsiballZ_setup.py'
Dec 05 08:57:03 compute-1 sudo[52477]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 08:57:03 compute-1 python3.9[52479]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec 05 08:57:03 compute-1 sudo[52477]: pam_unix(sudo:session): session closed for user root
Dec 05 08:57:04 compute-1 sudo[52561]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aevzlzeuvkkjpnljgyobdufcgrfglleg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925023.0064604-147-190262258403416/AnsiballZ_dnf.py'
Dec 05 08:57:04 compute-1 sudo[52561]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 08:57:04 compute-1 python3.9[52563]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['openvswitch'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Dec 05 08:57:06 compute-1 sudo[52561]: pam_unix(sudo:session): session closed for user root
Dec 05 08:57:07 compute-1 sudo[52724]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yworgtnslfkyiaxpfxvdbmyxibyeayzk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925027.3607252-189-20514430979218/AnsiballZ_dnf.py'
Dec 05 08:57:07 compute-1 sudo[52724]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 08:57:07 compute-1 python3.9[52726]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec 05 08:57:17 compute-1 sshd-session[52743]: Received disconnect from 185.118.15.236 port 33568:11: Bye Bye [preauth]
Dec 05 08:57:17 compute-1 sshd-session[52743]: Disconnected from authenticating user root 185.118.15.236 port 33568 [preauth]
Dec 05 08:57:17 compute-1 sshd-session[52741]: Received disconnect from 43.225.158.169 port 40144:11: Bye Bye [preauth]
Dec 05 08:57:17 compute-1 sshd-session[52741]: Disconnected from authenticating user root 43.225.158.169 port 40144 [preauth]
Dec 05 08:57:24 compute-1 kernel: SELinux:  Converting 2731 SID table entries...
Dec 05 08:57:24 compute-1 kernel: SELinux:  policy capability network_peer_controls=1
Dec 05 08:57:24 compute-1 kernel: SELinux:  policy capability open_perms=1
Dec 05 08:57:24 compute-1 kernel: SELinux:  policy capability extended_socket_class=1
Dec 05 08:57:24 compute-1 kernel: SELinux:  policy capability always_check_network=0
Dec 05 08:57:24 compute-1 kernel: SELinux:  policy capability cgroup_seclabel=1
Dec 05 08:57:24 compute-1 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Dec 05 08:57:24 compute-1 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Dec 05 08:57:24 compute-1 groupadd[52753]: group added to /etc/group: name=unbound, GID=993
Dec 05 08:57:24 compute-1 groupadd[52753]: group added to /etc/gshadow: name=unbound
Dec 05 08:57:24 compute-1 groupadd[52753]: new group: name=unbound, GID=993
Dec 05 08:57:24 compute-1 useradd[52760]: new user: name=unbound, UID=993, GID=993, home=/var/lib/unbound, shell=/sbin/nologin, from=none
Dec 05 08:57:24 compute-1 dbus-broker-launch[773]: avc:  op=load_policy lsm=selinux seqno=9 res=1
Dec 05 08:57:24 compute-1 systemd[1]: Started daily update of the root trust anchor for DNSSEC.
Dec 05 08:57:26 compute-1 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Dec 05 08:57:26 compute-1 systemd[1]: Starting man-db-cache-update.service...
Dec 05 08:57:26 compute-1 systemd[1]: Reloading.
Dec 05 08:57:26 compute-1 systemd-rc-local-generator[53260]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 05 08:57:26 compute-1 systemd-sysv-generator[53263]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 05 08:57:26 compute-1 systemd[1]: Queuing reload/restart jobs for marked units…
Dec 05 08:57:27 compute-1 sudo[52724]: pam_unix(sudo:session): session closed for user root
Dec 05 08:57:27 compute-1 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Dec 05 08:57:27 compute-1 systemd[1]: Finished man-db-cache-update.service.
Dec 05 08:57:27 compute-1 systemd[1]: run-r19117f92b6514b8cab8c33aa1795f675.service: Deactivated successfully.
Dec 05 08:57:29 compute-1 sudo[53827]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tomcxgqzpykdacuomzjbdirokumhdhze ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925048.4671574-213-83786659021442/AnsiballZ_systemd.py'
Dec 05 08:57:29 compute-1 sudo[53827]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 08:57:29 compute-1 python3.9[53829]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Dec 05 08:57:29 compute-1 systemd[1]: Reloading.
Dec 05 08:57:29 compute-1 systemd-rc-local-generator[53858]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 05 08:57:29 compute-1 systemd-sysv-generator[53863]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 05 08:57:29 compute-1 systemd[1]: Starting Open vSwitch Database Unit...
Dec 05 08:57:29 compute-1 chown[53871]: /usr/bin/chown: cannot access '/run/openvswitch': No such file or directory
Dec 05 08:57:29 compute-1 ovs-ctl[53876]: /etc/openvswitch/conf.db does not exist ... (warning).
Dec 05 08:57:29 compute-1 ovs-ctl[53876]: Creating empty database /etc/openvswitch/conf.db [  OK  ]
Dec 05 08:57:29 compute-1 ovs-ctl[53876]: Starting ovsdb-server [  OK  ]
Dec 05 08:57:29 compute-1 ovs-vsctl[53925]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait -- init -- set Open_vSwitch . db-version=8.5.1
Dec 05 08:57:30 compute-1 ovs-vsctl[53941]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait set Open_vSwitch . ovs-version=3.3.5-115.el9s "external-ids:system-id=\"2deaed7a-68f6-453c-b7f8-10ef033f3762\"" "external-ids:rundir=\"/var/run/openvswitch\"" "system-type=\"centos\"" "system-version=\"9\""
Dec 05 08:57:30 compute-1 ovs-ctl[53876]: Configuring Open vSwitch system IDs [  OK  ]
Dec 05 08:57:30 compute-1 ovs-ctl[53876]: Enabling remote OVSDB managers [  OK  ]
Dec 05 08:57:30 compute-1 ovs-vsctl[53951]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait add Open_vSwitch . external-ids hostname=compute-1
Dec 05 08:57:30 compute-1 systemd[1]: Started Open vSwitch Database Unit.
Dec 05 08:57:30 compute-1 systemd[1]: Starting Open vSwitch Delete Transient Ports...
Dec 05 08:57:30 compute-1 systemd[1]: Finished Open vSwitch Delete Transient Ports.
Dec 05 08:57:30 compute-1 systemd[1]: Starting Open vSwitch Forwarding Unit...
Dec 05 08:57:30 compute-1 kernel: openvswitch: Open vSwitch switching datapath
Dec 05 08:57:30 compute-1 ovs-ctl[53995]: Inserting openvswitch module [  OK  ]
Dec 05 08:57:30 compute-1 ovs-ctl[53964]: Starting ovs-vswitchd [  OK  ]
Dec 05 08:57:30 compute-1 ovs-vsctl[54013]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait add Open_vSwitch . external-ids hostname=compute-1
Dec 05 08:57:30 compute-1 ovs-ctl[53964]: Enabling remote OVSDB managers [  OK  ]
Dec 05 08:57:30 compute-1 systemd[1]: Started Open vSwitch Forwarding Unit.
Dec 05 08:57:30 compute-1 systemd[1]: Starting Open vSwitch...
Dec 05 08:57:30 compute-1 systemd[1]: Finished Open vSwitch.
Dec 05 08:57:30 compute-1 sudo[53827]: pam_unix(sudo:session): session closed for user root
Dec 05 08:57:31 compute-1 python3.9[54164]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 05 08:57:32 compute-1 sudo[54314]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vjojtyxeykmejrfayhaneutgnnqpoemi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925051.8506114-267-8566547581601/AnsiballZ_sefcontext.py'
Dec 05 08:57:32 compute-1 sudo[54314]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 08:57:32 compute-1 python3.9[54316]: ansible-community.general.sefcontext Invoked with selevel=s0 setype=container_file_t state=present target=/var/lib/edpm-config(/.*)? ignore_selinux_state=False ftype=a reload=True substitute=None seuser=None
Dec 05 08:57:33 compute-1 kernel: SELinux:  Converting 2745 SID table entries...
Dec 05 08:57:33 compute-1 kernel: SELinux:  policy capability network_peer_controls=1
Dec 05 08:57:33 compute-1 kernel: SELinux:  policy capability open_perms=1
Dec 05 08:57:33 compute-1 kernel: SELinux:  policy capability extended_socket_class=1
Dec 05 08:57:33 compute-1 kernel: SELinux:  policy capability always_check_network=0
Dec 05 08:57:33 compute-1 kernel: SELinux:  policy capability cgroup_seclabel=1
Dec 05 08:57:33 compute-1 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Dec 05 08:57:33 compute-1 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Dec 05 08:57:34 compute-1 sudo[54314]: pam_unix(sudo:session): session closed for user root
Dec 05 08:57:35 compute-1 python3.9[54472]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 05 08:57:36 compute-1 sudo[54628]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jdmlngkiegifnuhfftqfmuyrwdkxgacs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925056.0531616-321-1311245905107/AnsiballZ_dnf.py'
Dec 05 08:57:36 compute-1 dbus-broker-launch[773]: avc:  op=load_policy lsm=selinux seqno=10 res=1
Dec 05 08:57:36 compute-1 sudo[54628]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 08:57:36 compute-1 python3.9[54630]: ansible-ansible.legacy.dnf Invoked with name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec 05 08:57:38 compute-1 sudo[54628]: pam_unix(sudo:session): session closed for user root
Dec 05 08:57:38 compute-1 sudo[54781]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fmpsblfgendxjslonqitxndyfoudzsdk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925058.3305857-345-236825609662161/AnsiballZ_command.py'
Dec 05 08:57:38 compute-1 sudo[54781]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 08:57:39 compute-1 python3.9[54783]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -V driverctl lvm2 crudini jq nftables NetworkManager openstack-selinux python3-libselinux python3-pyyaml rsync tmpwatch sysstat iproute-tc ksmtuned systemd-container crypto-policies-scripts grubby sos _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 05 08:57:39 compute-1 sudo[54781]: pam_unix(sudo:session): session closed for user root
Dec 05 08:57:40 compute-1 sudo[55068]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xbtghklxfnualoclaqsxwsxqrwtrnamd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925060.1720426-369-76838547415408/AnsiballZ_file.py'
Dec 05 08:57:40 compute-1 sudo[55068]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 08:57:40 compute-1 python3.9[55070]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/edpm-config selevel=s0 setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None attributes=None
Dec 05 08:57:40 compute-1 sudo[55068]: pam_unix(sudo:session): session closed for user root
Dec 05 08:57:41 compute-1 python3.9[55220]: ansible-ansible.builtin.stat Invoked with path=/etc/cloud/cloud.cfg.d follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 05 08:57:42 compute-1 sudo[55372]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pnklkifzcriopjhwoxsycnouwvzqweaj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925062.0288172-417-216370396060602/AnsiballZ_dnf.py'
Dec 05 08:57:42 compute-1 sudo[55372]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 08:57:42 compute-1 python3.9[55374]: ansible-ansible.legacy.dnf Invoked with name=['NetworkManager-ovs'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec 05 08:57:44 compute-1 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Dec 05 08:57:44 compute-1 systemd[1]: Starting man-db-cache-update.service...
Dec 05 08:57:44 compute-1 systemd[1]: Reloading.
Dec 05 08:57:44 compute-1 systemd-rc-local-generator[55410]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 05 08:57:44 compute-1 systemd-sysv-generator[55417]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 05 08:57:44 compute-1 systemd[1]: Queuing reload/restart jobs for marked units…
Dec 05 08:57:45 compute-1 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Dec 05 08:57:45 compute-1 systemd[1]: Finished man-db-cache-update.service.
Dec 05 08:57:45 compute-1 systemd[1]: run-rf43283f4fc6d48899c0301be4a8c0526.service: Deactivated successfully.
Dec 05 08:57:45 compute-1 sudo[55372]: pam_unix(sudo:session): session closed for user root
Dec 05 08:57:45 compute-1 sudo[55688]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vugvibrqunwqvlgaotcmfofvpqkxkuae ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925065.5551355-441-104219186371058/AnsiballZ_systemd.py'
Dec 05 08:57:45 compute-1 sudo[55688]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 08:57:46 compute-1 python3.9[55690]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 05 08:57:46 compute-1 systemd[1]: NetworkManager-wait-online.service: Deactivated successfully.
Dec 05 08:57:46 compute-1 systemd[1]: Stopped Network Manager Wait Online.
Dec 05 08:57:46 compute-1 systemd[1]: Stopping Network Manager Wait Online...
Dec 05 08:57:46 compute-1 systemd[1]: Stopping Network Manager...
Dec 05 08:57:46 compute-1 NetworkManager[7190]: <info>  [1764925066.2695] caught SIGTERM, shutting down normally.
Dec 05 08:57:46 compute-1 NetworkManager[7190]: <info>  [1764925066.2717] dhcp4 (eth0): canceled DHCP transaction
Dec 05 08:57:46 compute-1 NetworkManager[7190]: <info>  [1764925066.2717] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Dec 05 08:57:46 compute-1 NetworkManager[7190]: <info>  [1764925066.2717] dhcp4 (eth0): state changed no lease
Dec 05 08:57:46 compute-1 NetworkManager[7190]: <info>  [1764925066.2721] manager: NetworkManager state is now CONNECTED_SITE
Dec 05 08:57:46 compute-1 NetworkManager[7190]: <info>  [1764925066.2795] exiting (success)
Dec 05 08:57:46 compute-1 systemd[1]: Starting Network Manager Script Dispatcher Service...
Dec 05 08:57:46 compute-1 systemd[1]: Started Network Manager Script Dispatcher Service.
Dec 05 08:57:46 compute-1 systemd[1]: NetworkManager.service: Deactivated successfully.
Dec 05 08:57:46 compute-1 systemd[1]: Stopped Network Manager.
Dec 05 08:57:46 compute-1 systemd[1]: NetworkManager.service: Consumed 15.486s CPU time, 4.1M memory peak, read 0B from disk, written 18.5K to disk.
Dec 05 08:57:46 compute-1 systemd[1]: Starting Network Manager...
Dec 05 08:57:46 compute-1 NetworkManager[55704]: <info>  [1764925066.3644] NetworkManager (version 1.54.1-1.el9) is starting... (after a restart, boot:37103174-0a80-476d-aa28-333d5ef7214b)
Dec 05 08:57:46 compute-1 NetworkManager[55704]: <info>  [1764925066.3645] Read config: /etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf
Dec 05 08:57:46 compute-1 NetworkManager[55704]: <info>  [1764925066.3709] manager[0x5622fa44a090]: monitoring kernel firmware directory '/lib/firmware'.
Dec 05 08:57:46 compute-1 systemd[1]: Starting Hostname Service...
Dec 05 08:57:46 compute-1 systemd[1]: Started Hostname Service.
Dec 05 08:57:46 compute-1 NetworkManager[55704]: <info>  [1764925066.4695] hostname: hostname: using hostnamed
Dec 05 08:57:46 compute-1 NetworkManager[55704]: <info>  [1764925066.4697] hostname: static hostname changed from (none) to "compute-1"
Dec 05 08:57:46 compute-1 NetworkManager[55704]: <info>  [1764925066.4706] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto)
Dec 05 08:57:46 compute-1 NetworkManager[55704]: <info>  [1764925066.4711] manager[0x5622fa44a090]: rfkill: Wi-Fi hardware radio set enabled
Dec 05 08:57:46 compute-1 NetworkManager[55704]: <info>  [1764925066.4712] manager[0x5622fa44a090]: rfkill: WWAN hardware radio set enabled
Dec 05 08:57:46 compute-1 NetworkManager[55704]: <info>  [1764925066.4738] Loaded device plugin: NMOvsFactory (/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-device-plugin-ovs.so)
Dec 05 08:57:46 compute-1 NetworkManager[55704]: <info>  [1764925066.4748] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-device-plugin-team.so)
Dec 05 08:57:46 compute-1 NetworkManager[55704]: <info>  [1764925066.4749] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file
Dec 05 08:57:46 compute-1 NetworkManager[55704]: <info>  [1764925066.4749] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file
Dec 05 08:57:46 compute-1 NetworkManager[55704]: <info>  [1764925066.4750] manager: Networking is enabled by state file
Dec 05 08:57:46 compute-1 NetworkManager[55704]: <info>  [1764925066.4752] settings: Loaded settings plugin: keyfile (internal)
Dec 05 08:57:46 compute-1 NetworkManager[55704]: <info>  [1764925066.4756] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-settings-plugin-ifcfg-rh.so")
Dec 05 08:57:46 compute-1 NetworkManager[55704]: <info>  [1764925066.4808] Warning: the ifcfg-rh plugin is deprecated, please migrate connections to the keyfile format using "nmcli connection migrate"
Dec 05 08:57:46 compute-1 NetworkManager[55704]: <info>  [1764925066.4825] dhcp: init: Using DHCP client 'internal'
Dec 05 08:57:46 compute-1 NetworkManager[55704]: <info>  [1764925066.4831] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1)
Dec 05 08:57:46 compute-1 NetworkManager[55704]: <info>  [1764925066.4841] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec 05 08:57:46 compute-1 NetworkManager[55704]: <info>  [1764925066.4854] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'external')
Dec 05 08:57:46 compute-1 NetworkManager[55704]: <info>  [1764925066.4870] device (lo): Activation: starting connection 'lo' (2b3ccb97-e960-48b1-9417-7b23d43663c4)
Dec 05 08:57:46 compute-1 NetworkManager[55704]: <info>  [1764925066.4882] device (eth0): carrier: link connected
Dec 05 08:57:46 compute-1 NetworkManager[55704]: <info>  [1764925066.4892] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2)
Dec 05 08:57:46 compute-1 NetworkManager[55704]: <info>  [1764925066.4903] manager: (eth0): assume: will attempt to assume matching connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03) (indicated)
Dec 05 08:57:46 compute-1 NetworkManager[55704]: <info>  [1764925066.4905] device (eth0): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Dec 05 08:57:46 compute-1 NetworkManager[55704]: <info>  [1764925066.4914] device (eth0): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Dec 05 08:57:46 compute-1 NetworkManager[55704]: <info>  [1764925066.4922] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Dec 05 08:57:46 compute-1 NetworkManager[55704]: <info>  [1764925066.4931] device (eth1): carrier: link connected
Dec 05 08:57:46 compute-1 NetworkManager[55704]: <info>  [1764925066.4935] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3)
Dec 05 08:57:46 compute-1 NetworkManager[55704]: <info>  [1764925066.4941] manager: (eth1): assume: will attempt to assume matching connection 'ci-private-network' (91a49ed8-302a-59cf-a590-ba724ca6d638) (indicated)
Dec 05 08:57:46 compute-1 NetworkManager[55704]: <info>  [1764925066.4942] device (eth1): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Dec 05 08:57:46 compute-1 NetworkManager[55704]: <info>  [1764925066.4949] device (eth1): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Dec 05 08:57:46 compute-1 NetworkManager[55704]: <info>  [1764925066.4956] device (eth1): Activation: starting connection 'ci-private-network' (91a49ed8-302a-59cf-a590-ba724ca6d638)
Dec 05 08:57:46 compute-1 systemd[1]: Started Network Manager.
Dec 05 08:57:46 compute-1 NetworkManager[55704]: <info>  [1764925066.4964] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager"
Dec 05 08:57:46 compute-1 NetworkManager[55704]: <info>  [1764925066.4978] device (lo): state change: disconnected -> prepare (reason 'none', managed-type: 'external')
Dec 05 08:57:46 compute-1 NetworkManager[55704]: <info>  [1764925066.4980] device (lo): state change: prepare -> config (reason 'none', managed-type: 'external')
Dec 05 08:57:46 compute-1 NetworkManager[55704]: <info>  [1764925066.4982] device (lo): state change: config -> ip-config (reason 'none', managed-type: 'external')
Dec 05 08:57:46 compute-1 NetworkManager[55704]: <info>  [1764925066.4984] device (eth0): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Dec 05 08:57:46 compute-1 NetworkManager[55704]: <info>  [1764925066.5000] device (eth0): state change: prepare -> config (reason 'none', managed-type: 'assume')
Dec 05 08:57:46 compute-1 NetworkManager[55704]: <info>  [1764925066.5005] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Dec 05 08:57:46 compute-1 NetworkManager[55704]: <info>  [1764925066.5008] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'assume')
Dec 05 08:57:46 compute-1 NetworkManager[55704]: <info>  [1764925066.5013] device (lo): state change: ip-config -> ip-check (reason 'none', managed-type: 'external')
Dec 05 08:57:46 compute-1 NetworkManager[55704]: <info>  [1764925066.5021] device (eth0): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Dec 05 08:57:46 compute-1 NetworkManager[55704]: <info>  [1764925066.5025] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Dec 05 08:57:46 compute-1 NetworkManager[55704]: <info>  [1764925066.5064] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Dec 05 08:57:46 compute-1 NetworkManager[55704]: <info>  [1764925066.5078] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Dec 05 08:57:46 compute-1 NetworkManager[55704]: <info>  [1764925066.5088] device (lo): state change: ip-check -> secondaries (reason 'none', managed-type: 'external')
Dec 05 08:57:46 compute-1 NetworkManager[55704]: <info>  [1764925066.5090] dhcp4 (eth0): state changed new lease, address=38.102.83.154
Dec 05 08:57:46 compute-1 NetworkManager[55704]: <info>  [1764925066.5093] device (lo): state change: secondaries -> activated (reason 'none', managed-type: 'external')
Dec 05 08:57:46 compute-1 NetworkManager[55704]: <info>  [1764925066.5099] device (lo): Activation: successful, device activated.
Dec 05 08:57:46 compute-1 NetworkManager[55704]: <info>  [1764925066.5110] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS
Dec 05 08:57:46 compute-1 systemd[1]: Starting Network Manager Wait Online...
Dec 05 08:57:46 compute-1 NetworkManager[55704]: <info>  [1764925066.5184] device (eth0): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Dec 05 08:57:46 compute-1 NetworkManager[55704]: <info>  [1764925066.5191] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Dec 05 08:57:46 compute-1 NetworkManager[55704]: <info>  [1764925066.5192] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Dec 05 08:57:46 compute-1 NetworkManager[55704]: <info>  [1764925066.5197] manager: NetworkManager state is now CONNECTED_LOCAL
Dec 05 08:57:46 compute-1 NetworkManager[55704]: <info>  [1764925066.5200] device (eth1): Activation: successful, device activated.
Dec 05 08:57:46 compute-1 NetworkManager[55704]: <info>  [1764925066.5208] device (eth0): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Dec 05 08:57:46 compute-1 NetworkManager[55704]: <info>  [1764925066.5209] device (eth0): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Dec 05 08:57:46 compute-1 NetworkManager[55704]: <info>  [1764925066.5212] manager: NetworkManager state is now CONNECTED_SITE
Dec 05 08:57:46 compute-1 NetworkManager[55704]: <info>  [1764925066.5215] device (eth0): Activation: successful, device activated.
Dec 05 08:57:46 compute-1 NetworkManager[55704]: <info>  [1764925066.5219] manager: NetworkManager state is now CONNECTED_GLOBAL
Dec 05 08:57:46 compute-1 NetworkManager[55704]: <info>  [1764925066.5221] manager: startup complete
Dec 05 08:57:46 compute-1 systemd[1]: Finished Network Manager Wait Online.
Dec 05 08:57:46 compute-1 sudo[55688]: pam_unix(sudo:session): session closed for user root
Dec 05 08:57:47 compute-1 sudo[55915]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vwhfnxhqylhfcwdzvqzosohavpkxdesc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925066.7877467-465-47715957396501/AnsiballZ_dnf.py'
Dec 05 08:57:47 compute-1 sudo[55915]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 08:57:47 compute-1 python3.9[55917]: ansible-ansible.legacy.dnf Invoked with name=['os-net-config'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec 05 08:57:49 compute-1 sshd-session[55919]: Received disconnect from 122.168.194.41 port 37822:11: Bye Bye [preauth]
Dec 05 08:57:49 compute-1 sshd-session[55919]: Disconnected from authenticating user root 122.168.194.41 port 37822 [preauth]
Dec 05 08:57:52 compute-1 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Dec 05 08:57:52 compute-1 systemd[1]: Starting man-db-cache-update.service...
Dec 05 08:57:52 compute-1 systemd[1]: Reloading.
Dec 05 08:57:52 compute-1 systemd-sysv-generator[55971]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 05 08:57:52 compute-1 systemd-rc-local-generator[55967]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 05 08:57:52 compute-1 systemd[1]: Queuing reload/restart jobs for marked units…
Dec 05 08:57:53 compute-1 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Dec 05 08:57:53 compute-1 systemd[1]: Finished man-db-cache-update.service.
Dec 05 08:57:53 compute-1 systemd[1]: run-r4d2048b5af944c4fa981c6a138777329.service: Deactivated successfully.
Dec 05 08:57:53 compute-1 sudo[55915]: pam_unix(sudo:session): session closed for user root
Dec 05 08:57:55 compute-1 sudo[56377]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ngvcwpucxdsbadxzieunnxdzfymmjhgn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925075.4219272-501-231329162173702/AnsiballZ_stat.py'
Dec 05 08:57:55 compute-1 sudo[56377]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 08:57:55 compute-1 python3.9[56379]: ansible-ansible.builtin.stat Invoked with path=/var/lib/edpm-config/os-net-config.returncode follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 05 08:57:55 compute-1 sudo[56377]: pam_unix(sudo:session): session closed for user root
Dec 05 08:57:56 compute-1 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Dec 05 08:57:56 compute-1 sudo[56529]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-drytpwoxcbnlzazuarzizquxlqjskziu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925076.263946-528-271110872950806/AnsiballZ_ini_file.py'
Dec 05 08:57:56 compute-1 sudo[56529]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 08:57:56 compute-1 python3.9[56531]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=no-auto-default path=/etc/NetworkManager/NetworkManager.conf section=main state=present value=* exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 08:57:56 compute-1 sudo[56529]: pam_unix(sudo:session): session closed for user root
Dec 05 08:57:57 compute-1 sudo[56683]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dnxfzefwdusggdscyphrywpfzxgenged ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925077.2657726-558-148561887697432/AnsiballZ_ini_file.py'
Dec 05 08:57:57 compute-1 sudo[56683]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 08:57:57 compute-1 python3.9[56685]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=dns path=/etc/NetworkManager/NetworkManager.conf section=main state=absent value=none exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 08:57:57 compute-1 sudo[56683]: pam_unix(sudo:session): session closed for user root
Dec 05 08:57:58 compute-1 sudo[56835]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-imqvpygarnywtoyvzomkeojzgmkkhbci ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925078.289807-558-70347746825776/AnsiballZ_ini_file.py'
Dec 05 08:57:58 compute-1 sudo[56835]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 08:57:58 compute-1 python3.9[56837]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=dns path=/etc/NetworkManager/conf.d/99-cloud-init.conf section=main state=absent value=none exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 08:57:58 compute-1 sudo[56835]: pam_unix(sudo:session): session closed for user root
Dec 05 08:57:59 compute-1 sudo[56987]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-igqiwpzgtvzzxkelofriyqfxuacdcvga ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925079.0361254-603-68906459936194/AnsiballZ_ini_file.py'
Dec 05 08:57:59 compute-1 sudo[56987]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 08:57:59 compute-1 python3.9[56989]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=rc-manager path=/etc/NetworkManager/NetworkManager.conf section=main state=absent value=unmanaged exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 08:57:59 compute-1 sudo[56987]: pam_unix(sudo:session): session closed for user root
Dec 05 08:58:00 compute-1 sudo[57139]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uzntkkbxigqxotqupzgfrlmrtvoevtzh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925079.690852-603-242914840520150/AnsiballZ_ini_file.py'
Dec 05 08:58:00 compute-1 sudo[57139]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 08:58:00 compute-1 python3.9[57141]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=rc-manager path=/etc/NetworkManager/conf.d/99-cloud-init.conf section=main state=absent value=unmanaged exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 08:58:00 compute-1 sudo[57139]: pam_unix(sudo:session): session closed for user root
Dec 05 08:58:00 compute-1 sudo[57291]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tctazmjhgknytwfjcsdrivchbgaqamdv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925080.4887516-648-259657891839688/AnsiballZ_stat.py'
Dec 05 08:58:00 compute-1 sudo[57291]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 08:58:00 compute-1 python3.9[57293]: ansible-ansible.legacy.stat Invoked with path=/etc/dhcp/dhclient-enter-hooks follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 08:58:01 compute-1 sudo[57291]: pam_unix(sudo:session): session closed for user root
Dec 05 08:58:01 compute-1 sudo[57414]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hoeuzjyruovwpdkkfasxjdtolbgiyrys ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925080.4887516-648-259657891839688/AnsiballZ_copy.py'
Dec 05 08:58:01 compute-1 sudo[57414]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 08:58:01 compute-1 python3.9[57416]: ansible-ansible.legacy.copy Invoked with dest=/etc/dhcp/dhclient-enter-hooks mode=0755 src=/home/zuul/.ansible/tmp/ansible-tmp-1764925080.4887516-648-259657891839688/.source _original_basename=.ewhfky16 follow=False checksum=f6278a40de79a9841f6ed1fc584538225566990c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 08:58:01 compute-1 sudo[57414]: pam_unix(sudo:session): session closed for user root
Dec 05 08:58:02 compute-1 sudo[57566]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pxpzffgovsjcuydhivmwdvgcdahgflln ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925082.0105038-693-146929962327370/AnsiballZ_file.py'
Dec 05 08:58:02 compute-1 sudo[57566]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 08:58:02 compute-1 python3.9[57568]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/os-net-config state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 08:58:02 compute-1 sudo[57566]: pam_unix(sudo:session): session closed for user root
Dec 05 08:58:03 compute-1 sudo[57718]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dffrudydbddbeteztnanzjeazkrrnnka ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925083.0595882-717-43169734717701/AnsiballZ_edpm_os_net_config_mappings.py'
Dec 05 08:58:03 compute-1 sudo[57718]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 08:58:03 compute-1 python3.9[57720]: ansible-edpm_os_net_config_mappings Invoked with net_config_data_lookup={}
Dec 05 08:58:03 compute-1 sudo[57718]: pam_unix(sudo:session): session closed for user root
Dec 05 08:58:04 compute-1 sudo[57870]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hhsrjyiimauynhdagxmltwtwvlnfvpkx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925084.0992339-744-179981366211756/AnsiballZ_file.py'
Dec 05 08:58:04 compute-1 sudo[57870]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 08:58:04 compute-1 python3.9[57872]: ansible-ansible.builtin.file Invoked with path=/var/lib/edpm-config/scripts state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 08:58:04 compute-1 sudo[57870]: pam_unix(sudo:session): session closed for user root
Dec 05 08:58:05 compute-1 sudo[58022]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mhvxphwqnmijjhplcbvuevikvzqgchfq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925085.056312-774-139705766679758/AnsiballZ_stat.py'
Dec 05 08:58:05 compute-1 sudo[58022]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 08:58:05 compute-1 sudo[58022]: pam_unix(sudo:session): session closed for user root
Dec 05 08:58:05 compute-1 sudo[58145]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aqgyzehyezhfgzlvusnmbzfvtxqgieie ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925085.056312-774-139705766679758/AnsiballZ_copy.py'
Dec 05 08:58:05 compute-1 sudo[58145]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 08:58:06 compute-1 sudo[58145]: pam_unix(sudo:session): session closed for user root
Dec 05 08:58:06 compute-1 sudo[58297]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-smihdxqueretblldmaerljejjjgbtzog ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925086.4256532-819-146479078337530/AnsiballZ_slurp.py'
Dec 05 08:58:06 compute-1 sudo[58297]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 08:58:07 compute-1 python3.9[58299]: ansible-ansible.builtin.slurp Invoked with path=/etc/os-net-config/config.yaml src=/etc/os-net-config/config.yaml
Dec 05 08:58:07 compute-1 sudo[58297]: pam_unix(sudo:session): session closed for user root
Dec 05 08:58:08 compute-1 sudo[58472]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sqjkqhangouhafaltnnaqpvffaqiptkv ; ANSIBLE_ASYNC_DIR=\'~/.ansible_async\' /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925087.7595055-846-81660691043908/async_wrapper.py j541135313231 300 /home/zuul/.ansible/tmp/ansible-tmp-1764925087.7595055-846-81660691043908/AnsiballZ_edpm_os_net_config.py _'
Dec 05 08:58:08 compute-1 sudo[58472]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 08:58:08 compute-1 ansible-async_wrapper.py[58474]: Invoked with j541135313231 300 /home/zuul/.ansible/tmp/ansible-tmp-1764925087.7595055-846-81660691043908/AnsiballZ_edpm_os_net_config.py _
Dec 05 08:58:08 compute-1 ansible-async_wrapper.py[58479]: Starting module and watcher
Dec 05 08:58:08 compute-1 ansible-async_wrapper.py[58479]: Start watching 58480 (300)
Dec 05 08:58:08 compute-1 ansible-async_wrapper.py[58480]: Start module (58480)
Dec 05 08:58:08 compute-1 ansible-async_wrapper.py[58474]: Return async_wrapper task started.
Dec 05 08:58:08 compute-1 sudo[58472]: pam_unix(sudo:session): session closed for user root
Dec 05 08:58:08 compute-1 python3.9[58481]: ansible-edpm_os_net_config Invoked with cleanup=True config_file=/etc/os-net-config/config.yaml debug=True detailed_exit_codes=True safe_defaults=False use_nmstate=True
Dec 05 08:58:09 compute-1 kernel: cfg80211: Loading compiled-in X.509 certificates for regulatory database
Dec 05 08:58:09 compute-1 kernel: Loaded X.509 cert 'sforshee: 00b28ddf47aef9cea7'
Dec 05 08:58:09 compute-1 kernel: Loaded X.509 cert 'wens: 61c038651aabdcf94bd0ac7ff06c7248db18c600'
Dec 05 08:58:09 compute-1 kernel: platform regulatory.0: Direct firmware load for regulatory.db failed with error -2
Dec 05 08:58:09 compute-1 kernel: cfg80211: failed to load regulatory.db
Dec 05 08:58:09 compute-1 sshd-session[58475]: Received disconnect from 122.114.113.177 port 54420:11: Bye Bye [preauth]
Dec 05 08:58:09 compute-1 sshd-session[58475]: Disconnected from authenticating user root 122.114.113.177 port 54420 [preauth]
Dec 05 08:58:10 compute-1 NetworkManager[55704]: <info>  [1764925090.6971] audit: op="checkpoint-create" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=58482 uid=0 result="success"
Dec 05 08:58:10 compute-1 NetworkManager[55704]: <info>  [1764925090.6994] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=58482 uid=0 result="success"
Dec 05 08:58:10 compute-1 NetworkManager[55704]: <info>  [1764925090.7569] manager: (br-ex): new Open vSwitch Bridge device (/org/freedesktop/NetworkManager/Devices/4)
Dec 05 08:58:10 compute-1 NetworkManager[55704]: <info>  [1764925090.7573] audit: op="connection-add" uuid="00540b90-4623-4aa8-89c7-33d5407eb2c0" name="br-ex-br" pid=58482 uid=0 result="success"
Dec 05 08:58:10 compute-1 NetworkManager[55704]: <info>  [1764925090.7589] manager: (br-ex): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/5)
Dec 05 08:58:10 compute-1 NetworkManager[55704]: <info>  [1764925090.7591] audit: op="connection-add" uuid="bcff6bc9-6569-405c-ab6b-3892c2f0363a" name="br-ex-port" pid=58482 uid=0 result="success"
Dec 05 08:58:10 compute-1 NetworkManager[55704]: <info>  [1764925090.7603] manager: (eth1): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/6)
Dec 05 08:58:10 compute-1 NetworkManager[55704]: <info>  [1764925090.7604] audit: op="connection-add" uuid="aeace326-a548-46c9-b151-b6f8ea7a048e" name="eth1-port" pid=58482 uid=0 result="success"
Dec 05 08:58:10 compute-1 NetworkManager[55704]: <info>  [1764925090.7614] manager: (vlan20): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/7)
Dec 05 08:58:10 compute-1 NetworkManager[55704]: <info>  [1764925090.7616] audit: op="connection-add" uuid="48f611b7-4cc9-4868-8477-596557d45c38" name="vlan20-port" pid=58482 uid=0 result="success"
Dec 05 08:58:10 compute-1 NetworkManager[55704]: <info>  [1764925090.7626] manager: (vlan21): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/8)
Dec 05 08:58:10 compute-1 NetworkManager[55704]: <info>  [1764925090.7627] audit: op="connection-add" uuid="b15e2a2f-2c9e-4acc-8d47-22141ba128d3" name="vlan21-port" pid=58482 uid=0 result="success"
Dec 05 08:58:10 compute-1 NetworkManager[55704]: <info>  [1764925090.7637] manager: (vlan22): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/9)
Dec 05 08:58:10 compute-1 NetworkManager[55704]: <info>  [1764925090.7638] audit: op="connection-add" uuid="84f7f286-ca7f-4669-b264-afa19db3f4cf" name="vlan22-port" pid=58482 uid=0 result="success"
Dec 05 08:58:10 compute-1 NetworkManager[55704]: <info>  [1764925090.7658] audit: op="connection-update" uuid="5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03" name="System eth0" args="connection.timestamp,connection.autoconnect-priority,ipv6.dhcp-timeout,ipv6.method,ipv6.addr-gen-mode,ipv4.dhcp-client-id,ipv4.dhcp-timeout,802-3-ethernet.mtu" pid=58482 uid=0 result="success"
Dec 05 08:58:10 compute-1 NetworkManager[55704]: <info>  [1764925090.7671] manager: (br-ex): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/10)
Dec 05 08:58:10 compute-1 NetworkManager[55704]: <info>  [1764925090.7673] audit: op="connection-add" uuid="db65a364-d1be-4d53-a4f0-a01879732438" name="br-ex-if" pid=58482 uid=0 result="success"
Dec 05 08:58:10 compute-1 NetworkManager[55704]: <info>  [1764925090.7749] audit: op="connection-update" uuid="91a49ed8-302a-59cf-a590-ba724ca6d638" name="ci-private-network" args="connection.slave-type,connection.port-type,connection.controller,connection.master,connection.timestamp,ipv6.addresses,ipv6.dns,ipv6.method,ipv6.routing-rules,ipv6.routes,ipv6.addr-gen-mode,ipv4.addresses,ipv4.dns,ipv4.method,ipv4.never-default,ipv4.routes,ipv4.routing-rules,ovs-external-ids.data,ovs-interface.type" pid=58482 uid=0 result="success"
Dec 05 08:58:10 compute-1 NetworkManager[55704]: <info>  [1764925090.7768] manager: (vlan20): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/11)
Dec 05 08:58:10 compute-1 NetworkManager[55704]: <info>  [1764925090.7770] audit: op="connection-add" uuid="47410f6f-fe89-48f1-8cfa-6ca0c468e8c1" name="vlan20-if" pid=58482 uid=0 result="success"
Dec 05 08:58:10 compute-1 NetworkManager[55704]: <info>  [1764925090.7784] manager: (vlan21): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/12)
Dec 05 08:58:10 compute-1 NetworkManager[55704]: <info>  [1764925090.7785] audit: op="connection-add" uuid="d281bbb0-10f5-4425-a39b-d64a1c526bf6" name="vlan21-if" pid=58482 uid=0 result="success"
Dec 05 08:58:10 compute-1 NetworkManager[55704]: <info>  [1764925090.7799] manager: (vlan22): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/13)
Dec 05 08:58:10 compute-1 NetworkManager[55704]: <info>  [1764925090.7800] audit: op="connection-add" uuid="1bd5d599-069a-46a1-8de5-09c3472c5f2e" name="vlan22-if" pid=58482 uid=0 result="success"
Dec 05 08:58:10 compute-1 NetworkManager[55704]: <info>  [1764925090.7812] audit: op="connection-delete" uuid="b6c4a16e-f9a7-3917-b1a2-dbfcd5a6a2e4" name="Wired connection 1" pid=58482 uid=0 result="success"
Dec 05 08:58:10 compute-1 NetworkManager[55704]: <info>  [1764925090.7827] device (br-ex)[Open vSwitch Bridge]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Dec 05 08:58:10 compute-1 NetworkManager[55704]: <info>  [1764925090.7838] device (br-ex)[Open vSwitch Bridge]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Dec 05 08:58:10 compute-1 NetworkManager[55704]: <info>  [1764925090.7843] device (br-ex)[Open vSwitch Bridge]: Activation: starting connection 'br-ex-br' (00540b90-4623-4aa8-89c7-33d5407eb2c0)
Dec 05 08:58:10 compute-1 NetworkManager[55704]: <info>  [1764925090.7844] audit: op="connection-activate" uuid="00540b90-4623-4aa8-89c7-33d5407eb2c0" name="br-ex-br" pid=58482 uid=0 result="success"
Dec 05 08:58:10 compute-1 NetworkManager[55704]: <info>  [1764925090.7847] device (br-ex)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Dec 05 08:58:10 compute-1 NetworkManager[55704]: <info>  [1764925090.7853] device (br-ex)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Dec 05 08:58:10 compute-1 NetworkManager[55704]: <info>  [1764925090.7858] device (br-ex)[Open vSwitch Port]: Activation: starting connection 'br-ex-port' (bcff6bc9-6569-405c-ab6b-3892c2f0363a)
Dec 05 08:58:10 compute-1 NetworkManager[55704]: <info>  [1764925090.7860] device (eth1)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Dec 05 08:58:10 compute-1 NetworkManager[55704]: <info>  [1764925090.7866] device (eth1)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Dec 05 08:58:10 compute-1 NetworkManager[55704]: <info>  [1764925090.7870] device (eth1)[Open vSwitch Port]: Activation: starting connection 'eth1-port' (aeace326-a548-46c9-b151-b6f8ea7a048e)
Dec 05 08:58:10 compute-1 NetworkManager[55704]: <info>  [1764925090.7873] device (vlan20)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Dec 05 08:58:10 compute-1 NetworkManager[55704]: <info>  [1764925090.7879] device (vlan20)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Dec 05 08:58:10 compute-1 NetworkManager[55704]: <info>  [1764925090.7884] device (vlan20)[Open vSwitch Port]: Activation: starting connection 'vlan20-port' (48f611b7-4cc9-4868-8477-596557d45c38)
Dec 05 08:58:10 compute-1 NetworkManager[55704]: <info>  [1764925090.7886] device (vlan21)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Dec 05 08:58:10 compute-1 NetworkManager[55704]: <info>  [1764925090.7892] device (vlan21)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Dec 05 08:58:10 compute-1 NetworkManager[55704]: <info>  [1764925090.7896] device (vlan21)[Open vSwitch Port]: Activation: starting connection 'vlan21-port' (b15e2a2f-2c9e-4acc-8d47-22141ba128d3)
Dec 05 08:58:10 compute-1 NetworkManager[55704]: <info>  [1764925090.7898] device (vlan22)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Dec 05 08:58:10 compute-1 NetworkManager[55704]: <info>  [1764925090.7904] device (vlan22)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Dec 05 08:58:10 compute-1 NetworkManager[55704]: <info>  [1764925090.7908] device (vlan22)[Open vSwitch Port]: Activation: starting connection 'vlan22-port' (84f7f286-ca7f-4669-b264-afa19db3f4cf)
Dec 05 08:58:10 compute-1 NetworkManager[55704]: <info>  [1764925090.7910] device (br-ex)[Open vSwitch Bridge]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Dec 05 08:58:10 compute-1 NetworkManager[55704]: <info>  [1764925090.7912] device (br-ex)[Open vSwitch Bridge]: state change: prepare -> config (reason 'none', managed-type: 'full')
Dec 05 08:58:10 compute-1 NetworkManager[55704]: <info>  [1764925090.7915] device (br-ex)[Open vSwitch Bridge]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Dec 05 08:58:10 compute-1 NetworkManager[55704]: <info>  [1764925090.7920] device (br-ex)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Dec 05 08:58:10 compute-1 NetworkManager[55704]: <info>  [1764925090.7924] device (br-ex)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Dec 05 08:58:10 compute-1 NetworkManager[55704]: <info>  [1764925090.7928] device (br-ex)[Open vSwitch Interface]: Activation: starting connection 'br-ex-if' (db65a364-d1be-4d53-a4f0-a01879732438)
Dec 05 08:58:10 compute-1 NetworkManager[55704]: <info>  [1764925090.7929] device (br-ex)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Dec 05 08:58:10 compute-1 NetworkManager[55704]: <info>  [1764925090.7931] device (br-ex)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Dec 05 08:58:10 compute-1 NetworkManager[55704]: <info>  [1764925090.7933] device (br-ex)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Dec 05 08:58:10 compute-1 NetworkManager[55704]: <info>  [1764925090.7935] device (br-ex)[Open vSwitch Port]: Activation: connection 'br-ex-port' attached as port, continuing activation
Dec 05 08:58:10 compute-1 NetworkManager[55704]: <info>  [1764925090.7936] device (eth1): state change: activated -> deactivating (reason 'new-activation', managed-type: 'full')
Dec 05 08:58:10 compute-1 NetworkManager[55704]: <info>  [1764925090.7945] device (eth1): disconnecting for new activation request.
Dec 05 08:58:10 compute-1 NetworkManager[55704]: <info>  [1764925090.7946] device (eth1)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Dec 05 08:58:10 compute-1 NetworkManager[55704]: <info>  [1764925090.7949] device (eth1)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Dec 05 08:58:10 compute-1 NetworkManager[55704]: <info>  [1764925090.7950] device (eth1)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Dec 05 08:58:10 compute-1 NetworkManager[55704]: <info>  [1764925090.7951] device (eth1)[Open vSwitch Port]: Activation: connection 'eth1-port' attached as port, continuing activation
Dec 05 08:58:10 compute-1 NetworkManager[55704]: <info>  [1764925090.7953] device (vlan20)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Dec 05 08:58:10 compute-1 NetworkManager[55704]: <info>  [1764925090.7958] device (vlan20)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Dec 05 08:58:10 compute-1 NetworkManager[55704]: <info>  [1764925090.7961] device (vlan20)[Open vSwitch Interface]: Activation: starting connection 'vlan20-if' (47410f6f-fe89-48f1-8cfa-6ca0c468e8c1)
Dec 05 08:58:10 compute-1 NetworkManager[55704]: <info>  [1764925090.7962] device (vlan20)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Dec 05 08:58:10 compute-1 NetworkManager[55704]: <info>  [1764925090.7965] device (vlan20)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Dec 05 08:58:10 compute-1 NetworkManager[55704]: <info>  [1764925090.7967] device (vlan20)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Dec 05 08:58:10 compute-1 NetworkManager[55704]: <info>  [1764925090.7968] device (vlan20)[Open vSwitch Port]: Activation: connection 'vlan20-port' attached as port, continuing activation
Dec 05 08:58:10 compute-1 NetworkManager[55704]: <info>  [1764925090.7971] device (vlan21)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Dec 05 08:58:10 compute-1 NetworkManager[55704]: <info>  [1764925090.7975] device (vlan21)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Dec 05 08:58:10 compute-1 NetworkManager[55704]: <info>  [1764925090.7979] device (vlan21)[Open vSwitch Interface]: Activation: starting connection 'vlan21-if' (d281bbb0-10f5-4425-a39b-d64a1c526bf6)
Dec 05 08:58:10 compute-1 NetworkManager[55704]: <info>  [1764925090.7980] device (vlan21)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Dec 05 08:58:10 compute-1 NetworkManager[55704]: <info>  [1764925090.7982] device (vlan21)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Dec 05 08:58:10 compute-1 NetworkManager[55704]: <info>  [1764925090.7984] device (vlan21)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Dec 05 08:58:10 compute-1 NetworkManager[55704]: <info>  [1764925090.7985] device (vlan21)[Open vSwitch Port]: Activation: connection 'vlan21-port' attached as port, continuing activation
Dec 05 08:58:10 compute-1 NetworkManager[55704]: <info>  [1764925090.7987] device (vlan22)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Dec 05 08:58:10 compute-1 NetworkManager[55704]: <info>  [1764925090.7992] device (vlan22)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Dec 05 08:58:10 compute-1 NetworkManager[55704]: <info>  [1764925090.7995] device (vlan22)[Open vSwitch Interface]: Activation: starting connection 'vlan22-if' (1bd5d599-069a-46a1-8de5-09c3472c5f2e)
Dec 05 08:58:10 compute-1 NetworkManager[55704]: <info>  [1764925090.7996] device (vlan22)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Dec 05 08:58:10 compute-1 NetworkManager[55704]: <info>  [1764925090.7998] device (vlan22)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Dec 05 08:58:10 compute-1 NetworkManager[55704]: <info>  [1764925090.8000] device (vlan22)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Dec 05 08:58:10 compute-1 NetworkManager[55704]: <info>  [1764925090.8001] device (vlan22)[Open vSwitch Port]: Activation: connection 'vlan22-port' attached as port, continuing activation
Dec 05 08:58:10 compute-1 NetworkManager[55704]: <info>  [1764925090.8002] device (br-ex)[Open vSwitch Bridge]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Dec 05 08:58:10 compute-1 NetworkManager[55704]: <info>  [1764925090.8016] audit: op="device-reapply" interface="eth0" ifindex=2 args="connection.autoconnect-priority,ipv6.method,ipv6.addr-gen-mode,ipv4.dhcp-client-id,ipv4.dhcp-timeout,802-3-ethernet.mtu" pid=58482 uid=0 result="success"
Dec 05 08:58:10 compute-1 NetworkManager[55704]: <info>  [1764925090.8018] device (br-ex)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Dec 05 08:58:10 compute-1 NetworkManager[55704]: <info>  [1764925090.8020] device (br-ex)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Dec 05 08:58:10 compute-1 NetworkManager[55704]: <info>  [1764925090.8022] device (br-ex)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Dec 05 08:58:10 compute-1 NetworkManager[55704]: <info>  [1764925090.8028] device (br-ex)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Dec 05 08:58:10 compute-1 NetworkManager[55704]: <info>  [1764925090.8031] device (eth1)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Dec 05 08:58:10 compute-1 NetworkManager[55704]: <info>  [1764925090.8034] device (vlan20)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Dec 05 08:58:10 compute-1 NetworkManager[55704]: <info>  [1764925090.8036] device (vlan20)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Dec 05 08:58:10 compute-1 NetworkManager[55704]: <info>  [1764925090.8038] device (vlan20)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Dec 05 08:58:10 compute-1 NetworkManager[55704]: <info>  [1764925090.8041] device (vlan20)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Dec 05 08:58:10 compute-1 NetworkManager[55704]: <info>  [1764925090.8044] device (vlan21)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Dec 05 08:58:10 compute-1 NetworkManager[55704]: <info>  [1764925090.8046] device (vlan21)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Dec 05 08:58:10 compute-1 NetworkManager[55704]: <info>  [1764925090.8048] device (vlan21)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Dec 05 08:58:10 compute-1 NetworkManager[55704]: <info>  [1764925090.8053] device (vlan21)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Dec 05 08:58:10 compute-1 kernel: ovs-system: entered promiscuous mode
Dec 05 08:58:10 compute-1 NetworkManager[55704]: <info>  [1764925090.8058] device (vlan22)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Dec 05 08:58:10 compute-1 NetworkManager[55704]: <info>  [1764925090.8062] device (vlan22)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Dec 05 08:58:10 compute-1 NetworkManager[55704]: <info>  [1764925090.8065] device (vlan22)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Dec 05 08:58:10 compute-1 NetworkManager[55704]: <info>  [1764925090.8071] device (vlan22)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Dec 05 08:58:10 compute-1 NetworkManager[55704]: <info>  [1764925090.8075] dhcp4 (eth0): canceled DHCP transaction
Dec 05 08:58:10 compute-1 NetworkManager[55704]: <info>  [1764925090.8075] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Dec 05 08:58:10 compute-1 NetworkManager[55704]: <info>  [1764925090.8075] dhcp4 (eth0): state changed no lease
Dec 05 08:58:10 compute-1 NetworkManager[55704]: <info>  [1764925090.8078] dhcp4 (eth0): activation: beginning transaction (no timeout)
Dec 05 08:58:10 compute-1 kernel: Timeout policy base is empty
Dec 05 08:58:10 compute-1 NetworkManager[55704]: <info>  [1764925090.8090] device (br-ex)[Open vSwitch Interface]: Activation: connection 'br-ex-if' attached as port, continuing activation
Dec 05 08:58:10 compute-1 NetworkManager[55704]: <info>  [1764925090.8093] audit: op="device-reapply" interface="eth1" ifindex=3 pid=58482 uid=0 result="fail" reason="Device is not activated"
Dec 05 08:58:10 compute-1 NetworkManager[55704]: <info>  [1764925090.8098] device (vlan20)[Open vSwitch Interface]: Activation: connection 'vlan20-if' attached as port, continuing activation
Dec 05 08:58:10 compute-1 systemd-udevd[58488]: Network interface NamePolicy= disabled on kernel command line.
Dec 05 08:58:10 compute-1 NetworkManager[55704]: <info>  [1764925090.8140] device (vlan21)[Open vSwitch Interface]: Activation: connection 'vlan21-if' attached as port, continuing activation
Dec 05 08:58:10 compute-1 NetworkManager[55704]: <info>  [1764925090.8145] dhcp4 (eth0): state changed new lease, address=38.102.83.154
Dec 05 08:58:10 compute-1 systemd[1]: Starting Network Manager Script Dispatcher Service...
Dec 05 08:58:10 compute-1 NetworkManager[55704]: <info>  [1764925090.8213] device (vlan22)[Open vSwitch Interface]: Activation: connection 'vlan22-if' attached as port, continuing activation
Dec 05 08:58:10 compute-1 NetworkManager[55704]: <info>  [1764925090.8223] device (eth1): disconnecting for new activation request.
Dec 05 08:58:10 compute-1 NetworkManager[55704]: <info>  [1764925090.8223] audit: op="connection-activate" uuid="91a49ed8-302a-59cf-a590-ba724ca6d638" name="ci-private-network" pid=58482 uid=0 result="success"
Dec 05 08:58:10 compute-1 NetworkManager[55704]: <info>  [1764925090.8260] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=58482 uid=0 result="success"
Dec 05 08:58:10 compute-1 systemd[1]: Started Network Manager Script Dispatcher Service.
Dec 05 08:58:10 compute-1 NetworkManager[55704]: <info>  [1764925090.8273] device (eth1): state change: deactivating -> disconnected (reason 'new-activation', managed-type: 'full')
Dec 05 08:58:10 compute-1 NetworkManager[55704]: <info>  [1764925090.8356] device (eth1): Activation: starting connection 'ci-private-network' (91a49ed8-302a-59cf-a590-ba724ca6d638)
Dec 05 08:58:10 compute-1 NetworkManager[55704]: <info>  [1764925090.8368] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Dec 05 08:58:10 compute-1 NetworkManager[55704]: <info>  [1764925090.8371] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'full')
Dec 05 08:58:10 compute-1 NetworkManager[55704]: <info>  [1764925090.8377] device (br-ex)[Open vSwitch Bridge]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Dec 05 08:58:10 compute-1 NetworkManager[55704]: <info>  [1764925090.8379] device (br-ex)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Dec 05 08:58:10 compute-1 NetworkManager[55704]: <info>  [1764925090.8380] device (eth1)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Dec 05 08:58:10 compute-1 NetworkManager[55704]: <info>  [1764925090.8381] device (vlan20)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Dec 05 08:58:10 compute-1 NetworkManager[55704]: <info>  [1764925090.8382] device (vlan21)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Dec 05 08:58:10 compute-1 NetworkManager[55704]: <info>  [1764925090.8383] device (vlan22)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Dec 05 08:58:10 compute-1 NetworkManager[55704]: <info>  [1764925090.8386] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'full')
Dec 05 08:58:10 compute-1 NetworkManager[55704]: <info>  [1764925090.8392] device (br-ex)[Open vSwitch Bridge]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Dec 05 08:58:10 compute-1 NetworkManager[55704]: <info>  [1764925090.8396] device (br-ex)[Open vSwitch Bridge]: Activation: successful, device activated.
Dec 05 08:58:10 compute-1 NetworkManager[55704]: <info>  [1764925090.8399] device (br-ex)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Dec 05 08:58:10 compute-1 NetworkManager[55704]: <info>  [1764925090.8402] device (br-ex)[Open vSwitch Port]: Activation: successful, device activated.
Dec 05 08:58:10 compute-1 NetworkManager[55704]: <info>  [1764925090.8405] device (eth1)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Dec 05 08:58:10 compute-1 NetworkManager[55704]: <info>  [1764925090.8408] device (eth1)[Open vSwitch Port]: Activation: successful, device activated.
Dec 05 08:58:10 compute-1 NetworkManager[55704]: <info>  [1764925090.8410] device (vlan20)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Dec 05 08:58:10 compute-1 NetworkManager[55704]: <info>  [1764925090.8413] device (vlan20)[Open vSwitch Port]: Activation: successful, device activated.
Dec 05 08:58:10 compute-1 NetworkManager[55704]: <info>  [1764925090.8418] device (vlan21)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Dec 05 08:58:10 compute-1 kernel: br-ex: entered promiscuous mode
Dec 05 08:58:10 compute-1 NetworkManager[55704]: <info>  [1764925090.8425] device (vlan21)[Open vSwitch Port]: Activation: successful, device activated.
Dec 05 08:58:10 compute-1 NetworkManager[55704]: <info>  [1764925090.8429] device (vlan22)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Dec 05 08:58:10 compute-1 NetworkManager[55704]: <info>  [1764925090.8432] device (vlan22)[Open vSwitch Port]: Activation: successful, device activated.
Dec 05 08:58:10 compute-1 NetworkManager[55704]: <info>  [1764925090.8436] device (eth1): Activation: connection 'ci-private-network' attached as port, continuing activation
Dec 05 08:58:10 compute-1 NetworkManager[55704]: <info>  [1764925090.8458] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Dec 05 08:58:10 compute-1 kernel: virtio_net virtio5 eth1: entered promiscuous mode
Dec 05 08:58:10 compute-1 NetworkManager[55704]: <info>  [1764925090.8487] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Dec 05 08:58:10 compute-1 NetworkManager[55704]: <info>  [1764925090.8491] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'full')
Dec 05 08:58:10 compute-1 NetworkManager[55704]: <info>  [1764925090.8502] device (eth1): Activation: successful, device activated.
Dec 05 08:58:10 compute-1 kernel: vlan22: entered promiscuous mode
Dec 05 08:58:10 compute-1 systemd-udevd[58486]: Network interface NamePolicy= disabled on kernel command line.
Dec 05 08:58:10 compute-1 NetworkManager[55704]: <info>  [1764925090.8551] device (br-ex)[Open vSwitch Interface]: carrier: link connected
Dec 05 08:58:10 compute-1 NetworkManager[55704]: <info>  [1764925090.8583] device (br-ex)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Dec 05 08:58:10 compute-1 kernel: vlan21: entered promiscuous mode
Dec 05 08:58:10 compute-1 NetworkManager[55704]: <info>  [1764925090.8623] device (br-ex)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Dec 05 08:58:10 compute-1 NetworkManager[55704]: <info>  [1764925090.8627] device (br-ex)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Dec 05 08:58:10 compute-1 NetworkManager[55704]: <info>  [1764925090.8637] device (br-ex)[Open vSwitch Interface]: Activation: successful, device activated.
Dec 05 08:58:10 compute-1 NetworkManager[55704]: <info>  [1764925090.8653] device (vlan22)[Open vSwitch Interface]: carrier: link connected
Dec 05 08:58:10 compute-1 kernel: vlan20: entered promiscuous mode
Dec 05 08:58:10 compute-1 NetworkManager[55704]: <info>  [1764925090.8672] device (vlan22)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Dec 05 08:58:10 compute-1 NetworkManager[55704]: <info>  [1764925090.8722] device (vlan22)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Dec 05 08:58:10 compute-1 NetworkManager[55704]: <info>  [1764925090.8729] device (vlan22)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Dec 05 08:58:10 compute-1 NetworkManager[55704]: <info>  [1764925090.8736] device (vlan22)[Open vSwitch Interface]: Activation: successful, device activated.
Dec 05 08:58:10 compute-1 NetworkManager[55704]: <info>  [1764925090.8750] device (vlan21)[Open vSwitch Interface]: carrier: link connected
Dec 05 08:58:10 compute-1 NetworkManager[55704]: <info>  [1764925090.8792] device (vlan21)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Dec 05 08:58:10 compute-1 NetworkManager[55704]: <info>  [1764925090.8852] device (vlan20)[Open vSwitch Interface]: carrier: link connected
Dec 05 08:58:10 compute-1 NetworkManager[55704]: <info>  [1764925090.8853] device (vlan21)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Dec 05 08:58:10 compute-1 NetworkManager[55704]: <info>  [1764925090.8860] device (vlan21)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Dec 05 08:58:10 compute-1 NetworkManager[55704]: <info>  [1764925090.8867] device (vlan21)[Open vSwitch Interface]: Activation: successful, device activated.
Dec 05 08:58:10 compute-1 NetworkManager[55704]: <info>  [1764925090.8887] device (vlan20)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Dec 05 08:58:10 compute-1 NetworkManager[55704]: <info>  [1764925090.8923] device (vlan20)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Dec 05 08:58:10 compute-1 NetworkManager[55704]: <info>  [1764925090.8931] device (vlan20)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Dec 05 08:58:10 compute-1 NetworkManager[55704]: <info>  [1764925090.8939] device (vlan20)[Open vSwitch Interface]: Activation: successful, device activated.
Dec 05 08:58:12 compute-1 NetworkManager[55704]: <info>  [1764925092.0020] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=58482 uid=0 result="success"
Dec 05 08:58:12 compute-1 NetworkManager[55704]: <info>  [1764925092.1703] checkpoint[0x5622fa420950]: destroy /org/freedesktop/NetworkManager/Checkpoint/1
Dec 05 08:58:12 compute-1 NetworkManager[55704]: <info>  [1764925092.1708] audit: op="checkpoint-destroy" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=58482 uid=0 result="success"
Dec 05 08:58:12 compute-1 NetworkManager[55704]: <info>  [1764925092.4951] audit: op="checkpoint-create" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=58482 uid=0 result="success"
Dec 05 08:58:12 compute-1 NetworkManager[55704]: <info>  [1764925092.4960] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=58482 uid=0 result="success"
Dec 05 08:58:12 compute-1 sudo[58814]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bskwxqcshqfjjmjaoirbuczrlornvgrq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925092.171172-846-55863337340372/AnsiballZ_async_status.py'
Dec 05 08:58:12 compute-1 sudo[58814]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 08:58:12 compute-1 NetworkManager[55704]: <info>  [1764925092.7002] audit: op="networking-control" arg="global-dns-configuration" pid=58482 uid=0 result="success"
Dec 05 08:58:12 compute-1 NetworkManager[55704]: <info>  [1764925092.7043] config: signal: SET_VALUES,values,values-intern,global-dns-config (/etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf)
Dec 05 08:58:12 compute-1 NetworkManager[55704]: <info>  [1764925092.7076] audit: op="networking-control" arg="global-dns-configuration" pid=58482 uid=0 result="success"
Dec 05 08:58:12 compute-1 NetworkManager[55704]: <info>  [1764925092.7098] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=58482 uid=0 result="success"
Dec 05 08:58:12 compute-1 NetworkManager[55704]: <info>  [1764925092.8404] checkpoint[0x5622fa420a20]: destroy /org/freedesktop/NetworkManager/Checkpoint/2
Dec 05 08:58:12 compute-1 NetworkManager[55704]: <info>  [1764925092.8407] audit: op="checkpoint-destroy" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=58482 uid=0 result="success"
Dec 05 08:58:12 compute-1 python3.9[58816]: ansible-ansible.legacy.async_status Invoked with jid=j541135313231.58474 mode=status _async_dir=/root/.ansible_async
Dec 05 08:58:12 compute-1 sudo[58814]: pam_unix(sudo:session): session closed for user root
Dec 05 08:58:12 compute-1 ansible-async_wrapper.py[58480]: Module complete (58480)
Dec 05 08:58:13 compute-1 ansible-async_wrapper.py[58479]: Done in kid B.
Dec 05 08:58:16 compute-1 sudo[58919]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qxywslstbcfsvvrlfhfciewjxolqumss ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925092.171172-846-55863337340372/AnsiballZ_async_status.py'
Dec 05 08:58:16 compute-1 sudo[58919]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 08:58:16 compute-1 python3.9[58921]: ansible-ansible.legacy.async_status Invoked with jid=j541135313231.58474 mode=status _async_dir=/root/.ansible_async
Dec 05 08:58:16 compute-1 sudo[58919]: pam_unix(sudo:session): session closed for user root
Dec 05 08:58:16 compute-1 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Dec 05 08:58:16 compute-1 sudo[59020]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wrgjrdxnhutuhaxbknhwsvoskwwcgybn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925092.171172-846-55863337340372/AnsiballZ_async_status.py'
Dec 05 08:58:16 compute-1 sudo[59020]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 08:58:16 compute-1 python3.9[59022]: ansible-ansible.legacy.async_status Invoked with jid=j541135313231.58474 mode=cleanup _async_dir=/root/.ansible_async
Dec 05 08:58:16 compute-1 sudo[59020]: pam_unix(sudo:session): session closed for user root
Dec 05 08:58:17 compute-1 sudo[59172]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qjozdqpdqpvfmhtfjoddqvlaumjmhrus ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925097.258828-922-184534884330693/AnsiballZ_stat.py'
Dec 05 08:58:17 compute-1 sudo[59172]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 08:58:17 compute-1 python3.9[59174]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/os-net-config.returncode follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 08:58:17 compute-1 sudo[59172]: pam_unix(sudo:session): session closed for user root
Dec 05 08:58:18 compute-1 sudo[59295]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jcxhewylofxlonrfplznelfbkcrllzph ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925097.258828-922-184534884330693/AnsiballZ_copy.py'
Dec 05 08:58:18 compute-1 sudo[59295]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 08:58:18 compute-1 python3.9[59297]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/os-net-config.returncode mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764925097.258828-922-184534884330693/.source.returncode _original_basename=.ubqqd4qo follow=False checksum=b6589fc6ab0dc82cf12099d1c2d40ab994e8410c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 08:58:18 compute-1 sudo[59295]: pam_unix(sudo:session): session closed for user root
Dec 05 08:58:18 compute-1 sudo[59447]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bkusexhkisubqofsffzitxnmttvvaryq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925098.6724792-970-41529406024148/AnsiballZ_stat.py'
Dec 05 08:58:18 compute-1 sudo[59447]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 08:58:19 compute-1 python3.9[59449]: ansible-ansible.legacy.stat Invoked with path=/etc/cloud/cloud.cfg.d/99-edpm-disable-network-config.cfg follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 08:58:19 compute-1 sudo[59447]: pam_unix(sudo:session): session closed for user root
Dec 05 08:58:19 compute-1 sudo[59571]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tjfsimrwdpuordzrvhylwhvofjitnbgf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925098.6724792-970-41529406024148/AnsiballZ_copy.py'
Dec 05 08:58:19 compute-1 sudo[59571]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 08:58:19 compute-1 python3.9[59573]: ansible-ansible.legacy.copy Invoked with dest=/etc/cloud/cloud.cfg.d/99-edpm-disable-network-config.cfg mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764925098.6724792-970-41529406024148/.source.cfg _original_basename=.9142zn9h follow=False checksum=f3c5952a9cd4c6c31b314b25eb897168971cc86e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 08:58:19 compute-1 sudo[59571]: pam_unix(sudo:session): session closed for user root
Dec 05 08:58:20 compute-1 sudo[59723]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aaxwuifwqgotzwhwoanciyhepyjmjlcn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925099.9477022-1015-78502304358978/AnsiballZ_systemd.py'
Dec 05 08:58:20 compute-1 sudo[59723]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 08:58:20 compute-1 python3.9[59725]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=reloaded daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 05 08:58:20 compute-1 systemd[1]: Reloading Network Manager...
Dec 05 08:58:20 compute-1 NetworkManager[55704]: <info>  [1764925100.5956] audit: op="reload" arg="0" pid=59729 uid=0 result="success"
Dec 05 08:58:20 compute-1 NetworkManager[55704]: <info>  [1764925100.5964] config: signal: SIGHUP,config-files,values,values-user,no-auto-default (/etc/NetworkManager/NetworkManager.conf, /usr/lib/NetworkManager/conf.d/00-server.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf, /var/lib/NetworkManager/NetworkManager-intern.conf)
Dec 05 08:58:20 compute-1 systemd[1]: Reloaded Network Manager.
Dec 05 08:58:20 compute-1 sudo[59723]: pam_unix(sudo:session): session closed for user root
Dec 05 08:58:21 compute-1 sshd-session[51702]: Connection closed by 192.168.122.30 port 50214
Dec 05 08:58:21 compute-1 sshd-session[51699]: pam_unix(sshd:session): session closed for user zuul
Dec 05 08:58:21 compute-1 systemd[1]: session-13.scope: Deactivated successfully.
Dec 05 08:58:21 compute-1 systemd[1]: session-13.scope: Consumed 55.422s CPU time.
Dec 05 08:58:21 compute-1 systemd-logind[807]: Session 13 logged out. Waiting for processes to exit.
Dec 05 08:58:21 compute-1 systemd-logind[807]: Removed session 13.
Dec 05 08:58:26 compute-1 sshd-session[59763]: Accepted publickey for zuul from 192.168.122.30 port 37806 ssh2: ECDSA SHA256:xFeNApY160LxGIiSPyA1pr6/OAQjwgC1t0EAbYvSE3Q
Dec 05 08:58:26 compute-1 systemd-logind[807]: New session 14 of user zuul.
Dec 05 08:58:26 compute-1 systemd[1]: Started Session 14 of User zuul.
Dec 05 08:58:26 compute-1 sshd-session[59763]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 05 08:58:26 compute-1 sshd-session[59760]: Received disconnect from 43.225.158.169 port 53284:11: Bye Bye [preauth]
Dec 05 08:58:26 compute-1 sshd-session[59760]: Disconnected from authenticating user root 43.225.158.169 port 53284 [preauth]
Dec 05 08:58:27 compute-1 python3.9[59916]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 05 08:58:28 compute-1 python3.9[60072]: ansible-ansible.builtin.setup Invoked with filter=['ansible_default_ipv4'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec 05 08:58:29 compute-1 sshd-session[59921]: Received disconnect from 185.118.15.236 port 33690:11: Bye Bye [preauth]
Dec 05 08:58:29 compute-1 sshd-session[59921]: Disconnected from authenticating user root 185.118.15.236 port 33690 [preauth]
Dec 05 08:58:30 compute-1 python3.9[60262]: ansible-ansible.legacy.command Invoked with _raw_params=hostname -f _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 05 08:58:30 compute-1 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Dec 05 08:58:30 compute-1 sshd-session[59766]: Connection closed by 192.168.122.30 port 37806
Dec 05 08:58:30 compute-1 sshd-session[59763]: pam_unix(sshd:session): session closed for user zuul
Dec 05 08:58:30 compute-1 systemd[1]: session-14.scope: Deactivated successfully.
Dec 05 08:58:30 compute-1 systemd[1]: session-14.scope: Consumed 2.344s CPU time.
Dec 05 08:58:30 compute-1 systemd-logind[807]: Session 14 logged out. Waiting for processes to exit.
Dec 05 08:58:30 compute-1 systemd-logind[807]: Removed session 14.
Dec 05 08:58:36 compute-1 sshd-session[60291]: Accepted publickey for zuul from 192.168.122.30 port 42902 ssh2: ECDSA SHA256:xFeNApY160LxGIiSPyA1pr6/OAQjwgC1t0EAbYvSE3Q
Dec 05 08:58:36 compute-1 systemd-logind[807]: New session 15 of user zuul.
Dec 05 08:58:36 compute-1 systemd[1]: Started Session 15 of User zuul.
Dec 05 08:58:36 compute-1 sshd-session[60291]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 05 08:58:37 compute-1 python3.9[60444]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 05 08:58:39 compute-1 python3.9[60599]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 05 08:58:39 compute-1 sudo[60753]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ggarjjesgqowpzfowvuatgshueglcnnv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925119.4875288-81-109509445281540/AnsiballZ_setup.py'
Dec 05 08:58:39 compute-1 sudo[60753]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 08:58:40 compute-1 python3.9[60755]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec 05 08:58:40 compute-1 sudo[60753]: pam_unix(sudo:session): session closed for user root
Dec 05 08:58:40 compute-1 sudo[60837]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kykbrzyoceekqymhwewmqzshhkrjxcql ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925119.4875288-81-109509445281540/AnsiballZ_dnf.py'
Dec 05 08:58:40 compute-1 sudo[60837]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 08:58:41 compute-1 python3.9[60839]: ansible-ansible.legacy.dnf Invoked with name=['podman'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec 05 08:58:42 compute-1 sudo[60837]: pam_unix(sudo:session): session closed for user root
Dec 05 08:58:43 compute-1 sudo[60991]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-knmlfkpqivojzhhxpprcxzbuqifmfzsy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925123.1032376-117-173711173886564/AnsiballZ_setup.py'
Dec 05 08:58:43 compute-1 sudo[60991]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 08:58:43 compute-1 python3.9[60993]: ansible-ansible.builtin.setup Invoked with filter=['ansible_interfaces'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec 05 08:58:44 compute-1 sudo[60991]: pam_unix(sudo:session): session closed for user root
Dec 05 08:58:44 compute-1 sudo[61182]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cazdvgxfdpqszfsnenamimielskewcbc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925124.4516518-150-75289490105327/AnsiballZ_file.py'
Dec 05 08:58:44 compute-1 sudo[61182]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 08:58:45 compute-1 python3.9[61184]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/containers/networks recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 08:58:45 compute-1 sudo[61182]: pam_unix(sudo:session): session closed for user root
Dec 05 08:58:45 compute-1 sudo[61334]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cjfklfbcvcvoinpqdaypopxliilhooqh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925125.3664896-174-215204015422988/AnsiballZ_command.py'
Dec 05 08:58:45 compute-1 sudo[61334]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 08:58:45 compute-1 python3.9[61336]: ansible-ansible.legacy.command Invoked with _raw_params=podman network inspect podman
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 05 08:58:46 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 05 08:58:46 compute-1 sudo[61334]: pam_unix(sudo:session): session closed for user root
Dec 05 08:58:46 compute-1 sudo[61497]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-emxceqjfitwiyxikamyphdpmojhrvkmh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925126.3112633-198-39012798461064/AnsiballZ_stat.py'
Dec 05 08:58:46 compute-1 sudo[61497]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 08:58:46 compute-1 python3.9[61499]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/networks/podman.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 08:58:47 compute-1 sudo[61497]: pam_unix(sudo:session): session closed for user root
Dec 05 08:58:47 compute-1 sudo[61575]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ssderwwiifagkncwgpkwnetmrymodahu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925126.3112633-198-39012798461064/AnsiballZ_file.py'
Dec 05 08:58:47 compute-1 sudo[61575]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 08:58:47 compute-1 python3.9[61577]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/containers/networks/podman.json _original_basename=podman_network_config.j2 recurse=False state=file path=/etc/containers/networks/podman.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 08:58:47 compute-1 sudo[61575]: pam_unix(sudo:session): session closed for user root
Dec 05 08:58:48 compute-1 sudo[61727]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ymdbfmdgxqhkxdqtfwiiikdqljhnacwd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925127.7390833-234-45356944681620/AnsiballZ_stat.py'
Dec 05 08:58:48 compute-1 sudo[61727]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 08:58:48 compute-1 python3.9[61729]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 08:58:48 compute-1 sudo[61727]: pam_unix(sudo:session): session closed for user root
Dec 05 08:58:48 compute-1 sudo[61805]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-krsldcvrspgqxgtqhsfldzwejqimqord ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925127.7390833-234-45356944681620/AnsiballZ_file.py'
Dec 05 08:58:48 compute-1 sudo[61805]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 08:58:48 compute-1 python3.9[61807]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root setype=etc_t dest=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf _original_basename=registries.conf.j2 recurse=False state=file path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 05 08:58:48 compute-1 sudo[61805]: pam_unix(sudo:session): session closed for user root
Dec 05 08:58:49 compute-1 sudo[61957]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rxlreqmhaoikisalcujzwqolqpjlqlch ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925128.943181-273-24276966138455/AnsiballZ_ini_file.py'
Dec 05 08:58:49 compute-1 sudo[61957]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 08:58:49 compute-1 python3.9[61959]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=pids_limit owner=root path=/etc/containers/containers.conf section=containers setype=etc_t value=4096 backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Dec 05 08:58:49 compute-1 sudo[61957]: pam_unix(sudo:session): session closed for user root
Dec 05 08:58:50 compute-1 sudo[62109]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xfxxziarjzdumjsmzilbaizhbeauutgp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925129.7387228-273-191456396010940/AnsiballZ_ini_file.py'
Dec 05 08:58:50 compute-1 sudo[62109]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 08:58:50 compute-1 python3.9[62111]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=events_logger owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="journald" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Dec 05 08:58:50 compute-1 sudo[62109]: pam_unix(sudo:session): session closed for user root
Dec 05 08:58:50 compute-1 sudo[62261]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yadompnvlvtilpfsldhmomikhhfgrvso ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925130.3895373-273-180535411359932/AnsiballZ_ini_file.py'
Dec 05 08:58:50 compute-1 sudo[62261]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 08:58:50 compute-1 python3.9[62263]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=runtime owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="crun" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Dec 05 08:58:50 compute-1 sudo[62261]: pam_unix(sudo:session): session closed for user root
Dec 05 08:58:51 compute-1 sudo[62413]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lajmvkxcfreztoowvgtssamdrxaijoma ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925131.0844104-273-178743947791569/AnsiballZ_ini_file.py'
Dec 05 08:58:51 compute-1 sudo[62413]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 08:58:51 compute-1 python3.9[62415]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=network_backend owner=root path=/etc/containers/containers.conf section=network setype=etc_t value="netavark" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Dec 05 08:58:51 compute-1 sudo[62413]: pam_unix(sudo:session): session closed for user root
Dec 05 08:58:52 compute-1 sudo[62565]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ifeahgqkxsnvsczcgloltviwmgeevbaw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925132.0011907-366-197468688882252/AnsiballZ_dnf.py'
Dec 05 08:58:52 compute-1 sudo[62565]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 08:58:52 compute-1 python3.9[62567]: ansible-ansible.legacy.dnf Invoked with name=['openssh-server'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec 05 08:58:54 compute-1 sudo[62565]: pam_unix(sudo:session): session closed for user root
Dec 05 08:58:54 compute-1 sudo[62718]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sywwetjprmhcnzrqbgzoqkknnecypajf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925134.635822-399-260834018724741/AnsiballZ_setup.py'
Dec 05 08:58:54 compute-1 sudo[62718]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 08:58:55 compute-1 python3.9[62720]: ansible-setup Invoked with gather_subset=['!all', '!min', 'distribution', 'distribution_major_version', 'distribution_version', 'os_family'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 05 08:58:55 compute-1 sudo[62718]: pam_unix(sudo:session): session closed for user root
Dec 05 08:58:55 compute-1 sudo[62872]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zkgsdrkshwohbzmzsaxabflmgnghfzuo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925135.493212-423-162131604208618/AnsiballZ_stat.py'
Dec 05 08:58:55 compute-1 sudo[62872]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 08:58:55 compute-1 python3.9[62874]: ansible-stat Invoked with path=/run/ostree-booted follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 05 08:58:55 compute-1 sudo[62872]: pam_unix(sudo:session): session closed for user root
Dec 05 08:58:56 compute-1 sudo[63024]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aimhzwlvhpmwtncvgraijshulflsalxg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925136.2776887-450-152632176651723/AnsiballZ_stat.py'
Dec 05 08:58:56 compute-1 sudo[63024]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 08:58:56 compute-1 python3.9[63026]: ansible-stat Invoked with path=/sbin/transactional-update follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 05 08:58:56 compute-1 sudo[63024]: pam_unix(sudo:session): session closed for user root
Dec 05 08:58:57 compute-1 sudo[63176]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nwxkwzecfgeqarvctfwwtxghjspwukro ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925137.168487-480-22357388375559/AnsiballZ_command.py'
Dec 05 08:58:57 compute-1 sudo[63176]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 08:58:57 compute-1 python3.9[63178]: ansible-ansible.legacy.command Invoked with _raw_params=systemctl is-system-running _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 05 08:58:57 compute-1 sudo[63176]: pam_unix(sudo:session): session closed for user root
Dec 05 08:58:58 compute-1 sudo[63329]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xxnhzjzmkowxcdzhmwrfjvkdsiunmufm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925138.0580344-510-195346872793005/AnsiballZ_service_facts.py'
Dec 05 08:58:58 compute-1 sudo[63329]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 08:58:58 compute-1 python3.9[63331]: ansible-service_facts Invoked
Dec 05 08:58:58 compute-1 network[63348]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Dec 05 08:58:58 compute-1 network[63349]: 'network-scripts' will be removed from distribution in near future.
Dec 05 08:58:58 compute-1 network[63350]: It is advised to switch to 'NetworkManager' instead for network management.
Dec 05 08:59:01 compute-1 sudo[63329]: pam_unix(sudo:session): session closed for user root
Dec 05 08:59:03 compute-1 sshd-session[63485]: Received disconnect from 122.168.194.41 port 43846:11: Bye Bye [preauth]
Dec 05 08:59:03 compute-1 sshd-session[63485]: Disconnected from authenticating user root 122.168.194.41 port 43846 [preauth]
Dec 05 08:59:03 compute-1 sudo[63635]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wjqdbcwqrigkbrirgtvsazdpfrgqebnx ; /bin/bash /home/zuul/.ansible/tmp/ansible-tmp-1764925143.4453442-555-106041631354524/AnsiballZ_timesync_provider.sh /home/zuul/.ansible/tmp/ansible-tmp-1764925143.4453442-555-106041631354524/args'
Dec 05 08:59:03 compute-1 sudo[63635]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 08:59:03 compute-1 sudo[63635]: pam_unix(sudo:session): session closed for user root
Dec 05 08:59:04 compute-1 sudo[63802]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-amauosfwsleluwizhitumbrhgjcixcax ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925144.3265886-588-74141607686938/AnsiballZ_dnf.py'
Dec 05 08:59:04 compute-1 sudo[63802]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 08:59:05 compute-1 python3.9[63804]: ansible-ansible.legacy.dnf Invoked with name=['chrony'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec 05 08:59:06 compute-1 sudo[63802]: pam_unix(sudo:session): session closed for user root
Dec 05 08:59:08 compute-1 sudo[63955]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rbhlsdbcnkjgydxcxndeqhxdqsaufsex ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925147.4387863-627-201162478015712/AnsiballZ_package_facts.py'
Dec 05 08:59:08 compute-1 sudo[63955]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 08:59:08 compute-1 python3.9[63957]: ansible-package_facts Invoked with manager=['auto'] strategy=first
Dec 05 08:59:08 compute-1 sudo[63955]: pam_unix(sudo:session): session closed for user root
Dec 05 08:59:10 compute-1 sudo[64107]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ilwkykmujgishajezglgplsncgdraeuf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925149.6892636-657-78350998538219/AnsiballZ_stat.py'
Dec 05 08:59:10 compute-1 sudo[64107]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 08:59:10 compute-1 python3.9[64109]: ansible-ansible.legacy.stat Invoked with path=/etc/chrony.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 08:59:10 compute-1 sudo[64107]: pam_unix(sudo:session): session closed for user root
Dec 05 08:59:10 compute-1 sudo[64232]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aqaazzndnqbiphijjuxozcdpvjlyeyuc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925149.6892636-657-78350998538219/AnsiballZ_copy.py'
Dec 05 08:59:10 compute-1 sudo[64232]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 08:59:11 compute-1 python3.9[64234]: ansible-ansible.legacy.copy Invoked with backup=True dest=/etc/chrony.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764925149.6892636-657-78350998538219/.source.conf follow=False _original_basename=chrony.conf.j2 checksum=cfb003e56d02d0d2c65555452eb1a05073fecdad force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 08:59:11 compute-1 sudo[64232]: pam_unix(sudo:session): session closed for user root
Dec 05 08:59:11 compute-1 sudo[64386]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uwjcfkmxyymsxorkuzbwbkmykqwovgcy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925151.3340302-702-26889329707605/AnsiballZ_stat.py'
Dec 05 08:59:11 compute-1 sudo[64386]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 08:59:11 compute-1 python3.9[64388]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/chronyd follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 08:59:11 compute-1 sudo[64386]: pam_unix(sudo:session): session closed for user root
Dec 05 08:59:12 compute-1 sudo[64511]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-evitsopajsoxomczjsdimcmzzvmltfyp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925151.3340302-702-26889329707605/AnsiballZ_copy.py'
Dec 05 08:59:12 compute-1 sudo[64511]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 08:59:12 compute-1 python3.9[64513]: ansible-ansible.legacy.copy Invoked with backup=True dest=/etc/sysconfig/chronyd mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764925151.3340302-702-26889329707605/.source follow=False _original_basename=chronyd.sysconfig.j2 checksum=dd196b1ff1f915b23eebc37ec77405b5dd3df76c force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 08:59:12 compute-1 sudo[64511]: pam_unix(sudo:session): session closed for user root
Dec 05 08:59:14 compute-1 sudo[64667]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yjnlytrpwefrzfyynbpumjprgppdzxmz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925153.639938-766-264213473871097/AnsiballZ_lineinfile.py'
Dec 05 08:59:14 compute-1 sudo[64667]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 08:59:14 compute-1 python3.9[64669]: ansible-lineinfile Invoked with backup=True create=True dest=/etc/sysconfig/network line=PEERNTP=no mode=0644 regexp=^PEERNTP= state=present path=/etc/sysconfig/network encoding=utf-8 backrefs=False firstmatch=False unsafe_writes=False search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 08:59:14 compute-1 sudo[64667]: pam_unix(sudo:session): session closed for user root
Dec 05 08:59:14 compute-1 sshd-session[64516]: Received disconnect from 101.47.162.91 port 42404:11: Bye Bye [preauth]
Dec 05 08:59:14 compute-1 sshd-session[64516]: Disconnected from authenticating user root 101.47.162.91 port 42404 [preauth]
Dec 05 08:59:15 compute-1 sudo[64821]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-szleylueoerbfcaaulfspspzttslslcf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925155.525139-810-213811085172034/AnsiballZ_setup.py'
Dec 05 08:59:15 compute-1 sudo[64821]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 08:59:16 compute-1 python3.9[64823]: ansible-ansible.legacy.setup Invoked with gather_subset=['!all'] filter=['ansible_service_mgr'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec 05 08:59:16 compute-1 sudo[64821]: pam_unix(sudo:session): session closed for user root
Dec 05 08:59:16 compute-1 sudo[64905]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ajmnytohfasdpvjinzdznfjhvsvxpsai ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925155.525139-810-213811085172034/AnsiballZ_systemd.py'
Dec 05 08:59:16 compute-1 sudo[64905]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 08:59:17 compute-1 python3.9[64907]: ansible-ansible.legacy.systemd Invoked with enabled=True name=chronyd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 05 08:59:17 compute-1 sudo[64905]: pam_unix(sudo:session): session closed for user root
Dec 05 08:59:18 compute-1 sudo[65059]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-avwuvtzellyqdrbqxgegzezyfusvodpk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925158.3985465-859-128706660036549/AnsiballZ_setup.py'
Dec 05 08:59:18 compute-1 sudo[65059]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 08:59:18 compute-1 python3.9[65061]: ansible-ansible.legacy.setup Invoked with gather_subset=['!all'] filter=['ansible_service_mgr'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec 05 08:59:19 compute-1 sudo[65059]: pam_unix(sudo:session): session closed for user root
Dec 05 08:59:19 compute-1 sudo[65143]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ceadejeynyywbwekijnbgjshpwdhttlq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925158.3985465-859-128706660036549/AnsiballZ_systemd.py'
Dec 05 08:59:19 compute-1 sudo[65143]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 08:59:19 compute-1 python3.9[65145]: ansible-ansible.legacy.systemd Invoked with name=chronyd state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 05 08:59:19 compute-1 chronyd[781]: chronyd exiting
Dec 05 08:59:19 compute-1 systemd[1]: Stopping NTP client/server...
Dec 05 08:59:19 compute-1 systemd[1]: chronyd.service: Deactivated successfully.
Dec 05 08:59:19 compute-1 systemd[1]: Stopped NTP client/server.
Dec 05 08:59:19 compute-1 systemd[1]: Starting NTP client/server...
Dec 05 08:59:19 compute-1 chronyd[65155]: chronyd version 4.8 starting (+CMDMON +REFCLOCK +RTC +PRIVDROP +SCFILTER +SIGND +NTS +SECHASH +IPV6 +DEBUG)
Dec 05 08:59:19 compute-1 chronyd[65155]: Frequency -23.773 +/- 0.404 ppm read from /var/lib/chrony/drift
Dec 05 08:59:19 compute-1 chronyd[65155]: Loaded seccomp filter (level 2)
Dec 05 08:59:19 compute-1 systemd[1]: Started NTP client/server.
Dec 05 08:59:19 compute-1 sudo[65143]: pam_unix(sudo:session): session closed for user root
Dec 05 08:59:21 compute-1 sshd-session[60294]: Connection closed by 192.168.122.30 port 42902
Dec 05 08:59:21 compute-1 sshd-session[60291]: pam_unix(sshd:session): session closed for user zuul
Dec 05 08:59:21 compute-1 systemd[1]: session-15.scope: Deactivated successfully.
Dec 05 08:59:21 compute-1 systemd[1]: session-15.scope: Consumed 27.000s CPU time.
Dec 05 08:59:21 compute-1 systemd-logind[807]: Session 15 logged out. Waiting for processes to exit.
Dec 05 08:59:21 compute-1 systemd-logind[807]: Removed session 15.
Dec 05 08:59:26 compute-1 sshd-session[65181]: Accepted publickey for zuul from 192.168.122.30 port 53164 ssh2: ECDSA SHA256:xFeNApY160LxGIiSPyA1pr6/OAQjwgC1t0EAbYvSE3Q
Dec 05 08:59:26 compute-1 systemd-logind[807]: New session 16 of user zuul.
Dec 05 08:59:26 compute-1 systemd[1]: Started Session 16 of User zuul.
Dec 05 08:59:26 compute-1 sshd-session[65181]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 05 08:59:28 compute-1 python3.9[65334]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 05 08:59:28 compute-1 sudo[65488]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tnpjjjicgditldezbsmpucdieugljwtq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925168.514208-60-261817181548412/AnsiballZ_file.py'
Dec 05 08:59:28 compute-1 sudo[65488]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 08:59:29 compute-1 python3.9[65490]: ansible-ansible.builtin.file Invoked with group=zuul mode=0770 owner=zuul path=/root/.config/containers recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 08:59:29 compute-1 sudo[65488]: pam_unix(sudo:session): session closed for user root
Dec 05 08:59:29 compute-1 sudo[65663]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yrsyqjqmpllucqagcldzlpyimptutjuj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925169.4156299-84-152743034894517/AnsiballZ_stat.py'
Dec 05 08:59:29 compute-1 sudo[65663]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 08:59:30 compute-1 python3.9[65665]: ansible-ansible.legacy.stat Invoked with path=/root/.config/containers/auth.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 08:59:30 compute-1 sudo[65663]: pam_unix(sudo:session): session closed for user root
Dec 05 08:59:30 compute-1 sudo[65741]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ilwbqecipqqzddkcbuatkhoftzanjxsi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925169.4156299-84-152743034894517/AnsiballZ_file.py'
Dec 05 08:59:30 compute-1 sudo[65741]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 08:59:30 compute-1 python3.9[65743]: ansible-ansible.legacy.file Invoked with group=zuul mode=0660 owner=zuul dest=/root/.config/containers/auth.json _original_basename=.d01_jxo2 recurse=False state=file path=/root/.config/containers/auth.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 08:59:30 compute-1 sudo[65741]: pam_unix(sudo:session): session closed for user root
Dec 05 08:59:31 compute-1 sudo[65893]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xbpnuorjqsfczyusdobnmwztunbryswt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925171.3012428-144-185925643633146/AnsiballZ_stat.py'
Dec 05 08:59:31 compute-1 sudo[65893]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 08:59:31 compute-1 python3.9[65895]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 08:59:31 compute-1 sudo[65893]: pam_unix(sudo:session): session closed for user root
Dec 05 08:59:32 compute-1 sudo[66016]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zsajeictlrkqvmsuinhxgbtfthicsxjr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925171.3012428-144-185925643633146/AnsiballZ_copy.py'
Dec 05 08:59:32 compute-1 sudo[66016]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 08:59:32 compute-1 python3.9[66018]: ansible-ansible.legacy.copy Invoked with dest=/etc/sysconfig/podman_drop_in mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764925171.3012428-144-185925643633146/.source _original_basename=.xkvgruz3 follow=False checksum=125299ce8dea7711a76292961206447f0043248b backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 08:59:32 compute-1 sudo[66016]: pam_unix(sudo:session): session closed for user root
Dec 05 08:59:33 compute-1 sudo[66168]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fwcxrsolykcnhiecxvrasreupfwcyfxu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925172.8690276-192-17200445728854/AnsiballZ_file.py'
Dec 05 08:59:33 compute-1 sudo[66168]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 08:59:33 compute-1 python3.9[66170]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 05 08:59:33 compute-1 sudo[66168]: pam_unix(sudo:session): session closed for user root
Dec 05 08:59:33 compute-1 sudo[66320]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xgzfdtoxvogehfalqnapyrnfgiikhgib ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925173.607077-216-228899706011636/AnsiballZ_stat.py'
Dec 05 08:59:33 compute-1 sudo[66320]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 08:59:34 compute-1 python3.9[66322]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 08:59:34 compute-1 sudo[66320]: pam_unix(sudo:session): session closed for user root
Dec 05 08:59:34 compute-1 sudo[66443]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-klesclrpdvuzbvzlkrpcizdfkviclbqk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925173.607077-216-228899706011636/AnsiballZ_copy.py'
Dec 05 08:59:34 compute-1 sudo[66443]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 08:59:34 compute-1 python3.9[66445]: ansible-ansible.legacy.copy Invoked with dest=/var/local/libexec/edpm-container-shutdown group=root mode=0700 owner=root setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764925173.607077-216-228899706011636/.source _original_basename=edpm-container-shutdown follow=False checksum=632c3792eb3dce4288b33ae7b265b71950d69f13 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Dec 05 08:59:34 compute-1 sudo[66443]: pam_unix(sudo:session): session closed for user root
Dec 05 08:59:35 compute-1 sudo[66597]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zyleqvflujonjthdfpthgwotnrpamyov ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925174.875007-216-214216079232370/AnsiballZ_stat.py'
Dec 05 08:59:35 compute-1 sudo[66597]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 08:59:35 compute-1 python3.9[66599]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 08:59:35 compute-1 sudo[66597]: pam_unix(sudo:session): session closed for user root
Dec 05 08:59:35 compute-1 sudo[66720]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lzhjzvrhrcqkufrbqeleucjverhxereh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925174.875007-216-214216079232370/AnsiballZ_copy.py'
Dec 05 08:59:35 compute-1 sudo[66720]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 08:59:35 compute-1 python3.9[66722]: ansible-ansible.legacy.copy Invoked with dest=/var/local/libexec/edpm-start-podman-container group=root mode=0700 owner=root setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764925174.875007-216-214216079232370/.source _original_basename=edpm-start-podman-container follow=False checksum=b963c569d75a655c0ccae95d9bb4a2a9a4df27d1 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Dec 05 08:59:35 compute-1 sshd-session[66446]: Received disconnect from 43.225.158.169 port 38191:11: Bye Bye [preauth]
Dec 05 08:59:35 compute-1 sshd-session[66446]: Disconnected from authenticating user root 43.225.158.169 port 38191 [preauth]
Dec 05 08:59:35 compute-1 sudo[66720]: pam_unix(sudo:session): session closed for user root
Dec 05 08:59:36 compute-1 sudo[66872]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wjlxniotmdhyrshnyiubhkaoqznkrodq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925176.2299056-303-54158665641413/AnsiballZ_file.py'
Dec 05 08:59:36 compute-1 sudo[66872]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 08:59:36 compute-1 python3.9[66874]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 08:59:36 compute-1 sudo[66872]: pam_unix(sudo:session): session closed for user root
Dec 05 08:59:37 compute-1 sudo[67024]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uibuvvyxvqqnjidaqqqfkaqdcsavnruw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925176.9109993-327-174774461183349/AnsiballZ_stat.py'
Dec 05 08:59:37 compute-1 sudo[67024]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 08:59:37 compute-1 python3.9[67026]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 08:59:37 compute-1 sudo[67024]: pam_unix(sudo:session): session closed for user root
Dec 05 08:59:37 compute-1 sudo[67147]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wievtpogtncbouylkypthyxzfihvaden ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925176.9109993-327-174774461183349/AnsiballZ_copy.py'
Dec 05 08:59:37 compute-1 sudo[67147]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 08:59:38 compute-1 python3.9[67149]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm-container-shutdown.service group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764925176.9109993-327-174774461183349/.source.service _original_basename=edpm-container-shutdown-service follow=False checksum=6336835cb0f888670cc99de31e19c8c071444d33 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 08:59:38 compute-1 sudo[67147]: pam_unix(sudo:session): session closed for user root
Dec 05 08:59:38 compute-1 sudo[67299]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-apxntfnwhgaiqigjlepepvmiblmquvco ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925178.2991445-372-280327466423127/AnsiballZ_stat.py'
Dec 05 08:59:38 compute-1 sudo[67299]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 08:59:38 compute-1 python3.9[67301]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 08:59:38 compute-1 sudo[67299]: pam_unix(sudo:session): session closed for user root
Dec 05 08:59:39 compute-1 sudo[67422]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ynguzwqbkhqokhoayeyuaabbdmnowvab ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925178.2991445-372-280327466423127/AnsiballZ_copy.py'
Dec 05 08:59:39 compute-1 sudo[67422]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 08:59:39 compute-1 python3.9[67424]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764925178.2991445-372-280327466423127/.source.preset _original_basename=91-edpm-container-shutdown-preset follow=False checksum=b275e4375287528cb63464dd32f622c4f142a915 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 08:59:39 compute-1 sudo[67422]: pam_unix(sudo:session): session closed for user root
Dec 05 08:59:40 compute-1 sudo[67574]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vztiroavylnenjhwscsnvtxrtzudshbc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925179.700334-417-270059702291982/AnsiballZ_systemd.py'
Dec 05 08:59:40 compute-1 sudo[67574]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 08:59:40 compute-1 python3.9[67576]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 05 08:59:40 compute-1 systemd[1]: Reloading.
Dec 05 08:59:40 compute-1 systemd-rc-local-generator[67603]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 05 08:59:40 compute-1 systemd-sysv-generator[67607]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 05 08:59:40 compute-1 systemd[1]: Reloading.
Dec 05 08:59:41 compute-1 systemd-rc-local-generator[67644]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 05 08:59:41 compute-1 systemd-sysv-generator[67647]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 05 08:59:41 compute-1 systemd[1]: Starting EDPM Container Shutdown...
Dec 05 08:59:41 compute-1 systemd[1]: Finished EDPM Container Shutdown.
Dec 05 08:59:41 compute-1 sudo[67574]: pam_unix(sudo:session): session closed for user root
Dec 05 08:59:41 compute-1 sudo[67801]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-biymsfvafnzuijhilbnrdhalwrkxbahb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925181.441975-441-201136969396467/AnsiballZ_stat.py'
Dec 05 08:59:41 compute-1 sudo[67801]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 08:59:41 compute-1 python3.9[67803]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 08:59:41 compute-1 sudo[67801]: pam_unix(sudo:session): session closed for user root
Dec 05 08:59:42 compute-1 sudo[67924]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-odxdbkzmrrgesbogoiwalxngusklpcbp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925181.441975-441-201136969396467/AnsiballZ_copy.py'
Dec 05 08:59:42 compute-1 sudo[67924]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 08:59:42 compute-1 python3.9[67926]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/netns-placeholder.service group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764925181.441975-441-201136969396467/.source.service _original_basename=netns-placeholder-service follow=False checksum=b61b1b5918c20c877b8b226fbf34ff89a082d972 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 08:59:42 compute-1 sudo[67924]: pam_unix(sudo:session): session closed for user root
Dec 05 08:59:42 compute-1 sudo[68078]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-psloqctklqkynqbogjjxpdbnpryxrnon ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925182.6873734-486-230567833421898/AnsiballZ_stat.py'
Dec 05 08:59:42 compute-1 sudo[68078]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 08:59:43 compute-1 python3.9[68080]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 08:59:43 compute-1 sudo[68078]: pam_unix(sudo:session): session closed for user root
Dec 05 08:59:43 compute-1 sudo[68201]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rcpsroikcmbnnlakjevlzatgcaroxhgw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925182.6873734-486-230567833421898/AnsiballZ_copy.py'
Dec 05 08:59:43 compute-1 sudo[68201]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 08:59:43 compute-1 sshd-session[67930]: Received disconnect from 185.118.15.236 port 33814:11: Bye Bye [preauth]
Dec 05 08:59:43 compute-1 sshd-session[67930]: Disconnected from authenticating user root 185.118.15.236 port 33814 [preauth]
Dec 05 08:59:43 compute-1 python3.9[68203]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system-preset/91-netns-placeholder.preset group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764925182.6873734-486-230567833421898/.source.preset _original_basename=91-netns-placeholder-preset follow=False checksum=28b7b9aa893525d134a1eeda8a0a48fb25b736b9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 08:59:43 compute-1 sudo[68201]: pam_unix(sudo:session): session closed for user root
Dec 05 08:59:44 compute-1 sudo[68353]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-scabhcfquuwozvvdcdopuwmlvvzkcmgi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925184.024989-531-41809248413759/AnsiballZ_systemd.py'
Dec 05 08:59:44 compute-1 sudo[68353]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 08:59:44 compute-1 python3.9[68355]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 05 08:59:44 compute-1 systemd[1]: Reloading.
Dec 05 08:59:44 compute-1 systemd-sysv-generator[68389]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 05 08:59:44 compute-1 systemd-rc-local-generator[68385]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 05 08:59:44 compute-1 systemd[1]: Reloading.
Dec 05 08:59:44 compute-1 systemd-rc-local-generator[68420]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 05 08:59:44 compute-1 systemd-sysv-generator[68424]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 05 08:59:45 compute-1 systemd[1]: Starting Create netns directory...
Dec 05 08:59:45 compute-1 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Dec 05 08:59:45 compute-1 systemd[1]: netns-placeholder.service: Deactivated successfully.
Dec 05 08:59:45 compute-1 systemd[1]: Finished Create netns directory.
Dec 05 08:59:45 compute-1 sudo[68353]: pam_unix(sudo:session): session closed for user root
Dec 05 08:59:46 compute-1 python3.9[68583]: ansible-ansible.builtin.service_facts Invoked
Dec 05 08:59:46 compute-1 network[68600]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Dec 05 08:59:46 compute-1 network[68601]: 'network-scripts' will be removed from distribution in near future.
Dec 05 08:59:46 compute-1 network[68602]: It is advised to switch to 'NetworkManager' instead for network management.
Dec 05 08:59:49 compute-1 sudo[68862]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jcealbofucpycrozxeigfspefkqdgdos ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925189.1717243-579-94612471282119/AnsiballZ_systemd.py'
Dec 05 08:59:49 compute-1 sudo[68862]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 08:59:49 compute-1 python3.9[68864]: ansible-ansible.builtin.systemd Invoked with enabled=False name=iptables.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 05 08:59:49 compute-1 systemd[1]: Reloading.
Dec 05 08:59:49 compute-1 systemd-rc-local-generator[68893]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 05 08:59:49 compute-1 systemd-sysv-generator[68896]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 05 08:59:50 compute-1 systemd[1]: Stopping IPv4 firewall with iptables...
Dec 05 08:59:50 compute-1 iptables.init[68903]: iptables: Setting chains to policy ACCEPT: raw mangle filter nat [  OK  ]
Dec 05 08:59:50 compute-1 iptables.init[68903]: iptables: Flushing firewall rules: [  OK  ]
Dec 05 08:59:50 compute-1 systemd[1]: iptables.service: Deactivated successfully.
Dec 05 08:59:50 compute-1 systemd[1]: Stopped IPv4 firewall with iptables.
Dec 05 08:59:50 compute-1 sudo[68862]: pam_unix(sudo:session): session closed for user root
Dec 05 08:59:50 compute-1 sudo[69097]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dnpsvlshgwxyyabsxhoxrvouwfewdqur ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925190.6683474-579-251961292375293/AnsiballZ_systemd.py'
Dec 05 08:59:50 compute-1 sudo[69097]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 08:59:51 compute-1 python3.9[69099]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ip6tables.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 05 08:59:51 compute-1 sudo[69097]: pam_unix(sudo:session): session closed for user root
Dec 05 08:59:51 compute-1 sudo[69251]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-globlimpsyilbmwzmeltbfufgkglznlu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925191.6644785-627-278472040342623/AnsiballZ_systemd.py'
Dec 05 08:59:51 compute-1 sudo[69251]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 08:59:52 compute-1 python3.9[69253]: ansible-ansible.builtin.systemd Invoked with enabled=True name=nftables state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 05 08:59:52 compute-1 systemd[1]: Reloading.
Dec 05 08:59:52 compute-1 systemd-rc-local-generator[69282]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 05 08:59:52 compute-1 systemd-sysv-generator[69285]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 05 08:59:52 compute-1 systemd[1]: Starting Netfilter Tables...
Dec 05 08:59:52 compute-1 systemd[1]: Finished Netfilter Tables.
Dec 05 08:59:52 compute-1 sudo[69251]: pam_unix(sudo:session): session closed for user root
Dec 05 08:59:53 compute-1 sudo[69442]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cbybuvtzghegfyohqmthjemgcuzkjslf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925192.8742425-651-158319752792551/AnsiballZ_command.py'
Dec 05 08:59:53 compute-1 sudo[69442]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 08:59:53 compute-1 python3.9[69444]: ansible-ansible.legacy.command Invoked with _raw_params=nft flush ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 05 08:59:53 compute-1 sudo[69442]: pam_unix(sudo:session): session closed for user root
Dec 05 08:59:54 compute-1 sudo[69595]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sdufzpkybvvebqqlbworexqkqkybvwgy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925194.660878-693-75547307113614/AnsiballZ_stat.py'
Dec 05 08:59:54 compute-1 sudo[69595]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 08:59:55 compute-1 python3.9[69597]: ansible-ansible.legacy.stat Invoked with path=/etc/ssh/sshd_config follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 08:59:55 compute-1 sudo[69595]: pam_unix(sudo:session): session closed for user root
Dec 05 08:59:55 compute-1 sudo[69720]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nerncgdeiggxojiqbiqemnecckvjclur ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925194.660878-693-75547307113614/AnsiballZ_copy.py'
Dec 05 08:59:55 compute-1 sudo[69720]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 08:59:55 compute-1 python3.9[69722]: ansible-ansible.legacy.copy Invoked with dest=/etc/ssh/sshd_config mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1764925194.660878-693-75547307113614/.source validate=/usr/sbin/sshd -T -f %s follow=False _original_basename=sshd_config_block.j2 checksum=6c79f4cb960ad444688fde322eeacb8402e22d79 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 08:59:55 compute-1 sudo[69720]: pam_unix(sudo:session): session closed for user root
Dec 05 08:59:56 compute-1 sudo[69873]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tnezqondxvdasicgjzeroyrccscuptzr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925196.1069117-738-267387494831146/AnsiballZ_systemd.py'
Dec 05 08:59:56 compute-1 sudo[69873]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 08:59:56 compute-1 python3.9[69875]: ansible-ansible.builtin.systemd Invoked with name=sshd state=reloaded daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 05 08:59:56 compute-1 systemd[1]: Reloading OpenSSH server daemon...
Dec 05 08:59:56 compute-1 systemd[1]: Reloaded OpenSSH server daemon.
Dec 05 08:59:56 compute-1 sshd[1008]: Received SIGHUP; restarting.
Dec 05 08:59:56 compute-1 sshd[1008]: Server listening on 0.0.0.0 port 22.
Dec 05 08:59:56 compute-1 sshd[1008]: Server listening on :: port 22.
Dec 05 08:59:56 compute-1 sudo[69873]: pam_unix(sudo:session): session closed for user root
Dec 05 08:59:57 compute-1 sudo[70029]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bagqcdffyqecgcejwgyddjpcupnvopzm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925197.4053051-762-150936316526279/AnsiballZ_file.py'
Dec 05 08:59:57 compute-1 sudo[70029]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 08:59:57 compute-1 python3.9[70031]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 08:59:57 compute-1 sudo[70029]: pam_unix(sudo:session): session closed for user root
Dec 05 08:59:58 compute-1 sudo[70181]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zhktkzmglbrztsbangrdabnxlezosyit ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925198.1816616-786-113476386843985/AnsiballZ_stat.py'
Dec 05 08:59:58 compute-1 sudo[70181]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 08:59:58 compute-1 python3.9[70183]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/sshd-networks.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 08:59:58 compute-1 sudo[70181]: pam_unix(sudo:session): session closed for user root
Dec 05 08:59:59 compute-1 sudo[70304]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nfkjrubctuyuloujacfmijmfkkwftvqv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925198.1816616-786-113476386843985/AnsiballZ_copy.py'
Dec 05 08:59:59 compute-1 sudo[70304]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 08:59:59 compute-1 python3.9[70306]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/sshd-networks.yaml group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764925198.1816616-786-113476386843985/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=0bfc8440fd8f39002ab90252479fb794f51b5ae8 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 08:59:59 compute-1 sudo[70304]: pam_unix(sudo:session): session closed for user root
Dec 05 09:00:00 compute-1 sudo[70456]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-njenpnqppgxqhxihlosqegiqqxnfeqre ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925199.9610546-840-271611980529763/AnsiballZ_timezone.py'
Dec 05 09:00:00 compute-1 sudo[70456]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:00:00 compute-1 python3.9[70458]: ansible-community.general.timezone Invoked with name=UTC hwclock=None
Dec 05 09:00:00 compute-1 systemd[1]: Starting Time & Date Service...
Dec 05 09:00:00 compute-1 systemd[1]: Started Time & Date Service.
Dec 05 09:00:00 compute-1 sudo[70456]: pam_unix(sudo:session): session closed for user root
Dec 05 09:00:01 compute-1 sudo[70612]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wwamephesztfdaoqcyfwsjmyxhsnqwtk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925201.3058324-867-5029321369719/AnsiballZ_file.py'
Dec 05 09:00:01 compute-1 sudo[70612]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:00:01 compute-1 python3.9[70614]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 09:00:01 compute-1 sudo[70612]: pam_unix(sudo:session): session closed for user root
Dec 05 09:00:02 compute-1 sudo[70764]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ioqlrkouitrsqfcdtzmcjrbxarihfqhb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925202.0262358-891-37731409003425/AnsiballZ_stat.py'
Dec 05 09:00:02 compute-1 sudo[70764]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:00:02 compute-1 python3.9[70766]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 09:00:02 compute-1 sudo[70764]: pam_unix(sudo:session): session closed for user root
Dec 05 09:00:02 compute-1 sudo[70887]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-diceariidqamkbkyqwctqnvlghdqeyct ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925202.0262358-891-37731409003425/AnsiballZ_copy.py'
Dec 05 09:00:02 compute-1 sudo[70887]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:00:03 compute-1 python3.9[70889]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764925202.0262358-891-37731409003425/.source.yaml follow=False _original_basename=base-rules.yaml.j2 checksum=450456afcafded6d4bdecceec7a02e806eebd8b3 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 09:00:03 compute-1 sudo[70887]: pam_unix(sudo:session): session closed for user root
Dec 05 09:00:03 compute-1 sudo[71039]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uehhqaqcvhmjevgdqqyxxkvtyagxsxei ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925203.3388224-936-122337836341954/AnsiballZ_stat.py'
Dec 05 09:00:03 compute-1 sudo[71039]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:00:03 compute-1 python3.9[71041]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 09:00:03 compute-1 sudo[71039]: pam_unix(sudo:session): session closed for user root
Dec 05 09:00:04 compute-1 sudo[71162]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sebtgnofvoumcuaexnlprfyvmbjepeud ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925203.3388224-936-122337836341954/AnsiballZ_copy.py'
Dec 05 09:00:04 compute-1 sudo[71162]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:00:04 compute-1 python3.9[71164]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764925203.3388224-936-122337836341954/.source.yaml _original_basename=.qe4mlmw3 follow=False checksum=97d170e1550eee4afc0af065b78cda302a97674c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 09:00:04 compute-1 sudo[71162]: pam_unix(sudo:session): session closed for user root
Dec 05 09:00:05 compute-1 sudo[71314]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-spwlcqojoerdtezjtpuqueahklmziyec ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925205.046463-981-27623778125683/AnsiballZ_stat.py'
Dec 05 09:00:05 compute-1 sudo[71314]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:00:05 compute-1 python3.9[71316]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 09:00:05 compute-1 sudo[71314]: pam_unix(sudo:session): session closed for user root
Dec 05 09:00:05 compute-1 sudo[71437]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bqdhhzunjlgxvriwtujhofttssrylskr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925205.046463-981-27623778125683/AnsiballZ_copy.py'
Dec 05 09:00:05 compute-1 sudo[71437]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:00:06 compute-1 python3.9[71439]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/iptables.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764925205.046463-981-27623778125683/.source.nft _original_basename=iptables.nft follow=False checksum=3e02df08f1f3ab4a513e94056dbd390e3d38fe30 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 09:00:06 compute-1 sudo[71437]: pam_unix(sudo:session): session closed for user root
Dec 05 09:00:06 compute-1 sudo[71589]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ehrremcvxbckjfckeepdhmdqcgezfovc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925206.3555856-1026-61404821149452/AnsiballZ_command.py'
Dec 05 09:00:06 compute-1 sudo[71589]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:00:06 compute-1 python3.9[71591]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/iptables.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 05 09:00:06 compute-1 sudo[71589]: pam_unix(sudo:session): session closed for user root
Dec 05 09:00:07 compute-1 sudo[71742]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hzfkssiycbxqafbodhodezzyoweychbd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925207.0992496-1050-266981234629085/AnsiballZ_command.py'
Dec 05 09:00:07 compute-1 sudo[71742]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:00:07 compute-1 python3.9[71744]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 05 09:00:07 compute-1 sudo[71742]: pam_unix(sudo:session): session closed for user root
Dec 05 09:00:08 compute-1 sudo[71895]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hczlocrymngtacodmpfmvhawrcyycjbt ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1764925207.867764-1074-36584683747678/AnsiballZ_edpm_nftables_from_files.py'
Dec 05 09:00:08 compute-1 sudo[71895]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:00:08 compute-1 python3[71897]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Dec 05 09:00:08 compute-1 sudo[71895]: pam_unix(sudo:session): session closed for user root
Dec 05 09:00:09 compute-1 sudo[72047]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gucrxgsokqqnlnbfnkypsjysdvofhklq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925208.8763433-1098-194108265183673/AnsiballZ_stat.py'
Dec 05 09:00:09 compute-1 sudo[72047]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:00:09 compute-1 python3.9[72049]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 09:00:09 compute-1 sudo[72047]: pam_unix(sudo:session): session closed for user root
Dec 05 09:00:09 compute-1 sudo[72170]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jtqnbvsrkxrsimgducpansexvaggsyqo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925208.8763433-1098-194108265183673/AnsiballZ_copy.py'
Dec 05 09:00:09 compute-1 sudo[72170]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:00:09 compute-1 python3.9[72172]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764925208.8763433-1098-194108265183673/.source.nft follow=False _original_basename=jump-chain.j2 checksum=4c6f036d2d5808f109acc0880c19aa74ca48c961 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 09:00:10 compute-1 sudo[72170]: pam_unix(sudo:session): session closed for user root
Dec 05 09:00:10 compute-1 sudo[72322]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yhkbznabnxlnzomohngquhidwnzhrcuq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925210.2629719-1143-239381799391879/AnsiballZ_stat.py'
Dec 05 09:00:10 compute-1 sudo[72322]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:00:11 compute-1 python3.9[72324]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 09:00:11 compute-1 sudo[72322]: pam_unix(sudo:session): session closed for user root
Dec 05 09:00:11 compute-1 sudo[72445]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-syhdkgcjsgxjpqdftftkbwntktrchtno ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925210.2629719-1143-239381799391879/AnsiballZ_copy.py'
Dec 05 09:00:11 compute-1 sudo[72445]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:00:11 compute-1 python3.9[72447]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-update-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764925210.2629719-1143-239381799391879/.source.nft follow=False _original_basename=jump-chain.j2 checksum=4c6f036d2d5808f109acc0880c19aa74ca48c961 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 09:00:11 compute-1 sudo[72445]: pam_unix(sudo:session): session closed for user root
Dec 05 09:00:12 compute-1 sudo[72597]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tadnlaymjxyneaaunkoyvypnjdqtmdex ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925212.0186656-1188-262754557835618/AnsiballZ_stat.py'
Dec 05 09:00:12 compute-1 sudo[72597]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:00:12 compute-1 python3.9[72599]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 09:00:12 compute-1 sudo[72597]: pam_unix(sudo:session): session closed for user root
Dec 05 09:00:12 compute-1 sudo[72720]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ertojsgqbnbubdszewdrgqurjhvtilzw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925212.0186656-1188-262754557835618/AnsiballZ_copy.py'
Dec 05 09:00:12 compute-1 sudo[72720]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:00:13 compute-1 python3.9[72722]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-flushes.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764925212.0186656-1188-262754557835618/.source.nft follow=False _original_basename=flush-chain.j2 checksum=d16337256a56373421842284fe09e4e6c7df417e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 09:00:13 compute-1 sudo[72720]: pam_unix(sudo:session): session closed for user root
Dec 05 09:00:13 compute-1 sudo[72872]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qhvisqgjtncvwmfpldbnjocoypfpuywx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925213.3853667-1233-232129462613435/AnsiballZ_stat.py'
Dec 05 09:00:13 compute-1 sudo[72872]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:00:13 compute-1 python3.9[72874]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 09:00:13 compute-1 sudo[72872]: pam_unix(sudo:session): session closed for user root
Dec 05 09:00:14 compute-1 sudo[72995]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nawqkpurbfkmlgniscsicmqtllqpsbtn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925213.3853667-1233-232129462613435/AnsiballZ_copy.py'
Dec 05 09:00:14 compute-1 sudo[72995]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:00:14 compute-1 python3.9[72997]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-chains.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764925213.3853667-1233-232129462613435/.source.nft follow=False _original_basename=chains.j2 checksum=2079f3b60590a165d1d502e763170876fc8e2984 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 09:00:14 compute-1 sudo[72995]: pam_unix(sudo:session): session closed for user root
Dec 05 09:00:15 compute-1 sudo[73149]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-llqvuqefzmsbnjeqauotgusglyinpmof ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925214.76379-1279-249229510814653/AnsiballZ_stat.py'
Dec 05 09:00:15 compute-1 sudo[73149]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:00:15 compute-1 python3.9[73151]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 09:00:15 compute-1 sudo[73149]: pam_unix(sudo:session): session closed for user root
Dec 05 09:00:16 compute-1 sudo[73272]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-efdyjxlxbgkawflunmioubupzrbgefnu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925214.76379-1279-249229510814653/AnsiballZ_copy.py'
Dec 05 09:00:16 compute-1 sudo[73272]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:00:16 compute-1 python3.9[73274]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764925214.76379-1279-249229510814653/.source.nft follow=False _original_basename=ruleset.j2 checksum=15a82a0dc61abfd6aa593407582b5b950437eb80 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 09:00:16 compute-1 sudo[73272]: pam_unix(sudo:session): session closed for user root
Dec 05 09:00:16 compute-1 sshd-session[73074]: Received disconnect from 122.168.194.41 port 37774:11: Bye Bye [preauth]
Dec 05 09:00:16 compute-1 sshd-session[73074]: Disconnected from authenticating user root 122.168.194.41 port 37774 [preauth]
Dec 05 09:00:16 compute-1 sudo[73424]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hurrhbrkrotdpszeiorxcqoxpfxnnqrn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925216.50477-1323-88288288089788/AnsiballZ_file.py'
Dec 05 09:00:16 compute-1 sudo[73424]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:00:16 compute-1 python3.9[73426]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 09:00:17 compute-1 sudo[73424]: pam_unix(sudo:session): session closed for user root
Dec 05 09:00:17 compute-1 sudo[73576]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pgxrynqaldktlcemdozytvkepavyfbaa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925217.233019-1347-95473194948999/AnsiballZ_command.py'
Dec 05 09:00:17 compute-1 sudo[73576]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:00:17 compute-1 python3.9[73578]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 05 09:00:17 compute-1 sudo[73576]: pam_unix(sudo:session): session closed for user root
Dec 05 09:00:18 compute-1 sudo[73735]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cfxfhjrlnrnxwughbwdqmwpodgzykgef ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925218.068367-1371-224972324876872/AnsiballZ_blockinfile.py'
Dec 05 09:00:18 compute-1 sudo[73735]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:00:18 compute-1 python3.9[73737]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"
                                            include "/etc/nftables/edpm-chains.nft"
                                            include "/etc/nftables/edpm-rules.nft"
                                            include "/etc/nftables/edpm-jumps.nft"
                                             path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 09:00:18 compute-1 sudo[73735]: pam_unix(sudo:session): session closed for user root
Dec 05 09:00:20 compute-1 sudo[73888]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ojgsrxyvqjkckhqcfwxhchkunttcppyl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925219.9391272-1398-236755298647818/AnsiballZ_file.py'
Dec 05 09:00:20 compute-1 sudo[73888]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:00:20 compute-1 python3.9[73890]: ansible-ansible.builtin.file Invoked with group=hugetlbfs mode=0775 owner=zuul path=/dev/hugepages1G state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 09:00:20 compute-1 sudo[73888]: pam_unix(sudo:session): session closed for user root
Dec 05 09:00:20 compute-1 sudo[74040]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hopmmwvaalhjwootcoqxyavdcllbvriv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925220.6671991-1398-195505840524371/AnsiballZ_file.py'
Dec 05 09:00:20 compute-1 sudo[74040]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:00:21 compute-1 python3.9[74042]: ansible-ansible.builtin.file Invoked with group=hugetlbfs mode=0775 owner=zuul path=/dev/hugepages2M state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 09:00:21 compute-1 sudo[74040]: pam_unix(sudo:session): session closed for user root
Dec 05 09:00:21 compute-1 sudo[74192]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zneytsijeiauxzubuhdejhkxapflggig ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925221.442513-1443-100804579512593/AnsiballZ_mount.py'
Dec 05 09:00:21 compute-1 sudo[74192]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:00:22 compute-1 python3.9[74194]: ansible-ansible.posix.mount Invoked with fstype=hugetlbfs opts=pagesize=1G path=/dev/hugepages1G src=none state=mounted boot=True dump=0 opts_no_log=False passno=0 backup=False fstab=None
Dec 05 09:00:22 compute-1 sudo[74192]: pam_unix(sudo:session): session closed for user root
Dec 05 09:00:22 compute-1 rsyslogd[1007]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Dec 05 09:00:22 compute-1 rsyslogd[1007]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Dec 05 09:00:22 compute-1 sudo[74346]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rdpjaomkbjqddedawdavxcyeygqoazdd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925222.336405-1443-279654807394089/AnsiballZ_mount.py'
Dec 05 09:00:22 compute-1 sudo[74346]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:00:22 compute-1 python3.9[74348]: ansible-ansible.posix.mount Invoked with fstype=hugetlbfs opts=pagesize=2M path=/dev/hugepages2M src=none state=mounted boot=True dump=0 opts_no_log=False passno=0 backup=False fstab=None
Dec 05 09:00:22 compute-1 sudo[74346]: pam_unix(sudo:session): session closed for user root
Dec 05 09:00:23 compute-1 sshd-session[65184]: Connection closed by 192.168.122.30 port 53164
Dec 05 09:00:23 compute-1 sshd-session[65181]: pam_unix(sshd:session): session closed for user zuul
Dec 05 09:00:23 compute-1 systemd[1]: session-16.scope: Deactivated successfully.
Dec 05 09:00:23 compute-1 systemd[1]: session-16.scope: Consumed 36.307s CPU time.
Dec 05 09:00:23 compute-1 systemd-logind[807]: Session 16 logged out. Waiting for processes to exit.
Dec 05 09:00:23 compute-1 systemd-logind[807]: Removed session 16.
Dec 05 09:00:28 compute-1 sshd-session[74374]: Accepted publickey for zuul from 192.168.122.30 port 45694 ssh2: ECDSA SHA256:xFeNApY160LxGIiSPyA1pr6/OAQjwgC1t0EAbYvSE3Q
Dec 05 09:00:28 compute-1 systemd-logind[807]: New session 17 of user zuul.
Dec 05 09:00:28 compute-1 systemd[1]: Started Session 17 of User zuul.
Dec 05 09:00:28 compute-1 sshd-session[74374]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 05 09:00:29 compute-1 sudo[74527]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ldynbymddljijoosvxohmyzislkdhlbv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925228.6009893-19-111644133416635/AnsiballZ_tempfile.py'
Dec 05 09:00:29 compute-1 sudo[74527]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:00:29 compute-1 python3.9[74529]: ansible-ansible.builtin.tempfile Invoked with state=file prefix=ansible. suffix= path=None
Dec 05 09:00:29 compute-1 sudo[74527]: pam_unix(sudo:session): session closed for user root
Dec 05 09:00:29 compute-1 sudo[74679]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tveasywqqylszfdsijxcbkiystckpylz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925229.475536-55-30292004507056/AnsiballZ_stat.py'
Dec 05 09:00:29 compute-1 sudo[74679]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:00:30 compute-1 python3.9[74681]: ansible-ansible.builtin.stat Invoked with path=/etc/ssh/ssh_known_hosts follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 05 09:00:30 compute-1 sudo[74679]: pam_unix(sudo:session): session closed for user root
Dec 05 09:00:30 compute-1 systemd[1]: systemd-timedated.service: Deactivated successfully.
Dec 05 09:00:31 compute-1 sudo[74833]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gmmmmcxxxthlmkflfcoxvslgpfbgmspz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925230.4854999-85-273638705819309/AnsiballZ_setup.py'
Dec 05 09:00:31 compute-1 sudo[74833]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:00:31 compute-1 python3.9[74835]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'ssh_host_key_rsa_public', 'ssh_host_key_ed25519_public', 'ssh_host_key_ecdsa_public'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 05 09:00:31 compute-1 sudo[74833]: pam_unix(sudo:session): session closed for user root
Dec 05 09:00:32 compute-1 sudo[74985]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uknsewvxvngsunzpfeosvykxkqyswkpr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925231.7418296-110-111213031859359/AnsiballZ_blockinfile.py'
Dec 05 09:00:32 compute-1 sudo[74985]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:00:32 compute-1 python3.9[74987]: ansible-ansible.builtin.blockinfile Invoked with block=compute-1.ctlplane.example.com,192.168.122.101,compute-1* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCw9iXJtfPYCz4Fx82XY+AKLliRYbAofVbg/zsy92I2eUirGieu757oIyvYqCKnaR/4bCsHMiA0KzhMaeWE9Np8QnRG/bgr+5dXggFnsiLc91soKEGKmtoCRNkcUwVlQQYKK+A+JcpThWkgHyKzSNZjX3LNxaGfIQ+bWz/ev3NeKtmnz+7lsRmPfcAjFFwDDXUyWlOom5KPvOOQ40hZjR7B3UaFYvsltslecYLEOq/SUXFp7lxDuuzmvFAkUfLX6LmZWK1hPb5TakWHidK/KomO5wRk3zoV/yfIpxjqHEFbX3tgM5YJ4CfYn5YTo+AGZTuBj/CUQaY5n7RSqnxb03GFT6HZjuq/ArqoJGpRShx9tq3RS1nEgLe0flhYHpK1RzbOP6aVC2Hznrdeye3SIqYp1X0UyaNsC4F3LhpDXMjkcVUHppvvs3VsfzOqpFNIs3cRWGdqHrFWDVliEe5xBXcjr/6obqeLCeOw9LPnTqF/+MhNDu2cTsStKQRa6qZZY30=
                                            compute-1.ctlplane.example.com,192.168.122.101,compute-1* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIBOMzpo662NIEWaOCWccvSFekuC4Q7FVkv2ynICgYIBK
                                            compute-1.ctlplane.example.com,192.168.122.101,compute-1* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBHE0N2FojYmeSpK2iCxZfIl3m4X/T4sw+bRbcxlwakCC3DI4aAi6r89kG39waqtdjbi3W6KDnbPdSyz5GQu/DtM=
                                            compute-2.ctlplane.example.com,192.168.122.102,compute-2* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCJ39L0LUKmzSmk+SKVNiXmdeppHC9vuRMK6kfLLbe+e3J7fheyktIWH4m1r0h4Z4YKkOJOb/e3KAXEFS14z2Savh2J/7kZO+fjMoXGRVwGCwvT2llDdb456B7BRCXw6lz3q+81reUPewRJoX8RxO8pvAqMqDBMB8C44Q69CcYt88/MOSJhh5OLu7nsUEOTEGsqbv3qn3B83oRJ4ciWlEK+3qWuWykV3FyA+l4HqW2aDgiNCyzML/ddaXqVoWZH7cfyFT/Wx+vQ4SzWssc5092GOJWF6Qc8F8amZIlS9O0D1l/OW7yO3aL/efMW68fdT+6UBGaRAA/7UktWq2bmfHRO+XnTFy8vMHsPIRygOU+IuClJElh+MaWs1iigfvbbSF75l8lRfADzV+VS4vZNR7lOviHzJcWauQtX2oFwhG554H9k3uqv8mqCMVfZ2eoFP252J3KC9ENwYwjP64Sbd1dPFHW6zRWrJBxaBKoN6nKrhwqYx6BhiNEt5YyfazciDtk=
                                            compute-2.ctlplane.example.com,192.168.122.102,compute-2* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIBqe6xBXUoEeZsqqUrHJJFPh57hyaFh1ZGgyNHWUxEIa
                                            compute-2.ctlplane.example.com,192.168.122.102,compute-2* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBHHbwOOxMcOIKqHXpxAgCX8iPNLcOGZLVy/o6h+hM75Q2xOTEPMql3uaaTLf8vboLiPSYYFge9h8ufpAGr8FY3A=
                                            compute-0.ctlplane.example.com,192.168.122.100,compute-0* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCvf/WE8Mw9cxikNSFiwgKbNpUel03zQZo4gKoiINbwPx1087mT3UZIxIXdtO8BX039dnp5c3YzmlA77SKdZFNA11lgoJlMfBodPbstRqUkfyrD9XsQFKK+eFlLW2dHtcok91T837EeXdXpPpbkfWHrikE2vpBXdYXraWZagbAPOc4P5cmep9/aOUsnjcn8pQtprB5e1NnF4VgXipLWHvrGgWLfUdvuASdXdUJj9jmxB5ImRcByIPnQWY5NOBd3gwLM/ljeFOWMDrg1YwT3ZrZWLzkt9ZS+fxYfgCxDil3qNV+GP/GW3+lfiyTl1NqTGUXyjYAn2SKGDYKStO+RdtZWDaWbaFMhr+7rkGgkDEwND8JorsW5ahkuijr7A/scTd5sSViXCILQ2pHwzs2PndHgo+y0hT0MAShgjtN104f6zxTJhOUgZssfx2czJM00qHFc158RRVO54fjDstoHvi11quyiuLalUciefLiZ9nMZ4nM6Tbi6vLjN3K/zpjSSRp0=
                                            compute-0.ctlplane.example.com,192.168.122.100,compute-0* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIKQBWkHK/XfxUnRuK2xunkiGosjgRKiYPsI+eJJub1OD
                                            compute-0.ctlplane.example.com,192.168.122.100,compute-0* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBM6mn1z6Pkr4n7li0MzE9BAI6YFqtzCLterdPs6vB70rPM1fsVl4jiugeuxZe41sivQPeBzDb+v28jfZDuQYJAs=
                                             create=True mode=0644 path=/tmp/ansible.fesrq6ud state=present marker=# {mark} ANSIBLE MANAGED BLOCK backup=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 09:00:32 compute-1 sudo[74985]: pam_unix(sudo:session): session closed for user root
Dec 05 09:00:33 compute-1 sudo[75137]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vkizelphsbnplyaehjsnvsnvowbjxnyt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925232.875691-134-257283362360565/AnsiballZ_command.py'
Dec 05 09:00:33 compute-1 sudo[75137]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:00:33 compute-1 python3.9[75139]: ansible-ansible.legacy.command Invoked with _raw_params=cat '/tmp/ansible.fesrq6ud' > /etc/ssh/ssh_known_hosts _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 05 09:00:33 compute-1 sudo[75137]: pam_unix(sudo:session): session closed for user root
Dec 05 09:00:34 compute-1 sudo[75291]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vgepzxgiywbxwmnwenhjdwhimynjxexn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925233.7793472-158-130332232325875/AnsiballZ_file.py'
Dec 05 09:00:34 compute-1 sudo[75291]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:00:34 compute-1 python3.9[75293]: ansible-ansible.builtin.file Invoked with path=/tmp/ansible.fesrq6ud state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 09:00:34 compute-1 sudo[75291]: pam_unix(sudo:session): session closed for user root
Dec 05 09:00:34 compute-1 sshd-session[74377]: Connection closed by 192.168.122.30 port 45694
Dec 05 09:00:34 compute-1 sshd-session[74374]: pam_unix(sshd:session): session closed for user zuul
Dec 05 09:00:34 compute-1 systemd[1]: session-17.scope: Deactivated successfully.
Dec 05 09:00:34 compute-1 systemd[1]: session-17.scope: Consumed 3.757s CPU time.
Dec 05 09:00:34 compute-1 systemd-logind[807]: Session 17 logged out. Waiting for processes to exit.
Dec 05 09:00:34 compute-1 systemd-logind[807]: Removed session 17.
Dec 05 09:00:41 compute-1 sshd-session[75318]: Accepted publickey for zuul from 192.168.122.30 port 45322 ssh2: ECDSA SHA256:xFeNApY160LxGIiSPyA1pr6/OAQjwgC1t0EAbYvSE3Q
Dec 05 09:00:41 compute-1 systemd-logind[807]: New session 18 of user zuul.
Dec 05 09:00:41 compute-1 systemd[1]: Started Session 18 of User zuul.
Dec 05 09:00:41 compute-1 sshd-session[75318]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 05 09:00:42 compute-1 python3.9[75471]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 05 09:00:43 compute-1 sudo[75627]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ntuoqxqraudsqafuqknhvfwyxoahcfwv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925242.6813805-57-126381580438457/AnsiballZ_systemd.py'
Dec 05 09:00:43 compute-1 sudo[75627]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:00:43 compute-1 python3.9[75629]: ansible-ansible.builtin.systemd Invoked with enabled=True name=sshd daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None masked=None
Dec 05 09:00:43 compute-1 sudo[75627]: pam_unix(sudo:session): session closed for user root
Dec 05 09:00:44 compute-1 sshd-session[75542]: Received disconnect from 43.225.158.169 port 51335:11: Bye Bye [preauth]
Dec 05 09:00:44 compute-1 sshd-session[75542]: Disconnected from authenticating user root 43.225.158.169 port 51335 [preauth]
Dec 05 09:00:44 compute-1 sudo[75781]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ycpoviitjjuoroysopzhcwdofcaxlphv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925243.9290743-81-179013435833928/AnsiballZ_systemd.py'
Dec 05 09:00:44 compute-1 sudo[75781]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:00:44 compute-1 python3.9[75783]: ansible-ansible.builtin.systemd Invoked with name=sshd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 05 09:00:44 compute-1 sudo[75781]: pam_unix(sudo:session): session closed for user root
Dec 05 09:00:45 compute-1 sudo[75934]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vcfbvtkdhwogrnwhmqjgzbzavgabqrfn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925244.8388696-108-128222202268023/AnsiballZ_command.py'
Dec 05 09:00:45 compute-1 sudo[75934]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:00:45 compute-1 python3.9[75936]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 05 09:00:45 compute-1 sudo[75934]: pam_unix(sudo:session): session closed for user root
Dec 05 09:00:46 compute-1 sudo[76087]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-suqegjxzmborcdzeovrcdqmtiwhbsjci ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925245.7446055-132-9008734279750/AnsiballZ_stat.py'
Dec 05 09:00:46 compute-1 sudo[76087]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:00:46 compute-1 python3.9[76089]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 05 09:00:46 compute-1 sudo[76087]: pam_unix(sudo:session): session closed for user root
Dec 05 09:00:47 compute-1 sudo[76241]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dixgoowndupixblnpnkupehwjfdsdzak ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925246.856858-156-131028574534590/AnsiballZ_command.py'
Dec 05 09:00:47 compute-1 sudo[76241]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:00:47 compute-1 python3.9[76243]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 05 09:00:47 compute-1 sudo[76241]: pam_unix(sudo:session): session closed for user root
Dec 05 09:00:48 compute-1 sudo[76396]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tiwrpmabzkibznvrmjlulmlaloggjabl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925247.683315-180-188382759685061/AnsiballZ_file.py'
Dec 05 09:00:48 compute-1 sudo[76396]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:00:48 compute-1 python3.9[76398]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 09:00:48 compute-1 sudo[76396]: pam_unix(sudo:session): session closed for user root
Dec 05 09:00:48 compute-1 sshd-session[75321]: Connection closed by 192.168.122.30 port 45322
Dec 05 09:00:48 compute-1 sshd-session[75318]: pam_unix(sshd:session): session closed for user zuul
Dec 05 09:00:48 compute-1 systemd[1]: session-18.scope: Deactivated successfully.
Dec 05 09:00:48 compute-1 systemd[1]: session-18.scope: Consumed 4.585s CPU time.
Dec 05 09:00:48 compute-1 systemd-logind[807]: Session 18 logged out. Waiting for processes to exit.
Dec 05 09:00:48 compute-1 systemd-logind[807]: Removed session 18.
Dec 05 09:00:53 compute-1 sshd-session[76423]: Accepted publickey for zuul from 192.168.122.30 port 55790 ssh2: ECDSA SHA256:xFeNApY160LxGIiSPyA1pr6/OAQjwgC1t0EAbYvSE3Q
Dec 05 09:00:53 compute-1 systemd-logind[807]: New session 19 of user zuul.
Dec 05 09:00:53 compute-1 systemd[1]: Started Session 19 of User zuul.
Dec 05 09:00:53 compute-1 sshd-session[76423]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 05 09:00:54 compute-1 python3.9[76576]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 05 09:00:55 compute-1 sudo[76730]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vaverpnzounziheparnuzzbpkxlxlwpv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925255.399699-63-130704044059261/AnsiballZ_setup.py'
Dec 05 09:00:55 compute-1 sudo[76730]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:00:56 compute-1 python3.9[76732]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec 05 09:00:56 compute-1 sudo[76730]: pam_unix(sudo:session): session closed for user root
Dec 05 09:00:56 compute-1 sudo[76814]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hxsknibuhjypsgfhuvxjjcnekaaxoogh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925255.399699-63-130704044059261/AnsiballZ_dnf.py'
Dec 05 09:00:56 compute-1 sudo[76814]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:00:56 compute-1 python3.9[76816]: ansible-ansible.legacy.dnf Invoked with name=['yum-utils'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Dec 05 09:00:58 compute-1 sudo[76814]: pam_unix(sudo:session): session closed for user root
Dec 05 09:00:59 compute-1 python3.9[76969]: ansible-ansible.legacy.command Invoked with _raw_params=needs-restarting -r _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 05 09:01:00 compute-1 sshd-session[76907]: Received disconnect from 185.118.15.236 port 33936:11: Bye Bye [preauth]
Dec 05 09:01:00 compute-1 sshd-session[76907]: Disconnected from authenticating user root 185.118.15.236 port 33936 [preauth]
Dec 05 09:01:00 compute-1 python3.9[77120]: ansible-ansible.builtin.find Invoked with paths=['/var/lib/openstack/reboot_required/'] patterns=[] read_whole_file=False file_type=file age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Dec 05 09:01:01 compute-1 CROND[77198]: (root) CMD (run-parts /etc/cron.hourly)
Dec 05 09:01:01 compute-1 run-parts[77201]: (/etc/cron.hourly) starting 0anacron
Dec 05 09:01:01 compute-1 anacron[77209]: Anacron started on 2025-12-05
Dec 05 09:01:01 compute-1 anacron[77209]: Will run job `cron.daily' in 11 min.
Dec 05 09:01:01 compute-1 anacron[77209]: Will run job `cron.weekly' in 31 min.
Dec 05 09:01:01 compute-1 anacron[77209]: Will run job `cron.monthly' in 51 min.
Dec 05 09:01:01 compute-1 anacron[77209]: Jobs will be executed sequentially
Dec 05 09:01:01 compute-1 run-parts[77212]: (/etc/cron.hourly) finished 0anacron
Dec 05 09:01:01 compute-1 CROND[77197]: (root) CMDEND (run-parts /etc/cron.hourly)
Dec 05 09:01:01 compute-1 python3.9[77285]: ansible-ansible.builtin.stat Invoked with path=/var/lib/config-data/puppet-generated follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 05 09:01:02 compute-1 python3.9[77435]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 05 09:01:03 compute-1 sshd-session[76426]: Connection closed by 192.168.122.30 port 55790
Dec 05 09:01:03 compute-1 sshd-session[76423]: pam_unix(sshd:session): session closed for user zuul
Dec 05 09:01:03 compute-1 systemd[1]: session-19.scope: Deactivated successfully.
Dec 05 09:01:03 compute-1 systemd[1]: session-19.scope: Consumed 6.413s CPU time.
Dec 05 09:01:03 compute-1 systemd-logind[807]: Session 19 logged out. Waiting for processes to exit.
Dec 05 09:01:03 compute-1 systemd-logind[807]: Removed session 19.
Dec 05 09:01:09 compute-1 sshd-session[77460]: Accepted publickey for zuul from 192.168.122.30 port 50092 ssh2: ECDSA SHA256:xFeNApY160LxGIiSPyA1pr6/OAQjwgC1t0EAbYvSE3Q
Dec 05 09:01:09 compute-1 systemd-logind[807]: New session 20 of user zuul.
Dec 05 09:01:09 compute-1 systemd[1]: Started Session 20 of User zuul.
Dec 05 09:01:09 compute-1 sshd-session[77460]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 05 09:01:10 compute-1 python3.9[77613]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 05 09:01:12 compute-1 sudo[77767]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qqbrjalbovxospnqpispqlwdgobujwld ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925271.5691376-111-143300708548607/AnsiballZ_file.py'
Dec 05 09:01:12 compute-1 sudo[77767]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:01:12 compute-1 python3.9[77769]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/libvirt/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 05 09:01:12 compute-1 sudo[77767]: pam_unix(sudo:session): session closed for user root
Dec 05 09:01:12 compute-1 sudo[77919]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bczcploahffoxrqiqiylsbpirpuuokdf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925272.4384036-111-178894866912760/AnsiballZ_file.py'
Dec 05 09:01:12 compute-1 sudo[77919]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:01:12 compute-1 python3.9[77921]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/libvirt/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 05 09:01:12 compute-1 sudo[77919]: pam_unix(sudo:session): session closed for user root
Dec 05 09:01:13 compute-1 sudo[78071]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jxiifrksntyhdlaelijmtstdggaavccs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925273.162776-161-277746740424528/AnsiballZ_stat.py'
Dec 05 09:01:13 compute-1 sudo[78071]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:01:13 compute-1 python3.9[78073]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/libvirt/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 09:01:13 compute-1 sudo[78071]: pam_unix(sudo:session): session closed for user root
Dec 05 09:01:14 compute-1 sudo[78194]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-obzwfebgrenhxhjuzarhocnuqyktcpwn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925273.162776-161-277746740424528/AnsiballZ_copy.py'
Dec 05 09:01:14 compute-1 sudo[78194]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:01:14 compute-1 python3.9[78196]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/libvirt/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764925273.162776-161-277746740424528/.source.crt _original_basename=compute-1.ctlplane.example.com-tls.crt follow=False checksum=c603aa8b55fb0c0b5c979c68cb9412884a6110c5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 09:01:14 compute-1 sudo[78194]: pam_unix(sudo:session): session closed for user root
Dec 05 09:01:14 compute-1 sudo[78346]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wmwsasgspngakykmdxxupcvvskwciwso ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925274.4260705-161-224055745373980/AnsiballZ_stat.py'
Dec 05 09:01:14 compute-1 sudo[78346]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:01:14 compute-1 python3.9[78348]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/libvirt/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 09:01:14 compute-1 sudo[78346]: pam_unix(sudo:session): session closed for user root
Dec 05 09:01:15 compute-1 sudo[78469]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-csxfszsnkhccoljxwffnjeyqlozdogxm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925274.4260705-161-224055745373980/AnsiballZ_copy.py'
Dec 05 09:01:15 compute-1 sudo[78469]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:01:15 compute-1 python3.9[78471]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/libvirt/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764925274.4260705-161-224055745373980/.source.crt _original_basename=compute-1.ctlplane.example.com-ca.crt follow=False checksum=11a36bbe21ef590cbdf3e8ad4ac0ae84daa17216 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 09:01:15 compute-1 sudo[78469]: pam_unix(sudo:session): session closed for user root
Dec 05 09:01:15 compute-1 sudo[78621]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lcmovlvmjqdjcqevchodqxibnnncacig ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925275.6493561-161-85156515091188/AnsiballZ_stat.py'
Dec 05 09:01:15 compute-1 sudo[78621]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:01:16 compute-1 python3.9[78623]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/libvirt/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 09:01:16 compute-1 sudo[78621]: pam_unix(sudo:session): session closed for user root
Dec 05 09:01:16 compute-1 sudo[78744]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zbumxtzxfbyyljbhyczzwjmampqkbsbq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925275.6493561-161-85156515091188/AnsiballZ_copy.py'
Dec 05 09:01:16 compute-1 sudo[78744]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:01:16 compute-1 python3.9[78746]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/libvirt/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764925275.6493561-161-85156515091188/.source.key _original_basename=compute-1.ctlplane.example.com-tls.key follow=False checksum=77c1c0c20190dc970850bf5ae44c3243b3764d4a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 09:01:16 compute-1 sudo[78744]: pam_unix(sudo:session): session closed for user root
Dec 05 09:01:17 compute-1 sudo[78896]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uzvlyunxmefjzlcbrvqondewkyqzpbue ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925276.9178724-291-233372128679088/AnsiballZ_file.py'
Dec 05 09:01:17 compute-1 sudo[78896]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:01:17 compute-1 python3.9[78898]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/telemetry/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 05 09:01:17 compute-1 sudo[78896]: pam_unix(sudo:session): session closed for user root
Dec 05 09:01:17 compute-1 sudo[79048]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-niasfknkfvaavrzlparsnmnlnozapsnq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925277.5690935-291-70535604502997/AnsiballZ_file.py'
Dec 05 09:01:17 compute-1 sudo[79048]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:01:18 compute-1 python3.9[79050]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/telemetry/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 05 09:01:18 compute-1 sudo[79048]: pam_unix(sudo:session): session closed for user root
Dec 05 09:01:18 compute-1 sudo[79200]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yxojaizlqpxzopmrsqvoarnclaubxabl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925278.269679-338-10858030247707/AnsiballZ_stat.py'
Dec 05 09:01:18 compute-1 sudo[79200]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:01:18 compute-1 python3.9[79202]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 09:01:18 compute-1 sudo[79200]: pam_unix(sudo:session): session closed for user root
Dec 05 09:01:19 compute-1 sudo[79323]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bzqmmikqeesasneszmyazkrdkrgnulbn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925278.269679-338-10858030247707/AnsiballZ_copy.py'
Dec 05 09:01:19 compute-1 sudo[79323]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:01:19 compute-1 python3.9[79325]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/telemetry/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764925278.269679-338-10858030247707/.source.crt _original_basename=compute-1.ctlplane.example.com-tls.crt follow=False checksum=6c3a0c25837795eae45ce867ee9c1b30c884e66d backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 09:01:19 compute-1 sudo[79323]: pam_unix(sudo:session): session closed for user root
Dec 05 09:01:19 compute-1 sudo[79475]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-whyhbfaduzivrjxpqffoozygipetqlqk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925279.4739714-338-104580183936385/AnsiballZ_stat.py'
Dec 05 09:01:19 compute-1 sudo[79475]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:01:19 compute-1 python3.9[79477]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 09:01:19 compute-1 sudo[79475]: pam_unix(sudo:session): session closed for user root
Dec 05 09:01:20 compute-1 sudo[79598]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-syedqqzcpmdvbmoyvteweawqvipiuvqs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925279.4739714-338-104580183936385/AnsiballZ_copy.py'
Dec 05 09:01:20 compute-1 sudo[79598]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:01:20 compute-1 python3.9[79600]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/telemetry/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764925279.4739714-338-104580183936385/.source.crt _original_basename=compute-1.ctlplane.example.com-ca.crt follow=False checksum=9c77b8e17a8dd6fdd56037c2672848636335a9a8 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 09:01:20 compute-1 sudo[79598]: pam_unix(sudo:session): session closed for user root
Dec 05 09:01:20 compute-1 sudo[79750]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qmtuqyhrpirpyzrinfajmuarelbkffkn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925280.6399448-338-69049178535153/AnsiballZ_stat.py'
Dec 05 09:01:20 compute-1 sudo[79750]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:01:21 compute-1 python3.9[79752]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 09:01:21 compute-1 sudo[79750]: pam_unix(sudo:session): session closed for user root
Dec 05 09:01:21 compute-1 sudo[79873]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dstuqwswdxsvgkyyrtdlgkvwaujqmlsh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925280.6399448-338-69049178535153/AnsiballZ_copy.py'
Dec 05 09:01:21 compute-1 sudo[79873]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:01:21 compute-1 python3.9[79875]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/telemetry/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764925280.6399448-338-69049178535153/.source.key _original_basename=compute-1.ctlplane.example.com-tls.key follow=False checksum=fa88b65fe15339e8c59645e5189fcf582cc0f3ab backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 09:01:21 compute-1 sudo[79873]: pam_unix(sudo:session): session closed for user root
Dec 05 09:01:22 compute-1 sudo[80025]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qkzurbhteohchvpusgulcxanhfuxjclc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925281.9467587-473-27993206943597/AnsiballZ_file.py'
Dec 05 09:01:22 compute-1 sudo[80025]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:01:22 compute-1 python3.9[80027]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/neutron-metadata/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 05 09:01:22 compute-1 sudo[80025]: pam_unix(sudo:session): session closed for user root
Dec 05 09:01:22 compute-1 sudo[80177]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dxnlgwgoqkwhmvbglcycwxvqgmqmbbgd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925282.6230648-473-253655230594892/AnsiballZ_file.py'
Dec 05 09:01:22 compute-1 sudo[80177]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:01:23 compute-1 python3.9[80179]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/neutron-metadata/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 05 09:01:23 compute-1 sudo[80177]: pam_unix(sudo:session): session closed for user root
Dec 05 09:01:23 compute-1 sudo[80329]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ucnmbfedpkqtioybebaygkhkrtunquac ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925283.3533819-519-17911902792666/AnsiballZ_stat.py'
Dec 05 09:01:23 compute-1 sudo[80329]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:01:23 compute-1 python3.9[80331]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/neutron-metadata/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 09:01:23 compute-1 sudo[80329]: pam_unix(sudo:session): session closed for user root
Dec 05 09:01:24 compute-1 sudo[80452]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ldbebvmbcnvikobgkdneynnrrrmvkjgt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925283.3533819-519-17911902792666/AnsiballZ_copy.py'
Dec 05 09:01:24 compute-1 sudo[80452]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:01:24 compute-1 python3.9[80454]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/neutron-metadata/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764925283.3533819-519-17911902792666/.source.crt _original_basename=compute-1.ctlplane.example.com-tls.crt follow=False checksum=80791d6ff0d1afe4e9dc69702b447a03796ba2f9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 09:01:24 compute-1 sudo[80452]: pam_unix(sudo:session): session closed for user root
Dec 05 09:01:24 compute-1 sudo[80604]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tzuxqzgdopcderefbnjmnkjhjmokrfsy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925284.5373297-519-156978712658573/AnsiballZ_stat.py'
Dec 05 09:01:24 compute-1 sudo[80604]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:01:25 compute-1 python3.9[80606]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/neutron-metadata/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 09:01:25 compute-1 sudo[80604]: pam_unix(sudo:session): session closed for user root
Dec 05 09:01:25 compute-1 sudo[80727]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-duernzrgtwbipqygbgvtimlbjdgixdta ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925284.5373297-519-156978712658573/AnsiballZ_copy.py'
Dec 05 09:01:25 compute-1 sudo[80727]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:01:25 compute-1 python3.9[80729]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/neutron-metadata/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764925284.5373297-519-156978712658573/.source.crt _original_basename=compute-1.ctlplane.example.com-ca.crt follow=False checksum=5128ccb9f343660fdc6aa3fee9d74f6859898ffe backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 09:01:25 compute-1 sudo[80727]: pam_unix(sudo:session): session closed for user root
Dec 05 09:01:26 compute-1 sudo[80879]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yqggyjyklzdekfwxntawhdyouekpknip ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925285.7507997-519-35862842743537/AnsiballZ_stat.py'
Dec 05 09:01:26 compute-1 sudo[80879]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:01:26 compute-1 python3.9[80881]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/neutron-metadata/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 09:01:26 compute-1 sudo[80879]: pam_unix(sudo:session): session closed for user root
Dec 05 09:01:26 compute-1 sudo[81004]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lxuygcshlkjrjanfggersownywaoyqiw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925285.7507997-519-35862842743537/AnsiballZ_copy.py'
Dec 05 09:01:26 compute-1 sudo[81004]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:01:26 compute-1 python3.9[81006]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/neutron-metadata/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764925285.7507997-519-35862842743537/.source.key _original_basename=compute-1.ctlplane.example.com-tls.key follow=False checksum=6529c6c7f805c99bc99f26c6462c3cdb97b2eace backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 09:01:26 compute-1 sudo[81004]: pam_unix(sudo:session): session closed for user root
Dec 05 09:01:27 compute-1 sudo[81156]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jibeolwvbzmsjecuqhfoingajvaugotz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925287.00253-654-81971444874816/AnsiballZ_file.py'
Dec 05 09:01:27 compute-1 sudo[81156]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:01:27 compute-1 python3.9[81158]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/ovn/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 05 09:01:27 compute-1 sudo[81156]: pam_unix(sudo:session): session closed for user root
Dec 05 09:01:27 compute-1 sudo[81308]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qaltdltbitgtxgisgkcuohmqwegbvukx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925287.641107-654-163357700260453/AnsiballZ_file.py'
Dec 05 09:01:27 compute-1 sudo[81308]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:01:27 compute-1 sshd-session[80929]: Received disconnect from 122.168.194.41 port 48594:11: Bye Bye [preauth]
Dec 05 09:01:27 compute-1 sshd-session[80929]: Disconnected from authenticating user root 122.168.194.41 port 48594 [preauth]
Dec 05 09:01:28 compute-1 python3.9[81310]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/ovn/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 05 09:01:28 compute-1 sudo[81308]: pam_unix(sudo:session): session closed for user root
Dec 05 09:01:28 compute-1 sudo[81460]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-olnfhdspultpediqrfzxjwlnzlfmbwzd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925288.294649-698-237621398354629/AnsiballZ_stat.py'
Dec 05 09:01:28 compute-1 sudo[81460]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:01:28 compute-1 python3.9[81462]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/ovn/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 09:01:28 compute-1 sudo[81460]: pam_unix(sudo:session): session closed for user root
Dec 05 09:01:28 compute-1 chronyd[65155]: Selected source 51.79.69.205 (pool.ntp.org)
Dec 05 09:01:29 compute-1 sudo[81583]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wgttaymdercghfouwrfdclezmfoazrfo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925288.294649-698-237621398354629/AnsiballZ_copy.py'
Dec 05 09:01:29 compute-1 sudo[81583]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:01:29 compute-1 python3.9[81585]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/ovn/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764925288.294649-698-237621398354629/.source.crt _original_basename=compute-1.ctlplane.example.com-tls.crt follow=False checksum=c05584703c6b0a4e10f7c78e4594b0c28fdcba5e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 09:01:29 compute-1 sudo[81583]: pam_unix(sudo:session): session closed for user root
Dec 05 09:01:29 compute-1 sudo[81735]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fdiuctvyswjjelpmphjjnmnesgfvxsks ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925289.517328-698-227737740453042/AnsiballZ_stat.py'
Dec 05 09:01:29 compute-1 sudo[81735]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:01:30 compute-1 python3.9[81737]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/ovn/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 09:01:30 compute-1 sudo[81735]: pam_unix(sudo:session): session closed for user root
Dec 05 09:01:30 compute-1 sudo[81858]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fvmdwavnvmiynsqsdvtxzjwgmvcblgkc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925289.517328-698-227737740453042/AnsiballZ_copy.py'
Dec 05 09:01:30 compute-1 sudo[81858]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:01:30 compute-1 python3.9[81860]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/ovn/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764925289.517328-698-227737740453042/.source.crt _original_basename=compute-1.ctlplane.example.com-ca.crt follow=False checksum=5128ccb9f343660fdc6aa3fee9d74f6859898ffe backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 09:01:30 compute-1 sudo[81858]: pam_unix(sudo:session): session closed for user root
Dec 05 09:01:31 compute-1 sudo[82010]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bcfifwoyrzsyqyujytaqsdalkoowztxf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925290.7602756-698-214866644867201/AnsiballZ_stat.py'
Dec 05 09:01:31 compute-1 sudo[82010]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:01:31 compute-1 python3.9[82012]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/ovn/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 09:01:31 compute-1 sudo[82010]: pam_unix(sudo:session): session closed for user root
Dec 05 09:01:31 compute-1 sudo[82133]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xcrjvljxvjxrldfhjegttxnpmoeywbpp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925290.7602756-698-214866644867201/AnsiballZ_copy.py'
Dec 05 09:01:31 compute-1 sudo[82133]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:01:31 compute-1 python3.9[82135]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/ovn/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764925290.7602756-698-214866644867201/.source.key _original_basename=compute-1.ctlplane.example.com-tls.key follow=False checksum=4fe1efd98acb6356f6c187a9329a9545c0b01998 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 09:01:31 compute-1 sudo[82133]: pam_unix(sudo:session): session closed for user root
Dec 05 09:01:32 compute-1 sudo[82285]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jhkwqivsuhhrsibhbckceqfjaaglbcud ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925292.6984608-880-14408567658578/AnsiballZ_file.py'
Dec 05 09:01:32 compute-1 sudo[82285]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:01:33 compute-1 python3.9[82287]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/telemetry setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 05 09:01:33 compute-1 sudo[82285]: pam_unix(sudo:session): session closed for user root
Dec 05 09:01:33 compute-1 sudo[82437]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xdusylhuvtgmhztaogoejborqyquhcna ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925293.3787575-906-59391718454548/AnsiballZ_stat.py'
Dec 05 09:01:33 compute-1 sudo[82437]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:01:34 compute-1 python3.9[82439]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 09:01:34 compute-1 sudo[82437]: pam_unix(sudo:session): session closed for user root
Dec 05 09:01:34 compute-1 sudo[82560]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-byubevmpckxeapplvvliikibhqcejewv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925293.3787575-906-59391718454548/AnsiballZ_copy.py'
Dec 05 09:01:34 compute-1 sudo[82560]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:01:34 compute-1 python3.9[82562]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764925293.3787575-906-59391718454548/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=0908ee08d371593f319166fe14009677e91e42dc backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 09:01:34 compute-1 sudo[82560]: pam_unix(sudo:session): session closed for user root
Dec 05 09:01:35 compute-1 sudo[82712]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tilfvidbxyzessrwpkrczyonvessqfgh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925295.1207035-966-112142226158244/AnsiballZ_file.py'
Dec 05 09:01:35 compute-1 sudo[82712]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:01:35 compute-1 python3.9[82714]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/libvirt setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 05 09:01:35 compute-1 sudo[82712]: pam_unix(sudo:session): session closed for user root
Dec 05 09:01:36 compute-1 sudo[82864]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sldlnasfybzbkaghoaviivcilfpuplqn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925295.8019505-988-140897275806009/AnsiballZ_stat.py'
Dec 05 09:01:36 compute-1 sudo[82864]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:01:36 compute-1 python3.9[82866]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/libvirt/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 09:01:36 compute-1 sudo[82864]: pam_unix(sudo:session): session closed for user root
Dec 05 09:01:36 compute-1 sudo[82987]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wgvidjiexcpfsvqfigdenqnvcmxdkfbm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925295.8019505-988-140897275806009/AnsiballZ_copy.py'
Dec 05 09:01:36 compute-1 sudo[82987]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:01:36 compute-1 python3.9[82989]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/libvirt/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764925295.8019505-988-140897275806009/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=0908ee08d371593f319166fe14009677e91e42dc backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 09:01:36 compute-1 sudo[82987]: pam_unix(sudo:session): session closed for user root
Dec 05 09:01:37 compute-1 sudo[83139]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tutpfeacaxiqbcjtdknyduptwumgcear ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925297.0819318-1034-14457147795040/AnsiballZ_file.py'
Dec 05 09:01:37 compute-1 sudo[83139]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:01:37 compute-1 python3.9[83141]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/repo-setup setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 05 09:01:37 compute-1 sudo[83139]: pam_unix(sudo:session): session closed for user root
Dec 05 09:01:38 compute-1 sudo[83293]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fhcdmeutdhopfhjzdhijbnwhjkvefiex ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925297.764071-1061-263835048545441/AnsiballZ_stat.py'
Dec 05 09:01:38 compute-1 sudo[83293]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:01:38 compute-1 python3.9[83295]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/repo-setup/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 09:01:38 compute-1 sudo[83293]: pam_unix(sudo:session): session closed for user root
Dec 05 09:01:38 compute-1 sudo[83416]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cifjjxkqwslisznemkivhubqnhwzbekb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925297.764071-1061-263835048545441/AnsiballZ_copy.py'
Dec 05 09:01:38 compute-1 sudo[83416]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:01:38 compute-1 python3.9[83418]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/repo-setup/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764925297.764071-1061-263835048545441/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=0908ee08d371593f319166fe14009677e91e42dc backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 09:01:38 compute-1 sudo[83416]: pam_unix(sudo:session): session closed for user root
Dec 05 09:01:39 compute-1 sudo[83568]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-frqddkaytryzyzoyzmroosamfyuzypwp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925299.1270745-1110-120503447979312/AnsiballZ_file.py'
Dec 05 09:01:39 compute-1 sudo[83568]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:01:39 compute-1 python3.9[83570]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/neutron-metadata setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 05 09:01:39 compute-1 sudo[83568]: pam_unix(sudo:session): session closed for user root
Dec 05 09:01:40 compute-1 sudo[83720]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wsyezhmvtverxeinpolichchbloxskxi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925299.8467028-1134-123043219067876/AnsiballZ_stat.py'
Dec 05 09:01:40 compute-1 sudo[83720]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:01:40 compute-1 python3.9[83722]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 09:01:40 compute-1 sudo[83720]: pam_unix(sudo:session): session closed for user root
Dec 05 09:01:40 compute-1 sudo[83843]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rknckxqovnyrvhywxjattrxdrxpozpqd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925299.8467028-1134-123043219067876/AnsiballZ_copy.py'
Dec 05 09:01:40 compute-1 sudo[83843]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:01:40 compute-1 python3.9[83845]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764925299.8467028-1134-123043219067876/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=0908ee08d371593f319166fe14009677e91e42dc backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 09:01:41 compute-1 sudo[83843]: pam_unix(sudo:session): session closed for user root
Dec 05 09:01:41 compute-1 sshd-session[83142]: Received disconnect from 101.47.162.91 port 46450:11: Bye Bye [preauth]
Dec 05 09:01:41 compute-1 sshd-session[83142]: Disconnected from authenticating user root 101.47.162.91 port 46450 [preauth]
Dec 05 09:01:41 compute-1 sudo[83995]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-akacjwpayzsunkilwjrnqgfkwcpwhyvn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925301.2135012-1184-78672070766655/AnsiballZ_file.py'
Dec 05 09:01:41 compute-1 sudo[83995]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:01:41 compute-1 python3.9[83997]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/ovn setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 05 09:01:41 compute-1 sudo[83995]: pam_unix(sudo:session): session closed for user root
Dec 05 09:01:42 compute-1 sudo[84147]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iibmkodhhtlozneyfcoqnqcrcddzldxu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925301.9541106-1208-69384566885618/AnsiballZ_stat.py'
Dec 05 09:01:42 compute-1 sudo[84147]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:01:42 compute-1 python3.9[84149]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 09:01:42 compute-1 sudo[84147]: pam_unix(sudo:session): session closed for user root
Dec 05 09:01:42 compute-1 sudo[84270]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mfhhqepgjupgbxesxkmpxivzdwmiwbcb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925301.9541106-1208-69384566885618/AnsiballZ_copy.py'
Dec 05 09:01:42 compute-1 sudo[84270]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:01:43 compute-1 python3.9[84272]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764925301.9541106-1208-69384566885618/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=0908ee08d371593f319166fe14009677e91e42dc backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 09:01:43 compute-1 sudo[84270]: pam_unix(sudo:session): session closed for user root
Dec 05 09:01:43 compute-1 sudo[84422]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bjvfprihqszqdzqehtglemkuxzdbwmbi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925303.3319476-1261-208050312594746/AnsiballZ_file.py'
Dec 05 09:01:43 compute-1 sudo[84422]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:01:43 compute-1 python3.9[84424]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/bootstrap setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 05 09:01:43 compute-1 sudo[84422]: pam_unix(sudo:session): session closed for user root
Dec 05 09:01:43 compute-1 irqbalance[793]: Cannot change IRQ 26 affinity: Operation not permitted
Dec 05 09:01:43 compute-1 irqbalance[793]: IRQ 26 affinity is now unmanaged
Dec 05 09:01:44 compute-1 sudo[84574]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gkybguguwqlukcqurfmkgjwfmsmmibke ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925304.0001762-1287-275861637287941/AnsiballZ_stat.py'
Dec 05 09:01:44 compute-1 sudo[84574]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:01:44 compute-1 python3.9[84576]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/bootstrap/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 09:01:44 compute-1 sudo[84574]: pam_unix(sudo:session): session closed for user root
Dec 05 09:01:44 compute-1 sudo[84697]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fqhfaohydqzzzgroemsgjeostbogwhys ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925304.0001762-1287-275861637287941/AnsiballZ_copy.py'
Dec 05 09:01:44 compute-1 sudo[84697]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:01:45 compute-1 python3.9[84699]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/bootstrap/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764925304.0001762-1287-275861637287941/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=0908ee08d371593f319166fe14009677e91e42dc backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 09:01:45 compute-1 sudo[84697]: pam_unix(sudo:session): session closed for user root
Dec 05 09:01:45 compute-1 sudo[84849]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mwbzvecgfbinkhavplzryycbhyrnhpgt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925305.2407162-1333-59092927576293/AnsiballZ_file.py'
Dec 05 09:01:45 compute-1 sudo[84849]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:01:45 compute-1 python3.9[84851]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 05 09:01:45 compute-1 sudo[84849]: pam_unix(sudo:session): session closed for user root
Dec 05 09:01:46 compute-1 sudo[85001]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uinjxokaknslgmgkmvjvgkgsdeqgrsyv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925305.915076-1352-141320780527117/AnsiballZ_stat.py'
Dec 05 09:01:46 compute-1 sudo[85001]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:01:46 compute-1 python3.9[85003]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 09:01:46 compute-1 sudo[85001]: pam_unix(sudo:session): session closed for user root
Dec 05 09:01:46 compute-1 sudo[85124]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ohosdzzddtkueonjmdpugbjjiqgwhbze ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925305.915076-1352-141320780527117/AnsiballZ_copy.py'
Dec 05 09:01:46 compute-1 sudo[85124]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:01:46 compute-1 python3.9[85126]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764925305.915076-1352-141320780527117/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=0908ee08d371593f319166fe14009677e91e42dc backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 09:01:47 compute-1 sudo[85124]: pam_unix(sudo:session): session closed for user root
Dec 05 09:01:47 compute-1 sshd-session[77463]: Connection closed by 192.168.122.30 port 50092
Dec 05 09:01:47 compute-1 sshd-session[77460]: pam_unix(sshd:session): session closed for user zuul
Dec 05 09:01:47 compute-1 systemd[1]: session-20.scope: Deactivated successfully.
Dec 05 09:01:47 compute-1 systemd[1]: session-20.scope: Consumed 29.587s CPU time.
Dec 05 09:01:47 compute-1 systemd-logind[807]: Session 20 logged out. Waiting for processes to exit.
Dec 05 09:01:47 compute-1 systemd-logind[807]: Removed session 20.
Dec 05 09:01:52 compute-1 sshd-session[85151]: Received disconnect from 43.225.158.169 port 36242:11: Bye Bye [preauth]
Dec 05 09:01:52 compute-1 sshd-session[85151]: Disconnected from authenticating user root 43.225.158.169 port 36242 [preauth]
Dec 05 09:01:53 compute-1 sshd-session[85153]: Accepted publickey for zuul from 192.168.122.30 port 54584 ssh2: ECDSA SHA256:xFeNApY160LxGIiSPyA1pr6/OAQjwgC1t0EAbYvSE3Q
Dec 05 09:01:53 compute-1 systemd-logind[807]: New session 21 of user zuul.
Dec 05 09:01:53 compute-1 systemd[1]: Started Session 21 of User zuul.
Dec 05 09:01:53 compute-1 sshd-session[85153]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 05 09:01:54 compute-1 python3.9[85306]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 05 09:01:55 compute-1 sudo[85460]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-porkiacftsytgpmvlhfaqojwhnbrkdya ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925314.9850469-63-29558197703888/AnsiballZ_file.py'
Dec 05 09:01:55 compute-1 sudo[85460]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:01:55 compute-1 python3.9[85462]: ansible-ansible.builtin.file Invoked with group=zuul mode=0750 owner=zuul path=/var/lib/edpm-config/firewall setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 05 09:01:55 compute-1 sudo[85460]: pam_unix(sudo:session): session closed for user root
Dec 05 09:01:56 compute-1 sudo[85612]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ozrnkrmboqkzaixfioeuflpjjxcppjvq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925315.7951643-63-150211007887666/AnsiballZ_file.py'
Dec 05 09:01:56 compute-1 sudo[85612]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:01:56 compute-1 python3.9[85614]: ansible-ansible.builtin.file Invoked with group=openvswitch owner=openvswitch path=/var/lib/openvswitch/ovn setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Dec 05 09:01:56 compute-1 sudo[85612]: pam_unix(sudo:session): session closed for user root
Dec 05 09:01:57 compute-1 python3.9[85764]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 05 09:01:57 compute-1 sudo[85914]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dstlxunmryxoxdkyqzdmpribsmmxdplq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925317.307462-132-66120363382885/AnsiballZ_seboolean.py'
Dec 05 09:01:57 compute-1 sudo[85914]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:01:58 compute-1 python3.9[85916]: ansible-ansible.posix.seboolean Invoked with name=virt_sandbox_use_netlink persistent=True state=True ignore_selinux_state=False
Dec 05 09:01:59 compute-1 sudo[85914]: pam_unix(sudo:session): session closed for user root
Dec 05 09:02:00 compute-1 sudo[86070]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ozbcbzggujkkwxfpkytrametukhecleu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925319.7961965-162-278886092909854/AnsiballZ_setup.py'
Dec 05 09:02:00 compute-1 dbus-broker-launch[773]: avc:  op=load_policy lsm=selinux seqno=11 res=1
Dec 05 09:02:00 compute-1 sudo[86070]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:02:00 compute-1 python3.9[86072]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec 05 09:02:00 compute-1 sudo[86070]: pam_unix(sudo:session): session closed for user root
Dec 05 09:02:01 compute-1 sudo[86154]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mngjjqqlxtlpwnkiixdbvesdrtedovzu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925319.7961965-162-278886092909854/AnsiballZ_dnf.py'
Dec 05 09:02:01 compute-1 sudo[86154]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:02:01 compute-1 python3.9[86156]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec 05 09:02:02 compute-1 sudo[86154]: pam_unix(sudo:session): session closed for user root
Dec 05 09:02:03 compute-1 sudo[86307]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rqujktlvlfgxjupiarcphvcjxgkkaqgk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925323.0093796-198-5250773524054/AnsiballZ_systemd.py'
Dec 05 09:02:03 compute-1 sudo[86307]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:02:04 compute-1 python3.9[86309]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Dec 05 09:02:04 compute-1 sudo[86307]: pam_unix(sudo:session): session closed for user root
Dec 05 09:02:04 compute-1 sudo[86462]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-muvtearvkbnixejawzrssceytlkxwgyo ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1764925324.36301-222-229578442048469/AnsiballZ_edpm_nftables_snippet.py'
Dec 05 09:02:04 compute-1 sudo[86462]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:02:05 compute-1 python3[86464]: ansible-osp.edpm.edpm_nftables_snippet Invoked with content=- rule_name: 118 neutron vxlan networks
                                            rule:
                                              proto: udp
                                              dport: 4789
                                          - rule_name: 119 neutron geneve networks
                                            rule:
                                              proto: udp
                                              dport: 6081
                                              state: ["UNTRACKED"]
                                          - rule_name: 120 neutron geneve networks no conntrack
                                            rule:
                                              proto: udp
                                              dport: 6081
                                              table: raw
                                              chain: OUTPUT
                                              jump: NOTRACK
                                              action: append
                                              state: []
                                          - rule_name: 121 neutron geneve networks no conntrack
                                            rule:
                                              proto: udp
                                              dport: 6081
                                              table: raw
                                              chain: PREROUTING
                                              jump: NOTRACK
                                              action: append
                                              state: []
                                           dest=/var/lib/edpm-config/firewall/ovn.yaml state=present
Dec 05 09:02:05 compute-1 sudo[86462]: pam_unix(sudo:session): session closed for user root
Dec 05 09:02:05 compute-1 sudo[86614]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jhrnxsikruqxvynqgbdkcuhekudvnogg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925325.350565-249-93630599719032/AnsiballZ_file.py'
Dec 05 09:02:05 compute-1 sudo[86614]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:02:06 compute-1 python3.9[86616]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 09:02:06 compute-1 sudo[86614]: pam_unix(sudo:session): session closed for user root
Dec 05 09:02:06 compute-1 sudo[86766]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aginxjvapdaolgyinzllcphqbclmftya ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925326.2901335-273-175147425585447/AnsiballZ_stat.py'
Dec 05 09:02:06 compute-1 sudo[86766]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:02:06 compute-1 python3.9[86768]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 09:02:06 compute-1 sudo[86766]: pam_unix(sudo:session): session closed for user root
Dec 05 09:02:07 compute-1 sudo[86844]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nfupjrfcjvjnqfaensnisltauevjjjul ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925326.2901335-273-175147425585447/AnsiballZ_file.py'
Dec 05 09:02:07 compute-1 sudo[86844]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:02:07 compute-1 python3.9[86846]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 09:02:07 compute-1 sudo[86844]: pam_unix(sudo:session): session closed for user root
Dec 05 09:02:07 compute-1 sudo[86996]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wvvvcomznermoputjifndlqcouhvqccn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925327.5650487-309-255341431603327/AnsiballZ_stat.py'
Dec 05 09:02:07 compute-1 sudo[86996]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:02:08 compute-1 python3.9[86998]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 09:02:08 compute-1 sudo[86996]: pam_unix(sudo:session): session closed for user root
Dec 05 09:02:08 compute-1 sudo[87074]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-znydchdytyyenaimhorokopwmlqqrbgw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925327.5650487-309-255341431603327/AnsiballZ_file.py'
Dec 05 09:02:08 compute-1 sudo[87074]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:02:08 compute-1 python3.9[87076]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.y1_xrqk1 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 09:02:08 compute-1 sudo[87074]: pam_unix(sudo:session): session closed for user root
Dec 05 09:02:09 compute-1 sudo[87226]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xoiouxqpabgzgrnsusbekswwdjilukhw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925328.8139262-345-262563427003666/AnsiballZ_stat.py'
Dec 05 09:02:09 compute-1 sudo[87226]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:02:09 compute-1 python3.9[87228]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 09:02:09 compute-1 sudo[87226]: pam_unix(sudo:session): session closed for user root
Dec 05 09:02:09 compute-1 sudo[87304]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qozlwxrnqqoaqppofxgkoleowtfaabei ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925328.8139262-345-262563427003666/AnsiballZ_file.py'
Dec 05 09:02:09 compute-1 sudo[87304]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:02:09 compute-1 python3.9[87306]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 09:02:09 compute-1 sudo[87304]: pam_unix(sudo:session): session closed for user root
Dec 05 09:02:12 compute-1 sudo[87456]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gweurvxkpzegscxayvagfexjrjniuhmu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925331.6320746-384-161551227810677/AnsiballZ_command.py'
Dec 05 09:02:12 compute-1 sudo[87456]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:02:12 compute-1 python3.9[87458]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 05 09:02:12 compute-1 sudo[87456]: pam_unix(sudo:session): session closed for user root
Dec 05 09:02:12 compute-1 sudo[87609]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vxdxrcahsjplenzgjxdjvgsxisdaapim ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1764925332.5121443-408-14268473704272/AnsiballZ_edpm_nftables_from_files.py'
Dec 05 09:02:12 compute-1 sudo[87609]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:02:13 compute-1 python3[87611]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Dec 05 09:02:13 compute-1 sudo[87609]: pam_unix(sudo:session): session closed for user root
Dec 05 09:02:13 compute-1 sudo[87761]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qztaoyocnypnpzyqyskcvxyvvnayqrve ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925333.3944361-432-192927775498832/AnsiballZ_stat.py'
Dec 05 09:02:13 compute-1 sudo[87761]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:02:13 compute-1 python3.9[87763]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 09:02:13 compute-1 sudo[87761]: pam_unix(sudo:session): session closed for user root
Dec 05 09:02:14 compute-1 sudo[87886]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lnedtssrugxzpbxzaipsivmsrghxebwp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925333.3944361-432-192927775498832/AnsiballZ_copy.py'
Dec 05 09:02:14 compute-1 sudo[87886]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:02:14 compute-1 python3.9[87888]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764925333.3944361-432-192927775498832/.source.nft follow=False _original_basename=jump-chain.j2 checksum=81c2fc96c23335ffe374f9b064e885d5d971ddf9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 09:02:14 compute-1 sudo[87886]: pam_unix(sudo:session): session closed for user root
Dec 05 09:02:15 compute-1 sudo[88038]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-paydwudvluohkojjfpeekncslubbxgzo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925334.8825312-477-52972969023018/AnsiballZ_stat.py'
Dec 05 09:02:15 compute-1 sudo[88038]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:02:15 compute-1 python3.9[88040]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 09:02:15 compute-1 sudo[88038]: pam_unix(sudo:session): session closed for user root
Dec 05 09:02:15 compute-1 sudo[88163]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-unakwqahtdyzagqfgnshuabsdnzwohmd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925334.8825312-477-52972969023018/AnsiballZ_copy.py'
Dec 05 09:02:15 compute-1 sudo[88163]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:02:16 compute-1 python3.9[88165]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-update-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764925334.8825312-477-52972969023018/.source.nft follow=False _original_basename=jump-chain.j2 checksum=81c2fc96c23335ffe374f9b064e885d5d971ddf9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 09:02:16 compute-1 sudo[88163]: pam_unix(sudo:session): session closed for user root
Dec 05 09:02:16 compute-1 sudo[88315]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oakmvrwmckksevwsiouifjpguiwdztxm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925336.1980302-522-219491771692624/AnsiballZ_stat.py'
Dec 05 09:02:16 compute-1 sudo[88315]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:02:16 compute-1 python3.9[88317]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 09:02:16 compute-1 sudo[88315]: pam_unix(sudo:session): session closed for user root
Dec 05 09:02:17 compute-1 sudo[88440]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mfshnadcicislaiuughvbjzdxwcpovrs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925336.1980302-522-219491771692624/AnsiballZ_copy.py'
Dec 05 09:02:17 compute-1 sudo[88440]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:02:17 compute-1 python3.9[88442]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-flushes.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764925336.1980302-522-219491771692624/.source.nft follow=False _original_basename=flush-chain.j2 checksum=4d3ffec49c8eb1a9b80d2f1e8cd64070063a87b4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 09:02:17 compute-1 sudo[88440]: pam_unix(sudo:session): session closed for user root
Dec 05 09:02:17 compute-1 sudo[88592]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zdlarpisizjriuorimfxpgarivmvoopz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925337.485894-567-176980606449808/AnsiballZ_stat.py'
Dec 05 09:02:17 compute-1 sudo[88592]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:02:18 compute-1 python3.9[88594]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 09:02:18 compute-1 sudo[88592]: pam_unix(sudo:session): session closed for user root
Dec 05 09:02:18 compute-1 sudo[88717]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-axxjdombgwgkflyziodnxtnurgtspuvi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925337.485894-567-176980606449808/AnsiballZ_copy.py'
Dec 05 09:02:18 compute-1 sudo[88717]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:02:18 compute-1 python3.9[88719]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-chains.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764925337.485894-567-176980606449808/.source.nft follow=False _original_basename=chains.j2 checksum=298ada419730ec15df17ded0cc50c97a4014a591 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 09:02:18 compute-1 sudo[88717]: pam_unix(sudo:session): session closed for user root
Dec 05 09:02:19 compute-1 sudo[88871]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xxtypollzaesbglwoofmapkvwureduch ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925338.9615943-612-45424175507875/AnsiballZ_stat.py'
Dec 05 09:02:19 compute-1 sudo[88871]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:02:19 compute-1 python3.9[88873]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 09:02:19 compute-1 sudo[88871]: pam_unix(sudo:session): session closed for user root
Dec 05 09:02:20 compute-1 sshd-session[88744]: Received disconnect from 185.118.15.236 port 34060:11: Bye Bye [preauth]
Dec 05 09:02:20 compute-1 sshd-session[88744]: Disconnected from authenticating user root 185.118.15.236 port 34060 [preauth]
Dec 05 09:02:20 compute-1 sudo[88996]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uyqoxtwgcmtcdatdnfnonzjflktvgzwg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925338.9615943-612-45424175507875/AnsiballZ_copy.py'
Dec 05 09:02:20 compute-1 sudo[88996]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:02:20 compute-1 python3.9[88998]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764925338.9615943-612-45424175507875/.source.nft follow=False _original_basename=ruleset.j2 checksum=eb691bdb7d792c5f8ff0d719e807fe1c95b09438 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 09:02:20 compute-1 sudo[88996]: pam_unix(sudo:session): session closed for user root
Dec 05 09:02:20 compute-1 sudo[89148]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-irudislftureglvujvqopozuexjxfhms ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925340.5234756-657-179484412357563/AnsiballZ_file.py'
Dec 05 09:02:20 compute-1 sudo[89148]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:02:21 compute-1 python3.9[89150]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 09:02:21 compute-1 sudo[89148]: pam_unix(sudo:session): session closed for user root
Dec 05 09:02:21 compute-1 sudo[89300]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kttqhjtzunuckgxkgrrtbdvjzozzxlup ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925341.2381244-681-275310585121617/AnsiballZ_command.py'
Dec 05 09:02:21 compute-1 sudo[89300]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:02:21 compute-1 python3.9[89302]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 05 09:02:21 compute-1 sudo[89300]: pam_unix(sudo:session): session closed for user root
Dec 05 09:02:22 compute-1 sudo[89455]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mhlqrqmkhmdhehlhmhgdfjrcqzropehv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925342.0241714-705-280645522732966/AnsiballZ_blockinfile.py'
Dec 05 09:02:22 compute-1 sudo[89455]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:02:22 compute-1 python3.9[89457]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"
                                            include "/etc/nftables/edpm-chains.nft"
                                            include "/etc/nftables/edpm-rules.nft"
                                            include "/etc/nftables/edpm-jumps.nft"
                                             path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 09:02:22 compute-1 sudo[89455]: pam_unix(sudo:session): session closed for user root
Dec 05 09:02:23 compute-1 sudo[89607]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xbpoadjhbpkvmwhcytkyybikzwfrhkkf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925343.0560143-733-66104880727166/AnsiballZ_command.py'
Dec 05 09:02:23 compute-1 sudo[89607]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:02:23 compute-1 python3.9[89609]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 05 09:02:23 compute-1 sudo[89607]: pam_unix(sudo:session): session closed for user root
Dec 05 09:02:24 compute-1 sudo[89760]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wfhsdbsaacczktaygoycagtmkcqdltgb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925343.8005002-756-183952291904778/AnsiballZ_stat.py'
Dec 05 09:02:24 compute-1 sudo[89760]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:02:24 compute-1 python3.9[89762]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 05 09:02:24 compute-1 sudo[89760]: pam_unix(sudo:session): session closed for user root
Dec 05 09:02:25 compute-1 sudo[89914]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ldjmktihwiiyoqalvlfujhyuuqfatcol ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925344.7806716-780-75894257234853/AnsiballZ_command.py'
Dec 05 09:02:25 compute-1 sudo[89914]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:02:25 compute-1 python3.9[89916]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 05 09:02:25 compute-1 sudo[89914]: pam_unix(sudo:session): session closed for user root
Dec 05 09:02:25 compute-1 sudo[90069]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jpgfakpkcmfvinspzeednuqvvdawklhq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925345.535433-804-90496908092721/AnsiballZ_file.py'
Dec 05 09:02:25 compute-1 sudo[90069]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:02:26 compute-1 python3.9[90071]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 09:02:26 compute-1 sudo[90069]: pam_unix(sudo:session): session closed for user root
Dec 05 09:02:27 compute-1 python3.9[90221]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'machine'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 05 09:02:28 compute-1 sudo[90372]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vkknpvjdjnglihnipygfkiplatyskqhz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925348.522364-924-243052444531072/AnsiballZ_command.py'
Dec 05 09:02:28 compute-1 sudo[90372]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:02:29 compute-1 python3.9[90374]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl set open . external_ids:hostname=compute-1.ctlplane.example.com external_ids:ovn-bridge=br-int external_ids:ovn-bridge-mappings=datacentre:br-ex external_ids:ovn-chassis-mac-mappings="datacentre:3e:0a:f2:93:49:d5" external_ids:ovn-encap-ip=172.19.0.101 external_ids:ovn-encap-type=geneve external_ids:ovn-encap-tos=0 external_ids:ovn-match-northd-version=False external_ids:ovn-monitor-all=True external_ids:ovn-remote=ssl:ovsdbserver-sb.openstack.svc:6642 external_ids:ovn-remote-probe-interval=60000 external_ids:ovn-ofctrl-wait-before-clear=8000 external_ids:rundir=/var/run/openvswitch 
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 05 09:02:29 compute-1 ovs-vsctl[90375]: ovs|00001|vsctl|INFO|Called as ovs-vsctl set open . external_ids:hostname=compute-1.ctlplane.example.com external_ids:ovn-bridge=br-int external_ids:ovn-bridge-mappings=datacentre:br-ex external_ids:ovn-chassis-mac-mappings=datacentre:3e:0a:f2:93:49:d5 external_ids:ovn-encap-ip=172.19.0.101 external_ids:ovn-encap-type=geneve external_ids:ovn-encap-tos=0 external_ids:ovn-match-northd-version=False external_ids:ovn-monitor-all=True external_ids:ovn-remote=ssl:ovsdbserver-sb.openstack.svc:6642 external_ids:ovn-remote-probe-interval=60000 external_ids:ovn-ofctrl-wait-before-clear=8000 external_ids:rundir=/var/run/openvswitch
Dec 05 09:02:29 compute-1 sudo[90372]: pam_unix(sudo:session): session closed for user root
Dec 05 09:02:29 compute-1 sudo[90525]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ickthrjthyyqvrhtzxufawoldayilgxk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925349.7065506-951-109901832615558/AnsiballZ_command.py'
Dec 05 09:02:29 compute-1 sudo[90525]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:02:30 compute-1 python3.9[90527]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail
                                            ovs-vsctl show | grep -q "Manager"
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 05 09:02:30 compute-1 sudo[90525]: pam_unix(sudo:session): session closed for user root
Dec 05 09:02:30 compute-1 sudo[90680]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kjvsadlzmkuxwmpdydgjxqsgirirbxwh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925350.464539-975-95515072517380/AnsiballZ_command.py'
Dec 05 09:02:30 compute-1 sudo[90680]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:02:30 compute-1 python3.9[90682]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl --timeout=5 --id=@manager -- create Manager target=\"ptcp:********@manager
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 05 09:02:30 compute-1 ovs-vsctl[90683]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --timeout=5 --id=@manager -- create Manager "target=\"ptcp:6640:127.0.0.1\"" -- add Open_vSwitch . manager_options @manager
Dec 05 09:02:31 compute-1 sudo[90680]: pam_unix(sudo:session): session closed for user root
Dec 05 09:02:31 compute-1 python3.9[90833]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 05 09:02:32 compute-1 sudo[90985]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rhgnuuwpaykkvlewztfxutcpcuqbnbsj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925352.089189-1026-268182071174953/AnsiballZ_file.py'
Dec 05 09:02:32 compute-1 sudo[90985]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:02:32 compute-1 python3.9[90987]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 05 09:02:32 compute-1 sudo[90985]: pam_unix(sudo:session): session closed for user root
Dec 05 09:02:33 compute-1 sudo[91137]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ulnttpcixrtzhwnugrzvmpnebwnuirji ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925352.8737023-1050-241604269149471/AnsiballZ_stat.py'
Dec 05 09:02:33 compute-1 sudo[91137]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:02:33 compute-1 python3.9[91139]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 09:02:33 compute-1 sudo[91137]: pam_unix(sudo:session): session closed for user root
Dec 05 09:02:33 compute-1 sudo[91215]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vrjlcnlvzjbvjdsoerwiyhcakyjluyze ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925352.8737023-1050-241604269149471/AnsiballZ_file.py'
Dec 05 09:02:33 compute-1 sudo[91215]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:02:33 compute-1 python3.9[91217]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 05 09:02:33 compute-1 sudo[91215]: pam_unix(sudo:session): session closed for user root
Dec 05 09:02:34 compute-1 sudo[91368]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eompgnwarywbidcbdhzkzqbmgnvypclz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925353.970192-1050-102405835345123/AnsiballZ_stat.py'
Dec 05 09:02:34 compute-1 sudo[91368]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:02:34 compute-1 python3.9[91370]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 09:02:34 compute-1 sudo[91368]: pam_unix(sudo:session): session closed for user root
Dec 05 09:02:35 compute-1 sudo[91446]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qrrnxerrowwohboqkjyvfuujjocerjif ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925353.970192-1050-102405835345123/AnsiballZ_file.py'
Dec 05 09:02:35 compute-1 sudo[91446]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:02:35 compute-1 python3.9[91448]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 05 09:02:35 compute-1 sudo[91446]: pam_unix(sudo:session): session closed for user root
Dec 05 09:02:35 compute-1 sudo[91598]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gdogdetwmnpyjvjodmnymnomzdfebyel ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925355.5737574-1119-229423536601807/AnsiballZ_file.py'
Dec 05 09:02:35 compute-1 sudo[91598]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:02:36 compute-1 python3.9[91600]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 09:02:36 compute-1 sudo[91598]: pam_unix(sudo:session): session closed for user root
Dec 05 09:02:36 compute-1 sudo[91750]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-leuppxmjiplnavvpwuixcnhogaedimwn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925356.3065932-1143-86631501898908/AnsiballZ_stat.py'
Dec 05 09:02:36 compute-1 sudo[91750]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:02:36 compute-1 python3.9[91752]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 09:02:36 compute-1 sudo[91750]: pam_unix(sudo:session): session closed for user root
Dec 05 09:02:37 compute-1 sudo[91828]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-msvgodtzdaoevytmzqaavkzvfohhdzxl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925356.3065932-1143-86631501898908/AnsiballZ_file.py'
Dec 05 09:02:37 compute-1 sudo[91828]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:02:37 compute-1 python3.9[91830]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 09:02:37 compute-1 sudo[91828]: pam_unix(sudo:session): session closed for user root
Dec 05 09:02:37 compute-1 sudo[91980]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-trvgdyhizjypnksakmeoyzicrmmdmvdm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925357.5048327-1179-278075006939462/AnsiballZ_stat.py'
Dec 05 09:02:37 compute-1 sudo[91980]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:02:37 compute-1 python3.9[91982]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 09:02:38 compute-1 sudo[91980]: pam_unix(sudo:session): session closed for user root
Dec 05 09:02:38 compute-1 sudo[92058]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kjctgknixublcxraddhspfqxfumukepo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925357.5048327-1179-278075006939462/AnsiballZ_file.py'
Dec 05 09:02:38 compute-1 sudo[92058]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:02:38 compute-1 python3.9[92060]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 09:02:38 compute-1 sudo[92058]: pam_unix(sudo:session): session closed for user root
Dec 05 09:02:39 compute-1 sudo[92210]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-itiaaubfsxxkeuqzpwbskazlguknorha ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925359.0437703-1215-95931642694569/AnsiballZ_systemd.py'
Dec 05 09:02:39 compute-1 sudo[92210]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:02:39 compute-1 python3.9[92212]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 05 09:02:39 compute-1 systemd[1]: Reloading.
Dec 05 09:02:39 compute-1 systemd-rc-local-generator[92242]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 05 09:02:39 compute-1 systemd-sysv-generator[92245]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 05 09:02:39 compute-1 sudo[92210]: pam_unix(sudo:session): session closed for user root
Dec 05 09:02:40 compute-1 sudo[92401]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ttrqfyjjhazkdawtzcpqdfxogotmyymy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925360.2071297-1239-15836045186634/AnsiballZ_stat.py'
Dec 05 09:02:40 compute-1 sudo[92401]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:02:40 compute-1 python3.9[92403]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 09:02:40 compute-1 sudo[92401]: pam_unix(sudo:session): session closed for user root
Dec 05 09:02:41 compute-1 sudo[92479]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bkzyguadddazvajkqgqxpepfdswfkxyy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925360.2071297-1239-15836045186634/AnsiballZ_file.py'
Dec 05 09:02:41 compute-1 sudo[92479]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:02:41 compute-1 python3.9[92481]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 09:02:41 compute-1 sudo[92479]: pam_unix(sudo:session): session closed for user root
Dec 05 09:02:41 compute-1 sudo[92631]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gywbnoyvreycjodstnsardkhlbyvzevb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925361.4869215-1275-165835666010377/AnsiballZ_stat.py'
Dec 05 09:02:41 compute-1 sudo[92631]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:02:41 compute-1 python3.9[92633]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 09:02:42 compute-1 sudo[92631]: pam_unix(sudo:session): session closed for user root
Dec 05 09:02:42 compute-1 sudo[92709]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-livwxncamykyzakwfpelxscpoivkeybi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925361.4869215-1275-165835666010377/AnsiballZ_file.py'
Dec 05 09:02:42 compute-1 sudo[92709]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:02:42 compute-1 python3.9[92711]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 09:02:42 compute-1 sudo[92709]: pam_unix(sudo:session): session closed for user root
Dec 05 09:02:43 compute-1 sudo[92861]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tyhnbilyjfszaquyhcwpprrbpidevrtr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925362.6741123-1311-102688125466571/AnsiballZ_systemd.py'
Dec 05 09:02:43 compute-1 sudo[92861]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:02:43 compute-1 python3.9[92863]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 05 09:02:43 compute-1 systemd[1]: Reloading.
Dec 05 09:02:43 compute-1 systemd-rc-local-generator[92890]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 05 09:02:43 compute-1 systemd-sysv-generator[92895]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 05 09:02:43 compute-1 systemd[1]: Starting Create netns directory...
Dec 05 09:02:43 compute-1 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Dec 05 09:02:43 compute-1 systemd[1]: netns-placeholder.service: Deactivated successfully.
Dec 05 09:02:43 compute-1 systemd[1]: Finished Create netns directory.
Dec 05 09:02:43 compute-1 sudo[92861]: pam_unix(sudo:session): session closed for user root
Dec 05 09:02:44 compute-1 sudo[93056]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kilvvjnjmtyzjbrgjkaofeexycdicmtv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925364.1097054-1341-68925083757847/AnsiballZ_file.py'
Dec 05 09:02:44 compute-1 sudo[93056]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:02:44 compute-1 python3.9[93058]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 05 09:02:44 compute-1 sudo[93056]: pam_unix(sudo:session): session closed for user root
Dec 05 09:02:45 compute-1 sudo[93208]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-horeomcwaecmdkopituqzkviznlqoyky ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925364.8803396-1365-198976155386785/AnsiballZ_stat.py'
Dec 05 09:02:45 compute-1 sudo[93208]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:02:45 compute-1 python3.9[93210]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ovn_controller/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 09:02:45 compute-1 sudo[93208]: pam_unix(sudo:session): session closed for user root
Dec 05 09:02:45 compute-1 sudo[93333]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cinymdkgdzrftkvfhdnvpflfcwjyeswb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925364.8803396-1365-198976155386785/AnsiballZ_copy.py'
Dec 05 09:02:45 compute-1 sudo[93333]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:02:45 compute-1 python3.9[93335]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ovn_controller/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764925364.8803396-1365-198976155386785/.source _original_basename=healthcheck follow=False checksum=4098dd010265fabdf5c26b97d169fc4e575ff457 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Dec 05 09:02:45 compute-1 sudo[93333]: pam_unix(sudo:session): session closed for user root
Dec 05 09:02:46 compute-1 sshd-session[93211]: Received disconnect from 122.114.113.177 port 42214:11: Bye Bye [preauth]
Dec 05 09:02:46 compute-1 sshd-session[93211]: Disconnected from authenticating user root 122.114.113.177 port 42214 [preauth]
Dec 05 09:02:46 compute-1 sudo[93485]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gwhpxaaxtxojvpypqyhkrnyajyemmbrh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925366.4845161-1416-51379378810758/AnsiballZ_file.py'
Dec 05 09:02:46 compute-1 sudo[93485]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:02:46 compute-1 python3.9[93487]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/edpm-config recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 09:02:47 compute-1 sudo[93485]: pam_unix(sudo:session): session closed for user root
Dec 05 09:02:47 compute-1 sudo[93637]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eeflmzuvhwbupqfutcllcocwfudypuqq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925367.2369313-1440-84253561870032/AnsiballZ_file.py'
Dec 05 09:02:47 compute-1 sudo[93637]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:02:47 compute-1 python3.9[93639]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 05 09:02:47 compute-1 sudo[93637]: pam_unix(sudo:session): session closed for user root
Dec 05 09:02:48 compute-1 sudo[93789]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aeeuqjrzbanvzamwdjjfcknnitxeszyn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925368.2222576-1464-151417378424753/AnsiballZ_stat.py'
Dec 05 09:02:48 compute-1 sudo[93789]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:02:48 compute-1 python3.9[93791]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/ovn_controller.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 09:02:48 compute-1 sudo[93789]: pam_unix(sudo:session): session closed for user root
Dec 05 09:02:49 compute-1 sudo[93912]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bscapkezaqbampcrddujbucmdrhraseo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925368.2222576-1464-151417378424753/AnsiballZ_copy.py'
Dec 05 09:02:49 compute-1 sudo[93912]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:02:49 compute-1 python3.9[93914]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/ovn_controller.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1764925368.2222576-1464-151417378424753/.source.json _original_basename=.wsb06tes follow=False checksum=2328fc98619beeb08ee32b01f15bb43094c10b61 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 09:02:49 compute-1 sudo[93912]: pam_unix(sudo:session): session closed for user root
Dec 05 09:02:50 compute-1 python3.9[94064]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/ovn_controller state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 09:02:52 compute-1 sudo[94487]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tfgsxihgkybeylgmlbwtolgemqatfork ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925371.787785-1584-83175728838750/AnsiballZ_container_config_data.py'
Dec 05 09:02:52 compute-1 sudo[94487]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:02:52 compute-1 python3.9[94489]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/ovn_controller config_pattern=*.json debug=False
Dec 05 09:02:52 compute-1 sudo[94487]: pam_unix(sudo:session): session closed for user root
Dec 05 09:02:53 compute-1 sshd-session[94412]: Received disconnect from 122.168.194.41 port 59528:11: Bye Bye [preauth]
Dec 05 09:02:53 compute-1 sshd-session[94412]: Disconnected from authenticating user root 122.168.194.41 port 59528 [preauth]
Dec 05 09:02:53 compute-1 sudo[94639]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wmqvabffxkyardvedmcojulcyonsxrzy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925373.1340501-1617-176874605855796/AnsiballZ_container_config_hash.py'
Dec 05 09:02:53 compute-1 sudo[94639]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:02:53 compute-1 python3.9[94641]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/openstack
Dec 05 09:02:53 compute-1 sudo[94639]: pam_unix(sudo:session): session closed for user root
Dec 05 09:02:54 compute-1 sudo[94791]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wemtytxsubtnitjbkhtpowrzszaoksut ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925374.1037085-1644-139098667690390/AnsiballZ_podman_container_info.py'
Dec 05 09:02:54 compute-1 sudo[94791]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:02:55 compute-1 python3.9[94793]: ansible-containers.podman.podman_container_info Invoked with executable=podman name=None
Dec 05 09:02:55 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 05 09:02:55 compute-1 sudo[94791]: pam_unix(sudo:session): session closed for user root
Dec 05 09:02:57 compute-1 sudo[94954]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pfbfntnikjbsgeoaeshlwttatpxkfquw ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1764925376.1920853-1683-103577040780256/AnsiballZ_edpm_container_manage.py'
Dec 05 09:02:57 compute-1 sudo[94954]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:02:57 compute-1 python3[94956]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/ovn_controller config_id=ovn_controller config_overrides={} config_patterns=*.json containers=['ovn_controller'] log_base_path=/var/log/containers/stdouts debug=False
Dec 05 09:02:57 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 05 09:02:57 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 05 09:02:57 compute-1 podman[94995]: 2025-12-05 09:02:57.644843111 +0000 UTC m=+0.050595796 container create 0cdc951f3b455a1459471b20e1f1626ca53f702831c125f835d1b670aeb17ccf (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, io.buildah.version=1.41.3, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a42d21893a42142b0b00e96296a13ac9d2c0fedf55d6438ad104fd0309c63376-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.build-date=20251125, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Dec 05 09:02:57 compute-1 podman[94995]: 2025-12-05 09:02:57.618191132 +0000 UTC m=+0.023943837 image pull 3a37a52861b2e44ebd2a63ca2589a7c9d8e4119e5feace9d19c6312ed9b8421c quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified
Dec 05 09:02:57 compute-1 python3[94956]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name ovn_controller --conmon-pidfile /run/ovn_controller.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env EDPM_CONFIG_HASH=a42d21893a42142b0b00e96296a13ac9d2c0fedf55d6438ad104fd0309c63376-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a --healthcheck-command /openstack/healthcheck --label config_id=ovn_controller --label container_name=ovn_controller --label managed_by=edpm_ansible --label config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a42d21893a42142b0b00e96296a13ac9d2c0fedf55d6438ad104fd0309c63376-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --user root --volume /lib/modules:/lib/modules:ro --volume /run:/run --volume /var/lib/openvswitch/ovn:/run/ovn:shared,z --volume /var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z --volume /var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z --volume /var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z --volume /var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified
Dec 05 09:02:57 compute-1 sudo[94954]: pam_unix(sudo:session): session closed for user root
Dec 05 09:02:58 compute-1 sudo[95183]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-olfxijztfanluaosbkoswzmejbbzzhsz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925377.9965994-1707-218434523691253/AnsiballZ_stat.py'
Dec 05 09:02:58 compute-1 sudo[95183]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:02:58 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 05 09:02:58 compute-1 python3.9[95185]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 05 09:02:58 compute-1 sudo[95183]: pam_unix(sudo:session): session closed for user root
Dec 05 09:02:59 compute-1 sudo[95337]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rqslclsspyvaotxitbzdvaefyalhnuaz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925378.8603284-1734-104778411197354/AnsiballZ_file.py'
Dec 05 09:02:59 compute-1 sudo[95337]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:02:59 compute-1 python3.9[95339]: ansible-file Invoked with path=/etc/systemd/system/edpm_ovn_controller.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 09:02:59 compute-1 sudo[95337]: pam_unix(sudo:session): session closed for user root
Dec 05 09:02:59 compute-1 sudo[95413]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-firadmbpcxypfhpafabhrjqtbnbpwfzd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925378.8603284-1734-104778411197354/AnsiballZ_stat.py'
Dec 05 09:02:59 compute-1 sudo[95413]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:02:59 compute-1 python3.9[95415]: ansible-stat Invoked with path=/etc/systemd/system/edpm_ovn_controller_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 05 09:02:59 compute-1 sudo[95413]: pam_unix(sudo:session): session closed for user root
Dec 05 09:03:00 compute-1 sudo[95564]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iprdnulxgyqfpnxczngvnmzojfxnbfyp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925379.881901-1734-219023923040916/AnsiballZ_copy.py'
Dec 05 09:03:00 compute-1 sudo[95564]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:03:00 compute-1 python3.9[95566]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764925379.881901-1734-219023923040916/source dest=/etc/systemd/system/edpm_ovn_controller.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 09:03:00 compute-1 sudo[95564]: pam_unix(sudo:session): session closed for user root
Dec 05 09:03:00 compute-1 sudo[95640]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wkspwkufuauxohfpgrcwflfaacehpceg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925379.881901-1734-219023923040916/AnsiballZ_systemd.py'
Dec 05 09:03:00 compute-1 sudo[95640]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:03:01 compute-1 python3.9[95642]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Dec 05 09:03:01 compute-1 systemd[1]: Reloading.
Dec 05 09:03:01 compute-1 systemd-rc-local-generator[95666]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 05 09:03:01 compute-1 systemd-sysv-generator[95672]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 05 09:03:01 compute-1 sudo[95640]: pam_unix(sudo:session): session closed for user root
Dec 05 09:03:01 compute-1 sudo[95751]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hiogobhqlbnraswtvioxgnnvkqzcxnha ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925379.881901-1734-219023923040916/AnsiballZ_systemd.py'
Dec 05 09:03:01 compute-1 sudo[95751]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:03:01 compute-1 python3.9[95753]: ansible-systemd Invoked with state=restarted name=edpm_ovn_controller.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 05 09:03:02 compute-1 systemd[1]: Reloading.
Dec 05 09:03:02 compute-1 systemd-rc-local-generator[95783]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 05 09:03:02 compute-1 systemd-sysv-generator[95786]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 05 09:03:02 compute-1 systemd[1]: Starting ovn_controller container...
Dec 05 09:03:02 compute-1 systemd[1]: Created slice Virtual Machine and Container Slice.
Dec 05 09:03:02 compute-1 systemd[1]: Started libcrun container.
Dec 05 09:03:02 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b8446a54ebe833412efad51589c3496ddae1fbea2d1582e21c75e35d3f419923/merged/run/ovn supports timestamps until 2038 (0x7fffffff)
Dec 05 09:03:02 compute-1 systemd[1]: Started /usr/bin/podman healthcheck run 0cdc951f3b455a1459471b20e1f1626ca53f702831c125f835d1b670aeb17ccf.
Dec 05 09:03:02 compute-1 podman[95794]: 2025-12-05 09:03:02.441647818 +0000 UTC m=+0.147365164 container init 0cdc951f3b455a1459471b20e1f1626ca53f702831c125f835d1b670aeb17ccf (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a42d21893a42142b0b00e96296a13ac9d2c0fedf55d6438ad104fd0309c63376-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, managed_by=edpm_ansible, org.label-schema.build-date=20251125)
Dec 05 09:03:02 compute-1 ovn_controller[95809]: + sudo -E kolla_set_configs
Dec 05 09:03:02 compute-1 podman[95794]: 2025-12-05 09:03:02.480335932 +0000 UTC m=+0.186053258 container start 0cdc951f3b455a1459471b20e1f1626ca53f702831c125f835d1b670aeb17ccf (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a42d21893a42142b0b00e96296a13ac9d2c0fedf55d6438ad104fd0309c63376-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 05 09:03:02 compute-1 edpm-start-podman-container[95794]: ovn_controller
Dec 05 09:03:02 compute-1 systemd[1]: Created slice User Slice of UID 0.
Dec 05 09:03:02 compute-1 systemd[1]: Starting User Runtime Directory /run/user/0...
Dec 05 09:03:02 compute-1 systemd[1]: Finished User Runtime Directory /run/user/0.
Dec 05 09:03:02 compute-1 systemd[1]: Starting User Manager for UID 0...
Dec 05 09:03:02 compute-1 edpm-start-podman-container[95793]: Creating additional drop-in dependency for "ovn_controller" (0cdc951f3b455a1459471b20e1f1626ca53f702831c125f835d1b670aeb17ccf)
Dec 05 09:03:02 compute-1 podman[95815]: 2025-12-05 09:03:02.571068445 +0000 UTC m=+0.077914706 container health_status 0cdc951f3b455a1459471b20e1f1626ca53f702831c125f835d1b670aeb17ccf (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=starting, health_failing_streak=1, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_controller, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a42d21893a42142b0b00e96296a13ac9d2c0fedf55d6438ad104fd0309c63376-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true)
Dec 05 09:03:02 compute-1 systemd[1]: 0cdc951f3b455a1459471b20e1f1626ca53f702831c125f835d1b670aeb17ccf-2524713450852a84.service: Main process exited, code=exited, status=1/FAILURE
Dec 05 09:03:02 compute-1 systemd[1]: 0cdc951f3b455a1459471b20e1f1626ca53f702831c125f835d1b670aeb17ccf-2524713450852a84.service: Failed with result 'exit-code'.
Dec 05 09:03:02 compute-1 systemd[95851]: pam_unix(systemd-user:session): session opened for user root(uid=0) by root(uid=0)
Dec 05 09:03:02 compute-1 systemd[1]: Reloading.
Dec 05 09:03:02 compute-1 systemd-sysv-generator[95902]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 05 09:03:02 compute-1 systemd-rc-local-generator[95897]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 05 09:03:02 compute-1 systemd[95851]: Queued start job for default target Main User Target.
Dec 05 09:03:02 compute-1 systemd[95851]: Created slice User Application Slice.
Dec 05 09:03:02 compute-1 systemd[95851]: Mark boot as successful after the user session has run 2 minutes was skipped because of an unmet condition check (ConditionUser=!@system).
Dec 05 09:03:02 compute-1 systemd[95851]: Started Daily Cleanup of User's Temporary Directories.
Dec 05 09:03:02 compute-1 systemd[95851]: Reached target Paths.
Dec 05 09:03:02 compute-1 systemd[95851]: Reached target Timers.
Dec 05 09:03:02 compute-1 systemd[95851]: Starting D-Bus User Message Bus Socket...
Dec 05 09:03:02 compute-1 systemd[95851]: Starting Create User's Volatile Files and Directories...
Dec 05 09:03:02 compute-1 systemd[95851]: Finished Create User's Volatile Files and Directories.
Dec 05 09:03:02 compute-1 systemd[95851]: Listening on D-Bus User Message Bus Socket.
Dec 05 09:03:02 compute-1 systemd[95851]: Reached target Sockets.
Dec 05 09:03:02 compute-1 systemd[95851]: Reached target Basic System.
Dec 05 09:03:02 compute-1 systemd[95851]: Reached target Main User Target.
Dec 05 09:03:02 compute-1 systemd[95851]: Startup finished in 154ms.
Dec 05 09:03:02 compute-1 systemd[1]: Started User Manager for UID 0.
Dec 05 09:03:02 compute-1 systemd[1]: Started ovn_controller container.
Dec 05 09:03:02 compute-1 systemd[1]: Started Session c1 of User root.
Dec 05 09:03:02 compute-1 sudo[95751]: pam_unix(sudo:session): session closed for user root
Dec 05 09:03:02 compute-1 ovn_controller[95809]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Dec 05 09:03:02 compute-1 ovn_controller[95809]: INFO:__main__:Validating config file
Dec 05 09:03:02 compute-1 ovn_controller[95809]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Dec 05 09:03:02 compute-1 ovn_controller[95809]: INFO:__main__:Writing out command to execute
Dec 05 09:03:02 compute-1 systemd[1]: session-c1.scope: Deactivated successfully.
Dec 05 09:03:02 compute-1 ovn_controller[95809]: ++ cat /run_command
Dec 05 09:03:02 compute-1 ovn_controller[95809]: + CMD='/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock  -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt '
Dec 05 09:03:02 compute-1 ovn_controller[95809]: + ARGS=
Dec 05 09:03:02 compute-1 ovn_controller[95809]: + sudo kolla_copy_cacerts
Dec 05 09:03:03 compute-1 systemd[1]: Started Session c2 of User root.
Dec 05 09:03:03 compute-1 systemd[1]: session-c2.scope: Deactivated successfully.
Dec 05 09:03:03 compute-1 ovn_controller[95809]: + [[ ! -n '' ]]
Dec 05 09:03:03 compute-1 ovn_controller[95809]: + . kolla_extend_start
Dec 05 09:03:03 compute-1 ovn_controller[95809]: + echo 'Running command: '\''/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock  -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt '\'''
Dec 05 09:03:03 compute-1 ovn_controller[95809]: Running command: '/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock  -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt '
Dec 05 09:03:03 compute-1 ovn_controller[95809]: + umask 0022
Dec 05 09:03:03 compute-1 ovn_controller[95809]: + exec /usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt
Dec 05 09:03:03 compute-1 ovn_controller[95809]: 2025-12-05T09:03:03Z|00001|reconnect|INFO|unix:/run/openvswitch/db.sock: connecting...
Dec 05 09:03:03 compute-1 ovn_controller[95809]: 2025-12-05T09:03:03Z|00002|reconnect|INFO|unix:/run/openvswitch/db.sock: connected
Dec 05 09:03:03 compute-1 ovn_controller[95809]: 2025-12-05T09:03:03Z|00003|main|INFO|OVN internal version is : [24.03.8-20.33.0-76.8]
Dec 05 09:03:03 compute-1 ovn_controller[95809]: 2025-12-05T09:03:03Z|00004|main|INFO|OVS IDL reconnected, force recompute.
Dec 05 09:03:03 compute-1 ovn_controller[95809]: 2025-12-05T09:03:03Z|00005|reconnect|INFO|ssl:ovsdbserver-sb.openstack.svc:6642: connecting...
Dec 05 09:03:03 compute-1 ovn_controller[95809]: 2025-12-05T09:03:03Z|00006|main|INFO|OVNSB IDL reconnected, force recompute.
Dec 05 09:03:03 compute-1 NetworkManager[55704]: <info>  [1764925383.0725] manager: (br-int): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/14)
Dec 05 09:03:03 compute-1 NetworkManager[55704]: <info>  [1764925383.0734] device (br-int)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Dec 05 09:03:03 compute-1 NetworkManager[55704]: <info>  [1764925383.0751] manager: (br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/15)
Dec 05 09:03:03 compute-1 NetworkManager[55704]: <info>  [1764925383.0756] manager: (br-int): new Open vSwitch Bridge device (/org/freedesktop/NetworkManager/Devices/16)
Dec 05 09:03:03 compute-1 NetworkManager[55704]: <info>  [1764925383.0759] device (br-int)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'none', managed-type: 'full')
Dec 05 09:03:03 compute-1 kernel: br-int: entered promiscuous mode
Dec 05 09:03:03 compute-1 ovn_controller[95809]: 2025-12-05T09:03:03Z|00007|reconnect|INFO|ssl:ovsdbserver-sb.openstack.svc:6642: connected
Dec 05 09:03:03 compute-1 ovn_controller[95809]: 2025-12-05T09:03:03Z|00008|features|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Dec 05 09:03:03 compute-1 ovn_controller[95809]: 2025-12-05T09:03:03Z|00009|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Dec 05 09:03:03 compute-1 ovn_controller[95809]: 2025-12-05T09:03:03Z|00010|features|INFO|OVS Feature: ct_zero_snat, state: supported
Dec 05 09:03:03 compute-1 ovn_controller[95809]: 2025-12-05T09:03:03Z|00011|features|INFO|OVS Feature: ct_flush, state: supported
Dec 05 09:03:03 compute-1 ovn_controller[95809]: 2025-12-05T09:03:03Z|00012|features|INFO|OVS Feature: dp_hash_l4_sym_support, state: supported
Dec 05 09:03:03 compute-1 ovn_controller[95809]: 2025-12-05T09:03:03Z|00013|reconnect|INFO|unix:/run/openvswitch/db.sock: connecting...
Dec 05 09:03:03 compute-1 ovn_controller[95809]: 2025-12-05T09:03:03Z|00014|main|INFO|OVS feature set changed, force recompute.
Dec 05 09:03:03 compute-1 ovn_controller[95809]: 2025-12-05T09:03:03Z|00015|ofctrl|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Dec 05 09:03:03 compute-1 ovn_controller[95809]: 2025-12-05T09:03:03Z|00016|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Dec 05 09:03:03 compute-1 ovn_controller[95809]: 2025-12-05T09:03:03Z|00017|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Dec 05 09:03:03 compute-1 ovn_controller[95809]: 2025-12-05T09:03:03Z|00018|ofctrl|INFO|ofctrl-wait-before-clear is now 8000 ms (was 0 ms)
Dec 05 09:03:03 compute-1 ovn_controller[95809]: 2025-12-05T09:03:03Z|00019|main|INFO|OVS OpenFlow connection reconnected,force recompute.
Dec 05 09:03:03 compute-1 ovn_controller[95809]: 2025-12-05T09:03:03Z|00020|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Dec 05 09:03:03 compute-1 ovn_controller[95809]: 2025-12-05T09:03:03Z|00021|reconnect|INFO|unix:/run/openvswitch/db.sock: connected
Dec 05 09:03:03 compute-1 ovn_controller[95809]: 2025-12-05T09:03:03Z|00022|main|INFO|OVS feature set changed, force recompute.
Dec 05 09:03:03 compute-1 ovn_controller[95809]: 2025-12-05T09:03:03Z|00023|features|INFO|OVS DB schema supports 4 flow table prefixes, our IDL supports: 4
Dec 05 09:03:03 compute-1 ovn_controller[95809]: 2025-12-05T09:03:03Z|00024|main|INFO|Setting flow table prefixes: ip_src, ip_dst, ipv6_src, ipv6_dst.
Dec 05 09:03:03 compute-1 ovn_controller[95809]: 2025-12-05T09:03:03Z|00001|statctrl(ovn_statctrl3)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Dec 05 09:03:03 compute-1 ovn_controller[95809]: 2025-12-05T09:03:03Z|00001|pinctrl(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Dec 05 09:03:03 compute-1 ovn_controller[95809]: 2025-12-05T09:03:03Z|00002|rconn(ovn_statctrl3)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Dec 05 09:03:03 compute-1 ovn_controller[95809]: 2025-12-05T09:03:03Z|00002|rconn(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Dec 05 09:03:03 compute-1 ovn_controller[95809]: 2025-12-05T09:03:03Z|00003|rconn(ovn_statctrl3)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Dec 05 09:03:03 compute-1 ovn_controller[95809]: 2025-12-05T09:03:03Z|00003|rconn(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Dec 05 09:03:03 compute-1 NetworkManager[55704]: <info>  [1764925383.1056] manager: (ovn-0e9f3e-0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/17)
Dec 05 09:03:03 compute-1 NetworkManager[55704]: <info>  [1764925383.1074] manager: (ovn-27a42d-0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/18)
Dec 05 09:03:03 compute-1 NetworkManager[55704]: <info>  [1764925383.1081] manager: (ovn-4a37a0-0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/19)
Dec 05 09:03:03 compute-1 systemd-udevd[95945]: Network interface NamePolicy= disabled on kernel command line.
Dec 05 09:03:03 compute-1 kernel: genev_sys_6081: entered promiscuous mode
Dec 05 09:03:03 compute-1 systemd-udevd[95946]: Network interface NamePolicy= disabled on kernel command line.
Dec 05 09:03:03 compute-1 NetworkManager[55704]: <info>  [1764925383.1256] device (genev_sys_6081): carrier: link connected
Dec 05 09:03:03 compute-1 NetworkManager[55704]: <info>  [1764925383.1259] manager: (genev_sys_6081): new Generic device (/org/freedesktop/NetworkManager/Devices/20)
Dec 05 09:03:03 compute-1 python3.9[96075]: ansible-ansible.builtin.slurp Invoked with src=/var/lib/edpm-config/deployed_services.yaml
Dec 05 09:03:04 compute-1 sudo[96225]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cydfkdcdaljbssbfaxmfxxsudqwsttdt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925384.4262702-1857-225530027133996/AnsiballZ_stat.py'
Dec 05 09:03:04 compute-1 sudo[96225]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:03:04 compute-1 python3.9[96227]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/deployed_services.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 09:03:04 compute-1 sudo[96225]: pam_unix(sudo:session): session closed for user root
Dec 05 09:03:05 compute-1 sudo[96350]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yumaoltpqivhmasdemdhbvxsliqkqobk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925384.4262702-1857-225530027133996/AnsiballZ_copy.py'
Dec 05 09:03:05 compute-1 sudo[96350]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:03:05 compute-1 python3.9[96352]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/deployed_services.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764925384.4262702-1857-225530027133996/.source.yaml _original_basename=.91s6k795 follow=False checksum=45f167ec2f7695cf97f26f5af6bb6cda689487ab backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 09:03:05 compute-1 sudo[96350]: pam_unix(sudo:session): session closed for user root
Dec 05 09:03:06 compute-1 sudo[96502]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pcoxspgrzzitoqlvcdoedwhwmejtutct ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925385.784468-1902-99028922550764/AnsiballZ_command.py'
Dec 05 09:03:06 compute-1 sudo[96502]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:03:06 compute-1 python3.9[96504]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl remove open . other_config hw-offload
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 05 09:03:06 compute-1 ovs-vsctl[96505]: ovs|00001|vsctl|INFO|Called as ovs-vsctl remove open . other_config hw-offload
Dec 05 09:03:06 compute-1 sudo[96502]: pam_unix(sudo:session): session closed for user root
Dec 05 09:03:06 compute-1 sshd-session[96280]: Received disconnect from 43.225.158.169 port 49385:11: Bye Bye [preauth]
Dec 05 09:03:06 compute-1 sshd-session[96280]: Disconnected from authenticating user root 43.225.158.169 port 49385 [preauth]
Dec 05 09:03:06 compute-1 sudo[96655]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-loiglkutvfopmiysuefplnjmzowhcepx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925386.561758-1926-169823235347243/AnsiballZ_command.py'
Dec 05 09:03:06 compute-1 sudo[96655]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:03:07 compute-1 python3.9[96657]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl get Open_vSwitch . external_ids:ovn-cms-options | sed 's/\"//g'
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 05 09:03:07 compute-1 ovs-vsctl[96659]: ovs|00001|db_ctl_base|ERR|no key "ovn-cms-options" in Open_vSwitch record "." column external_ids
Dec 05 09:03:07 compute-1 sudo[96655]: pam_unix(sudo:session): session closed for user root
Dec 05 09:03:07 compute-1 sudo[96810]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ifkidonjvsbnuuohhdilzsbbsmbjpcrl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925387.6840122-1968-78982859671985/AnsiballZ_command.py'
Dec 05 09:03:07 compute-1 sudo[96810]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:03:08 compute-1 python3.9[96812]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl remove Open_vSwitch . external_ids ovn-cms-options
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 05 09:03:08 compute-1 ovs-vsctl[96813]: ovs|00001|vsctl|INFO|Called as ovs-vsctl remove Open_vSwitch . external_ids ovn-cms-options
Dec 05 09:03:08 compute-1 sudo[96810]: pam_unix(sudo:session): session closed for user root
Dec 05 09:03:08 compute-1 sshd-session[85156]: Connection closed by 192.168.122.30 port 54584
Dec 05 09:03:08 compute-1 sshd-session[85153]: pam_unix(sshd:session): session closed for user zuul
Dec 05 09:03:08 compute-1 systemd[1]: session-21.scope: Deactivated successfully.
Dec 05 09:03:08 compute-1 systemd[1]: session-21.scope: Consumed 49.307s CPU time.
Dec 05 09:03:08 compute-1 systemd-logind[807]: Session 21 logged out. Waiting for processes to exit.
Dec 05 09:03:08 compute-1 systemd-logind[807]: Removed session 21.
Dec 05 09:03:13 compute-1 systemd[1]: Stopping User Manager for UID 0...
Dec 05 09:03:13 compute-1 systemd[95851]: Activating special unit Exit the Session...
Dec 05 09:03:13 compute-1 systemd[95851]: Stopped target Main User Target.
Dec 05 09:03:13 compute-1 systemd[95851]: Stopped target Basic System.
Dec 05 09:03:13 compute-1 systemd[95851]: Stopped target Paths.
Dec 05 09:03:13 compute-1 systemd[95851]: Stopped target Sockets.
Dec 05 09:03:13 compute-1 systemd[95851]: Stopped target Timers.
Dec 05 09:03:13 compute-1 systemd[95851]: Stopped Daily Cleanup of User's Temporary Directories.
Dec 05 09:03:13 compute-1 systemd[95851]: Closed D-Bus User Message Bus Socket.
Dec 05 09:03:13 compute-1 systemd[95851]: Stopped Create User's Volatile Files and Directories.
Dec 05 09:03:13 compute-1 systemd[95851]: Removed slice User Application Slice.
Dec 05 09:03:13 compute-1 systemd[95851]: Reached target Shutdown.
Dec 05 09:03:13 compute-1 systemd[95851]: Finished Exit the Session.
Dec 05 09:03:13 compute-1 systemd[95851]: Reached target Exit the Session.
Dec 05 09:03:13 compute-1 systemd[1]: user@0.service: Deactivated successfully.
Dec 05 09:03:13 compute-1 systemd[1]: Stopped User Manager for UID 0.
Dec 05 09:03:13 compute-1 systemd[1]: Stopping User Runtime Directory /run/user/0...
Dec 05 09:03:13 compute-1 systemd[1]: run-user-0.mount: Deactivated successfully.
Dec 05 09:03:13 compute-1 systemd[1]: user-runtime-dir@0.service: Deactivated successfully.
Dec 05 09:03:13 compute-1 systemd[1]: Stopped User Runtime Directory /run/user/0.
Dec 05 09:03:13 compute-1 systemd[1]: Removed slice User Slice of UID 0.
Dec 05 09:03:14 compute-1 sshd-session[96841]: Accepted publickey for zuul from 192.168.122.30 port 36040 ssh2: ECDSA SHA256:xFeNApY160LxGIiSPyA1pr6/OAQjwgC1t0EAbYvSE3Q
Dec 05 09:03:14 compute-1 systemd-logind[807]: New session 23 of user zuul.
Dec 05 09:03:14 compute-1 systemd[1]: Started Session 23 of User zuul.
Dec 05 09:03:14 compute-1 sshd-session[96841]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 05 09:03:15 compute-1 python3.9[96994]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 05 09:03:16 compute-1 sudo[97148]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jvmenhhdemozolhjuulqkgbfevsjpomj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925396.3380513-63-135204700100217/AnsiballZ_file.py'
Dec 05 09:03:16 compute-1 sudo[97148]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:03:16 compute-1 python3.9[97150]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/var/lib/openstack/neutron-ovn-metadata-agent setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Dec 05 09:03:17 compute-1 sudo[97148]: pam_unix(sudo:session): session closed for user root
Dec 05 09:03:17 compute-1 sudo[97300]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lwdgvxiagznsnntzzovyxxupqnqrewsh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925397.143861-63-53910229763899/AnsiballZ_file.py'
Dec 05 09:03:17 compute-1 sudo[97300]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:03:17 compute-1 python3.9[97302]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 05 09:03:17 compute-1 sudo[97300]: pam_unix(sudo:session): session closed for user root
Dec 05 09:03:18 compute-1 sudo[97452]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-afgtysukcttcrafoacgaxjzbtnqxyrua ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925397.9219215-63-158319592569401/AnsiballZ_file.py'
Dec 05 09:03:18 compute-1 sudo[97452]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:03:18 compute-1 python3.9[97454]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/kill_scripts setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 05 09:03:18 compute-1 sudo[97452]: pam_unix(sudo:session): session closed for user root
Dec 05 09:03:18 compute-1 sudo[97604]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oiurelpffcsobzqzjjrgepktlqlmpuzj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925398.5864656-63-274681549693940/AnsiballZ_file.py'
Dec 05 09:03:18 compute-1 sudo[97604]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:03:19 compute-1 python3.9[97606]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/ovn-metadata-proxy setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 05 09:03:19 compute-1 sudo[97604]: pam_unix(sudo:session): session closed for user root
Dec 05 09:03:19 compute-1 sudo[97756]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tutnafrgkrhzvekktrcxhrmdqjafsmyb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925399.253128-63-4157954250340/AnsiballZ_file.py'
Dec 05 09:03:19 compute-1 sudo[97756]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:03:19 compute-1 python3.9[97758]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/external/pids setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 05 09:03:19 compute-1 sudo[97756]: pam_unix(sudo:session): session closed for user root
Dec 05 09:03:20 compute-1 python3.9[97908]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 05 09:03:21 compute-1 sudo[98059]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-piunwsouysckughvvarxvwawhbxonszp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925401.0518444-195-254974351611544/AnsiballZ_seboolean.py'
Dec 05 09:03:21 compute-1 sudo[98059]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:03:21 compute-1 python3.9[98061]: ansible-ansible.posix.seboolean Invoked with name=virt_sandbox_use_netlink persistent=True state=True ignore_selinux_state=False
Dec 05 09:03:22 compute-1 sudo[98059]: pam_unix(sudo:session): session closed for user root
Dec 05 09:03:23 compute-1 python3.9[98211]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/ovn_metadata_haproxy_wrapper follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 09:03:24 compute-1 python3.9[98332]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/neutron/ovn_metadata_haproxy_wrapper mode=0755 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764925402.7914443-219-261221216540014/.source follow=False _original_basename=haproxy.j2 checksum=a5072e7b19ca96a1f495d94f97f31903737cfd27 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 05 09:03:25 compute-1 python3.9[98482]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/kill_scripts/haproxy-kill follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 09:03:26 compute-1 python3.9[98603]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/neutron/kill_scripts/haproxy-kill mode=0755 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764925405.0936837-265-156602328495914/.source follow=False _original_basename=kill-script.j2 checksum=2dfb5489f491f61b95691c3bf95fa1fe48ff3700 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 05 09:03:26 compute-1 sudo[98753]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wregaarkurzezcanyrdepazmnrpkucxk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925406.540212-315-91992668752807/AnsiballZ_setup.py'
Dec 05 09:03:26 compute-1 sudo[98753]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:03:27 compute-1 python3.9[98755]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec 05 09:03:27 compute-1 sudo[98753]: pam_unix(sudo:session): session closed for user root
Dec 05 09:03:27 compute-1 sudo[98837]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vgfxxzbzqotoebsptqriiopnwkbrnche ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925406.540212-315-91992668752807/AnsiballZ_dnf.py'
Dec 05 09:03:27 compute-1 sudo[98837]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:03:28 compute-1 python3.9[98839]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec 05 09:03:29 compute-1 sudo[98837]: pam_unix(sudo:session): session closed for user root
Dec 05 09:03:30 compute-1 sudo[98990]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vpjlgnpwzuwffuevommfujfrtexhlgan ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925410.1129951-351-85922830152345/AnsiballZ_systemd.py'
Dec 05 09:03:30 compute-1 sudo[98990]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:03:31 compute-1 python3.9[98992]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Dec 05 09:03:31 compute-1 sudo[98990]: pam_unix(sudo:session): session closed for user root
Dec 05 09:03:31 compute-1 python3.9[99145]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/neutron-ovn-metadata-agent/01-rootwrap.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 09:03:32 compute-1 python3.9[99266]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/neutron-ovn-metadata-agent/01-rootwrap.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764925411.396142-375-26743862216334/.source.conf follow=False _original_basename=rootwrap.conf.j2 checksum=11f2cfb4b7d97b2cef3c2c2d88089e6999cffe22 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 05 09:03:32 compute-1 ovn_controller[95809]: 2025-12-05T09:03:32Z|00025|memory|INFO|16128 kB peak resident set size after 29.9 seconds
Dec 05 09:03:32 compute-1 ovn_controller[95809]: 2025-12-05T09:03:32Z|00026|memory|INFO|idl-cells-OVN_Southbound:273 idl-cells-Open_vSwitch:585 ofctrl_desired_flow_usage-KB:7 ofctrl_installed_flow_usage-KB:5 ofctrl_sb_flow_ref_usage-KB:3
Dec 05 09:03:33 compute-1 podman[99390]: 2025-12-05 09:03:33.003259804 +0000 UTC m=+0.134545665 container health_status 0cdc951f3b455a1459471b20e1f1626ca53f702831c125f835d1b670aeb17ccf (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a42d21893a42142b0b00e96296a13ac9d2c0fedf55d6438ad104fd0309c63376-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller)
Dec 05 09:03:33 compute-1 python3.9[99428]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/neutron-ovn-metadata-agent/01-neutron-ovn-metadata-agent.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 09:03:33 compute-1 python3.9[99562]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/neutron-ovn-metadata-agent/01-neutron-ovn-metadata-agent.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764925412.5722733-375-270621590096156/.source.conf follow=False _original_basename=neutron-ovn-metadata-agent.conf.j2 checksum=8bc979abbe81c2cf3993a225517a7e2483e20443 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 05 09:03:35 compute-1 python3.9[99712]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/neutron-ovn-metadata-agent/10-neutron-metadata.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 09:03:35 compute-1 python3.9[99833]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/neutron-ovn-metadata-agent/10-neutron-metadata.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764925414.9341817-507-68542568491522/.source.conf _original_basename=10-neutron-metadata.conf follow=False checksum=ca7d4d155f5b812fab1a3b70e34adb495d291b8d backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 05 09:03:36 compute-1 python3.9[99983]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/neutron-ovn-metadata-agent/05-nova-metadata.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 09:03:37 compute-1 python3.9[100104]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/neutron-ovn-metadata-agent/05-nova-metadata.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764925416.174455-507-156690362296727/.source.conf _original_basename=05-nova-metadata.conf follow=False checksum=a14d6b38898a379cd37fc0bf365d17f10859446f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 05 09:03:38 compute-1 python3.9[100254]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 05 09:03:38 compute-1 sudo[100406]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-brcqgpztuobgsessgetjgxaksvzcdvoe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925418.4251497-621-233221058226386/AnsiballZ_file.py'
Dec 05 09:03:38 compute-1 sudo[100406]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:03:38 compute-1 python3.9[100408]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 05 09:03:38 compute-1 sudo[100406]: pam_unix(sudo:session): session closed for user root
Dec 05 09:03:39 compute-1 sudo[100558]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xkkxewnrjsctxzaspmedsihbwzrinfvr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925419.1205423-645-133845310674764/AnsiballZ_stat.py'
Dec 05 09:03:39 compute-1 sudo[100558]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:03:39 compute-1 python3.9[100560]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 09:03:39 compute-1 sudo[100558]: pam_unix(sudo:session): session closed for user root
Dec 05 09:03:40 compute-1 sudo[100636]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fxrpsguhihukxhjpnfpdbshpiwkcowpy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925419.1205423-645-133845310674764/AnsiballZ_file.py'
Dec 05 09:03:40 compute-1 sudo[100636]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:03:40 compute-1 python3.9[100638]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 05 09:03:40 compute-1 sudo[100636]: pam_unix(sudo:session): session closed for user root
Dec 05 09:03:40 compute-1 sudo[100788]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uorekwvudepnhcggdswybzpmqxqhpspt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925420.5593529-645-138874113250927/AnsiballZ_stat.py'
Dec 05 09:03:40 compute-1 sudo[100788]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:03:41 compute-1 python3.9[100790]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 09:03:41 compute-1 sudo[100788]: pam_unix(sudo:session): session closed for user root
Dec 05 09:03:41 compute-1 sudo[100866]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bcupwzfilkcrliwcywbirrekkdfcjnvh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925420.5593529-645-138874113250927/AnsiballZ_file.py'
Dec 05 09:03:41 compute-1 sudo[100866]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:03:41 compute-1 python3.9[100868]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 05 09:03:41 compute-1 sudo[100866]: pam_unix(sudo:session): session closed for user root
Dec 05 09:03:42 compute-1 sudo[101020]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vetdosnawepumdaglglqhnvjyqhrqtxu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925421.7412977-714-255518929075297/AnsiballZ_file.py'
Dec 05 09:03:42 compute-1 sudo[101020]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:03:42 compute-1 python3.9[101022]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 09:03:42 compute-1 sudo[101020]: pam_unix(sudo:session): session closed for user root
Dec 05 09:03:42 compute-1 sudo[101172]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jfcgpssloykakziocklwdkavqdvalqwz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925422.4682908-738-243130571847830/AnsiballZ_stat.py'
Dec 05 09:03:42 compute-1 sudo[101172]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:03:42 compute-1 python3.9[101174]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 09:03:42 compute-1 sudo[101172]: pam_unix(sudo:session): session closed for user root
Dec 05 09:03:43 compute-1 sudo[101250]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uxqpdwqwmrvhydsoajopfkpladrqvoji ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925422.4682908-738-243130571847830/AnsiballZ_file.py'
Dec 05 09:03:43 compute-1 sudo[101250]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:03:43 compute-1 python3.9[101252]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 09:03:43 compute-1 sudo[101250]: pam_unix(sudo:session): session closed for user root
Dec 05 09:03:44 compute-1 sshd-session[100893]: Connection reset by authenticating user root 91.202.233.33 port 49300 [preauth]
Dec 05 09:03:44 compute-1 sudo[101404]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wudwlbvflywpftgzxptajqurblvwfhdy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925423.849242-774-64768273932309/AnsiballZ_stat.py'
Dec 05 09:03:44 compute-1 sudo[101404]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:03:44 compute-1 python3.9[101406]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 09:03:44 compute-1 sudo[101404]: pam_unix(sudo:session): session closed for user root
Dec 05 09:03:44 compute-1 sudo[101484]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-agkssafnnzptdkomclfqskvfnntxjcjk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925423.849242-774-64768273932309/AnsiballZ_file.py'
Dec 05 09:03:44 compute-1 sudo[101484]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:03:44 compute-1 sshd-session[101253]: Received disconnect from 185.118.15.236 port 34180:11: Bye Bye [preauth]
Dec 05 09:03:44 compute-1 sshd-session[101253]: Disconnected from authenticating user root 185.118.15.236 port 34180 [preauth]
Dec 05 09:03:44 compute-1 python3.9[101486]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 09:03:44 compute-1 sudo[101484]: pam_unix(sudo:session): session closed for user root
Dec 05 09:03:45 compute-1 sudo[101636]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-njymjfifyqenlksldmvteabsmwuykkmj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925425.0127144-810-113111984818830/AnsiballZ_systemd.py'
Dec 05 09:03:45 compute-1 sudo[101636]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:03:45 compute-1 python3.9[101638]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 05 09:03:45 compute-1 systemd[1]: Reloading.
Dec 05 09:03:45 compute-1 systemd-rc-local-generator[101665]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 05 09:03:45 compute-1 systemd-sysv-generator[101670]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 05 09:03:45 compute-1 sudo[101636]: pam_unix(sudo:session): session closed for user root
Dec 05 09:03:46 compute-1 sshd-session[101407]: Connection reset by authenticating user root 91.202.233.33 port 49304 [preauth]
Dec 05 09:03:46 compute-1 sudo[101827]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zqyasjujywvwlsocwafrxigqzgjlmsek ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925426.168337-834-256254046280709/AnsiballZ_stat.py'
Dec 05 09:03:46 compute-1 sudo[101827]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:03:46 compute-1 python3.9[101829]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 09:03:46 compute-1 sudo[101827]: pam_unix(sudo:session): session closed for user root
Dec 05 09:03:46 compute-1 sudo[101906]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gteyethanhwujukwtlqcisfhjchckizs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925426.168337-834-256254046280709/AnsiballZ_file.py'
Dec 05 09:03:46 compute-1 sudo[101906]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:03:47 compute-1 python3.9[101908]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 09:03:47 compute-1 sudo[101906]: pam_unix(sudo:session): session closed for user root
Dec 05 09:03:47 compute-1 sudo[102058]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-njepxfavyqpjrerzkmfajduvyzvawkxn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925427.3730156-870-86256476299095/AnsiballZ_stat.py'
Dec 05 09:03:47 compute-1 sudo[102058]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:03:47 compute-1 python3.9[102060]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 09:03:47 compute-1 sudo[102058]: pam_unix(sudo:session): session closed for user root
Dec 05 09:03:48 compute-1 sudo[102136]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dvtujgdqjlijwfcynrqnxuscnfwgfirp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925427.3730156-870-86256476299095/AnsiballZ_file.py'
Dec 05 09:03:48 compute-1 sudo[102136]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:03:48 compute-1 python3.9[102138]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 09:03:48 compute-1 sudo[102136]: pam_unix(sudo:session): session closed for user root
Dec 05 09:03:48 compute-1 sudo[102288]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zqpqwcwmdobrrwpedynkebbxoxrvstuj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925428.5442152-906-67425708146179/AnsiballZ_systemd.py'
Dec 05 09:03:48 compute-1 sudo[102288]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:03:48 compute-1 sshd-session[101817]: Connection reset by authenticating user root 91.202.233.33 port 49318 [preauth]
Dec 05 09:03:49 compute-1 python3.9[102290]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 05 09:03:49 compute-1 systemd[1]: Reloading.
Dec 05 09:03:49 compute-1 systemd-sysv-generator[102321]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 05 09:03:49 compute-1 systemd-rc-local-generator[102318]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 05 09:03:49 compute-1 systemd[1]: Starting Create netns directory...
Dec 05 09:03:49 compute-1 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Dec 05 09:03:49 compute-1 systemd[1]: netns-placeholder.service: Deactivated successfully.
Dec 05 09:03:49 compute-1 systemd[1]: Finished Create netns directory.
Dec 05 09:03:49 compute-1 sudo[102288]: pam_unix(sudo:session): session closed for user root
Dec 05 09:03:50 compute-1 sudo[102482]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vzkfmdvxhszstqrrjjjihzfuypbfhdhb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925429.9393032-936-130979989082940/AnsiballZ_file.py'
Dec 05 09:03:50 compute-1 sudo[102482]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:03:50 compute-1 python3.9[102484]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 05 09:03:50 compute-1 sudo[102482]: pam_unix(sudo:session): session closed for user root
Dec 05 09:03:50 compute-1 sudo[102634]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tuddctbhmtakzpyyagqwxujupiilcdxs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925430.643022-960-260499838130423/AnsiballZ_stat.py'
Dec 05 09:03:50 compute-1 sudo[102634]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:03:51 compute-1 python3.9[102636]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ovn_metadata_agent/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 09:03:51 compute-1 sudo[102634]: pam_unix(sudo:session): session closed for user root
Dec 05 09:03:51 compute-1 sudo[102757]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bgdmzgdmsdywvlndqkrugvxrrwzmwinf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925430.643022-960-260499838130423/AnsiballZ_copy.py'
Dec 05 09:03:51 compute-1 sudo[102757]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:03:51 compute-1 sshd-session[102291]: Connection reset by authenticating user root 91.202.233.33 port 49328 [preauth]
Dec 05 09:03:51 compute-1 python3.9[102759]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ovn_metadata_agent/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764925430.643022-960-260499838130423/.source _original_basename=healthcheck follow=False checksum=898a5a1fcd473cf731177fc866e3bd7ebf20a131 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Dec 05 09:03:51 compute-1 sudo[102757]: pam_unix(sudo:session): session closed for user root
Dec 05 09:03:52 compute-1 sudo[102911]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pwhjwsxyapoqdtzoskngmgrnifsigqss ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925432.2315204-1011-158164363297228/AnsiballZ_file.py'
Dec 05 09:03:52 compute-1 sudo[102911]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:03:52 compute-1 python3.9[102913]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/edpm-config recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 09:03:52 compute-1 sudo[102911]: pam_unix(sudo:session): session closed for user root
Dec 05 09:03:53 compute-1 sudo[103063]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mrjtxfjhhncptcvjjmdtohcgnaelyjgb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925432.932485-1035-264009912061065/AnsiballZ_file.py'
Dec 05 09:03:53 compute-1 sudo[103063]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:03:53 compute-1 python3.9[103065]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 05 09:03:53 compute-1 sudo[103063]: pam_unix(sudo:session): session closed for user root
Dec 05 09:03:53 compute-1 sshd-session[102784]: Connection reset by authenticating user root 91.202.233.33 port 57492 [preauth]
Dec 05 09:03:53 compute-1 sudo[103215]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zfyjxbkgtnbyecnfyziqnvgzltpahpqg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925433.580465-1059-212288807504040/AnsiballZ_stat.py'
Dec 05 09:03:53 compute-1 sudo[103215]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:03:54 compute-1 python3.9[103217]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/ovn_metadata_agent.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 09:03:54 compute-1 sudo[103215]: pam_unix(sudo:session): session closed for user root
Dec 05 09:03:54 compute-1 sudo[103338]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ubvanxunhyskllwitxbqwykbsopmzzir ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925433.580465-1059-212288807504040/AnsiballZ_copy.py'
Dec 05 09:03:54 compute-1 sudo[103338]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:03:54 compute-1 python3.9[103340]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/ovn_metadata_agent.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1764925433.580465-1059-212288807504040/.source.json _original_basename=.8e1rljpn follow=False checksum=a908ef151ded3a33ae6c9ac8be72a35e5e33b9dc backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 09:03:54 compute-1 sudo[103338]: pam_unix(sudo:session): session closed for user root
Dec 05 09:03:55 compute-1 python3.9[103490]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 09:03:57 compute-1 sudo[103911]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mjcefzuxlsdzmmvviefmhgttcstpvxvh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925436.9773698-1179-32906670957960/AnsiballZ_container_config_data.py'
Dec 05 09:03:57 compute-1 sudo[103911]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:03:57 compute-1 python3.9[103913]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent config_pattern=*.json debug=False
Dec 05 09:03:57 compute-1 sudo[103911]: pam_unix(sudo:session): session closed for user root
Dec 05 09:03:58 compute-1 sudo[104063]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-arxaaxbcywswrjiibzjrhmirzixprokl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925438.0527916-1212-123231226979330/AnsiballZ_container_config_hash.py'
Dec 05 09:03:58 compute-1 sudo[104063]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:03:58 compute-1 python3.9[104065]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/openstack
Dec 05 09:03:58 compute-1 sudo[104063]: pam_unix(sudo:session): session closed for user root
Dec 05 09:03:59 compute-1 sudo[104215]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gjijtaljefwdwhxhopvoinrliwpzsuhk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925439.0561445-1239-237073276363706/AnsiballZ_podman_container_info.py'
Dec 05 09:03:59 compute-1 sudo[104215]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:03:59 compute-1 python3.9[104217]: ansible-containers.podman.podman_container_info Invoked with executable=podman name=None
Dec 05 09:03:59 compute-1 sudo[104215]: pam_unix(sudo:session): session closed for user root
Dec 05 09:04:01 compute-1 sudo[104395]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nlwhnynmwzirxqtyqlhadevideawxeme ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1764925440.5941377-1278-273027937680777/AnsiballZ_edpm_container_manage.py'
Dec 05 09:04:01 compute-1 sudo[104395]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:04:01 compute-1 python3[104397]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent config_id=ovn_metadata_agent config_overrides={} config_patterns=*.json containers=['ovn_metadata_agent'] log_base_path=/var/log/containers/stdouts debug=False
Dec 05 09:04:01 compute-1 podman[104432]: 2025-12-05 09:04:01.594210656 +0000 UTC m=+0.057935327 container create e8cea6352187f28e672aa25c227f2705a90d58074b8cf42235a56df5d4026fd5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a42d21893a42142b0b00e96296a13ac9d2c0fedf55d6438ad104fd0309c63376-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_metadata_agent, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 05 09:04:01 compute-1 podman[104432]: 2025-12-05 09:04:01.562247454 +0000 UTC m=+0.025972155 image pull 014dc726c85414b29f2dde7b5d875685d08784761c0f0ffa8630d1583a877bf9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Dec 05 09:04:01 compute-1 python3[104397]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name ovn_metadata_agent --cgroupns=host --conmon-pidfile /run/ovn_metadata_agent.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env EDPM_CONFIG_HASH=a42d21893a42142b0b00e96296a13ac9d2c0fedf55d6438ad104fd0309c63376-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d --healthcheck-command /openstack/healthcheck --label config_id=ovn_metadata_agent --label container_name=ovn_metadata_agent --label managed_by=edpm_ansible --label config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a42d21893a42142b0b00e96296a13ac9d2c0fedf55d6438ad104fd0309c63376-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']} --log-driver journald --log-level info --network host --pid host --privileged=True --user root --volume /run/openvswitch:/run/openvswitch:z --volume /var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z --volume /run/netns:/run/netns:shared --volume /var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/neutron:/var/lib/neutron:shared,z --volume /var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro --volume /var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro --volume /var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z --volume /var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z --volume /var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z --volume /var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Dec 05 09:04:01 compute-1 sudo[104395]: pam_unix(sudo:session): session closed for user root
Dec 05 09:04:02 compute-1 sudo[104618]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fmlbyolrnunougpmwhqqjxgsrsxqxaej ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925441.9821486-1302-276820827246914/AnsiballZ_stat.py'
Dec 05 09:04:02 compute-1 sudo[104618]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:04:02 compute-1 python3.9[104620]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 05 09:04:02 compute-1 sudo[104618]: pam_unix(sudo:session): session closed for user root
Dec 05 09:04:03 compute-1 sudo[104782]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-alwfiqasqcjeuwduhkjxdpfrisvmpwtk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925442.790733-1329-131142531609514/AnsiballZ_file.py'
Dec 05 09:04:03 compute-1 sudo[104782]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:04:03 compute-1 podman[104746]: 2025-12-05 09:04:03.166063207 +0000 UTC m=+0.118368776 container health_status 0cdc951f3b455a1459471b20e1f1626ca53f702831c125f835d1b670aeb17ccf (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a42d21893a42142b0b00e96296a13ac9d2c0fedf55d6438ad104fd0309c63376-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Dec 05 09:04:03 compute-1 python3.9[104788]: ansible-file Invoked with path=/etc/systemd/system/edpm_ovn_metadata_agent.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 09:04:03 compute-1 sudo[104782]: pam_unix(sudo:session): session closed for user root
Dec 05 09:04:03 compute-1 sudo[104871]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xxecaanwyjxeztfkyecdkdqohgnxfgkc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925442.790733-1329-131142531609514/AnsiballZ_stat.py'
Dec 05 09:04:03 compute-1 sudo[104871]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:04:03 compute-1 python3.9[104873]: ansible-stat Invoked with path=/etc/systemd/system/edpm_ovn_metadata_agent_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 05 09:04:03 compute-1 sudo[104871]: pam_unix(sudo:session): session closed for user root
Dec 05 09:04:04 compute-1 sudo[105022]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rzfpjgxlrmkzpkloexvbgdzmqzqbviia ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925443.8359659-1329-168102504949568/AnsiballZ_copy.py'
Dec 05 09:04:04 compute-1 sudo[105022]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:04:04 compute-1 python3.9[105024]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764925443.8359659-1329-168102504949568/source dest=/etc/systemd/system/edpm_ovn_metadata_agent.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 09:04:04 compute-1 sudo[105022]: pam_unix(sudo:session): session closed for user root
Dec 05 09:04:04 compute-1 sudo[105098]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qvdzioaoxevdorkfuruugjkxrhefapuh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925443.8359659-1329-168102504949568/AnsiballZ_systemd.py'
Dec 05 09:04:04 compute-1 sudo[105098]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:04:05 compute-1 python3.9[105100]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Dec 05 09:04:05 compute-1 systemd[1]: Reloading.
Dec 05 09:04:05 compute-1 systemd-sysv-generator[105131]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 05 09:04:05 compute-1 systemd-rc-local-generator[105124]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 05 09:04:05 compute-1 sudo[105098]: pam_unix(sudo:session): session closed for user root
Dec 05 09:04:05 compute-1 sudo[105209]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aroiawpnudlnczipnocmlhyoeduzbeyg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925443.8359659-1329-168102504949568/AnsiballZ_systemd.py'
Dec 05 09:04:05 compute-1 sudo[105209]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:04:05 compute-1 python3.9[105211]: ansible-systemd Invoked with state=restarted name=edpm_ovn_metadata_agent.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 05 09:04:06 compute-1 systemd[1]: Reloading.
Dec 05 09:04:06 compute-1 systemd-rc-local-generator[105239]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 05 09:04:06 compute-1 systemd-sysv-generator[105245]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 05 09:04:06 compute-1 systemd[1]: Starting ovn_metadata_agent container...
Dec 05 09:04:06 compute-1 systemd[1]: Started libcrun container.
Dec 05 09:04:06 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/61fece0c512645d01ac6c8a46f0b3f39666f2a1a889ecd0696c003893f6501e3/merged/etc/neutron.conf.d supports timestamps until 2038 (0x7fffffff)
Dec 05 09:04:06 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/61fece0c512645d01ac6c8a46f0b3f39666f2a1a889ecd0696c003893f6501e3/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 05 09:04:06 compute-1 systemd[1]: Started /usr/bin/podman healthcheck run e8cea6352187f28e672aa25c227f2705a90d58074b8cf42235a56df5d4026fd5.
Dec 05 09:04:06 compute-1 podman[105252]: 2025-12-05 09:04:06.403612852 +0000 UTC m=+0.143557211 container init e8cea6352187f28e672aa25c227f2705a90d58074b8cf42235a56df5d4026fd5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a42d21893a42142b0b00e96296a13ac9d2c0fedf55d6438ad104fd0309c63376-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Dec 05 09:04:06 compute-1 ovn_metadata_agent[105267]: + sudo -E kolla_set_configs
Dec 05 09:04:06 compute-1 podman[105252]: 2025-12-05 09:04:06.432798686 +0000 UTC m=+0.172743015 container start e8cea6352187f28e672aa25c227f2705a90d58074b8cf42235a56df5d4026fd5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a42d21893a42142b0b00e96296a13ac9d2c0fedf55d6438ad104fd0309c63376-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible)
Dec 05 09:04:06 compute-1 edpm-start-podman-container[105252]: ovn_metadata_agent
Dec 05 09:04:06 compute-1 edpm-start-podman-container[105251]: Creating additional drop-in dependency for "ovn_metadata_agent" (e8cea6352187f28e672aa25c227f2705a90d58074b8cf42235a56df5d4026fd5)
Dec 05 09:04:06 compute-1 ovn_metadata_agent[105267]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Dec 05 09:04:06 compute-1 ovn_metadata_agent[105267]: INFO:__main__:Validating config file
Dec 05 09:04:06 compute-1 ovn_metadata_agent[105267]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Dec 05 09:04:06 compute-1 ovn_metadata_agent[105267]: INFO:__main__:Copying service configuration files
Dec 05 09:04:06 compute-1 ovn_metadata_agent[105267]: INFO:__main__:Deleting /etc/neutron/rootwrap.conf
Dec 05 09:04:06 compute-1 podman[105273]: 2025-12-05 09:04:06.51029974 +0000 UTC m=+0.061916764 container health_status e8cea6352187f28e672aa25c227f2705a90d58074b8cf42235a56df5d4026fd5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a42d21893a42142b0b00e96296a13ac9d2c0fedf55d6438ad104fd0309c63376-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Dec 05 09:04:06 compute-1 ovn_metadata_agent[105267]: INFO:__main__:Copying /etc/neutron.conf.d/01-rootwrap.conf to /etc/neutron/rootwrap.conf
Dec 05 09:04:06 compute-1 ovn_metadata_agent[105267]: INFO:__main__:Setting permission for /etc/neutron/rootwrap.conf
Dec 05 09:04:06 compute-1 ovn_metadata_agent[105267]: INFO:__main__:Writing out command to execute
Dec 05 09:04:06 compute-1 ovn_metadata_agent[105267]: INFO:__main__:Setting permission for /var/lib/neutron
Dec 05 09:04:06 compute-1 ovn_metadata_agent[105267]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts
Dec 05 09:04:06 compute-1 ovn_metadata_agent[105267]: INFO:__main__:Setting permission for /var/lib/neutron/ovn-metadata-proxy
Dec 05 09:04:06 compute-1 ovn_metadata_agent[105267]: INFO:__main__:Setting permission for /var/lib/neutron/external
Dec 05 09:04:06 compute-1 ovn_metadata_agent[105267]: INFO:__main__:Setting permission for /var/lib/neutron/ovn_metadata_haproxy_wrapper
Dec 05 09:04:06 compute-1 ovn_metadata_agent[105267]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts/haproxy-kill
Dec 05 09:04:06 compute-1 ovn_metadata_agent[105267]: INFO:__main__:Setting permission for /var/lib/neutron/external/pids
Dec 05 09:04:06 compute-1 ovn_metadata_agent[105267]: ++ cat /run_command
Dec 05 09:04:06 compute-1 ovn_metadata_agent[105267]: + CMD=neutron-ovn-metadata-agent
Dec 05 09:04:06 compute-1 ovn_metadata_agent[105267]: + ARGS=
Dec 05 09:04:06 compute-1 ovn_metadata_agent[105267]: + sudo kolla_copy_cacerts
Dec 05 09:04:06 compute-1 systemd[1]: Reloading.
Dec 05 09:04:06 compute-1 ovn_metadata_agent[105267]: + [[ ! -n '' ]]
Dec 05 09:04:06 compute-1 ovn_metadata_agent[105267]: + . kolla_extend_start
Dec 05 09:04:06 compute-1 ovn_metadata_agent[105267]: Running command: 'neutron-ovn-metadata-agent'
Dec 05 09:04:06 compute-1 ovn_metadata_agent[105267]: + echo 'Running command: '\''neutron-ovn-metadata-agent'\'''
Dec 05 09:04:06 compute-1 ovn_metadata_agent[105267]: + umask 0022
Dec 05 09:04:06 compute-1 ovn_metadata_agent[105267]: + exec neutron-ovn-metadata-agent
Dec 05 09:04:06 compute-1 systemd-rc-local-generator[105339]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 05 09:04:06 compute-1 systemd-sysv-generator[105344]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 05 09:04:06 compute-1 systemd[1]: Started ovn_metadata_agent container.
Dec 05 09:04:06 compute-1 sudo[105209]: pam_unix(sudo:session): session closed for user root
Dec 05 09:04:07 compute-1 python3.9[105502]: ansible-ansible.builtin.slurp Invoked with src=/var/lib/edpm-config/deployed_services.yaml
Dec 05 09:04:08 compute-1 sudo[105652]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vwfoybdczwjukcunhqdqggqgntizzusg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925448.1306362-1452-539005155020/AnsiballZ_stat.py'
Dec 05 09:04:08 compute-1 sudo[105652]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:04:08 compute-1 python3.9[105654]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/deployed_services.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 09:04:08 compute-1 sudo[105652]: pam_unix(sudo:session): session closed for user root
Dec 05 09:04:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:08.804 105272 INFO neutron.common.config [-] Logging enabled!
Dec 05 09:04:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:08.805 105272 INFO neutron.common.config [-] /usr/bin/neutron-ovn-metadata-agent version 22.2.2.dev43
Dec 05 09:04:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:08.805 105272 DEBUG neutron.common.config [-] command line: /usr/bin/neutron-ovn-metadata-agent setup_logging /usr/lib/python3.9/site-packages/neutron/common/config.py:123
Dec 05 09:04:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:08.805 105272 DEBUG neutron.agent.ovn.metadata_agent [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Dec 05 09:04:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:08.805 105272 DEBUG neutron.agent.ovn.metadata_agent [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Dec 05 09:04:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:08.805 105272 DEBUG neutron.agent.ovn.metadata_agent [-] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Dec 05 09:04:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:08.806 105272 DEBUG neutron.agent.ovn.metadata_agent [-] config files: ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Dec 05 09:04:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:08.806 105272 DEBUG neutron.agent.ovn.metadata_agent [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Dec 05 09:04:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:08.806 105272 DEBUG neutron.agent.ovn.metadata_agent [-] agent_down_time                = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:04:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:08.806 105272 DEBUG neutron.agent.ovn.metadata_agent [-] allow_bulk                     = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:04:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:08.806 105272 DEBUG neutron.agent.ovn.metadata_agent [-] api_extensions_path            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:04:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:08.806 105272 DEBUG neutron.agent.ovn.metadata_agent [-] api_paste_config               = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:04:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:08.806 105272 DEBUG neutron.agent.ovn.metadata_agent [-] api_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:04:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:08.807 105272 DEBUG neutron.agent.ovn.metadata_agent [-] auth_ca_cert                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:04:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:08.807 105272 DEBUG neutron.agent.ovn.metadata_agent [-] auth_strategy                  = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:04:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:08.807 105272 DEBUG neutron.agent.ovn.metadata_agent [-] backlog                        = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:04:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:08.807 105272 DEBUG neutron.agent.ovn.metadata_agent [-] base_mac                       = fa:16:3e:00:00:00 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:04:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:08.807 105272 DEBUG neutron.agent.ovn.metadata_agent [-] bind_host                      = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:04:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:08.807 105272 DEBUG neutron.agent.ovn.metadata_agent [-] bind_port                      = 9696 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:04:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:08.807 105272 DEBUG neutron.agent.ovn.metadata_agent [-] client_socket_timeout          = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:04:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:08.807 105272 DEBUG neutron.agent.ovn.metadata_agent [-] config_dir                     = ['/etc/neutron.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:04:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:08.807 105272 DEBUG neutron.agent.ovn.metadata_agent [-] config_file                    = ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:04:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:08.808 105272 DEBUG neutron.agent.ovn.metadata_agent [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:04:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:08.808 105272 DEBUG neutron.agent.ovn.metadata_agent [-] control_exchange               = neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:04:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:08.808 105272 DEBUG neutron.agent.ovn.metadata_agent [-] core_plugin                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:04:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:08.808 105272 DEBUG neutron.agent.ovn.metadata_agent [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:04:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:08.808 105272 DEBUG neutron.agent.ovn.metadata_agent [-] default_availability_zones     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:04:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:08.808 105272 DEBUG neutron.agent.ovn.metadata_agent [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'OFPHandler=INFO', 'OfctlService=INFO', 'os_ken.base.app_manager=INFO', 'os_ken.controller.controller=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:04:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:08.808 105272 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_agent_notification        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:04:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:08.808 105272 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_lease_duration            = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:04:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:08.808 105272 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_load_type                 = networks log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:04:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:08.809 105272 DEBUG neutron.agent.ovn.metadata_agent [-] dns_domain                     = openstacklocal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:04:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:08.809 105272 DEBUG neutron.agent.ovn.metadata_agent [-] enable_new_agents              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:04:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:08.809 105272 DEBUG neutron.agent.ovn.metadata_agent [-] enable_traditional_dhcp        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:04:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:08.809 105272 DEBUG neutron.agent.ovn.metadata_agent [-] external_dns_driver            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:04:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:08.809 105272 DEBUG neutron.agent.ovn.metadata_agent [-] external_pids                  = /var/lib/neutron/external/pids log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:04:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:08.809 105272 DEBUG neutron.agent.ovn.metadata_agent [-] filter_validation              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:04:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:08.809 105272 DEBUG neutron.agent.ovn.metadata_agent [-] global_physnet_mtu             = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:04:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:08.810 105272 DEBUG neutron.agent.ovn.metadata_agent [-] host                           = compute-1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:04:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:08.810 105272 DEBUG neutron.agent.ovn.metadata_agent [-] http_retries                   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:04:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:08.810 105272 DEBUG neutron.agent.ovn.metadata_agent [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:04:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:08.810 105272 DEBUG neutron.agent.ovn.metadata_agent [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:04:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:08.810 105272 DEBUG neutron.agent.ovn.metadata_agent [-] ipam_driver                    = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:04:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:08.810 105272 DEBUG neutron.agent.ovn.metadata_agent [-] ipv6_pd_enabled                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:04:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:08.810 105272 DEBUG neutron.agent.ovn.metadata_agent [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:04:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:08.810 105272 DEBUG neutron.agent.ovn.metadata_agent [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:04:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:08.810 105272 DEBUG neutron.agent.ovn.metadata_agent [-] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:04:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:08.811 105272 DEBUG neutron.agent.ovn.metadata_agent [-] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:04:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:08.811 105272 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:04:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:08.811 105272 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:04:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:08.811 105272 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:04:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:08.811 105272 DEBUG neutron.agent.ovn.metadata_agent [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:04:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:08.811 105272 DEBUG neutron.agent.ovn.metadata_agent [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:04:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:08.811 105272 DEBUG neutron.agent.ovn.metadata_agent [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:04:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:08.811 105272 DEBUG neutron.agent.ovn.metadata_agent [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:04:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:08.812 105272 DEBUG neutron.agent.ovn.metadata_agent [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:04:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:08.812 105272 DEBUG neutron.agent.ovn.metadata_agent [-] max_dns_nameservers            = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:04:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:08.812 105272 DEBUG neutron.agent.ovn.metadata_agent [-] max_header_line                = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:04:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:08.812 105272 DEBUG neutron.agent.ovn.metadata_agent [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:04:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:08.812 105272 DEBUG neutron.agent.ovn.metadata_agent [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:04:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:08.812 105272 DEBUG neutron.agent.ovn.metadata_agent [-] max_subnet_host_routes         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:04:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:08.812 105272 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_backlog               = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:04:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:08.812 105272 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_group           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:04:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:08.813 105272 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_shared_secret   = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:04:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:08.813 105272 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_socket          = /var/lib/neutron/metadata_proxy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:04:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:08.813 105272 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_socket_mode     = deduce log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:04:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:08.813 105272 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_user            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:04:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:08.813 105272 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_workers               = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:04:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:08.813 105272 DEBUG neutron.agent.ovn.metadata_agent [-] network_link_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:04:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:08.814 105272 DEBUG neutron.agent.ovn.metadata_agent [-] notify_nova_on_port_data_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:04:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:08.814 105272 DEBUG neutron.agent.ovn.metadata_agent [-] notify_nova_on_port_status_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:04:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:08.814 105272 DEBUG neutron.agent.ovn.metadata_agent [-] nova_client_cert               =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:04:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:08.814 105272 DEBUG neutron.agent.ovn.metadata_agent [-] nova_client_priv_key           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:04:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:08.814 105272 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_host             = nova-metadata-internal.openstack.svc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:04:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:08.814 105272 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_insecure         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:04:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:08.815 105272 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_port             = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:04:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:08.815 105272 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_protocol         = https log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:04:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:08.815 105272 DEBUG neutron.agent.ovn.metadata_agent [-] pagination_max_limit           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:04:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:08.815 105272 DEBUG neutron.agent.ovn.metadata_agent [-] periodic_fuzzy_delay           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:04:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:08.815 105272 DEBUG neutron.agent.ovn.metadata_agent [-] periodic_interval              = 40 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:04:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:08.816 105272 DEBUG neutron.agent.ovn.metadata_agent [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:04:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:08.816 105272 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:04:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:08.816 105272 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:04:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:08.816 105272 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:04:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:08.816 105272 DEBUG neutron.agent.ovn.metadata_agent [-] retry_until_window             = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:04:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:08.816 105272 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_resources_processing_step  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:04:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:08.816 105272 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_response_max_timeout       = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:04:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:08.817 105272 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_state_report_workers       = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:04:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:08.817 105272 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:04:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:08.817 105272 DEBUG neutron.agent.ovn.metadata_agent [-] send_events_interval           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:04:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:08.817 105272 DEBUG neutron.agent.ovn.metadata_agent [-] service_plugins                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:04:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:08.817 105272 DEBUG neutron.agent.ovn.metadata_agent [-] setproctitle                   = on log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:04:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:08.817 105272 DEBUG neutron.agent.ovn.metadata_agent [-] state_path                     = /var/lib/neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:04:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:08.817 105272 DEBUG neutron.agent.ovn.metadata_agent [-] syslog_log_facility            = syslog log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:04:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:08.818 105272 DEBUG neutron.agent.ovn.metadata_agent [-] tcp_keepidle                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:04:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:08.818 105272 DEBUG neutron.agent.ovn.metadata_agent [-] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:04:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:08.818 105272 DEBUG neutron.agent.ovn.metadata_agent [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:04:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:08.818 105272 DEBUG neutron.agent.ovn.metadata_agent [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:04:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:08.818 105272 DEBUG neutron.agent.ovn.metadata_agent [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:04:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:08.818 105272 DEBUG neutron.agent.ovn.metadata_agent [-] use_ssl                        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:04:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:08.818 105272 DEBUG neutron.agent.ovn.metadata_agent [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:04:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:08.818 105272 DEBUG neutron.agent.ovn.metadata_agent [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:04:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:08.818 105272 DEBUG neutron.agent.ovn.metadata_agent [-] vlan_transparent               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:04:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:08.818 105272 DEBUG neutron.agent.ovn.metadata_agent [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:04:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:08.819 105272 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_default_pool_size         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:04:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:08.819 105272 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:04:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:08.819 105272 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_log_format                = %(client_ip)s "%(request_line)s" status: %(status_code)s  len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:04:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:08.819 105272 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_server_debug              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:04:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:08.819 105272 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:08.819 105272 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_concurrency.lock_path     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:08.819 105272 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.connection_string     = messaging:// log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:08.820 105272 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.enabled               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:08.820 105272 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_doc_type           = notification log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:08.820 105272 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_scroll_size        = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:08.820 105272 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_scroll_time        = 2m log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:08.820 105272 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.filter_error_trace    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:08.820 105272 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.hmac_keys             = SECRET_KEY log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:08.820 105272 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.sentinel_service_name = mymaster log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:08.820 105272 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.socket_timeout        = 0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:08.820 105272 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.trace_sqlalchemy      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:08.821 105272 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.enforce_new_defaults = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:08.821 105272 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.enforce_scope      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:08.821 105272 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:08.821 105272 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:08.821 105272 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:08.821 105272 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:08.821 105272 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:08.821 105272 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:08.822 105272 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:08.822 105272 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:08.822 105272 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:08.822 105272 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:08.822 105272 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:08.822 105272 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:08.822 105272 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:08.822 105272 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:08.823 105272 DEBUG neutron.agent.ovn.metadata_agent [-] service_providers.service_provider = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:08.823 105272 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.capabilities           = [21, 12, 1, 2, 19] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:08.823 105272 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.group                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:08.823 105272 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.helper_command         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:08.823 105272 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.logger_name            = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:08.823 105272 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.thread_pool_size       = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:08.823 105272 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.user                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:08.823 105272 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:08.823 105272 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.group     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:08.824 105272 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:08.824 105272 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:08.824 105272 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:08.824 105272 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.user      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:08.824 105272 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:08.824 105272 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:08.824 105272 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:08.825 105272 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:08.825 105272 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:08.825 105272 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:08.825 105272 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.capabilities = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:08.825 105272 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:08.825 105272 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:08.825 105272 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:08.825 105272 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:08.825 105272 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:08.826 105272 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:08.826 105272 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:08.826 105272 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:08.826 105272 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:08.826 105272 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:08.826 105272 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:08.826 105272 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.capabilities      = [12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:08.826 105272 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.group             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:08.826 105272 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.helper_command    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:08.827 105272 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.logger_name       = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:08.827 105272 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.thread_pool_size  = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:08.827 105272 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.user              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:08.827 105272 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.check_child_processes_action = respawn log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:08.827 105272 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.check_child_processes_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:08.827 105272 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.comment_iptables_rules   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:08.827 105272 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.debug_iptables_rules     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:08.827 105272 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.kill_scripts_path        = /etc/neutron/kill_scripts/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:08.828 105272 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.root_helper              = sudo neutron-rootwrap /etc/neutron/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:08.828 105272 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.root_helper_daemon       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:08.828 105272 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.use_helper_for_ns_read   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:08.828 105272 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.use_random_fully         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:08.828 105272 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:08.828 105272 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.default_quota           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:08.828 105272 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_driver            = neutron.db.quota.driver_nolock.DbQuotaNoLockDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:08.828 105272 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_network           = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:08.829 105272 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_port              = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:08.829 105272 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_security_group    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:08.829 105272 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_security_group_rule = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:08.829 105272 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_subnet            = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:08.829 105272 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.track_quota_usage       = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:08.829 105272 DEBUG neutron.agent.ovn.metadata_agent [-] nova.auth_section              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:08.829 105272 DEBUG neutron.agent.ovn.metadata_agent [-] nova.auth_type                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:08.829 105272 DEBUG neutron.agent.ovn.metadata_agent [-] nova.cafile                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:08.830 105272 DEBUG neutron.agent.ovn.metadata_agent [-] nova.certfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:08.830 105272 DEBUG neutron.agent.ovn.metadata_agent [-] nova.collect_timing            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:08.830 105272 DEBUG neutron.agent.ovn.metadata_agent [-] nova.endpoint_type             = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:08.830 105272 DEBUG neutron.agent.ovn.metadata_agent [-] nova.insecure                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:08.830 105272 DEBUG neutron.agent.ovn.metadata_agent [-] nova.keyfile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:08.830 105272 DEBUG neutron.agent.ovn.metadata_agent [-] nova.region_name               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:08.830 105272 DEBUG neutron.agent.ovn.metadata_agent [-] nova.split_loggers             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:08.830 105272 DEBUG neutron.agent.ovn.metadata_agent [-] nova.timeout                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:08.830 105272 DEBUG neutron.agent.ovn.metadata_agent [-] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:08.831 105272 DEBUG neutron.agent.ovn.metadata_agent [-] placement.auth_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:08.831 105272 DEBUG neutron.agent.ovn.metadata_agent [-] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:08.831 105272 DEBUG neutron.agent.ovn.metadata_agent [-] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:08.831 105272 DEBUG neutron.agent.ovn.metadata_agent [-] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:08.831 105272 DEBUG neutron.agent.ovn.metadata_agent [-] placement.endpoint_type        = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:08.831 105272 DEBUG neutron.agent.ovn.metadata_agent [-] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:08.831 105272 DEBUG neutron.agent.ovn.metadata_agent [-] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:08.831 105272 DEBUG neutron.agent.ovn.metadata_agent [-] placement.region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:08.831 105272 DEBUG neutron.agent.ovn.metadata_agent [-] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:08.832 105272 DEBUG neutron.agent.ovn.metadata_agent [-] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:08.832 105272 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:08.832 105272 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:08.832 105272 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:08.832 105272 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:08.832 105272 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:08.832 105272 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:08.833 105272 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:08.833 105272 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.enable_notifications    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:08.833 105272 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:08.833 105272 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:08.833 105272 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.interface               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:08.833 105272 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:08.833 105272 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:08.833 105272 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:08.833 105272 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:08.833 105272 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:08.834 105272 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.service_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:08.834 105272 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:08.834 105272 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:08.834 105272 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:08.834 105272 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:08.834 105272 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.valid_interfaces        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:08.834 105272 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:08.834 105272 DEBUG neutron.agent.ovn.metadata_agent [-] cli_script.dry_run             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:08.835 105272 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.allow_stateless_action_supported = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:08.835 105272 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.dhcp_default_lease_time    = 43200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:08.835 105272 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.disable_ovn_dhcp_for_baremetal_ports = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:08.835 105272 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.dns_servers                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:08.835 105272 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.enable_distributed_floating_ip = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:08.835 105272 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.neutron_sync_mode          = log log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:08.835 105272 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_dhcp4_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:08.835 105272 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_dhcp6_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:08.835 105272 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_emit_need_to_frag      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:08.836 105272 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_l3_mode                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:08.836 105272 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_l3_scheduler           = leastloaded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:08.836 105272 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_metadata_enabled       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:08.836 105272 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_ca_cert             =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:08.836 105272 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_certificate         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:08.836 105272 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_connection          = tcp:127.0.0.1:6641 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:08.836 105272 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_private_key         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:08.836 105272 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_ca_cert             = /etc/pki/tls/certs/ovndbca.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:08.836 105272 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_certificate         = /etc/pki/tls/certs/ovndb.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:08.837 105272 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_connection          = ssl:ovsdbserver-sb.openstack.svc:6642 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:08.837 105272 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_private_key         = /etc/pki/tls/private/ovndb.key log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:08.837 105272 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:08.837 105272 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_log_level            = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:08.837 105272 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_probe_interval       = 60000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:08.837 105272 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_retry_max_interval   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:08.837 105272 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.vhost_sock_dir             = /var/run/openvswitch log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:08.837 105272 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.vif_type                   = ovs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:08.837 105272 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.bridge_mac_table_size      = 50000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:08.837 105272 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.igmp_snooping_enable       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:08.838 105272 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.ovsdb_timeout              = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:08.838 105272 DEBUG neutron.agent.ovn.metadata_agent [-] ovs.ovsdb_connection           = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:08.838 105272 DEBUG neutron.agent.ovn.metadata_agent [-] ovs.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:08.838 105272 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:08.838 105272 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:08.838 105272 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:08.839 105272 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:08.839 105272 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:08.839 105272 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:08.839 105272 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:08.839 105272 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:08.839 105272 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:08.839 105272 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:08.839 105272 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:08.839 105272 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:08.840 105272 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:08.840 105272 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:08.840 105272 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:08.840 105272 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:08.840 105272 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:08.840 105272 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:08.840 105272 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:08.840 105272 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:08.841 105272 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:08.841 105272 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:08.841 105272 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:08.841 105272 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:08.841 105272 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:08.841 105272 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:08.841 105272 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:08.841 105272 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:08.842 105272 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:08.842 105272 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:08.842 105272 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:08.842 105272 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.driver = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:08.842 105272 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:08.842 105272 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:08.842 105272 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:08.842 105272 DEBUG neutron.agent.ovn.metadata_agent [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Dec 05 09:04:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:08.894 105272 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Bridge.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Dec 05 09:04:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:08.894 105272 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Port.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Dec 05 09:04:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:08.894 105272 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Interface.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Dec 05 09:04:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:08.895 105272 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: connecting...
Dec 05 09:04:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:08.895 105272 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: connected
Dec 05 09:04:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:08.909 105272 DEBUG neutron.agent.ovn.metadata.agent [-] Loaded chassis name 2deaed7a-68f6-453c-b7f8-10ef033f3762 (UUID: 2deaed7a-68f6-453c-b7f8-10ef033f3762) and ovn bridge br-int. _load_config /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:309
Dec 05 09:04:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:08.950 105272 INFO neutron.agent.ovn.metadata.ovsdb [-] Getting OvsdbSbOvnIdl for MetadataAgent with retry
Dec 05 09:04:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:08.951 105272 DEBUG ovsdbapp.backend.ovs_idl [-] Created lookup_table index Chassis.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:87
Dec 05 09:04:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:08.951 105272 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Datapath_Binding.tunnel_key autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Dec 05 09:04:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:08.951 105272 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Chassis_Private.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Dec 05 09:04:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:08.955 105272 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...
Dec 05 09:04:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:08.962 105272 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected
Dec 05 09:04:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:08.969 105272 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched CREATE: ChassisPrivateCreateEvent(events=('create',), table='Chassis_Private', conditions=(('name', '=', '2deaed7a-68f6-453c-b7f8-10ef033f3762'),), old_conditions=None), priority=20 to row=Chassis_Private(chassis=[<ovs.db.idl.Row object at 0x7fc06f54c970>], external_ids={}, name=2deaed7a-68f6-453c-b7f8-10ef033f3762, nb_cfg_timestamp=1764925391094, nb_cfg=1) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 09:04:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:08.970 105272 DEBUG neutron_lib.callbacks.manager [-] Subscribe: <bound method MetadataProxyHandler.post_fork_initialize of <neutron.agent.ovn.metadata.server.MetadataProxyHandler object at 0x7fc06f54ca90>> process after_init 55550000, False subscribe /usr/lib/python3.9/site-packages/neutron_lib/callbacks/manager.py:52
Dec 05 09:04:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:08.971 105272 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 09:04:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:08.971 105272 DEBUG oslo_concurrency.lockutils [-] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 09:04:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:08.972 105272 DEBUG oslo_concurrency.lockutils [-] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 09:04:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:08.972 105272 INFO oslo_service.service [-] Starting 1 workers
Dec 05 09:04:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:08.979 105272 DEBUG oslo_service.service [-] Started child 105739 _start_child /usr/lib/python3.9/site-packages/oslo_service/service.py:575
Dec 05 09:04:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:08.983 105739 DEBUG neutron_lib.callbacks.manager [-] Publish callbacks ['neutron.agent.ovn.metadata.server.MetadataProxyHandler.post_fork_initialize-194177'] for process (None), after_init _notify_loop /usr/lib/python3.9/site-packages/neutron_lib/callbacks/manager.py:184
Dec 05 09:04:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:08.984 105272 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.namespace_cmd', '--privsep_sock_path', '/tmp/tmpvw29zzsg/privsep.sock']
Dec 05 09:04:09 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:09.011 105739 INFO neutron.agent.ovn.metadata.ovsdb [-] Getting OvsdbSbOvnIdl for MetadataAgent with retry
Dec 05 09:04:09 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:09.011 105739 DEBUG ovsdbapp.backend.ovs_idl [-] Created lookup_table index Chassis.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:87
Dec 05 09:04:09 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:09.011 105739 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Datapath_Binding.tunnel_key autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Dec 05 09:04:09 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:09.018 105739 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...
Dec 05 09:04:09 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:09.024 105739 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected
Dec 05 09:04:09 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:09.032 105739 INFO eventlet.wsgi.server [-] (105739) wsgi starting up on http:/var/lib/neutron/metadata_proxy
Dec 05 09:04:09 compute-1 sudo[105781]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zpfhukzdkbncaspcxzteygucltcqkufm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925448.1306362-1452-539005155020/AnsiballZ_copy.py'
Dec 05 09:04:09 compute-1 sudo[105781]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:04:09 compute-1 python3.9[105783]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/deployed_services.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764925448.1306362-1452-539005155020/.source.yaml _original_basename=.dmktxp7c follow=False checksum=9613cbd75e5ca3331d8a273f71e415b309cae055 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 09:04:09 compute-1 sudo[105781]: pam_unix(sudo:session): session closed for user root
Dec 05 09:04:09 compute-1 kernel: capability: warning: `privsep-helper' uses deprecated v2 capabilities in a way that may be insecure
Dec 05 09:04:09 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:09.791 105272 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap
Dec 05 09:04:09 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:09.792 105272 DEBUG oslo.privsep.daemon [-] Accepted privsep connection to /tmp/tmpvw29zzsg/privsep.sock __init__ /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:362
Dec 05 09:04:09 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:09.595 105809 INFO oslo.privsep.daemon [-] privsep daemon starting
Dec 05 09:04:09 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:09.601 105809 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Dec 05 09:04:09 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:09.605 105809 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_SYS_ADMIN/CAP_SYS_ADMIN/none
Dec 05 09:04:09 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:09.606 105809 INFO oslo.privsep.daemon [-] privsep daemon running as pid 105809
Dec 05 09:04:09 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:09.795 105809 DEBUG oslo.privsep.daemon [-] privsep: reply[3db9396a-3986-443e-8cdf-89704dd2fb22]: (2,) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:04:09 compute-1 sshd-session[96844]: Connection closed by 192.168.122.30 port 36040
Dec 05 09:04:09 compute-1 sshd-session[96841]: pam_unix(sshd:session): session closed for user zuul
Dec 05 09:04:09 compute-1 systemd[1]: session-23.scope: Deactivated successfully.
Dec 05 09:04:09 compute-1 systemd[1]: session-23.scope: Consumed 38.006s CPU time.
Dec 05 09:04:09 compute-1 systemd-logind[807]: Session 23 logged out. Waiting for processes to exit.
Dec 05 09:04:09 compute-1 systemd-logind[807]: Removed session 23.
Dec 05 09:04:10 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:10.357 105809 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "context-manager" by "neutron_lib.db.api._create_context_manager" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:04:10 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:10.357 105809 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" acquired by "neutron_lib.db.api._create_context_manager" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:04:10 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:10.357 105809 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" "released" by "neutron_lib.db.api._create_context_manager" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:04:10 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:10.996 105809 DEBUG oslo.privsep.daemon [-] privsep: reply[ba055c10-440f-448e-b27e-32cc5729ede2]: (4, []) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:04:10 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:10.999 105272 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbAddCommand(_result=None, table=Chassis_Private, record=2deaed7a-68f6-453c-b7f8-10ef033f3762, column=external_ids, values=({'neutron:ovn-metadata-id': '46d1f1fa-1f9d-5b14-acf0-7195d1901cce'},)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 09:04:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:11.008 105272 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=2deaed7a-68f6-453c-b7f8-10ef033f3762, col_values=(('external_ids', {'neutron:ovn-bridge': 'br-int'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 09:04:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:11.015 105272 DEBUG oslo_service.service [-] Full set of CONF: wait /usr/lib/python3.9/site-packages/oslo_service/service.py:649
Dec 05 09:04:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:11.016 105272 DEBUG oslo_service.service [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Dec 05 09:04:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:11.016 105272 DEBUG oslo_service.service [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Dec 05 09:04:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:11.016 105272 DEBUG oslo_service.service [-] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Dec 05 09:04:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:11.016 105272 DEBUG oslo_service.service [-] config files: ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Dec 05 09:04:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:11.016 105272 DEBUG oslo_service.service [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Dec 05 09:04:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:11.016 105272 DEBUG oslo_service.service [-] agent_down_time                = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:04:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:11.017 105272 DEBUG oslo_service.service [-] allow_bulk                     = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:04:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:11.017 105272 DEBUG oslo_service.service [-] api_extensions_path            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:04:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:11.017 105272 DEBUG oslo_service.service [-] api_paste_config               = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:04:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:11.017 105272 DEBUG oslo_service.service [-] api_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:04:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:11.017 105272 DEBUG oslo_service.service [-] auth_ca_cert                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:04:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:11.018 105272 DEBUG oslo_service.service [-] auth_strategy                  = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:04:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:11.018 105272 DEBUG oslo_service.service [-] backlog                        = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:04:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:11.018 105272 DEBUG oslo_service.service [-] base_mac                       = fa:16:3e:00:00:00 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:04:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:11.018 105272 DEBUG oslo_service.service [-] bind_host                      = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:04:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:11.018 105272 DEBUG oslo_service.service [-] bind_port                      = 9696 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:04:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:11.018 105272 DEBUG oslo_service.service [-] client_socket_timeout          = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:04:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:11.018 105272 DEBUG oslo_service.service [-] config_dir                     = ['/etc/neutron.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:04:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:11.019 105272 DEBUG oslo_service.service [-] config_file                    = ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:04:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:11.019 105272 DEBUG oslo_service.service [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:04:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:11.019 105272 DEBUG oslo_service.service [-] control_exchange               = neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:04:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:11.019 105272 DEBUG oslo_service.service [-] core_plugin                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:04:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:11.019 105272 DEBUG oslo_service.service [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:04:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:11.019 105272 DEBUG oslo_service.service [-] default_availability_zones     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:04:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:11.020 105272 DEBUG oslo_service.service [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'OFPHandler=INFO', 'OfctlService=INFO', 'os_ken.base.app_manager=INFO', 'os_ken.controller.controller=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:04:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:11.020 105272 DEBUG oslo_service.service [-] dhcp_agent_notification        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:04:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:11.020 105272 DEBUG oslo_service.service [-] dhcp_lease_duration            = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:04:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:11.020 105272 DEBUG oslo_service.service [-] dhcp_load_type                 = networks log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:04:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:11.020 105272 DEBUG oslo_service.service [-] dns_domain                     = openstacklocal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:04:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:11.020 105272 DEBUG oslo_service.service [-] enable_new_agents              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:04:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:11.021 105272 DEBUG oslo_service.service [-] enable_traditional_dhcp        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:04:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:11.021 105272 DEBUG oslo_service.service [-] external_dns_driver            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:04:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:11.021 105272 DEBUG oslo_service.service [-] external_pids                  = /var/lib/neutron/external/pids log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:04:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:11.021 105272 DEBUG oslo_service.service [-] filter_validation              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:04:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:11.021 105272 DEBUG oslo_service.service [-] global_physnet_mtu             = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:04:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:11.021 105272 DEBUG oslo_service.service [-] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:04:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:11.022 105272 DEBUG oslo_service.service [-] host                           = compute-1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:04:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:11.022 105272 DEBUG oslo_service.service [-] http_retries                   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:04:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:11.022 105272 DEBUG oslo_service.service [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:04:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:11.022 105272 DEBUG oslo_service.service [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:04:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:11.022 105272 DEBUG oslo_service.service [-] ipam_driver                    = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:04:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:11.022 105272 DEBUG oslo_service.service [-] ipv6_pd_enabled                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:04:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:11.022 105272 DEBUG oslo_service.service [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:04:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:11.023 105272 DEBUG oslo_service.service [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:04:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:11.023 105272 DEBUG oslo_service.service [-] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:04:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:11.023 105272 DEBUG oslo_service.service [-] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:04:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:11.023 105272 DEBUG oslo_service.service [-] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:04:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:11.023 105272 DEBUG oslo_service.service [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:04:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:11.023 105272 DEBUG oslo_service.service [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:04:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:11.023 105272 DEBUG oslo_service.service [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:04:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:11.023 105272 DEBUG oslo_service.service [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:04:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:11.023 105272 DEBUG oslo_service.service [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:04:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:11.024 105272 DEBUG oslo_service.service [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:04:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:11.024 105272 DEBUG oslo_service.service [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:04:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:11.024 105272 DEBUG oslo_service.service [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:04:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:11.024 105272 DEBUG oslo_service.service [-] max_dns_nameservers            = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:04:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:11.024 105272 DEBUG oslo_service.service [-] max_header_line                = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:04:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:11.024 105272 DEBUG oslo_service.service [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:04:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:11.024 105272 DEBUG oslo_service.service [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:04:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:11.024 105272 DEBUG oslo_service.service [-] max_subnet_host_routes         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:04:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:11.024 105272 DEBUG oslo_service.service [-] metadata_backlog               = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:04:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:11.024 105272 DEBUG oslo_service.service [-] metadata_proxy_group           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:04:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:11.025 105272 DEBUG oslo_service.service [-] metadata_proxy_shared_secret   = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:04:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:11.025 105272 DEBUG oslo_service.service [-] metadata_proxy_socket          = /var/lib/neutron/metadata_proxy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:04:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:11.025 105272 DEBUG oslo_service.service [-] metadata_proxy_socket_mode     = deduce log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:04:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:11.025 105272 DEBUG oslo_service.service [-] metadata_proxy_user            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:04:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:11.025 105272 DEBUG oslo_service.service [-] metadata_workers               = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:04:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:11.025 105272 DEBUG oslo_service.service [-] network_link_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:04:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:11.025 105272 DEBUG oslo_service.service [-] notify_nova_on_port_data_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:04:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:11.025 105272 DEBUG oslo_service.service [-] notify_nova_on_port_status_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:04:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:11.025 105272 DEBUG oslo_service.service [-] nova_client_cert               =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:04:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:11.026 105272 DEBUG oslo_service.service [-] nova_client_priv_key           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:04:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:11.026 105272 DEBUG oslo_service.service [-] nova_metadata_host             = nova-metadata-internal.openstack.svc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:04:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:11.026 105272 DEBUG oslo_service.service [-] nova_metadata_insecure         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:04:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:11.026 105272 DEBUG oslo_service.service [-] nova_metadata_port             = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:04:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:11.026 105272 DEBUG oslo_service.service [-] nova_metadata_protocol         = https log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:04:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:11.026 105272 DEBUG oslo_service.service [-] pagination_max_limit           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:04:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:11.026 105272 DEBUG oslo_service.service [-] periodic_fuzzy_delay           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:04:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:11.026 105272 DEBUG oslo_service.service [-] periodic_interval              = 40 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:04:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:11.026 105272 DEBUG oslo_service.service [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:04:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:11.026 105272 DEBUG oslo_service.service [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:04:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:11.027 105272 DEBUG oslo_service.service [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:04:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:11.027 105272 DEBUG oslo_service.service [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:04:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:11.027 105272 DEBUG oslo_service.service [-] retry_until_window             = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:04:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:11.027 105272 DEBUG oslo_service.service [-] rpc_resources_processing_step  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:04:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:11.027 105272 DEBUG oslo_service.service [-] rpc_response_max_timeout       = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:04:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:11.027 105272 DEBUG oslo_service.service [-] rpc_state_report_workers       = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:04:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:11.027 105272 DEBUG oslo_service.service [-] rpc_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:04:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:11.027 105272 DEBUG oslo_service.service [-] send_events_interval           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:04:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:11.027 105272 DEBUG oslo_service.service [-] service_plugins                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:04:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:11.028 105272 DEBUG oslo_service.service [-] setproctitle                   = on log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:04:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:11.028 105272 DEBUG oslo_service.service [-] state_path                     = /var/lib/neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:04:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:11.028 105272 DEBUG oslo_service.service [-] syslog_log_facility            = syslog log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:04:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:11.028 105272 DEBUG oslo_service.service [-] tcp_keepidle                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:04:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:11.028 105272 DEBUG oslo_service.service [-] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:04:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:11.028 105272 DEBUG oslo_service.service [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:04:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:11.028 105272 DEBUG oslo_service.service [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:04:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:11.028 105272 DEBUG oslo_service.service [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:04:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:11.028 105272 DEBUG oslo_service.service [-] use_ssl                        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:04:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:11.028 105272 DEBUG oslo_service.service [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:04:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:11.029 105272 DEBUG oslo_service.service [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:04:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:11.029 105272 DEBUG oslo_service.service [-] vlan_transparent               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:04:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:11.029 105272 DEBUG oslo_service.service [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:04:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:11.029 105272 DEBUG oslo_service.service [-] wsgi_default_pool_size         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:04:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:11.029 105272 DEBUG oslo_service.service [-] wsgi_keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:04:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:11.029 105272 DEBUG oslo_service.service [-] wsgi_log_format                = %(client_ip)s "%(request_line)s" status: %(status_code)s  len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:04:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:11.029 105272 DEBUG oslo_service.service [-] wsgi_server_debug              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:04:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:11.029 105272 DEBUG oslo_service.service [-] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:11.029 105272 DEBUG oslo_service.service [-] oslo_concurrency.lock_path     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:11.030 105272 DEBUG oslo_service.service [-] profiler.connection_string     = messaging:// log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:11.030 105272 DEBUG oslo_service.service [-] profiler.enabled               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:11.030 105272 DEBUG oslo_service.service [-] profiler.es_doc_type           = notification log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:11.030 105272 DEBUG oslo_service.service [-] profiler.es_scroll_size        = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:11.030 105272 DEBUG oslo_service.service [-] profiler.es_scroll_time        = 2m log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:11.030 105272 DEBUG oslo_service.service [-] profiler.filter_error_trace    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:11.030 105272 DEBUG oslo_service.service [-] profiler.hmac_keys             = SECRET_KEY log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:11.030 105272 DEBUG oslo_service.service [-] profiler.sentinel_service_name = mymaster log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:11.031 105272 DEBUG oslo_service.service [-] profiler.socket_timeout        = 0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:11.031 105272 DEBUG oslo_service.service [-] profiler.trace_sqlalchemy      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:11.031 105272 DEBUG oslo_service.service [-] oslo_policy.enforce_new_defaults = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:11.031 105272 DEBUG oslo_service.service [-] oslo_policy.enforce_scope      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:11.031 105272 DEBUG oslo_service.service [-] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:11.031 105272 DEBUG oslo_service.service [-] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:11.031 105272 DEBUG oslo_service.service [-] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:11.031 105272 DEBUG oslo_service.service [-] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:11.032 105272 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:11.032 105272 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:11.032 105272 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:11.032 105272 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:11.032 105272 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:11.032 105272 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:11.032 105272 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:11.032 105272 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:11.032 105272 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:11.033 105272 DEBUG oslo_service.service [-] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:11.033 105272 DEBUG oslo_service.service [-] service_providers.service_provider = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:11.033 105272 DEBUG oslo_service.service [-] privsep.capabilities           = [21, 12, 1, 2, 19] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:11.033 105272 DEBUG oslo_service.service [-] privsep.group                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:11.033 105272 DEBUG oslo_service.service [-] privsep.helper_command         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:11.034 105272 DEBUG oslo_service.service [-] privsep.logger_name            = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:11.034 105272 DEBUG oslo_service.service [-] privsep.thread_pool_size       = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:11.034 105272 DEBUG oslo_service.service [-] privsep.user                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:11.034 105272 DEBUG oslo_service.service [-] privsep_dhcp_release.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:11.034 105272 DEBUG oslo_service.service [-] privsep_dhcp_release.group     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:11.034 105272 DEBUG oslo_service.service [-] privsep_dhcp_release.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:11.034 105272 DEBUG oslo_service.service [-] privsep_dhcp_release.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:11.034 105272 DEBUG oslo_service.service [-] privsep_dhcp_release.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:11.035 105272 DEBUG oslo_service.service [-] privsep_dhcp_release.user      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:11.035 105272 DEBUG oslo_service.service [-] privsep_ovs_vsctl.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:11.035 105272 DEBUG oslo_service.service [-] privsep_ovs_vsctl.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:11.035 105272 DEBUG oslo_service.service [-] privsep_ovs_vsctl.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:11.035 105272 DEBUG oslo_service.service [-] privsep_ovs_vsctl.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:11.035 105272 DEBUG oslo_service.service [-] privsep_ovs_vsctl.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:11.035 105272 DEBUG oslo_service.service [-] privsep_ovs_vsctl.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:11.035 105272 DEBUG oslo_service.service [-] privsep_namespace.capabilities = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:11.036 105272 DEBUG oslo_service.service [-] privsep_namespace.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:11.036 105272 DEBUG oslo_service.service [-] privsep_namespace.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:11.036 105272 DEBUG oslo_service.service [-] privsep_namespace.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:11.036 105272 DEBUG oslo_service.service [-] privsep_namespace.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:11.036 105272 DEBUG oslo_service.service [-] privsep_namespace.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:11.036 105272 DEBUG oslo_service.service [-] privsep_conntrack.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:11.036 105272 DEBUG oslo_service.service [-] privsep_conntrack.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:11.036 105272 DEBUG oslo_service.service [-] privsep_conntrack.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:11.036 105272 DEBUG oslo_service.service [-] privsep_conntrack.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:11.036 105272 DEBUG oslo_service.service [-] privsep_conntrack.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:11.037 105272 DEBUG oslo_service.service [-] privsep_conntrack.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:11.037 105272 DEBUG oslo_service.service [-] privsep_link.capabilities      = [12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:11.037 105272 DEBUG oslo_service.service [-] privsep_link.group             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:11.037 105272 DEBUG oslo_service.service [-] privsep_link.helper_command    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:11.037 105272 DEBUG oslo_service.service [-] privsep_link.logger_name       = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:11.037 105272 DEBUG oslo_service.service [-] privsep_link.thread_pool_size  = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:11.037 105272 DEBUG oslo_service.service [-] privsep_link.user              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:11.037 105272 DEBUG oslo_service.service [-] AGENT.check_child_processes_action = respawn log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:11.037 105272 DEBUG oslo_service.service [-] AGENT.check_child_processes_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:11.037 105272 DEBUG oslo_service.service [-] AGENT.comment_iptables_rules   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:11.038 105272 DEBUG oslo_service.service [-] AGENT.debug_iptables_rules     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:11.038 105272 DEBUG oslo_service.service [-] AGENT.kill_scripts_path        = /etc/neutron/kill_scripts/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:11.038 105272 DEBUG oslo_service.service [-] AGENT.root_helper              = sudo neutron-rootwrap /etc/neutron/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:11.038 105272 DEBUG oslo_service.service [-] AGENT.root_helper_daemon       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:11.038 105272 DEBUG oslo_service.service [-] AGENT.use_helper_for_ns_read   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:11.038 105272 DEBUG oslo_service.service [-] AGENT.use_random_fully         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:11.038 105272 DEBUG oslo_service.service [-] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:11.038 105272 DEBUG oslo_service.service [-] QUOTAS.default_quota           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:11.039 105272 DEBUG oslo_service.service [-] QUOTAS.quota_driver            = neutron.db.quota.driver_nolock.DbQuotaNoLockDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:11.039 105272 DEBUG oslo_service.service [-] QUOTAS.quota_network           = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:11.039 105272 DEBUG oslo_service.service [-] QUOTAS.quota_port              = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:11.039 105272 DEBUG oslo_service.service [-] QUOTAS.quota_security_group    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:11.039 105272 DEBUG oslo_service.service [-] QUOTAS.quota_security_group_rule = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:11.039 105272 DEBUG oslo_service.service [-] QUOTAS.quota_subnet            = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:11.039 105272 DEBUG oslo_service.service [-] QUOTAS.track_quota_usage       = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:11.040 105272 DEBUG oslo_service.service [-] nova.auth_section              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:11.040 105272 DEBUG oslo_service.service [-] nova.auth_type                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:11.040 105272 DEBUG oslo_service.service [-] nova.cafile                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:11.040 105272 DEBUG oslo_service.service [-] nova.certfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:11.040 105272 DEBUG oslo_service.service [-] nova.collect_timing            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:11.040 105272 DEBUG oslo_service.service [-] nova.endpoint_type             = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:11.040 105272 DEBUG oslo_service.service [-] nova.insecure                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:11.040 105272 DEBUG oslo_service.service [-] nova.keyfile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:11.040 105272 DEBUG oslo_service.service [-] nova.region_name               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:11.041 105272 DEBUG oslo_service.service [-] nova.split_loggers             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:11.041 105272 DEBUG oslo_service.service [-] nova.timeout                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:11.041 105272 DEBUG oslo_service.service [-] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:11.041 105272 DEBUG oslo_service.service [-] placement.auth_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:11.041 105272 DEBUG oslo_service.service [-] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:11.041 105272 DEBUG oslo_service.service [-] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:11.041 105272 DEBUG oslo_service.service [-] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:11.041 105272 DEBUG oslo_service.service [-] placement.endpoint_type        = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:11.041 105272 DEBUG oslo_service.service [-] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:11.042 105272 DEBUG oslo_service.service [-] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:11.042 105272 DEBUG oslo_service.service [-] placement.region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:11.042 105272 DEBUG oslo_service.service [-] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:11.042 105272 DEBUG oslo_service.service [-] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:11.042 105272 DEBUG oslo_service.service [-] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:11.042 105272 DEBUG oslo_service.service [-] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:11.042 105272 DEBUG oslo_service.service [-] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:11.042 105272 DEBUG oslo_service.service [-] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:11.042 105272 DEBUG oslo_service.service [-] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:11.043 105272 DEBUG oslo_service.service [-] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:11.043 105272 DEBUG oslo_service.service [-] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:11.043 105272 DEBUG oslo_service.service [-] ironic.enable_notifications    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:11.043 105272 DEBUG oslo_service.service [-] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:11.043 105272 DEBUG oslo_service.service [-] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:11.043 105272 DEBUG oslo_service.service [-] ironic.interface               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:11.043 105272 DEBUG oslo_service.service [-] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:11.043 105272 DEBUG oslo_service.service [-] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:11.043 105272 DEBUG oslo_service.service [-] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:11.044 105272 DEBUG oslo_service.service [-] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:11.044 105272 DEBUG oslo_service.service [-] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:11.044 105272 DEBUG oslo_service.service [-] ironic.service_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:11.044 105272 DEBUG oslo_service.service [-] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:11.044 105272 DEBUG oslo_service.service [-] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:11.044 105272 DEBUG oslo_service.service [-] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:11.044 105272 DEBUG oslo_service.service [-] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:11.044 105272 DEBUG oslo_service.service [-] ironic.valid_interfaces        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:11.044 105272 DEBUG oslo_service.service [-] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:11.045 105272 DEBUG oslo_service.service [-] cli_script.dry_run             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:11.045 105272 DEBUG oslo_service.service [-] ovn.allow_stateless_action_supported = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:11.045 105272 DEBUG oslo_service.service [-] ovn.dhcp_default_lease_time    = 43200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:11.045 105272 DEBUG oslo_service.service [-] ovn.disable_ovn_dhcp_for_baremetal_ports = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:11.045 105272 DEBUG oslo_service.service [-] ovn.dns_servers                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:11.045 105272 DEBUG oslo_service.service [-] ovn.enable_distributed_floating_ip = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:11.045 105272 DEBUG oslo_service.service [-] ovn.neutron_sync_mode          = log log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:11.045 105272 DEBUG oslo_service.service [-] ovn.ovn_dhcp4_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:11.045 105272 DEBUG oslo_service.service [-] ovn.ovn_dhcp6_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:11.046 105272 DEBUG oslo_service.service [-] ovn.ovn_emit_need_to_frag      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:11.046 105272 DEBUG oslo_service.service [-] ovn.ovn_l3_mode                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:11.046 105272 DEBUG oslo_service.service [-] ovn.ovn_l3_scheduler           = leastloaded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:11.046 105272 DEBUG oslo_service.service [-] ovn.ovn_metadata_enabled       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:11.046 105272 DEBUG oslo_service.service [-] ovn.ovn_nb_ca_cert             =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:11.046 105272 DEBUG oslo_service.service [-] ovn.ovn_nb_certificate         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:11.046 105272 DEBUG oslo_service.service [-] ovn.ovn_nb_connection          = tcp:127.0.0.1:6641 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:11.046 105272 DEBUG oslo_service.service [-] ovn.ovn_nb_private_key         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:11.046 105272 DEBUG oslo_service.service [-] ovn.ovn_sb_ca_cert             = /etc/pki/tls/certs/ovndbca.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:11.046 105272 DEBUG oslo_service.service [-] ovn.ovn_sb_certificate         = /etc/pki/tls/certs/ovndb.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:11.047 105272 DEBUG oslo_service.service [-] ovn.ovn_sb_connection          = ssl:ovsdbserver-sb.openstack.svc:6642 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:11.047 105272 DEBUG oslo_service.service [-] ovn.ovn_sb_private_key         = /etc/pki/tls/private/ovndb.key log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:11.047 105272 DEBUG oslo_service.service [-] ovn.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:11.047 105272 DEBUG oslo_service.service [-] ovn.ovsdb_log_level            = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:11.047 105272 DEBUG oslo_service.service [-] ovn.ovsdb_probe_interval       = 60000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:11.047 105272 DEBUG oslo_service.service [-] ovn.ovsdb_retry_max_interval   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:11.047 105272 DEBUG oslo_service.service [-] ovn.vhost_sock_dir             = /var/run/openvswitch log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:11.047 105272 DEBUG oslo_service.service [-] ovn.vif_type                   = ovs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:11.047 105272 DEBUG oslo_service.service [-] OVS.bridge_mac_table_size      = 50000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:11.047 105272 DEBUG oslo_service.service [-] OVS.igmp_snooping_enable       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:11.048 105272 DEBUG oslo_service.service [-] OVS.ovsdb_timeout              = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:11.048 105272 DEBUG oslo_service.service [-] ovs.ovsdb_connection           = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:11.048 105272 DEBUG oslo_service.service [-] ovs.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:11.048 105272 DEBUG oslo_service.service [-] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:11.048 105272 DEBUG oslo_service.service [-] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:11.048 105272 DEBUG oslo_service.service [-] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:11.048 105272 DEBUG oslo_service.service [-] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:11.048 105272 DEBUG oslo_service.service [-] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:11.048 105272 DEBUG oslo_service.service [-] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:11.049 105272 DEBUG oslo_service.service [-] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:11.049 105272 DEBUG oslo_service.service [-] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:11.049 105272 DEBUG oslo_service.service [-] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:11.049 105272 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:11.049 105272 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:11.049 105272 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:11.049 105272 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:11.049 105272 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:11.049 105272 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:11.050 105272 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:11.050 105272 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:11.050 105272 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:11.050 105272 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:11.050 105272 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:11.050 105272 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:11.050 105272 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:11.050 105272 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:11.050 105272 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:11.050 105272 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:11.051 105272 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:11.051 105272 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:11.051 105272 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:11.051 105272 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:11.051 105272 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:11.051 105272 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:11.051 105272 DEBUG oslo_service.service [-] oslo_messaging_notifications.driver = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:11.051 105272 DEBUG oslo_service.service [-] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:11.052 105272 DEBUG oslo_service.service [-] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:11.052 105272 DEBUG oslo_service.service [-] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:04:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:04:11.052 105272 DEBUG oslo_service.service [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Dec 05 09:04:15 compute-1 sshd-session[105815]: Accepted publickey for zuul from 192.168.122.30 port 59102 ssh2: ECDSA SHA256:xFeNApY160LxGIiSPyA1pr6/OAQjwgC1t0EAbYvSE3Q
Dec 05 09:04:15 compute-1 systemd-logind[807]: New session 24 of user zuul.
Dec 05 09:04:15 compute-1 systemd[1]: Started Session 24 of User zuul.
Dec 05 09:04:15 compute-1 sshd-session[105815]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 05 09:04:16 compute-1 python3.9[105968]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 05 09:04:17 compute-1 sudo[106122]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sazzxgleyizquyrqmsoxfvjlfyqkbtne ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925456.7656476-63-518140175216/AnsiballZ_command.py'
Dec 05 09:04:17 compute-1 sudo[106122]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:04:17 compute-1 python3.9[106124]: ansible-ansible.legacy.command Invoked with _raw_params=podman ps -a --filter name=^nova_virtlogd$ --format \{\{.Names\}\} _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 05 09:04:17 compute-1 sudo[106122]: pam_unix(sudo:session): session closed for user root
Dec 05 09:04:18 compute-1 sudo[106287]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kfvcybasnhpqotwizmihhotpisrscmqf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925457.9355211-96-35164610137643/AnsiballZ_systemd_service.py'
Dec 05 09:04:18 compute-1 sudo[106287]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:04:18 compute-1 python3.9[106289]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Dec 05 09:04:18 compute-1 systemd[1]: Reloading.
Dec 05 09:04:18 compute-1 systemd-sysv-generator[106320]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 05 09:04:18 compute-1 systemd-rc-local-generator[106315]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 05 09:04:19 compute-1 sudo[106287]: pam_unix(sudo:session): session closed for user root
Dec 05 09:04:20 compute-1 python3.9[106476]: ansible-ansible.builtin.service_facts Invoked
Dec 05 09:04:20 compute-1 network[106493]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Dec 05 09:04:20 compute-1 network[106494]: 'network-scripts' will be removed from distribution in near future.
Dec 05 09:04:20 compute-1 network[106495]: It is advised to switch to 'NetworkManager' instead for network management.
Dec 05 09:04:21 compute-1 sshd-session[106376]: Received disconnect from 122.168.194.41 port 49866:11: Bye Bye [preauth]
Dec 05 09:04:21 compute-1 sshd-session[106376]: Disconnected from authenticating user root 122.168.194.41 port 49866 [preauth]
Dec 05 09:04:21 compute-1 sshd-session[106501]: Received disconnect from 43.225.158.169 port 34291:11: Bye Bye [preauth]
Dec 05 09:04:21 compute-1 sshd-session[106501]: Disconnected from authenticating user root 43.225.158.169 port 34291 [preauth]
Dec 05 09:04:23 compute-1 sudo[106756]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-buazznneagbadkqrktozqtzeownliafm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925463.2507849-153-932859423765/AnsiballZ_systemd_service.py'
Dec 05 09:04:23 compute-1 sudo[106756]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:04:23 compute-1 python3.9[106758]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_libvirt.target state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 05 09:04:23 compute-1 sudo[106756]: pam_unix(sudo:session): session closed for user root
Dec 05 09:04:24 compute-1 sudo[106909]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bzekorfogmsnyrrdgqgjgkkemaeosfux ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925464.0497427-153-202470264870124/AnsiballZ_systemd_service.py'
Dec 05 09:04:24 compute-1 sudo[106909]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:04:24 compute-1 python3.9[106911]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtlogd_wrapper.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 05 09:04:24 compute-1 sudo[106909]: pam_unix(sudo:session): session closed for user root
Dec 05 09:04:25 compute-1 sudo[107062]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-evkizqyyvnljilmypvkayshdssgornsx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925464.895283-153-1972322862549/AnsiballZ_systemd_service.py'
Dec 05 09:04:25 compute-1 sudo[107062]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:04:25 compute-1 python3.9[107064]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtnodedevd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 05 09:04:25 compute-1 sudo[107062]: pam_unix(sudo:session): session closed for user root
Dec 05 09:04:26 compute-1 sudo[107215]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mzmcgkjzynucetydzlhjcbgmsuxpqcmj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925465.7720525-153-76653850183736/AnsiballZ_systemd_service.py'
Dec 05 09:04:26 compute-1 sudo[107215]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:04:26 compute-1 python3.9[107217]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtproxyd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 05 09:04:26 compute-1 sudo[107215]: pam_unix(sudo:session): session closed for user root
Dec 05 09:04:27 compute-1 sudo[107368]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dxfpzdqgtkxncrfrzpboubzjxmqzqqfl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925466.7936866-153-21143698236975/AnsiballZ_systemd_service.py'
Dec 05 09:04:27 compute-1 sudo[107368]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:04:27 compute-1 python3.9[107370]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtqemud.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 05 09:04:27 compute-1 sudo[107368]: pam_unix(sudo:session): session closed for user root
Dec 05 09:04:27 compute-1 sudo[107521]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uggczwskvonotkmjqkfsqehlmhlgwnxu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925467.6274054-153-227806486701104/AnsiballZ_systemd_service.py'
Dec 05 09:04:27 compute-1 sudo[107521]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:04:28 compute-1 python3.9[107523]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtsecretd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 05 09:04:28 compute-1 sudo[107521]: pam_unix(sudo:session): session closed for user root
Dec 05 09:04:28 compute-1 sudo[107674]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gfgpmkufmvfkoqnorimkckbbcmamqmeu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925468.3902593-153-19050303842997/AnsiballZ_systemd_service.py'
Dec 05 09:04:28 compute-1 sudo[107674]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:04:29 compute-1 python3.9[107676]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtstoraged.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 05 09:04:29 compute-1 sudo[107674]: pam_unix(sudo:session): session closed for user root
Dec 05 09:04:30 compute-1 sudo[107827]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gtbgoufkqlsjzeltcurlgvgovlotqrzk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925470.1333356-309-185229085172639/AnsiballZ_file.py'
Dec 05 09:04:30 compute-1 sudo[107827]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:04:31 compute-1 python3.9[107829]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_libvirt.target state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 09:04:31 compute-1 sudo[107827]: pam_unix(sudo:session): session closed for user root
Dec 05 09:04:31 compute-1 sudo[107979]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-syrtfotsrvlbmforcobvpmssmhgyopbm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925471.2526999-309-171234228096128/AnsiballZ_file.py'
Dec 05 09:04:31 compute-1 sudo[107979]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:04:31 compute-1 python3.9[107981]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtlogd_wrapper.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 09:04:31 compute-1 sudo[107979]: pam_unix(sudo:session): session closed for user root
Dec 05 09:04:32 compute-1 sudo[108131]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mnaifrolueoeckzmzrpzumbkkrcjxexs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925471.9039576-309-1803939796828/AnsiballZ_file.py'
Dec 05 09:04:32 compute-1 sudo[108131]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:04:32 compute-1 python3.9[108133]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtnodedevd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 09:04:32 compute-1 sudo[108131]: pam_unix(sudo:session): session closed for user root
Dec 05 09:04:32 compute-1 sudo[108283]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gkqchqhdrtdycyrdwtgiqvqznbduosqv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925472.5509012-309-219146367019795/AnsiballZ_file.py'
Dec 05 09:04:32 compute-1 sudo[108283]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:04:33 compute-1 python3.9[108285]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtproxyd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 09:04:33 compute-1 sudo[108283]: pam_unix(sudo:session): session closed for user root
Dec 05 09:04:33 compute-1 sudo[108446]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iotmzxpxbznthyngumqhbedasfiqsvlm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925473.1990924-309-108204987532900/AnsiballZ_file.py'
Dec 05 09:04:33 compute-1 sudo[108446]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:04:33 compute-1 podman[108409]: 2025-12-05 09:04:33.591748852 +0000 UTC m=+0.099573156 container health_status 0cdc951f3b455a1459471b20e1f1626ca53f702831c125f835d1b670aeb17ccf (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a42d21893a42142b0b00e96296a13ac9d2c0fedf55d6438ad104fd0309c63376-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_controller, org.label-schema.schema-version=1.0)
Dec 05 09:04:33 compute-1 python3.9[108451]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtqemud.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 09:04:33 compute-1 sudo[108446]: pam_unix(sudo:session): session closed for user root
Dec 05 09:04:34 compute-1 sudo[108612]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ynxqfcsumymgahhopwbmvkhrddlxqaaj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925473.918001-309-15456438330847/AnsiballZ_file.py'
Dec 05 09:04:34 compute-1 sudo[108612]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:04:34 compute-1 python3.9[108614]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtsecretd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 09:04:34 compute-1 sudo[108612]: pam_unix(sudo:session): session closed for user root
Dec 05 09:04:34 compute-1 sudo[108764]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xmaxanrgamkqwlinjbfqfctgrkagbxrl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925474.615158-309-254335068785615/AnsiballZ_file.py'
Dec 05 09:04:34 compute-1 sudo[108764]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:04:35 compute-1 python3.9[108766]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtstoraged.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 09:04:35 compute-1 sudo[108764]: pam_unix(sudo:session): session closed for user root
Dec 05 09:04:35 compute-1 sudo[108916]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nzrpgdoroikxrjpkniijvopybkalkyrx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925475.4305382-459-22929783201025/AnsiballZ_file.py'
Dec 05 09:04:35 compute-1 sudo[108916]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:04:35 compute-1 python3.9[108918]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_libvirt.target state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 09:04:35 compute-1 sudo[108916]: pam_unix(sudo:session): session closed for user root
Dec 05 09:04:36 compute-1 sudo[109068]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xdkwoffeiaqamiqbfhiylgtcfizkrcgx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925476.0859842-459-60452499840891/AnsiballZ_file.py'
Dec 05 09:04:36 compute-1 sudo[109068]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:04:36 compute-1 python3.9[109070]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtlogd_wrapper.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 09:04:36 compute-1 sudo[109068]: pam_unix(sudo:session): session closed for user root
Dec 05 09:04:36 compute-1 podman[109071]: 2025-12-05 09:04:36.61633516 +0000 UTC m=+0.055568809 container health_status e8cea6352187f28e672aa25c227f2705a90d58074b8cf42235a56df5d4026fd5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a42d21893a42142b0b00e96296a13ac9d2c0fedf55d6438ad104fd0309c63376-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Dec 05 09:04:37 compute-1 sudo[109240]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qrrirztonwgxwigkjdqbfpcxajmvuagh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925476.7138484-459-211076468396458/AnsiballZ_file.py'
Dec 05 09:04:37 compute-1 sudo[109240]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:04:37 compute-1 python3.9[109242]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtnodedevd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 09:04:37 compute-1 sudo[109240]: pam_unix(sudo:session): session closed for user root
Dec 05 09:04:37 compute-1 sudo[109392]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lsoifyyvfqzftmowqooaavtqlxdckosq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925477.3905523-459-239387253908950/AnsiballZ_file.py'
Dec 05 09:04:37 compute-1 sudo[109392]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:04:37 compute-1 python3.9[109394]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtproxyd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 09:04:37 compute-1 sudo[109392]: pam_unix(sudo:session): session closed for user root
Dec 05 09:04:38 compute-1 sudo[109544]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zgktqfqngvpfrrothqavhojjnfvhpnao ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925478.068545-459-148618396954300/AnsiballZ_file.py'
Dec 05 09:04:38 compute-1 sudo[109544]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:04:38 compute-1 python3.9[109546]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtqemud.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 09:04:38 compute-1 sudo[109544]: pam_unix(sudo:session): session closed for user root
Dec 05 09:04:38 compute-1 sudo[109696]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-grwbsztpsrvpsqftfwemajndgmqsttrj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925478.7059433-459-118186060169153/AnsiballZ_file.py'
Dec 05 09:04:39 compute-1 sudo[109696]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:04:39 compute-1 python3.9[109698]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtsecretd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 09:04:39 compute-1 sudo[109696]: pam_unix(sudo:session): session closed for user root
Dec 05 09:04:39 compute-1 sudo[109848]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ncqqbpnzsvnxrpcvkaopivlgxuvlmifq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925479.3679342-459-243403799553622/AnsiballZ_file.py'
Dec 05 09:04:39 compute-1 sudo[109848]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:04:39 compute-1 python3.9[109850]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtstoraged.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 09:04:39 compute-1 sudo[109848]: pam_unix(sudo:session): session closed for user root
Dec 05 09:04:40 compute-1 sudo[110000]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-awzpexhxsakilijdycmpgcrpgoxlwlwv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925480.1620698-612-280683786021931/AnsiballZ_command.py'
Dec 05 09:04:40 compute-1 sudo[110000]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:04:40 compute-1 python3.9[110002]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then
                                               systemctl disable --now certmonger.service
                                               test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service
                                             fi
                                              _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 05 09:04:40 compute-1 sudo[110000]: pam_unix(sudo:session): session closed for user root
Dec 05 09:04:41 compute-1 python3.9[110154]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Dec 05 09:04:42 compute-1 sudo[110304]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wybzakswrpwwnlvfovsescztpcenvggj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925481.9147625-666-148434879620597/AnsiballZ_systemd_service.py'
Dec 05 09:04:42 compute-1 sudo[110304]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:04:42 compute-1 python3.9[110306]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Dec 05 09:04:42 compute-1 systemd[1]: Reloading.
Dec 05 09:04:42 compute-1 systemd-sysv-generator[110338]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 05 09:04:42 compute-1 systemd-rc-local-generator[110335]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 05 09:04:42 compute-1 sudo[110304]: pam_unix(sudo:session): session closed for user root
Dec 05 09:04:43 compute-1 sudo[110492]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-unklptgvqmczmirjqfsfqsydystesjal ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925483.0175636-690-66784407559816/AnsiballZ_command.py'
Dec 05 09:04:43 compute-1 sudo[110492]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:04:43 compute-1 python3.9[110494]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_libvirt.target _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 05 09:04:43 compute-1 sudo[110492]: pam_unix(sudo:session): session closed for user root
Dec 05 09:04:44 compute-1 sudo[110645]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iocidrbrlcljoazihkgewjmiaqgzacpn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925483.739674-690-153283383618025/AnsiballZ_command.py'
Dec 05 09:04:44 compute-1 sudo[110645]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:04:44 compute-1 python3.9[110647]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtlogd_wrapper.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 05 09:04:44 compute-1 sudo[110645]: pam_unix(sudo:session): session closed for user root
Dec 05 09:04:44 compute-1 sudo[110798]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tbrosmhjvjglqswqrsyzkxliqnowluqw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925484.4137702-690-264847920586095/AnsiballZ_command.py'
Dec 05 09:04:44 compute-1 sudo[110798]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:04:44 compute-1 python3.9[110800]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtnodedevd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 05 09:04:44 compute-1 sudo[110798]: pam_unix(sudo:session): session closed for user root
Dec 05 09:04:45 compute-1 sudo[110951]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-beiifnnlpzyggwolicmkodsgoqkmujey ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925485.094721-690-17935176424116/AnsiballZ_command.py'
Dec 05 09:04:45 compute-1 sudo[110951]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:04:45 compute-1 python3.9[110953]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtproxyd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 05 09:04:45 compute-1 sudo[110951]: pam_unix(sudo:session): session closed for user root
Dec 05 09:04:46 compute-1 sudo[111104]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kwxjhbvkvbkyktarelqugqnjaileupov ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925485.772398-690-175808700593396/AnsiballZ_command.py'
Dec 05 09:04:46 compute-1 sudo[111104]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:04:46 compute-1 python3.9[111106]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtqemud.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 05 09:04:46 compute-1 sudo[111104]: pam_unix(sudo:session): session closed for user root
Dec 05 09:04:46 compute-1 sudo[111257]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mirurabpqosyhnjjxzvjgyavvrbiozeb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925486.4203637-690-202232607328396/AnsiballZ_command.py'
Dec 05 09:04:46 compute-1 sudo[111257]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:04:46 compute-1 python3.9[111259]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtsecretd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 05 09:04:46 compute-1 sudo[111257]: pam_unix(sudo:session): session closed for user root
Dec 05 09:04:47 compute-1 sudo[111410]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-anqbzqosgzllhqeenqmgqkrmiagzqbya ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925487.0726643-690-127310500895209/AnsiballZ_command.py'
Dec 05 09:04:47 compute-1 sudo[111410]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:04:47 compute-1 python3.9[111412]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtstoraged.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 05 09:04:47 compute-1 sudo[111410]: pam_unix(sudo:session): session closed for user root
Dec 05 09:04:48 compute-1 sudo[111563]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hnwzllfyzirgahacpyevecwikcbvrnnn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925488.2402828-852-2677338521623/AnsiballZ_getent.py'
Dec 05 09:04:48 compute-1 sudo[111563]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:04:48 compute-1 python3.9[111565]: ansible-ansible.builtin.getent Invoked with database=passwd key=libvirt fail_key=True service=None split=None
Dec 05 09:04:48 compute-1 sudo[111563]: pam_unix(sudo:session): session closed for user root
Dec 05 09:04:49 compute-1 sudo[111716]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-njvgoggyfmoltgghnfsxdachpmefcpop ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925489.0995717-876-263390982503430/AnsiballZ_group.py'
Dec 05 09:04:49 compute-1 sudo[111716]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:04:49 compute-1 python3.9[111718]: ansible-ansible.builtin.group Invoked with gid=42473 name=libvirt state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Dec 05 09:04:49 compute-1 groupadd[111719]: group added to /etc/group: name=libvirt, GID=42473
Dec 05 09:04:49 compute-1 groupadd[111719]: group added to /etc/gshadow: name=libvirt
Dec 05 09:04:49 compute-1 groupadd[111719]: new group: name=libvirt, GID=42473
Dec 05 09:04:49 compute-1 sudo[111716]: pam_unix(sudo:session): session closed for user root
Dec 05 09:04:50 compute-1 sudo[111874]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aeyeansypwxkgfiwasnpvazessneoseo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925490.0684428-900-116357809279921/AnsiballZ_user.py'
Dec 05 09:04:50 compute-1 sudo[111874]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:04:50 compute-1 python3.9[111876]: ansible-ansible.builtin.user Invoked with comment=libvirt user group=libvirt groups=[''] name=libvirt shell=/sbin/nologin state=present uid=42473 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-1 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Dec 05 09:04:50 compute-1 useradd[111878]: new user: name=libvirt, UID=42473, GID=42473, home=/home/libvirt, shell=/sbin/nologin, from=/dev/pts/0
Dec 05 09:04:50 compute-1 sudo[111874]: pam_unix(sudo:session): session closed for user root
Dec 05 09:04:51 compute-1 sudo[112034]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aoaviukrmuyddrjydnaykkngdfmjjirh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925491.4552333-933-164561185558678/AnsiballZ_setup.py'
Dec 05 09:04:51 compute-1 sudo[112034]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:04:52 compute-1 python3.9[112036]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec 05 09:04:52 compute-1 sudo[112034]: pam_unix(sudo:session): session closed for user root
Dec 05 09:04:52 compute-1 sudo[112118]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zdsorureuuanvkwrjawbrpacfisepuff ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925491.4552333-933-164561185558678/AnsiballZ_dnf.py'
Dec 05 09:04:52 compute-1 sudo[112118]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:04:52 compute-1 python3.9[112120]: ansible-ansible.legacy.dnf Invoked with name=['libvirt ', 'libvirt-admin ', 'libvirt-client ', 'libvirt-daemon ', 'qemu-kvm', 'qemu-img', 'libguestfs', 'libseccomp', 'swtpm', 'swtpm-tools', 'edk2-ovmf', 'ceph-common', 'cyrus-sasl-scram'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec 05 09:05:04 compute-1 podman[112305]: 2025-12-05 09:05:04.67779384 +0000 UTC m=+0.106800835 container health_status 0cdc951f3b455a1459471b20e1f1626ca53f702831c125f835d1b670aeb17ccf (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a42d21893a42142b0b00e96296a13ac9d2c0fedf55d6438ad104fd0309c63376-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true)
Dec 05 09:05:07 compute-1 podman[112333]: 2025-12-05 09:05:07.631510838 +0000 UTC m=+0.068320141 container health_status e8cea6352187f28e672aa25c227f2705a90d58074b8cf42235a56df5d4026fd5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a42d21893a42142b0b00e96296a13ac9d2c0fedf55d6438ad104fd0309c63376-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team)
Dec 05 09:05:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:05:08.845 105272 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:05:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:05:08.848 105272 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.003s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:05:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:05:08.848 105272 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:05:09 compute-1 sshd-session[112358]: Received disconnect from 185.118.15.236 port 34304:11: Bye Bye [preauth]
Dec 05 09:05:09 compute-1 sshd-session[112358]: Disconnected from authenticating user root 185.118.15.236 port 34304 [preauth]
Dec 05 09:05:18 compute-1 kernel: SELinux:  Converting 2757 SID table entries...
Dec 05 09:05:18 compute-1 kernel: SELinux:  policy capability network_peer_controls=1
Dec 05 09:05:18 compute-1 kernel: SELinux:  policy capability open_perms=1
Dec 05 09:05:18 compute-1 kernel: SELinux:  policy capability extended_socket_class=1
Dec 05 09:05:18 compute-1 kernel: SELinux:  policy capability always_check_network=0
Dec 05 09:05:18 compute-1 kernel: SELinux:  policy capability cgroup_seclabel=1
Dec 05 09:05:18 compute-1 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Dec 05 09:05:18 compute-1 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Dec 05 09:05:28 compute-1 kernel: SELinux:  Converting 2757 SID table entries...
Dec 05 09:05:28 compute-1 kernel: SELinux:  policy capability network_peer_controls=1
Dec 05 09:05:28 compute-1 kernel: SELinux:  policy capability open_perms=1
Dec 05 09:05:28 compute-1 kernel: SELinux:  policy capability extended_socket_class=1
Dec 05 09:05:28 compute-1 kernel: SELinux:  policy capability always_check_network=0
Dec 05 09:05:28 compute-1 kernel: SELinux:  policy capability cgroup_seclabel=1
Dec 05 09:05:28 compute-1 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Dec 05 09:05:28 compute-1 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Dec 05 09:05:35 compute-1 dbus-broker-launch[773]: avc:  op=load_policy lsm=selinux seqno=13 res=1
Dec 05 09:05:35 compute-1 podman[112376]: 2025-12-05 09:05:35.691294999 +0000 UTC m=+0.117380332 container health_status 0cdc951f3b455a1459471b20e1f1626ca53f702831c125f835d1b670aeb17ccf (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a42d21893a42142b0b00e96296a13ac9d2c0fedf55d6438ad104fd0309c63376-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_managed=true)
Dec 05 09:05:38 compute-1 podman[112405]: 2025-12-05 09:05:38.619447251 +0000 UTC m=+0.055161379 container health_status e8cea6352187f28e672aa25c227f2705a90d58074b8cf42235a56df5d4026fd5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a42d21893a42142b0b00e96296a13ac9d2c0fedf55d6438ad104fd0309c63376-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent)
Dec 05 09:05:39 compute-1 sshd-session[112403]: Received disconnect from 43.225.158.169 port 47432:11: Bye Bye [preauth]
Dec 05 09:05:39 compute-1 sshd-session[112403]: Disconnected from authenticating user root 43.225.158.169 port 47432 [preauth]
Dec 05 09:05:42 compute-1 sshd-session[112424]: Received disconnect from 122.168.194.41 port 45686:11: Bye Bye [preauth]
Dec 05 09:05:42 compute-1 sshd-session[112424]: Disconnected from authenticating user root 122.168.194.41 port 45686 [preauth]
Dec 05 09:05:42 compute-1 sshd-session[113304]: Connection closed by 123.56.157.254 port 57806
Dec 05 09:05:55 compute-1 sshd-session[120093]: Received disconnect from 122.114.113.177 port 34138:11: Bye Bye [preauth]
Dec 05 09:05:55 compute-1 sshd-session[120093]: Disconnected from authenticating user root 122.114.113.177 port 34138 [preauth]
Dec 05 09:06:03 compute-1 sshd[1008]: Timeout before authentication for connection from 101.47.162.91 to 38.102.83.154, pid = 104218
Dec 05 09:06:06 compute-1 podman[127989]: 2025-12-05 09:06:06.712135484 +0000 UTC m=+0.151713243 container health_status 0cdc951f3b455a1459471b20e1f1626ca53f702831c125f835d1b670aeb17ccf (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a42d21893a42142b0b00e96296a13ac9d2c0fedf55d6438ad104fd0309c63376-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Dec 05 09:06:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:06:08.847 105272 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:06:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:06:08.849 105272 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:06:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:06:08.849 105272 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:06:09 compute-1 podman[129246]: 2025-12-05 09:06:09.612174342 +0000 UTC m=+0.057241479 container health_status e8cea6352187f28e672aa25c227f2705a90d58074b8cf42235a56df5d4026fd5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a42d21893a42142b0b00e96296a13ac9d2c0fedf55d6438ad104fd0309c63376-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_metadata_agent, io.buildah.version=1.41.3)
Dec 05 09:06:23 compute-1 sshd[1008]: drop connection #0 from [101.47.162.91]:46382 on [38.102.83.154]:22 penalty: exceeded LoginGraceTime
Dec 05 09:06:24 compute-1 kernel: SELinux:  Converting 2758 SID table entries...
Dec 05 09:06:24 compute-1 kernel: SELinux:  policy capability network_peer_controls=1
Dec 05 09:06:24 compute-1 kernel: SELinux:  policy capability open_perms=1
Dec 05 09:06:24 compute-1 kernel: SELinux:  policy capability extended_socket_class=1
Dec 05 09:06:24 compute-1 kernel: SELinux:  policy capability always_check_network=0
Dec 05 09:06:24 compute-1 kernel: SELinux:  policy capability cgroup_seclabel=1
Dec 05 09:06:24 compute-1 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Dec 05 09:06:24 compute-1 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Dec 05 09:06:26 compute-1 groupadd[129294]: group added to /etc/group: name=dnsmasq, GID=992
Dec 05 09:06:26 compute-1 groupadd[129294]: group added to /etc/gshadow: name=dnsmasq
Dec 05 09:06:26 compute-1 groupadd[129294]: new group: name=dnsmasq, GID=992
Dec 05 09:06:26 compute-1 useradd[129301]: new user: name=dnsmasq, UID=992, GID=992, home=/var/lib/dnsmasq, shell=/usr/sbin/nologin, from=none
Dec 05 09:06:26 compute-1 dbus-broker-launch[767]: Noticed file-system modification, trigger reload.
Dec 05 09:06:26 compute-1 dbus-broker-launch[773]: avc:  op=load_policy lsm=selinux seqno=14 res=1
Dec 05 09:06:26 compute-1 dbus-broker-launch[767]: Noticed file-system modification, trigger reload.
Dec 05 09:06:27 compute-1 groupadd[129314]: group added to /etc/group: name=clevis, GID=991
Dec 05 09:06:27 compute-1 groupadd[129314]: group added to /etc/gshadow: name=clevis
Dec 05 09:06:27 compute-1 groupadd[129314]: new group: name=clevis, GID=991
Dec 05 09:06:27 compute-1 useradd[129321]: new user: name=clevis, UID=991, GID=991, home=/var/cache/clevis, shell=/usr/sbin/nologin, from=none
Dec 05 09:06:27 compute-1 usermod[129331]: add 'clevis' to group 'tss'
Dec 05 09:06:27 compute-1 usermod[129331]: add 'clevis' to shadow group 'tss'
Dec 05 09:06:29 compute-1 polkitd[43753]: Reloading rules
Dec 05 09:06:29 compute-1 polkitd[43753]: Collecting garbage unconditionally...
Dec 05 09:06:29 compute-1 polkitd[43753]: Loading rules from directory /etc/polkit-1/rules.d
Dec 05 09:06:29 compute-1 polkitd[43753]: Loading rules from directory /usr/share/polkit-1/rules.d
Dec 05 09:06:29 compute-1 polkitd[43753]: Finished loading, compiling and executing 3 rules
Dec 05 09:06:29 compute-1 polkitd[43753]: Reloading rules
Dec 05 09:06:29 compute-1 polkitd[43753]: Collecting garbage unconditionally...
Dec 05 09:06:29 compute-1 polkitd[43753]: Loading rules from directory /etc/polkit-1/rules.d
Dec 05 09:06:29 compute-1 polkitd[43753]: Loading rules from directory /usr/share/polkit-1/rules.d
Dec 05 09:06:29 compute-1 polkitd[43753]: Finished loading, compiling and executing 3 rules
Dec 05 09:06:31 compute-1 groupadd[129518]: group added to /etc/group: name=ceph, GID=167
Dec 05 09:06:31 compute-1 groupadd[129518]: group added to /etc/gshadow: name=ceph
Dec 05 09:06:31 compute-1 groupadd[129518]: new group: name=ceph, GID=167
Dec 05 09:06:31 compute-1 useradd[129524]: new user: name=ceph, UID=167, GID=167, home=/var/lib/ceph, shell=/sbin/nologin, from=none
Dec 05 09:06:33 compute-1 sshd-session[129531]: Received disconnect from 185.118.15.236 port 34428:11: Bye Bye [preauth]
Dec 05 09:06:33 compute-1 sshd-session[129531]: Disconnected from authenticating user root 185.118.15.236 port 34428 [preauth]
Dec 05 09:06:34 compute-1 systemd[1]: Stopping OpenSSH server daemon...
Dec 05 09:06:34 compute-1 sshd[1008]: Received signal 15; terminating.
Dec 05 09:06:34 compute-1 systemd[1]: sshd.service: Deactivated successfully.
Dec 05 09:06:34 compute-1 systemd[1]: Stopped OpenSSH server daemon.
Dec 05 09:06:34 compute-1 systemd[1]: sshd.service: Consumed 5.059s CPU time, read 32.0K from disk, written 112.0K to disk.
Dec 05 09:06:34 compute-1 systemd[1]: Stopped target sshd-keygen.target.
Dec 05 09:06:34 compute-1 systemd[1]: Stopping sshd-keygen.target...
Dec 05 09:06:34 compute-1 systemd[1]: OpenSSH ecdsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Dec 05 09:06:34 compute-1 systemd[1]: OpenSSH ed25519 Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Dec 05 09:06:34 compute-1 systemd[1]: OpenSSH rsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Dec 05 09:06:34 compute-1 systemd[1]: Reached target sshd-keygen.target.
Dec 05 09:06:34 compute-1 systemd[1]: Starting OpenSSH server daemon...
Dec 05 09:06:34 compute-1 sshd[130045]: Server listening on 0.0.0.0 port 22.
Dec 05 09:06:34 compute-1 sshd[130045]: Server listening on :: port 22.
Dec 05 09:06:34 compute-1 systemd[1]: Started OpenSSH server daemon.
Dec 05 09:06:36 compute-1 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Dec 05 09:06:36 compute-1 systemd[1]: Starting man-db-cache-update.service...
Dec 05 09:06:36 compute-1 systemd[1]: Reloading.
Dec 05 09:06:36 compute-1 systemd-rc-local-generator[130302]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 05 09:06:36 compute-1 systemd-sysv-generator[130305]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 05 09:06:36 compute-1 systemd[1]: Queuing reload/restart jobs for marked units…
Dec 05 09:06:37 compute-1 podman[130357]: 2025-12-05 09:06:37.070631878 +0000 UTC m=+0.119740546 container health_status 0cdc951f3b455a1459471b20e1f1626ca53f702831c125f835d1b670aeb17ccf (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a42d21893a42142b0b00e96296a13ac9d2c0fedf55d6438ad104fd0309c63376-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Dec 05 09:06:39 compute-1 sudo[112118]: pam_unix(sudo:session): session closed for user root
Dec 05 09:06:40 compute-1 podman[133456]: 2025-12-05 09:06:40.131028982 +0000 UTC m=+0.064605451 container health_status e8cea6352187f28e672aa25c227f2705a90d58074b8cf42235a56df5d4026fd5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a42d21893a42142b0b00e96296a13ac9d2c0fedf55d6438ad104fd0309c63376-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_metadata_agent, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125)
Dec 05 09:06:45 compute-1 sudo[138662]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hywmbejgvmpvclxorqyhzlausvltjefl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925604.8165286-969-93115853817148/AnsiballZ_systemd.py'
Dec 05 09:06:45 compute-1 sudo[138662]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:06:45 compute-1 python3.9[138685]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Dec 05 09:06:45 compute-1 systemd[1]: Reloading.
Dec 05 09:06:45 compute-1 systemd-rc-local-generator[138902]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 05 09:06:45 compute-1 systemd-sysv-generator[138910]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 05 09:06:46 compute-1 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Dec 05 09:06:46 compute-1 systemd[1]: Finished man-db-cache-update.service.
Dec 05 09:06:46 compute-1 systemd[1]: man-db-cache-update.service: Consumed 11.905s CPU time.
Dec 05 09:06:46 compute-1 systemd[1]: run-r2e9f169f161645b1885fa266cdf34f69.service: Deactivated successfully.
Dec 05 09:06:46 compute-1 sudo[138662]: pam_unix(sudo:session): session closed for user root
Dec 05 09:06:46 compute-1 sudo[139064]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vzewkwoyechkqeyqkzdbpvhovmejtyvo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925606.3500068-969-151991416649266/AnsiballZ_systemd.py'
Dec 05 09:06:46 compute-1 sudo[139064]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:06:46 compute-1 python3.9[139066]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd-tcp.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Dec 05 09:06:47 compute-1 systemd[1]: Reloading.
Dec 05 09:06:47 compute-1 systemd-rc-local-generator[139091]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 05 09:06:47 compute-1 systemd-sysv-generator[139096]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 05 09:06:47 compute-1 sudo[139064]: pam_unix(sudo:session): session closed for user root
Dec 05 09:06:47 compute-1 sudo[139253]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-reudfolkfmsnviaqlccxrggvzzojvtnl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925607.458905-969-181307485721885/AnsiballZ_systemd.py'
Dec 05 09:06:47 compute-1 sudo[139253]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:06:48 compute-1 python3.9[139255]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd-tls.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Dec 05 09:06:48 compute-1 systemd[1]: Reloading.
Dec 05 09:06:48 compute-1 systemd-rc-local-generator[139285]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 05 09:06:48 compute-1 systemd-sysv-generator[139289]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 05 09:06:48 compute-1 sudo[139253]: pam_unix(sudo:session): session closed for user root
Dec 05 09:06:48 compute-1 sudo[139443]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eggxyzwntvendrrvurdthzkoqpzjdiei ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925608.5088952-969-164381117878251/AnsiballZ_systemd.py'
Dec 05 09:06:48 compute-1 sudo[139443]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:06:49 compute-1 python3.9[139445]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=virtproxyd-tcp.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Dec 05 09:06:49 compute-1 systemd[1]: Reloading.
Dec 05 09:06:49 compute-1 systemd-rc-local-generator[139469]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 05 09:06:49 compute-1 systemd-sysv-generator[139473]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 05 09:06:49 compute-1 sudo[139443]: pam_unix(sudo:session): session closed for user root
Dec 05 09:06:49 compute-1 sudo[139633]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dxhlsbpveiutckhxmrkuemidqxxcjetd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925609.6287122-1056-272510962291057/AnsiballZ_systemd.py'
Dec 05 09:06:49 compute-1 sudo[139633]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:06:50 compute-1 python3.9[139635]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec 05 09:06:50 compute-1 systemd[1]: Reloading.
Dec 05 09:06:50 compute-1 systemd-rc-local-generator[139664]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 05 09:06:50 compute-1 systemd-sysv-generator[139668]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 05 09:06:50 compute-1 sudo[139633]: pam_unix(sudo:session): session closed for user root
Dec 05 09:06:51 compute-1 sudo[139825]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gtbjwlxpnnxubdaavmpbyhdmnyfdqxng ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925610.7222197-1056-275700766996977/AnsiballZ_systemd.py'
Dec 05 09:06:51 compute-1 sudo[139825]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:06:51 compute-1 python3.9[139827]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec 05 09:06:51 compute-1 systemd[1]: Reloading.
Dec 05 09:06:51 compute-1 systemd-rc-local-generator[139857]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 05 09:06:51 compute-1 systemd-sysv-generator[139861]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 05 09:06:51 compute-1 sudo[139825]: pam_unix(sudo:session): session closed for user root
Dec 05 09:06:51 compute-1 sshd-session[139674]: Received disconnect from 43.225.158.169 port 60574:11: Bye Bye [preauth]
Dec 05 09:06:51 compute-1 sshd-session[139674]: Disconnected from authenticating user root 43.225.158.169 port 60574 [preauth]
Dec 05 09:06:52 compute-1 sudo[140014]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lboxcsliwftodijcnkmwtdbhfkfabmsn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925611.8228562-1056-138200531909654/AnsiballZ_systemd.py'
Dec 05 09:06:52 compute-1 sudo[140014]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:06:52 compute-1 python3.9[140016]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec 05 09:06:52 compute-1 systemd[1]: Reloading.
Dec 05 09:06:52 compute-1 systemd-rc-local-generator[140045]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 05 09:06:52 compute-1 systemd-sysv-generator[140049]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 05 09:06:52 compute-1 sudo[140014]: pam_unix(sudo:session): session closed for user root
Dec 05 09:06:53 compute-1 sudo[140204]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hzbhxoxnvlnymxredxiyncmvfjvfkulk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925612.8963733-1056-215942848008916/AnsiballZ_systemd.py'
Dec 05 09:06:53 compute-1 sudo[140204]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:06:53 compute-1 python3.9[140206]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec 05 09:06:53 compute-1 sudo[140204]: pam_unix(sudo:session): session closed for user root
Dec 05 09:06:54 compute-1 sudo[140359]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-svkrgspajczdxgouimdhooaowstbrfdl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925613.7330928-1056-200726500544211/AnsiballZ_systemd.py'
Dec 05 09:06:54 compute-1 sudo[140359]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:06:54 compute-1 python3.9[140361]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec 05 09:06:54 compute-1 systemd[1]: Reloading.
Dec 05 09:06:54 compute-1 systemd-rc-local-generator[140391]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 05 09:06:54 compute-1 systemd-sysv-generator[140394]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 05 09:06:54 compute-1 sudo[140359]: pam_unix(sudo:session): session closed for user root
Dec 05 09:06:55 compute-1 sudo[140548]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nhatdyyspqjunvpaguliopdqqaicbniy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925615.0009696-1164-157129729176083/AnsiballZ_systemd.py'
Dec 05 09:06:55 compute-1 sudo[140548]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:06:55 compute-1 python3.9[140550]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-tls.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Dec 05 09:06:56 compute-1 systemd[1]: Reloading.
Dec 05 09:06:56 compute-1 systemd-rc-local-generator[140580]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 05 09:06:56 compute-1 systemd-sysv-generator[140583]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 05 09:06:56 compute-1 systemd[1]: Listening on libvirt proxy daemon socket.
Dec 05 09:06:56 compute-1 systemd[1]: Listening on libvirt proxy daemon TLS IP socket.
Dec 05 09:06:57 compute-1 sudo[140548]: pam_unix(sudo:session): session closed for user root
Dec 05 09:06:57 compute-1 sudo[140741]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-svrnzvglpidjdnrlzousebrnrmptvamt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925617.2071803-1188-116917117815149/AnsiballZ_systemd.py'
Dec 05 09:06:57 compute-1 sudo[140741]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:06:57 compute-1 python3.9[140743]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec 05 09:06:57 compute-1 sudo[140741]: pam_unix(sudo:session): session closed for user root
Dec 05 09:06:58 compute-1 sudo[140896]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hbkozmwarletvhohxkhuglpyndsznobj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925618.0587368-1188-144354297247080/AnsiballZ_systemd.py'
Dec 05 09:06:58 compute-1 sudo[140896]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:06:58 compute-1 python3.9[140898]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec 05 09:06:58 compute-1 sudo[140896]: pam_unix(sudo:session): session closed for user root
Dec 05 09:06:59 compute-1 sudo[141053]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cacgexxojdrvhcdavjvebdxkjufkngzi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925619.0862215-1188-73076177846996/AnsiballZ_systemd.py'
Dec 05 09:06:59 compute-1 sudo[141053]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:06:59 compute-1 python3.9[141055]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec 05 09:06:59 compute-1 sudo[141053]: pam_unix(sudo:session): session closed for user root
Dec 05 09:07:00 compute-1 sudo[141208]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ptrugkdnrcbhobihxowhalbzdprmuikx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925620.0340664-1188-223795602062205/AnsiballZ_systemd.py'
Dec 05 09:07:00 compute-1 sudo[141208]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:07:00 compute-1 python3.9[141210]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec 05 09:07:00 compute-1 sudo[141208]: pam_unix(sudo:session): session closed for user root
Dec 05 09:07:00 compute-1 sshd-session[141001]: Received disconnect from 122.168.194.41 port 47964:11: Bye Bye [preauth]
Dec 05 09:07:00 compute-1 sshd-session[141001]: Disconnected from authenticating user root 122.168.194.41 port 47964 [preauth]
Dec 05 09:07:01 compute-1 sudo[141363]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gmkekjcmfxffayhuzqgcxqqusjfomigc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925620.8366632-1188-251411409386230/AnsiballZ_systemd.py'
Dec 05 09:07:01 compute-1 sudo[141363]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:07:01 compute-1 python3.9[141365]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec 05 09:07:01 compute-1 sudo[141363]: pam_unix(sudo:session): session closed for user root
Dec 05 09:07:01 compute-1 sudo[141518]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iuibejbbkmduiwykkuuvnlgstfkyunek ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925621.6855073-1188-30514184231718/AnsiballZ_systemd.py'
Dec 05 09:07:01 compute-1 sudo[141518]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:07:02 compute-1 python3.9[141520]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec 05 09:07:02 compute-1 sudo[141518]: pam_unix(sudo:session): session closed for user root
Dec 05 09:07:02 compute-1 sudo[141673]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bagbbcynbnuwwezqvgzxmukbnwhgzwmk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925622.5294425-1188-245863761030756/AnsiballZ_systemd.py'
Dec 05 09:07:02 compute-1 sudo[141673]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:07:03 compute-1 python3.9[141675]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec 05 09:07:03 compute-1 sudo[141673]: pam_unix(sudo:session): session closed for user root
Dec 05 09:07:03 compute-1 sudo[141828]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sfqrtdbvcqdplmndjvvshwwmdopvzioq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925623.3719983-1188-218986296621176/AnsiballZ_systemd.py'
Dec 05 09:07:03 compute-1 sudo[141828]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:07:03 compute-1 python3.9[141830]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec 05 09:07:04 compute-1 sudo[141828]: pam_unix(sudo:session): session closed for user root
Dec 05 09:07:04 compute-1 sudo[141983]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pdqmxalyzstlojavbzjhfxvqlhlefrws ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925624.2122965-1188-18279135590637/AnsiballZ_systemd.py'
Dec 05 09:07:04 compute-1 sudo[141983]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:07:04 compute-1 python3.9[141985]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec 05 09:07:04 compute-1 sudo[141983]: pam_unix(sudo:session): session closed for user root
Dec 05 09:07:05 compute-1 sudo[142138]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-imotscpffkpbffblrihfamcaexaduvcx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925625.1105723-1188-224535553555648/AnsiballZ_systemd.py'
Dec 05 09:07:05 compute-1 sudo[142138]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:07:05 compute-1 python3.9[142140]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec 05 09:07:05 compute-1 sudo[142138]: pam_unix(sudo:session): session closed for user root
Dec 05 09:07:06 compute-1 sudo[142293]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-egyudaalluatbqcploblcwharjbgqjgv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925625.9731567-1188-122129040540943/AnsiballZ_systemd.py'
Dec 05 09:07:06 compute-1 sudo[142293]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:07:06 compute-1 python3.9[142295]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec 05 09:07:06 compute-1 sudo[142293]: pam_unix(sudo:session): session closed for user root
Dec 05 09:07:07 compute-1 sudo[142448]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cqcrktwouorbklezdnipujgrnrhyezom ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925626.839401-1188-160322588745020/AnsiballZ_systemd.py'
Dec 05 09:07:07 compute-1 sudo[142448]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:07:07 compute-1 podman[142450]: 2025-12-05 09:07:07.260507847 +0000 UTC m=+0.107726189 container health_status 0cdc951f3b455a1459471b20e1f1626ca53f702831c125f835d1b670aeb17ccf (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a42d21893a42142b0b00e96296a13ac9d2c0fedf55d6438ad104fd0309c63376-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_controller, container_name=ovn_controller)
Dec 05 09:07:07 compute-1 python3.9[142451]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec 05 09:07:07 compute-1 sudo[142448]: pam_unix(sudo:session): session closed for user root
Dec 05 09:07:08 compute-1 sudo[142627]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zznhmfizffnqoobnunmusjedoycbnscm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925627.7021334-1188-34627449554163/AnsiballZ_systemd.py'
Dec 05 09:07:08 compute-1 sudo[142627]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:07:08 compute-1 python3.9[142629]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec 05 09:07:08 compute-1 sudo[142627]: pam_unix(sudo:session): session closed for user root
Dec 05 09:07:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:07:08.849 105272 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:07:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:07:08.852 105272 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.004s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:07:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:07:08.852 105272 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:07:09 compute-1 sudo[142782]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-epfxghkpfafrqvmwhtnjjysrtsnxsqaw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925628.7237015-1188-139238220034190/AnsiballZ_systemd.py'
Dec 05 09:07:09 compute-1 sudo[142782]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:07:09 compute-1 python3.9[142784]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec 05 09:07:09 compute-1 sudo[142782]: pam_unix(sudo:session): session closed for user root
Dec 05 09:07:10 compute-1 sudo[142946]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bohnbilqjguvxvnbrupllhtoompdholr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925630.0431042-1494-215769618100588/AnsiballZ_file.py'
Dec 05 09:07:10 compute-1 sudo[142946]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:07:10 compute-1 podman[142911]: 2025-12-05 09:07:10.376638464 +0000 UTC m=+0.064433653 container health_status e8cea6352187f28e672aa25c227f2705a90d58074b8cf42235a56df5d4026fd5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a42d21893a42142b0b00e96296a13ac9d2c0fedf55d6438ad104fd0309c63376-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Dec 05 09:07:10 compute-1 python3.9[142958]: ansible-ansible.builtin.file Invoked with group=root owner=root path=/etc/tmpfiles.d/ setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Dec 05 09:07:10 compute-1 sudo[142946]: pam_unix(sudo:session): session closed for user root
Dec 05 09:07:11 compute-1 sudo[143108]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wnbykvrkspayqbmuqyjuotunviexhiuh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925630.7827997-1494-178595560655243/AnsiballZ_file.py'
Dec 05 09:07:11 compute-1 sudo[143108]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:07:11 compute-1 python3.9[143110]: ansible-ansible.builtin.file Invoked with group=root owner=root path=/var/lib/edpm-config/firewall setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Dec 05 09:07:11 compute-1 sudo[143108]: pam_unix(sudo:session): session closed for user root
Dec 05 09:07:11 compute-1 sudo[143260]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yakmkkilpgyifynqqfivoeckapmlqeqw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925631.4200919-1494-276799112445737/AnsiballZ_file.py'
Dec 05 09:07:11 compute-1 sudo[143260]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:07:11 compute-1 python3.9[143262]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/libvirt setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 05 09:07:11 compute-1 sudo[143260]: pam_unix(sudo:session): session closed for user root
Dec 05 09:07:12 compute-1 sudo[143412]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fapfuvdwhunhkxqkszhmsteqyidaazzi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925632.1150737-1494-192678415544461/AnsiballZ_file.py'
Dec 05 09:07:12 compute-1 sudo[143412]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:07:12 compute-1 python3.9[143414]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/libvirt/private setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 05 09:07:12 compute-1 sudo[143412]: pam_unix(sudo:session): session closed for user root
Dec 05 09:07:13 compute-1 sudo[143564]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gerrdxmxaugbbczvfvgjwcijpdocwpgg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925632.7922876-1494-275310390656928/AnsiballZ_file.py'
Dec 05 09:07:13 compute-1 sudo[143564]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:07:13 compute-1 python3.9[143566]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/CA setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 05 09:07:13 compute-1 sudo[143564]: pam_unix(sudo:session): session closed for user root
Dec 05 09:07:14 compute-1 sudo[143716]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lntyklmajzhvvpvouczmqnnpnibnbdeu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925633.871858-1494-189451119362194/AnsiballZ_file.py'
Dec 05 09:07:14 compute-1 sudo[143716]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:07:14 compute-1 python3.9[143718]: ansible-ansible.builtin.file Invoked with group=qemu owner=root path=/etc/pki/qemu setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Dec 05 09:07:14 compute-1 sudo[143716]: pam_unix(sudo:session): session closed for user root
Dec 05 09:07:15 compute-1 sudo[143868]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-afbxxoirjssmizzlxrrseoqbkdxrktkh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925634.5754645-1623-279078661924579/AnsiballZ_stat.py'
Dec 05 09:07:15 compute-1 sudo[143868]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:07:15 compute-1 python3.9[143870]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtlogd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 09:07:15 compute-1 sudo[143868]: pam_unix(sudo:session): session closed for user root
Dec 05 09:07:15 compute-1 sudo[143993]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zzunkpixajtwsaoaqwwrfikwxpwdhrhz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925634.5754645-1623-279078661924579/AnsiballZ_copy.py'
Dec 05 09:07:15 compute-1 sudo[143993]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:07:16 compute-1 python3.9[143995]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtlogd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1764925634.5754645-1623-279078661924579/.source.conf follow=False _original_basename=virtlogd.conf checksum=d7a72ae92c2c205983b029473e05a6aa4c58ec24 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 09:07:16 compute-1 sudo[143993]: pam_unix(sudo:session): session closed for user root
Dec 05 09:07:16 compute-1 sudo[144145]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hydftyxrawxwndfxiqwrnmapwgahprwm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925636.2815585-1623-131326315170716/AnsiballZ_stat.py'
Dec 05 09:07:16 compute-1 sudo[144145]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:07:16 compute-1 python3.9[144147]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtnodedevd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 09:07:16 compute-1 sudo[144145]: pam_unix(sudo:session): session closed for user root
Dec 05 09:07:17 compute-1 sudo[144270]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-udthjpcgpqapgwdrrtdpxykbdkbjclnb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925636.2815585-1623-131326315170716/AnsiballZ_copy.py'
Dec 05 09:07:17 compute-1 sudo[144270]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:07:17 compute-1 python3.9[144272]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtnodedevd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1764925636.2815585-1623-131326315170716/.source.conf follow=False _original_basename=virtnodedevd.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 09:07:17 compute-1 sudo[144270]: pam_unix(sudo:session): session closed for user root
Dec 05 09:07:17 compute-1 sudo[144422]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qjldcpwmpkjavimhdliuojldwfkjvfgt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925637.5454922-1623-68833817851906/AnsiballZ_stat.py'
Dec 05 09:07:17 compute-1 sudo[144422]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:07:18 compute-1 python3.9[144424]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtproxyd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 09:07:18 compute-1 sudo[144422]: pam_unix(sudo:session): session closed for user root
Dec 05 09:07:18 compute-1 sudo[144547]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ixwmnvltozhdgitjgnbnfgphgzqnhlce ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925637.5454922-1623-68833817851906/AnsiballZ_copy.py'
Dec 05 09:07:18 compute-1 sudo[144547]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:07:18 compute-1 python3.9[144549]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtproxyd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1764925637.5454922-1623-68833817851906/.source.conf follow=False _original_basename=virtproxyd.conf checksum=28bc484b7c9988e03de49d4fcc0a088ea975f716 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 09:07:18 compute-1 sudo[144547]: pam_unix(sudo:session): session closed for user root
Dec 05 09:07:19 compute-1 sudo[144699]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rhnnrgvryloiipewsxjhyqlerxujfrzd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925638.9027805-1623-245678491637485/AnsiballZ_stat.py'
Dec 05 09:07:19 compute-1 sudo[144699]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:07:19 compute-1 python3.9[144701]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtqemud.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 09:07:19 compute-1 sudo[144699]: pam_unix(sudo:session): session closed for user root
Dec 05 09:07:19 compute-1 sudo[144824]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-volkmfbchszaoratllvkqrynufurunco ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925638.9027805-1623-245678491637485/AnsiballZ_copy.py'
Dec 05 09:07:19 compute-1 sudo[144824]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:07:19 compute-1 python3.9[144826]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtqemud.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1764925638.9027805-1623-245678491637485/.source.conf follow=False _original_basename=virtqemud.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 09:07:20 compute-1 sudo[144824]: pam_unix(sudo:session): session closed for user root
Dec 05 09:07:20 compute-1 sudo[144976]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-drxlpwnegirqkktmiouzbiolfhraxisl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925640.1639152-1623-92048025625934/AnsiballZ_stat.py'
Dec 05 09:07:20 compute-1 sudo[144976]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:07:20 compute-1 python3.9[144978]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/qemu.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 09:07:20 compute-1 sudo[144976]: pam_unix(sudo:session): session closed for user root
Dec 05 09:07:21 compute-1 sudo[145101]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cwcsjjgwjegbrvbifgefxfsqdskeyhpr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925640.1639152-1623-92048025625934/AnsiballZ_copy.py'
Dec 05 09:07:21 compute-1 sudo[145101]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:07:21 compute-1 python3.9[145103]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/qemu.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1764925640.1639152-1623-92048025625934/.source.conf follow=False _original_basename=qemu.conf.j2 checksum=c44de21af13c90603565570f09ff60c6a41ed8df backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 09:07:21 compute-1 sudo[145101]: pam_unix(sudo:session): session closed for user root
Dec 05 09:07:21 compute-1 sudo[145253]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jaflovdvmayrrbazfhobnduovmzdkubp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925641.4660058-1623-91199091294881/AnsiballZ_stat.py'
Dec 05 09:07:21 compute-1 sudo[145253]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:07:21 compute-1 python3.9[145255]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtsecretd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 09:07:22 compute-1 sudo[145253]: pam_unix(sudo:session): session closed for user root
Dec 05 09:07:22 compute-1 sudo[145378]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gqsevnhzhmcrhavhaasdvnmlrnrxodbw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925641.4660058-1623-91199091294881/AnsiballZ_copy.py'
Dec 05 09:07:22 compute-1 sudo[145378]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:07:22 compute-1 python3.9[145380]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtsecretd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1764925641.4660058-1623-91199091294881/.source.conf follow=False _original_basename=virtsecretd.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 09:07:22 compute-1 sudo[145378]: pam_unix(sudo:session): session closed for user root
Dec 05 09:07:23 compute-1 sudo[145530]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-papsjjyypvwayrmdfpkkqaaweihfncbe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925643.078358-1623-252676516098551/AnsiballZ_stat.py'
Dec 05 09:07:23 compute-1 sudo[145530]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:07:23 compute-1 python3.9[145532]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/auth.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 09:07:23 compute-1 sudo[145530]: pam_unix(sudo:session): session closed for user root
Dec 05 09:07:24 compute-1 sudo[145653]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dnrozomtivfqmgjcbcesnbzgtkjabvxg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925643.078358-1623-252676516098551/AnsiballZ_copy.py'
Dec 05 09:07:24 compute-1 sudo[145653]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:07:24 compute-1 python3.9[145655]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/auth.conf group=libvirt mode=0600 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1764925643.078358-1623-252676516098551/.source.conf follow=False _original_basename=auth.conf checksum=a94cd818c374cec2c8425b70d2e0e2f41b743ae4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 09:07:24 compute-1 sudo[145653]: pam_unix(sudo:session): session closed for user root
Dec 05 09:07:24 compute-1 sudo[145805]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nuvqmykagfnndiwaeuesubnqhcwgsgnn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925644.6569343-1623-226867991905631/AnsiballZ_stat.py'
Dec 05 09:07:24 compute-1 sudo[145805]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:07:25 compute-1 python3.9[145807]: ansible-ansible.legacy.stat Invoked with path=/etc/sasl2/libvirt.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 09:07:25 compute-1 sudo[145805]: pam_unix(sudo:session): session closed for user root
Dec 05 09:07:25 compute-1 sudo[145930]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aaeplgabsowdiniothvnnivuqqvowxfy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925644.6569343-1623-226867991905631/AnsiballZ_copy.py'
Dec 05 09:07:25 compute-1 sudo[145930]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:07:25 compute-1 python3.9[145932]: ansible-ansible.legacy.copy Invoked with dest=/etc/sasl2/libvirt.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1764925644.6569343-1623-226867991905631/.source.conf follow=False _original_basename=sasl_libvirt.conf checksum=652e4d404bf79253d06956b8e9847c9364979d4a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 09:07:25 compute-1 sudo[145930]: pam_unix(sudo:session): session closed for user root
Dec 05 09:07:26 compute-1 sudo[146082]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yogopxczopvyrnjbireeygrdriecfhla ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925646.0412443-1962-130918402808422/AnsiballZ_command.py'
Dec 05 09:07:26 compute-1 sudo[146082]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:07:26 compute-1 python3.9[146084]: ansible-ansible.legacy.command Invoked with cmd=saslpasswd2 -f /etc/libvirt/passwd.db -p -a libvirt -u openstack migration stdin=12345678 _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None
Dec 05 09:07:26 compute-1 sudo[146082]: pam_unix(sudo:session): session closed for user root
Dec 05 09:07:27 compute-1 sudo[146235]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ywjfixpnmenoffeeciubrikmkbtwtaez ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925646.8341188-1989-21582480232444/AnsiballZ_file.py'
Dec 05 09:07:27 compute-1 sudo[146235]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:07:27 compute-1 python3.9[146237]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtlogd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 09:07:27 compute-1 sudo[146235]: pam_unix(sudo:session): session closed for user root
Dec 05 09:07:28 compute-1 sudo[146387]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aufdovanopkjvxvoocztbwkibmlhrnal ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925647.5399964-1989-49331530570501/AnsiballZ_file.py'
Dec 05 09:07:28 compute-1 sudo[146387]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:07:28 compute-1 python3.9[146389]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtlogd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 09:07:28 compute-1 sudo[146387]: pam_unix(sudo:session): session closed for user root
Dec 05 09:07:28 compute-1 sudo[146539]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kcdheuyvhumifyklxixzmlyyygvhgtol ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925648.4168768-1989-37130851756024/AnsiballZ_file.py'
Dec 05 09:07:28 compute-1 sudo[146539]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:07:28 compute-1 python3.9[146541]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 09:07:28 compute-1 sudo[146539]: pam_unix(sudo:session): session closed for user root
Dec 05 09:07:29 compute-1 sudo[146691]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pyclqpfcbikvrhrpedfzofsbkzfytzld ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925649.0939832-1989-237101094521369/AnsiballZ_file.py'
Dec 05 09:07:29 compute-1 sudo[146691]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:07:29 compute-1 python3.9[146693]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 09:07:29 compute-1 sudo[146691]: pam_unix(sudo:session): session closed for user root
Dec 05 09:07:30 compute-1 sudo[146843]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qmrspyqcgaeqwrfmovxoggddjncnkktd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925649.9073668-1989-4170364764138/AnsiballZ_file.py'
Dec 05 09:07:30 compute-1 sudo[146843]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:07:30 compute-1 python3.9[146845]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 09:07:30 compute-1 sudo[146843]: pam_unix(sudo:session): session closed for user root
Dec 05 09:07:30 compute-1 sudo[146995]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rwmywpvbxsjeckbxefcyxtzkscxsxxme ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925650.6056733-1989-252338182178784/AnsiballZ_file.py'
Dec 05 09:07:30 compute-1 sudo[146995]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:07:31 compute-1 python3.9[146997]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 09:07:31 compute-1 sudo[146995]: pam_unix(sudo:session): session closed for user root
Dec 05 09:07:31 compute-1 sudo[147147]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pgzpxbaewnffyilulegzacdxkotydzjt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925651.3458169-1989-217089745723122/AnsiballZ_file.py'
Dec 05 09:07:31 compute-1 sudo[147147]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:07:31 compute-1 python3.9[147149]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 09:07:31 compute-1 sudo[147147]: pam_unix(sudo:session): session closed for user root
Dec 05 09:07:32 compute-1 sudo[147299]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-awdtzufazswhfsekaayygxxsqjolvylk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925652.0372696-1989-240892821018189/AnsiballZ_file.py'
Dec 05 09:07:32 compute-1 sudo[147299]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:07:32 compute-1 python3.9[147301]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 09:07:32 compute-1 sudo[147299]: pam_unix(sudo:session): session closed for user root
Dec 05 09:07:33 compute-1 sudo[147451]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-intqwzmrfwusnoulcqypfexzrcfdtjnn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925653.0733714-1989-102459506291129/AnsiballZ_file.py'
Dec 05 09:07:33 compute-1 sudo[147451]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:07:34 compute-1 python3.9[147453]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 09:07:34 compute-1 sudo[147451]: pam_unix(sudo:session): session closed for user root
Dec 05 09:07:34 compute-1 sudo[147603]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-faambacldkanzfdhludkxcoxcbnspssr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925654.6251135-1989-122254677083680/AnsiballZ_file.py'
Dec 05 09:07:34 compute-1 sudo[147603]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:07:35 compute-1 python3.9[147605]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 09:07:35 compute-1 sudo[147603]: pam_unix(sudo:session): session closed for user root
Dec 05 09:07:35 compute-1 sudo[147755]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dhykyqwfafxwvyujpcyemntcayerglrw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925655.2772372-1989-203569552196275/AnsiballZ_file.py'
Dec 05 09:07:35 compute-1 sudo[147755]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:07:35 compute-1 python3.9[147757]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 09:07:35 compute-1 sudo[147755]: pam_unix(sudo:session): session closed for user root
Dec 05 09:07:36 compute-1 sudo[147907]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jabkcmscfyrvwsfqlmlrntmrnfgupltu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925655.9708369-1989-62801242530758/AnsiballZ_file.py'
Dec 05 09:07:36 compute-1 sudo[147907]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:07:36 compute-1 python3.9[147909]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 09:07:36 compute-1 sudo[147907]: pam_unix(sudo:session): session closed for user root
Dec 05 09:07:36 compute-1 sudo[148059]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uueuxpuxadzqsxclgzgvpvdbhbnpelgf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925656.636035-1989-14988500263980/AnsiballZ_file.py'
Dec 05 09:07:36 compute-1 sudo[148059]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:07:37 compute-1 python3.9[148061]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 09:07:37 compute-1 sudo[148059]: pam_unix(sudo:session): session closed for user root
Dec 05 09:07:37 compute-1 podman[148107]: 2025-12-05 09:07:37.670667581 +0000 UTC m=+0.094663715 container health_status 0cdc951f3b455a1459471b20e1f1626ca53f702831c125f835d1b670aeb17ccf (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, managed_by=edpm_ansible, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a42d21893a42142b0b00e96296a13ac9d2c0fedf55d6438ad104fd0309c63376-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, io.buildah.version=1.41.3)
Dec 05 09:07:37 compute-1 sudo[148239]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tujvrqkwpfgdiatlqfeflhpovbdvzloi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925657.5482385-1989-20556560507300/AnsiballZ_file.py'
Dec 05 09:07:37 compute-1 sudo[148239]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:07:38 compute-1 python3.9[148241]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 09:07:38 compute-1 sudo[148239]: pam_unix(sudo:session): session closed for user root
Dec 05 09:07:38 compute-1 sudo[148391]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ybzwekwllpdjubtdhdfeicaldgpmgkmt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925658.276984-2286-84258661054786/AnsiballZ_stat.py'
Dec 05 09:07:38 compute-1 sudo[148391]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:07:38 compute-1 python3.9[148393]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtlogd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 09:07:38 compute-1 sudo[148391]: pam_unix(sudo:session): session closed for user root
Dec 05 09:07:39 compute-1 sudo[148514]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mqwbjdrtfiwxtnytgafxdbrdarnphgbz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925658.276984-2286-84258661054786/AnsiballZ_copy.py'
Dec 05 09:07:39 compute-1 sudo[148514]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:07:39 compute-1 python3.9[148516]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtlogd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764925658.276984-2286-84258661054786/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 09:07:39 compute-1 sudo[148514]: pam_unix(sudo:session): session closed for user root
Dec 05 09:07:39 compute-1 sudo[148666]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-obwaoqhtbmqmskevkekujgfdghwhzuzg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925659.5821412-2286-101871987244046/AnsiballZ_stat.py'
Dec 05 09:07:39 compute-1 sudo[148666]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:07:40 compute-1 python3.9[148668]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtlogd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 09:07:40 compute-1 sudo[148666]: pam_unix(sudo:session): session closed for user root
Dec 05 09:07:40 compute-1 sudo[148801]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yztpfkqasexvpqbozslnnhzrdneqwhhu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925659.5821412-2286-101871987244046/AnsiballZ_copy.py'
Dec 05 09:07:40 compute-1 sudo[148801]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:07:40 compute-1 podman[148763]: 2025-12-05 09:07:40.593755038 +0000 UTC m=+0.075696093 container health_status e8cea6352187f28e672aa25c227f2705a90d58074b8cf42235a56df5d4026fd5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a42d21893a42142b0b00e96296a13ac9d2c0fedf55d6438ad104fd0309c63376-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Dec 05 09:07:40 compute-1 python3.9[148807]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtlogd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764925659.5821412-2286-101871987244046/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 09:07:40 compute-1 sudo[148801]: pam_unix(sudo:session): session closed for user root
Dec 05 09:07:41 compute-1 sudo[148961]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nmddkllbwqikidytrsuyuszcaxqaiwdt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925660.958997-2286-234901573637572/AnsiballZ_stat.py'
Dec 05 09:07:41 compute-1 sudo[148961]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:07:41 compute-1 python3.9[148963]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 09:07:41 compute-1 sudo[148961]: pam_unix(sudo:session): session closed for user root
Dec 05 09:07:41 compute-1 sudo[149084]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-czbnnoypdhkvnkbpkygtncfeammzhhal ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925660.958997-2286-234901573637572/AnsiballZ_copy.py'
Dec 05 09:07:41 compute-1 sudo[149084]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:07:42 compute-1 python3.9[149086]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764925660.958997-2286-234901573637572/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 09:07:42 compute-1 sudo[149084]: pam_unix(sudo:session): session closed for user root
Dec 05 09:07:42 compute-1 sudo[149236]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aapmtlmmgkywyxwrevtbqhcejczwugys ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925662.2767065-2286-68316150395572/AnsiballZ_stat.py'
Dec 05 09:07:42 compute-1 sudo[149236]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:07:42 compute-1 python3.9[149238]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 09:07:42 compute-1 sudo[149236]: pam_unix(sudo:session): session closed for user root
Dec 05 09:07:43 compute-1 sudo[149359]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pdahiwizuscbguepiynmkhgfyattqkuv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925662.2767065-2286-68316150395572/AnsiballZ_copy.py'
Dec 05 09:07:43 compute-1 sudo[149359]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:07:43 compute-1 python3.9[149361]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764925662.2767065-2286-68316150395572/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 09:07:43 compute-1 sudo[149359]: pam_unix(sudo:session): session closed for user root
Dec 05 09:07:43 compute-1 sudo[149511]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hgedvvoblnpyadusizdjdtahzjhzsrdh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925663.5372772-2286-224350253617260/AnsiballZ_stat.py'
Dec 05 09:07:43 compute-1 sudo[149511]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:07:44 compute-1 python3.9[149513]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 09:07:44 compute-1 sudo[149511]: pam_unix(sudo:session): session closed for user root
Dec 05 09:07:44 compute-1 sudo[149634]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fcdwagsyltwatoyivpohhszvmkzdqaug ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925663.5372772-2286-224350253617260/AnsiballZ_copy.py'
Dec 05 09:07:44 compute-1 sudo[149634]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:07:44 compute-1 python3.9[149636]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764925663.5372772-2286-224350253617260/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 09:07:44 compute-1 sudo[149634]: pam_unix(sudo:session): session closed for user root
Dec 05 09:07:45 compute-1 sudo[149786]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-osclxelivuwlukdkxmmnjdkxorilpiuk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925664.7493062-2286-124559033234368/AnsiballZ_stat.py'
Dec 05 09:07:45 compute-1 sudo[149786]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:07:45 compute-1 python3.9[149788]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 09:07:45 compute-1 sudo[149786]: pam_unix(sudo:session): session closed for user root
Dec 05 09:07:45 compute-1 sudo[149909]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-keuokkawggdfxuveziuccrcdvpetpqnd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925664.7493062-2286-124559033234368/AnsiballZ_copy.py'
Dec 05 09:07:45 compute-1 sudo[149909]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:07:45 compute-1 python3.9[149911]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764925664.7493062-2286-124559033234368/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 09:07:45 compute-1 sudo[149909]: pam_unix(sudo:session): session closed for user root
Dec 05 09:07:46 compute-1 sudo[150061]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rfwbwwjqbmuyorvfhhlolmwvthnhogvn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925666.0092144-2286-205316009196682/AnsiballZ_stat.py'
Dec 05 09:07:46 compute-1 sudo[150061]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:07:46 compute-1 python3.9[150063]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 09:07:46 compute-1 sudo[150061]: pam_unix(sudo:session): session closed for user root
Dec 05 09:07:46 compute-1 sudo[150184]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-guynsxyenwuunbgbhwhcxaloarjhrtpb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925666.0092144-2286-205316009196682/AnsiballZ_copy.py'
Dec 05 09:07:46 compute-1 sudo[150184]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:07:47 compute-1 python3.9[150186]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764925666.0092144-2286-205316009196682/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 09:07:47 compute-1 sudo[150184]: pam_unix(sudo:session): session closed for user root
Dec 05 09:07:47 compute-1 sudo[150336]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ersmbxvkhlcjimpxqazendraojhigjsg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925667.3254223-2286-228621259214674/AnsiballZ_stat.py'
Dec 05 09:07:47 compute-1 sudo[150336]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:07:47 compute-1 python3.9[150338]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 09:07:47 compute-1 sudo[150336]: pam_unix(sudo:session): session closed for user root
Dec 05 09:07:48 compute-1 sudo[150459]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-odaforasqqdgqtrxckxwgskdrbmreijr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925667.3254223-2286-228621259214674/AnsiballZ_copy.py'
Dec 05 09:07:48 compute-1 sudo[150459]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:07:48 compute-1 python3.9[150461]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764925667.3254223-2286-228621259214674/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 09:07:48 compute-1 sudo[150459]: pam_unix(sudo:session): session closed for user root
Dec 05 09:07:48 compute-1 sudo[150611]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nzxbxubdkmarvwnxfgdlicbatyymanpv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925668.590859-2286-43711718971668/AnsiballZ_stat.py'
Dec 05 09:07:48 compute-1 sudo[150611]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:07:49 compute-1 python3.9[150613]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 09:07:49 compute-1 sudo[150611]: pam_unix(sudo:session): session closed for user root
Dec 05 09:07:49 compute-1 sudo[150734]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bqefkhvrkyhjtgremsyybzgcdjospkof ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925668.590859-2286-43711718971668/AnsiballZ_copy.py'
Dec 05 09:07:49 compute-1 sudo[150734]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:07:49 compute-1 python3.9[150736]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764925668.590859-2286-43711718971668/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 09:07:49 compute-1 sudo[150734]: pam_unix(sudo:session): session closed for user root
Dec 05 09:07:50 compute-1 sudo[150886]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hyhfvxmwsodyzkmwsfoullzjwwbjqmsu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925669.8279748-2286-235322487319616/AnsiballZ_stat.py'
Dec 05 09:07:50 compute-1 sudo[150886]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:07:50 compute-1 python3.9[150888]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 09:07:50 compute-1 sudo[150886]: pam_unix(sudo:session): session closed for user root
Dec 05 09:07:50 compute-1 sudo[151009]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xziddfhnptrvrujrczkonmwzepoqbnvk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925669.8279748-2286-235322487319616/AnsiballZ_copy.py'
Dec 05 09:07:50 compute-1 sudo[151009]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:07:50 compute-1 python3.9[151011]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764925669.8279748-2286-235322487319616/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 09:07:50 compute-1 sudo[151009]: pam_unix(sudo:session): session closed for user root
Dec 05 09:07:51 compute-1 sudo[151161]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nzdrqqkvpofdkpyjoemefapdrimfvlak ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925671.0743303-2286-158382552788750/AnsiballZ_stat.py'
Dec 05 09:07:51 compute-1 sudo[151161]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:07:51 compute-1 python3.9[151163]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 09:07:51 compute-1 sudo[151161]: pam_unix(sudo:session): session closed for user root
Dec 05 09:07:51 compute-1 sudo[151284]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jywgqyllycevnbfgcjvpmyksnnucqain ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925671.0743303-2286-158382552788750/AnsiballZ_copy.py'
Dec 05 09:07:51 compute-1 sudo[151284]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:07:52 compute-1 python3.9[151286]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764925671.0743303-2286-158382552788750/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 09:07:52 compute-1 sudo[151284]: pam_unix(sudo:session): session closed for user root
Dec 05 09:07:52 compute-1 sudo[151436]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nwqnoofjltahyafgthbjqzdzehagwtxi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925672.3227265-2286-48942465700574/AnsiballZ_stat.py'
Dec 05 09:07:52 compute-1 sudo[151436]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:07:52 compute-1 python3.9[151438]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 09:07:52 compute-1 sudo[151436]: pam_unix(sudo:session): session closed for user root
Dec 05 09:07:53 compute-1 sudo[151559]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jjrilfleormewijwhnpvftfsmnctmhpx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925672.3227265-2286-48942465700574/AnsiballZ_copy.py'
Dec 05 09:07:53 compute-1 sudo[151559]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:07:53 compute-1 python3.9[151561]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764925672.3227265-2286-48942465700574/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 09:07:53 compute-1 sudo[151559]: pam_unix(sudo:session): session closed for user root
Dec 05 09:07:53 compute-1 sudo[151711]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tjpzkwwkywbhhopaxndgghciysdkhldy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925673.6473951-2286-30881078427356/AnsiballZ_stat.py'
Dec 05 09:07:53 compute-1 sudo[151711]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:07:54 compute-1 python3.9[151713]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 09:07:54 compute-1 sudo[151711]: pam_unix(sudo:session): session closed for user root
Dec 05 09:07:54 compute-1 sudo[151834]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zjobcnwyyrncwpaxrqczvpjpcbzbfere ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925673.6473951-2286-30881078427356/AnsiballZ_copy.py'
Dec 05 09:07:54 compute-1 sudo[151834]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:07:54 compute-1 python3.9[151836]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764925673.6473951-2286-30881078427356/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 09:07:54 compute-1 sudo[151834]: pam_unix(sudo:session): session closed for user root
Dec 05 09:07:55 compute-1 sudo[151986]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hwziwonpogbsydwngcyvfvoexnryzhne ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925675.123311-2286-118503808586401/AnsiballZ_stat.py'
Dec 05 09:07:55 compute-1 sudo[151986]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:07:55 compute-1 python3.9[151988]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 09:07:55 compute-1 sudo[151986]: pam_unix(sudo:session): session closed for user root
Dec 05 09:07:56 compute-1 sudo[152109]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ywwahbqcymcjhdofqiycvucxsyiqjgii ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925675.123311-2286-118503808586401/AnsiballZ_copy.py'
Dec 05 09:07:56 compute-1 sudo[152109]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:07:56 compute-1 python3.9[152111]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764925675.123311-2286-118503808586401/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 09:07:56 compute-1 sudo[152109]: pam_unix(sudo:session): session closed for user root
Dec 05 09:07:57 compute-1 python3.9[152263]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail
                                             ls -lRZ /run/libvirt | grep -E ':container_\S+_t'
                                              _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 05 09:07:58 compute-1 sshd-session[152211]: Received disconnect from 185.118.15.236 port 34556:11: Bye Bye [preauth]
Dec 05 09:07:58 compute-1 sshd-session[152211]: Disconnected from authenticating user root 185.118.15.236 port 34556 [preauth]
Dec 05 09:07:58 compute-1 sudo[152416]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tdbcfpptnagpokdpwdqsoznfxwpjgxwm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925677.6669965-2904-250168487220072/AnsiballZ_seboolean.py'
Dec 05 09:07:58 compute-1 sudo[152416]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:07:58 compute-1 python3.9[152418]: ansible-ansible.posix.seboolean Invoked with name=os_enable_vtpm persistent=True state=True ignore_selinux_state=False
Dec 05 09:07:59 compute-1 sudo[152416]: pam_unix(sudo:session): session closed for user root
Dec 05 09:08:00 compute-1 sudo[152572]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tguuwoqqpphijudinfarjatmvjfhkpcf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925680.1028712-2928-204357813109006/AnsiballZ_copy.py'
Dec 05 09:08:00 compute-1 dbus-broker-launch[773]: avc:  op=load_policy lsm=selinux seqno=15 res=1
Dec 05 09:08:00 compute-1 sudo[152572]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:08:00 compute-1 python3.9[152574]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/servercert.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 09:08:00 compute-1 sudo[152572]: pam_unix(sudo:session): session closed for user root
Dec 05 09:08:01 compute-1 sudo[152724]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-chnxqaobloxgjjacjyuuaoowgosryqil ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925680.8005133-2928-149757351944064/AnsiballZ_copy.py'
Dec 05 09:08:01 compute-1 sudo[152724]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:08:01 compute-1 python3.9[152726]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/private/serverkey.pem group=root mode=0600 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 09:08:01 compute-1 sudo[152724]: pam_unix(sudo:session): session closed for user root
Dec 05 09:08:01 compute-1 sudo[152876]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-glvfbgxhbivdwraazdbkbaeecdkazynl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925681.452412-2928-56440900172329/AnsiballZ_copy.py'
Dec 05 09:08:01 compute-1 sudo[152876]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:08:01 compute-1 python3.9[152878]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/clientcert.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 09:08:01 compute-1 sudo[152876]: pam_unix(sudo:session): session closed for user root
Dec 05 09:08:02 compute-1 sudo[153029]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qoizvpuaikfyjaiqxlvcppqgpfcnawlm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925682.1294303-2928-75215654522042/AnsiballZ_copy.py'
Dec 05 09:08:02 compute-1 sudo[153029]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:08:02 compute-1 python3.9[153031]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/private/clientkey.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 09:08:02 compute-1 sudo[153029]: pam_unix(sudo:session): session closed for user root
Dec 05 09:08:03 compute-1 sudo[153183]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ozqktjdqwwcaqhfdyqkomnkasbbzfdmo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925682.7564027-2928-207461937159184/AnsiballZ_copy.py'
Dec 05 09:08:03 compute-1 sudo[153183]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:08:03 compute-1 python3.9[153185]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/CA/cacert.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/ca.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 09:08:03 compute-1 sudo[153183]: pam_unix(sudo:session): session closed for user root
Dec 05 09:08:04 compute-1 sudo[153335]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yjcnpnlulshzplkidlefbislacdqcwvy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925683.6722581-3036-67890526180868/AnsiballZ_copy.py'
Dec 05 09:08:04 compute-1 sudo[153335]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:08:04 compute-1 python3.9[153337]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/server-cert.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 09:08:04 compute-1 sudo[153335]: pam_unix(sudo:session): session closed for user root
Dec 05 09:08:04 compute-1 sshd-session[153131]: Received disconnect from 43.225.158.169 port 45482:11: Bye Bye [preauth]
Dec 05 09:08:04 compute-1 sshd-session[153131]: Disconnected from authenticating user root 43.225.158.169 port 45482 [preauth]
Dec 05 09:08:04 compute-1 sudo[153487]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zlhmmubdyuuhunjfgpbdhhyqzylmypyr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925684.3984447-3036-61134725109111/AnsiballZ_copy.py'
Dec 05 09:08:04 compute-1 sudo[153487]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:08:04 compute-1 python3.9[153489]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/server-key.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 09:08:04 compute-1 sudo[153487]: pam_unix(sudo:session): session closed for user root
Dec 05 09:08:05 compute-1 sudo[153639]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yfqdcizyauwtgijizlbpkxkkalcretml ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925685.064099-3036-248877900163661/AnsiballZ_copy.py'
Dec 05 09:08:05 compute-1 sudo[153639]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:08:05 compute-1 python3.9[153641]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/client-cert.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 09:08:05 compute-1 sudo[153639]: pam_unix(sudo:session): session closed for user root
Dec 05 09:08:05 compute-1 sudo[153791]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dqnfdybbyslqqnwrgpzxdqfkaxpxpzbr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925685.7016885-3036-142795489440271/AnsiballZ_copy.py'
Dec 05 09:08:05 compute-1 sudo[153791]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:08:06 compute-1 python3.9[153793]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/client-key.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 09:08:06 compute-1 sudo[153791]: pam_unix(sudo:session): session closed for user root
Dec 05 09:08:06 compute-1 sudo[153943]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oscijeupmmulqcrfhbludquraebisgmo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925686.3537862-3036-24302760862921/AnsiballZ_copy.py'
Dec 05 09:08:06 compute-1 sudo[153943]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:08:06 compute-1 python3.9[153945]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/ca-cert.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/ca.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 09:08:06 compute-1 sudo[153943]: pam_unix(sudo:session): session closed for user root
Dec 05 09:08:07 compute-1 sudo[154095]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-swzuemwdrbbwfcarqbbkqyamdkhojkcb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925687.0801656-3144-242731430432054/AnsiballZ_systemd.py'
Dec 05 09:08:07 compute-1 sudo[154095]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:08:07 compute-1 python3.9[154097]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtlogd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 05 09:08:07 compute-1 systemd[1]: Reloading.
Dec 05 09:08:07 compute-1 systemd-rc-local-generator[154146]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 05 09:08:07 compute-1 systemd-sysv-generator[154149]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 05 09:08:07 compute-1 podman[154099]: 2025-12-05 09:08:07.861116118 +0000 UTC m=+0.109065092 container health_status 0cdc951f3b455a1459471b20e1f1626ca53f702831c125f835d1b670aeb17ccf (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a42d21893a42142b0b00e96296a13ac9d2c0fedf55d6438ad104fd0309c63376-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true)
Dec 05 09:08:08 compute-1 systemd[1]: Starting libvirt logging daemon socket...
Dec 05 09:08:08 compute-1 systemd[1]: Listening on libvirt logging daemon socket.
Dec 05 09:08:08 compute-1 systemd[1]: Starting libvirt logging daemon admin socket...
Dec 05 09:08:08 compute-1 systemd[1]: Listening on libvirt logging daemon admin socket.
Dec 05 09:08:08 compute-1 systemd[1]: Starting libvirt logging daemon...
Dec 05 09:08:08 compute-1 systemd[1]: Started libvirt logging daemon.
Dec 05 09:08:08 compute-1 sudo[154095]: pam_unix(sudo:session): session closed for user root
Dec 05 09:08:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:08:08.850 105272 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:08:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:08:08.855 105272 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.006s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:08:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:08:08.855 105272 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:08:08 compute-1 sudo[154317]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jiwrozvtesiyzspmytvsexwjtldnurqg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925688.472099-3144-198604238852788/AnsiballZ_systemd.py'
Dec 05 09:08:08 compute-1 sudo[154317]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:08:09 compute-1 python3.9[154319]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtnodedevd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 05 09:08:09 compute-1 systemd[1]: Reloading.
Dec 05 09:08:09 compute-1 systemd-rc-local-generator[154345]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 05 09:08:09 compute-1 systemd-sysv-generator[154350]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 05 09:08:09 compute-1 systemd[1]: Starting libvirt nodedev daemon socket...
Dec 05 09:08:09 compute-1 systemd[1]: Listening on libvirt nodedev daemon socket.
Dec 05 09:08:09 compute-1 systemd[1]: Starting libvirt nodedev daemon admin socket...
Dec 05 09:08:09 compute-1 systemd[1]: Starting libvirt nodedev daemon read-only socket...
Dec 05 09:08:09 compute-1 systemd[1]: Listening on libvirt nodedev daemon admin socket.
Dec 05 09:08:09 compute-1 systemd[1]: Listening on libvirt nodedev daemon read-only socket.
Dec 05 09:08:09 compute-1 systemd[1]: Starting libvirt nodedev daemon...
Dec 05 09:08:09 compute-1 systemd[1]: Started libvirt nodedev daemon.
Dec 05 09:08:09 compute-1 sudo[154317]: pam_unix(sudo:session): session closed for user root
Dec 05 09:08:10 compute-1 systemd[1]: Starting SETroubleshoot daemon for processing new SELinux denial logs...
Dec 05 09:08:10 compute-1 sudo[154534]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qskjglykxmdikyuoevnotobbwzsvazlf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925689.8040345-3144-265757075884194/AnsiballZ_systemd.py'
Dec 05 09:08:10 compute-1 sudo[154534]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:08:10 compute-1 systemd[1]: Started SETroubleshoot daemon for processing new SELinux denial logs.
Dec 05 09:08:10 compute-1 python3.9[154536]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtproxyd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 05 09:08:10 compute-1 systemd[1]: Reloading.
Dec 05 09:08:10 compute-1 systemd-rc-local-generator[154565]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 05 09:08:10 compute-1 systemd-sysv-generator[154568]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 05 09:08:10 compute-1 systemd[1]: Starting libvirt proxy daemon admin socket...
Dec 05 09:08:10 compute-1 systemd[1]: Starting libvirt proxy daemon read-only socket...
Dec 05 09:08:10 compute-1 systemd[1]: Listening on libvirt proxy daemon admin socket.
Dec 05 09:08:10 compute-1 systemd[1]: Listening on libvirt proxy daemon read-only socket.
Dec 05 09:08:10 compute-1 systemd[1]: Starting libvirt proxy daemon...
Dec 05 09:08:10 compute-1 systemd[1]: Created slice Slice /system/dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged.
Dec 05 09:08:10 compute-1 systemd[1]: Started dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@0.service.
Dec 05 09:08:10 compute-1 systemd[1]: Started libvirt proxy daemon.
Dec 05 09:08:10 compute-1 sudo[154534]: pam_unix(sudo:session): session closed for user root
Dec 05 09:08:10 compute-1 podman[154574]: 2025-12-05 09:08:10.838626254 +0000 UTC m=+0.109994370 container health_status e8cea6352187f28e672aa25c227f2705a90d58074b8cf42235a56df5d4026fd5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a42d21893a42142b0b00e96296a13ac9d2c0fedf55d6438ad104fd0309c63376-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2)
Dec 05 09:08:11 compute-1 sudo[154771]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dptzqyahxxrzugwzbbtpajnotysneqos ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925691.0039806-3144-158601875032381/AnsiballZ_systemd.py'
Dec 05 09:08:11 compute-1 sudo[154771]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:08:11 compute-1 python3.9[154773]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtqemud.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 05 09:08:11 compute-1 systemd[1]: Reloading.
Dec 05 09:08:11 compute-1 systemd-sysv-generator[154802]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 05 09:08:11 compute-1 systemd-rc-local-generator[154799]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 05 09:08:11 compute-1 setroubleshoot[154507]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability. For complete SELinux messages run: sealert -l 1c3a5223-08fc-45aa-9d81-b8311a98de9c
Dec 05 09:08:11 compute-1 setroubleshoot[154507]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability.
                                                  
                                                  *****  Plugin dac_override (91.4 confidence) suggests   **********************
                                                  
                                                  If you want to help identify if domain needs this access or you have a file with the wrong permissions on your system
                                                  Then turn on full auditing to get path information about the offending file and generate the error again.
                                                  Do
                                                  
                                                  Turn on full auditing
                                                  # auditctl -w /etc/shadow -p w
                                                  Try to recreate AVC. Then execute
                                                  # ausearch -m avc -ts recent
                                                  If you see PATH record check ownership/permissions on file, and fix it,
                                                  otherwise report as a bugzilla.
                                                  
                                                  *****  Plugin catchall (9.59 confidence) suggests   **************************
                                                  
                                                  If you believe that virtlogd should have the dac_read_search capability by default.
                                                  Then you should report this as a bug.
                                                  You can generate a local policy module to allow this access.
                                                  Do
                                                  allow this access for now by executing:
                                                  # ausearch -c 'virtlogd' --raw | audit2allow -M my-virtlogd
                                                  # semodule -X 300 -i my-virtlogd.pp
                                                  
Dec 05 09:08:11 compute-1 setroubleshoot[154507]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability. For complete SELinux messages run: sealert -l 1c3a5223-08fc-45aa-9d81-b8311a98de9c
Dec 05 09:08:11 compute-1 rsyslogd[1007]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Dec 05 09:08:11 compute-1 rsyslogd[1007]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Dec 05 09:08:11 compute-1 setroubleshoot[154507]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability.
                                                  
                                                  *****  Plugin dac_override (91.4 confidence) suggests   **********************
                                                  
                                                  If you want to help identify if domain needs this access or you have a file with the wrong permissions on your system
                                                  Then turn on full auditing to get path information about the offending file and generate the error again.
                                                  Do
                                                  
                                                  Turn on full auditing
                                                  # auditctl -w /etc/shadow -p w
                                                  Try to recreate AVC. Then execute
                                                  # ausearch -m avc -ts recent
                                                  If you see PATH record check ownership/permissions on file, and fix it,
                                                  otherwise report as a bugzilla.
                                                  
                                                  *****  Plugin catchall (9.59 confidence) suggests   **************************
                                                  
                                                  If you believe that virtlogd should have the dac_read_search capability by default.
                                                  Then you should report this as a bug.
                                                  You can generate a local policy module to allow this access.
                                                  Do
                                                  allow this access for now by executing:
                                                  # ausearch -c 'virtlogd' --raw | audit2allow -M my-virtlogd
                                                  # semodule -X 300 -i my-virtlogd.pp
                                                  
Dec 05 09:08:12 compute-1 systemd[1]: Listening on libvirt locking daemon socket.
Dec 05 09:08:12 compute-1 systemd[1]: Starting libvirt QEMU daemon socket...
Dec 05 09:08:12 compute-1 systemd[1]: Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw).
Dec 05 09:08:12 compute-1 systemd[1]: Starting Virtual Machine and Container Registration Service...
Dec 05 09:08:12 compute-1 systemd[1]: Listening on libvirt QEMU daemon socket.
Dec 05 09:08:12 compute-1 systemd[1]: Starting libvirt QEMU daemon admin socket...
Dec 05 09:08:12 compute-1 systemd[1]: Starting libvirt QEMU daemon read-only socket...
Dec 05 09:08:12 compute-1 systemd[1]: Listening on libvirt QEMU daemon admin socket.
Dec 05 09:08:12 compute-1 systemd[1]: Listening on libvirt QEMU daemon read-only socket.
Dec 05 09:08:12 compute-1 systemd[1]: Started Virtual Machine and Container Registration Service.
Dec 05 09:08:12 compute-1 systemd[1]: Starting libvirt QEMU daemon...
Dec 05 09:08:12 compute-1 systemd[1]: Started libvirt QEMU daemon.
Dec 05 09:08:12 compute-1 sudo[154771]: pam_unix(sudo:session): session closed for user root
Dec 05 09:08:12 compute-1 sudo[154988]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jfwsbjmktzpgfgyennhyqcinozjhizqz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925692.3332565-3144-163804651743757/AnsiballZ_systemd.py'
Dec 05 09:08:12 compute-1 sudo[154988]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:08:13 compute-1 python3.9[154990]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtsecretd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 05 09:08:13 compute-1 systemd[1]: Reloading.
Dec 05 09:08:13 compute-1 systemd-rc-local-generator[155018]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 05 09:08:13 compute-1 systemd-sysv-generator[155021]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 05 09:08:13 compute-1 systemd[1]: Starting libvirt secret daemon socket...
Dec 05 09:08:13 compute-1 systemd[1]: Listening on libvirt secret daemon socket.
Dec 05 09:08:13 compute-1 systemd[1]: Starting libvirt secret daemon admin socket...
Dec 05 09:08:13 compute-1 systemd[1]: Starting libvirt secret daemon read-only socket...
Dec 05 09:08:13 compute-1 systemd[1]: Listening on libvirt secret daemon admin socket.
Dec 05 09:08:13 compute-1 systemd[1]: Listening on libvirt secret daemon read-only socket.
Dec 05 09:08:13 compute-1 systemd[1]: Starting libvirt secret daemon...
Dec 05 09:08:13 compute-1 systemd[1]: Started libvirt secret daemon.
Dec 05 09:08:13 compute-1 sudo[154988]: pam_unix(sudo:session): session closed for user root
Dec 05 09:08:14 compute-1 sudo[155200]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nswrbistgdecoixqjqiyhtvtsknrkwly ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925693.7770925-3255-99346057912438/AnsiballZ_file.py'
Dec 05 09:08:14 compute-1 sudo[155200]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:08:14 compute-1 python3.9[155202]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/openstack/config/ceph state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 09:08:14 compute-1 sudo[155200]: pam_unix(sudo:session): session closed for user root
Dec 05 09:08:14 compute-1 sudo[155352]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xlpjbyolfqpqsdhctxqpkxfedikcbzcx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925694.5057354-3279-43273315437879/AnsiballZ_find.py'
Dec 05 09:08:14 compute-1 sudo[155352]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:08:15 compute-1 python3.9[155354]: ansible-ansible.builtin.find Invoked with paths=['/var/lib/openstack/config/ceph'] patterns=['*.conf'] read_whole_file=False file_type=file age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Dec 05 09:08:15 compute-1 sudo[155352]: pam_unix(sudo:session): session closed for user root
Dec 05 09:08:16 compute-1 sudo[155504]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-etwlipcnrqehpbmpmzcowwvcyfcfyezx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925695.786716-3321-117211199637655/AnsiballZ_stat.py'
Dec 05 09:08:16 compute-1 sudo[155504]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:08:16 compute-1 python3.9[155506]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/libvirt.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 09:08:16 compute-1 sudo[155504]: pam_unix(sudo:session): session closed for user root
Dec 05 09:08:16 compute-1 sudo[155627]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nvdhuhzwygtwokrtmyphohotrjoiupyw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925695.786716-3321-117211199637655/AnsiballZ_copy.py'
Dec 05 09:08:16 compute-1 sudo[155627]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:08:16 compute-1 python3.9[155629]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/libvirt.yaml mode=0640 src=/home/zuul/.ansible/tmp/ansible-tmp-1764925695.786716-3321-117211199637655/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=5ca83b1310a74c5e48c4c3d4640e1cb8fdac1061 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 09:08:16 compute-1 sudo[155627]: pam_unix(sudo:session): session closed for user root
Dec 05 09:08:17 compute-1 sudo[155781]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sungmznymsphibotwtbtlbsroaygfyfh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925697.5740442-3369-188458813926846/AnsiballZ_file.py'
Dec 05 09:08:17 compute-1 sudo[155781]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:08:18 compute-1 python3.9[155783]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 09:08:18 compute-1 sudo[155781]: pam_unix(sudo:session): session closed for user root
Dec 05 09:08:18 compute-1 sudo[155933]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dvwzghftypfiqblxqdmbnmlqimnpusal ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925698.377593-3393-144694605123227/AnsiballZ_stat.py'
Dec 05 09:08:18 compute-1 sudo[155933]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:08:18 compute-1 python3.9[155935]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 09:08:18 compute-1 sudo[155933]: pam_unix(sudo:session): session closed for user root
Dec 05 09:08:19 compute-1 sshd-session[155654]: Received disconnect from 122.168.194.41 port 60124:11: Bye Bye [preauth]
Dec 05 09:08:19 compute-1 sshd-session[155654]: Disconnected from authenticating user root 122.168.194.41 port 60124 [preauth]
Dec 05 09:08:19 compute-1 sudo[156011]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ioegoubqyojmiccqzudzrwrvfzefcjws ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925698.377593-3393-144694605123227/AnsiballZ_file.py'
Dec 05 09:08:19 compute-1 sudo[156011]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:08:19 compute-1 python3.9[156013]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 09:08:19 compute-1 sudo[156011]: pam_unix(sudo:session): session closed for user root
Dec 05 09:08:19 compute-1 sudo[156163]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rmpwzjihzzfubibgsskeveqccjwratyw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925699.626552-3429-255674992033454/AnsiballZ_stat.py'
Dec 05 09:08:19 compute-1 sudo[156163]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:08:20 compute-1 python3.9[156165]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 09:08:20 compute-1 sudo[156163]: pam_unix(sudo:session): session closed for user root
Dec 05 09:08:20 compute-1 sudo[156241]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zsuipvbgqsnpolbgitwizwblzmgrchia ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925699.626552-3429-255674992033454/AnsiballZ_file.py'
Dec 05 09:08:20 compute-1 sudo[156241]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:08:20 compute-1 python3.9[156243]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.zmzwn3yb recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 09:08:20 compute-1 sudo[156241]: pam_unix(sudo:session): session closed for user root
Dec 05 09:08:21 compute-1 sudo[156393]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ltublunkwdqhqubrzvvtuzkgikhbriiu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925700.8268256-3465-163965746570018/AnsiballZ_stat.py'
Dec 05 09:08:21 compute-1 sudo[156393]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:08:21 compute-1 python3.9[156395]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 09:08:21 compute-1 sudo[156393]: pam_unix(sudo:session): session closed for user root
Dec 05 09:08:21 compute-1 sudo[156471]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bdogrzsxvynsfejrezujtjupdyvbtbeg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925700.8268256-3465-163965746570018/AnsiballZ_file.py'
Dec 05 09:08:21 compute-1 sudo[156471]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:08:21 compute-1 python3.9[156473]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 09:08:21 compute-1 sudo[156471]: pam_unix(sudo:session): session closed for user root
Dec 05 09:08:21 compute-1 systemd[1]: dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@0.service: Deactivated successfully.
Dec 05 09:08:21 compute-1 systemd[1]: dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@0.service: Consumed 1.158s CPU time.
Dec 05 09:08:22 compute-1 systemd[1]: setroubleshootd.service: Deactivated successfully.
Dec 05 09:08:22 compute-1 sudo[156623]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mawvcgzpiqseftmbfpdweaysslyybfec ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925702.3625216-3504-245462245250828/AnsiballZ_command.py'
Dec 05 09:08:22 compute-1 sudo[156623]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:08:22 compute-1 python3.9[156625]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 05 09:08:22 compute-1 sudo[156623]: pam_unix(sudo:session): session closed for user root
Dec 05 09:08:23 compute-1 sudo[156776]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rispacdqkkupkwyuaugwmbgstpdulfhk ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1764925703.1285625-3528-134706920755809/AnsiballZ_edpm_nftables_from_files.py'
Dec 05 09:08:23 compute-1 sudo[156776]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:08:23 compute-1 python3[156778]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Dec 05 09:08:23 compute-1 sudo[156776]: pam_unix(sudo:session): session closed for user root
Dec 05 09:08:24 compute-1 sudo[156928]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ztytkgjwjmbejwxqkpnesicmemhzgrrt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925704.133399-3552-257357501699778/AnsiballZ_stat.py'
Dec 05 09:08:24 compute-1 sudo[156928]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:08:24 compute-1 python3.9[156930]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 09:08:24 compute-1 sudo[156928]: pam_unix(sudo:session): session closed for user root
Dec 05 09:08:24 compute-1 sudo[157006]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wtufhehjkmlfdznwgmvlvrnreesdkcty ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925704.133399-3552-257357501699778/AnsiballZ_file.py'
Dec 05 09:08:24 compute-1 sudo[157006]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:08:25 compute-1 python3.9[157008]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 09:08:25 compute-1 sudo[157006]: pam_unix(sudo:session): session closed for user root
Dec 05 09:08:25 compute-1 sudo[157158]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ddojaqbglagrybhpuffrsuqxysdzrenv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925705.4166677-3588-178301513728506/AnsiballZ_stat.py'
Dec 05 09:08:25 compute-1 sudo[157158]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:08:25 compute-1 python3.9[157160]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 09:08:26 compute-1 sudo[157158]: pam_unix(sudo:session): session closed for user root
Dec 05 09:08:26 compute-1 sudo[157236]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zrgoakupnrwpbfktluhrawosxjvflxll ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925705.4166677-3588-178301513728506/AnsiballZ_file.py'
Dec 05 09:08:26 compute-1 sudo[157236]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:08:26 compute-1 python3.9[157238]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-update-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-update-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 09:08:26 compute-1 sudo[157236]: pam_unix(sudo:session): session closed for user root
Dec 05 09:08:27 compute-1 sudo[157388]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nlhivhncrkjszumwijedflkuhakobgkf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925706.9685833-3624-221236177099533/AnsiballZ_stat.py'
Dec 05 09:08:27 compute-1 sudo[157388]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:08:27 compute-1 python3.9[157390]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 09:08:27 compute-1 sudo[157388]: pam_unix(sudo:session): session closed for user root
Dec 05 09:08:27 compute-1 sudo[157466]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aadogiaalahnfrahuxapqmzuitvcuidj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925706.9685833-3624-221236177099533/AnsiballZ_file.py'
Dec 05 09:08:27 compute-1 sudo[157466]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:08:27 compute-1 python3.9[157468]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-flushes.nft _original_basename=flush-chain.j2 recurse=False state=file path=/etc/nftables/edpm-flushes.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 09:08:27 compute-1 sudo[157466]: pam_unix(sudo:session): session closed for user root
Dec 05 09:08:28 compute-1 sudo[157618]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tvswbsedjlyfhvioadpchedlsvkjyvrm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925708.1907556-3660-213872509350080/AnsiballZ_stat.py'
Dec 05 09:08:28 compute-1 sudo[157618]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:08:28 compute-1 python3.9[157620]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 09:08:28 compute-1 sudo[157618]: pam_unix(sudo:session): session closed for user root
Dec 05 09:08:28 compute-1 sudo[157696]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bxlyfmryqfqocqvswhjoqpxgfxmforis ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925708.1907556-3660-213872509350080/AnsiballZ_file.py'
Dec 05 09:08:28 compute-1 sudo[157696]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:08:29 compute-1 python3.9[157698]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-chains.nft _original_basename=chains.j2 recurse=False state=file path=/etc/nftables/edpm-chains.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 09:08:29 compute-1 sudo[157696]: pam_unix(sudo:session): session closed for user root
Dec 05 09:08:29 compute-1 sudo[157848]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-abrwdfbcokpsdlujyivtblsnglbozcuz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925709.4275026-3696-56629413145444/AnsiballZ_stat.py'
Dec 05 09:08:29 compute-1 sudo[157848]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:08:29 compute-1 python3.9[157850]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 09:08:30 compute-1 sudo[157848]: pam_unix(sudo:session): session closed for user root
Dec 05 09:08:30 compute-1 sudo[157973]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ieoejfbkzriadneredydohzewnoufpkn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925709.4275026-3696-56629413145444/AnsiballZ_copy.py'
Dec 05 09:08:30 compute-1 sudo[157973]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:08:30 compute-1 python3.9[157975]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764925709.4275026-3696-56629413145444/.source.nft follow=False _original_basename=ruleset.j2 checksum=8a12d4eb5149b6e500230381c1359a710881e9b0 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 09:08:30 compute-1 sudo[157973]: pam_unix(sudo:session): session closed for user root
Dec 05 09:08:31 compute-1 sudo[158125]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gfnsxabvlrnfyffncneynkjvzoxoyjcf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925710.811143-3741-266159245599302/AnsiballZ_file.py'
Dec 05 09:08:31 compute-1 sudo[158125]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:08:31 compute-1 python3.9[158127]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 09:08:31 compute-1 sudo[158125]: pam_unix(sudo:session): session closed for user root
Dec 05 09:08:32 compute-1 sudo[158277]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jerxhvfcjcqecuvwjtmujvdqdachilna ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925711.8652608-3765-193006770149461/AnsiballZ_command.py'
Dec 05 09:08:32 compute-1 sudo[158277]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:08:32 compute-1 python3.9[158279]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 05 09:08:32 compute-1 sudo[158277]: pam_unix(sudo:session): session closed for user root
Dec 05 09:08:33 compute-1 sudo[158432]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bhpszdqaeqtfbepqpipjdgdvkdtxvmac ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925712.6224887-3789-31756730090723/AnsiballZ_blockinfile.py'
Dec 05 09:08:33 compute-1 sudo[158432]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:08:33 compute-1 python3.9[158434]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"
                                             include "/etc/nftables/edpm-chains.nft"
                                             include "/etc/nftables/edpm-rules.nft"
                                             include "/etc/nftables/edpm-jumps.nft"
                                              path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 09:08:33 compute-1 sudo[158432]: pam_unix(sudo:session): session closed for user root
Dec 05 09:08:33 compute-1 sudo[158584]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mexpzbtuxbllhirhbehodsafxoexmssa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925713.6108503-3816-111275274018205/AnsiballZ_command.py'
Dec 05 09:08:33 compute-1 sudo[158584]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:08:34 compute-1 python3.9[158586]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 05 09:08:34 compute-1 sudo[158584]: pam_unix(sudo:session): session closed for user root
Dec 05 09:08:34 compute-1 sudo[158737]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kyprrfdbpurgcqoouvxlrlruukifadly ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925714.3621695-3840-184823649150465/AnsiballZ_stat.py'
Dec 05 09:08:34 compute-1 sudo[158737]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:08:34 compute-1 python3.9[158739]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 05 09:08:34 compute-1 sudo[158737]: pam_unix(sudo:session): session closed for user root
Dec 05 09:08:35 compute-1 sudo[158891]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ccketnbnmpmrzbqhplkmjpdwnrkzkmjb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925715.111273-3864-208070907699426/AnsiballZ_command.py'
Dec 05 09:08:35 compute-1 sudo[158891]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:08:35 compute-1 python3.9[158893]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 05 09:08:35 compute-1 sudo[158891]: pam_unix(sudo:session): session closed for user root
Dec 05 09:08:36 compute-1 sudo[159046]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uccjrtllalyetbvsptddilkcaldtszaq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925715.8395507-3888-158088340999465/AnsiballZ_file.py'
Dec 05 09:08:36 compute-1 sudo[159046]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:08:36 compute-1 python3.9[159048]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 09:08:36 compute-1 sudo[159046]: pam_unix(sudo:session): session closed for user root
Dec 05 09:08:36 compute-1 sudo[159198]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oqmpffjoreqrharexgichddkiscddvkj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925716.5215836-3912-265999325311560/AnsiballZ_stat.py'
Dec 05 09:08:36 compute-1 sudo[159198]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:08:37 compute-1 python3.9[159200]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm_libvirt.target follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 09:08:37 compute-1 sudo[159198]: pam_unix(sudo:session): session closed for user root
Dec 05 09:08:37 compute-1 sudo[159321]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gknfxjuocdtfisuvbwxhqefuoikeafqs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925716.5215836-3912-265999325311560/AnsiballZ_copy.py'
Dec 05 09:08:37 compute-1 sudo[159321]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:08:37 compute-1 python3.9[159323]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm_libvirt.target mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764925716.5215836-3912-265999325311560/.source.target follow=False _original_basename=edpm_libvirt.target checksum=13035a1aa0f414c677b14be9a5a363b6623d393c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 09:08:38 compute-1 sudo[159321]: pam_unix(sudo:session): session closed for user root
Dec 05 09:08:38 compute-1 sudo[159482]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ehikoeabiwaiymgsnycvjgdjexqaglva ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925718.2239377-3957-5003336501210/AnsiballZ_stat.py'
Dec 05 09:08:38 compute-1 sudo[159482]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:08:38 compute-1 podman[159447]: 2025-12-05 09:08:38.611477594 +0000 UTC m=+0.120575337 container health_status 0cdc951f3b455a1459471b20e1f1626ca53f702831c125f835d1b670aeb17ccf (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a42d21893a42142b0b00e96296a13ac9d2c0fedf55d6438ad104fd0309c63376-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125)
Dec 05 09:08:38 compute-1 python3.9[159492]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm_libvirt_guests.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 09:08:38 compute-1 sudo[159482]: pam_unix(sudo:session): session closed for user root
Dec 05 09:08:39 compute-1 sudo[159620]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-agoafenfktkzamnfoyajsbhurozpgoga ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925718.2239377-3957-5003336501210/AnsiballZ_copy.py'
Dec 05 09:08:39 compute-1 sudo[159620]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:08:39 compute-1 python3.9[159622]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm_libvirt_guests.service mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764925718.2239377-3957-5003336501210/.source.service follow=False _original_basename=edpm_libvirt_guests.service checksum=db83430a42fc2ccfd6ed8b56ebf04f3dff9cd0cf backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 09:08:39 compute-1 sudo[159620]: pam_unix(sudo:session): session closed for user root
Dec 05 09:08:39 compute-1 sudo[159772]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kntrgycavhoohtdwpvostmquelkhafqa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925719.5941663-4003-50395057128384/AnsiballZ_stat.py'
Dec 05 09:08:39 compute-1 sudo[159772]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:08:40 compute-1 python3.9[159774]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virt-guest-shutdown.target follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 09:08:40 compute-1 sudo[159772]: pam_unix(sudo:session): session closed for user root
Dec 05 09:08:40 compute-1 sudo[159895]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qjuunvmldacwcuyuemsduzorglkasuqi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925719.5941663-4003-50395057128384/AnsiballZ_copy.py'
Dec 05 09:08:40 compute-1 sudo[159895]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:08:40 compute-1 python3.9[159897]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virt-guest-shutdown.target mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764925719.5941663-4003-50395057128384/.source.target follow=False _original_basename=virt-guest-shutdown.target checksum=49ca149619c596cbba877418629d2cf8f7b0f5cf backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 09:08:40 compute-1 sudo[159895]: pam_unix(sudo:session): session closed for user root
Dec 05 09:08:41 compute-1 sudo[160058]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-drvigvtpeowsycgvzkrgwrzlfzlxnpwo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925721.1996732-4047-169096431649269/AnsiballZ_systemd.py'
Dec 05 09:08:41 compute-1 sudo[160058]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:08:41 compute-1 podman[160021]: 2025-12-05 09:08:41.545564416 +0000 UTC m=+0.085387502 container health_status e8cea6352187f28e672aa25c227f2705a90d58074b8cf42235a56df5d4026fd5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a42d21893a42142b0b00e96296a13ac9d2c0fedf55d6438ad104fd0309c63376-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Dec 05 09:08:41 compute-1 python3.9[160062]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm_libvirt.target state=restarted daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 05 09:08:41 compute-1 systemd[1]: Reloading.
Dec 05 09:08:41 compute-1 systemd-sysv-generator[160098]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 05 09:08:41 compute-1 systemd-rc-local-generator[160093]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 05 09:08:42 compute-1 systemd[1]: Reached target edpm_libvirt.target.
Dec 05 09:08:42 compute-1 sudo[160058]: pam_unix(sudo:session): session closed for user root
Dec 05 09:08:42 compute-1 sudo[160257]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bllcrnyrrompnbilffvplsrcyqvojaky ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925722.413447-4071-60868897711786/AnsiballZ_systemd.py'
Dec 05 09:08:42 compute-1 sudo[160257]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:08:43 compute-1 python3.9[160259]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm_libvirt_guests daemon_reexec=False scope=system no_block=False state=None force=None masked=None
Dec 05 09:08:43 compute-1 systemd[1]: Reloading.
Dec 05 09:08:43 compute-1 systemd-rc-local-generator[160288]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 05 09:08:43 compute-1 systemd-sysv-generator[160291]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 05 09:08:43 compute-1 systemd[1]: Reloading.
Dec 05 09:08:43 compute-1 systemd-rc-local-generator[160324]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 05 09:08:43 compute-1 systemd-sysv-generator[160328]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 05 09:08:43 compute-1 sudo[160257]: pam_unix(sudo:session): session closed for user root
Dec 05 09:08:44 compute-1 sshd-session[105818]: Connection closed by 192.168.122.30 port 59102
Dec 05 09:08:44 compute-1 sshd-session[105815]: pam_unix(sshd:session): session closed for user zuul
Dec 05 09:08:44 compute-1 systemd[1]: session-24.scope: Deactivated successfully.
Dec 05 09:08:44 compute-1 systemd[1]: session-24.scope: Consumed 3min 33.315s CPU time.
Dec 05 09:08:44 compute-1 systemd-logind[807]: Session 24 logged out. Waiting for processes to exit.
Dec 05 09:08:44 compute-1 systemd-logind[807]: Removed session 24.
Dec 05 09:08:49 compute-1 sshd-session[160357]: Accepted publickey for zuul from 192.168.122.30 port 56692 ssh2: ECDSA SHA256:xFeNApY160LxGIiSPyA1pr6/OAQjwgC1t0EAbYvSE3Q
Dec 05 09:08:49 compute-1 systemd-logind[807]: New session 25 of user zuul.
Dec 05 09:08:49 compute-1 systemd[1]: Started Session 25 of User zuul.
Dec 05 09:08:49 compute-1 sshd-session[160357]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 05 09:08:50 compute-1 python3.9[160511]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 05 09:08:51 compute-1 sshd-session[160356]: Received disconnect from 101.47.162.91 port 53898:11: Bye Bye [preauth]
Dec 05 09:08:51 compute-1 sshd-session[160356]: Disconnected from authenticating user root 101.47.162.91 port 53898 [preauth]
Dec 05 09:08:52 compute-1 python3.9[160665]: ansible-ansible.builtin.service_facts Invoked
Dec 05 09:08:52 compute-1 network[160682]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Dec 05 09:08:52 compute-1 network[160683]: 'network-scripts' will be removed from distribution in near future.
Dec 05 09:08:52 compute-1 network[160684]: It is advised to switch to 'NetworkManager' instead for network management.
Dec 05 09:08:57 compute-1 sudo[160953]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jrqdukkgkjpvkadqcssjvqxdtzvjzqmj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925736.8495667-102-60613196062767/AnsiballZ_setup.py'
Dec 05 09:08:57 compute-1 sudo[160953]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:08:57 compute-1 python3.9[160955]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec 05 09:08:57 compute-1 sudo[160953]: pam_unix(sudo:session): session closed for user root
Dec 05 09:08:58 compute-1 sudo[161037]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cmqaxqdysspcvpdlgoszoehbxowkatyw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925736.8495667-102-60613196062767/AnsiballZ_dnf.py'
Dec 05 09:08:58 compute-1 sudo[161037]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:08:58 compute-1 python3.9[161039]: ansible-ansible.legacy.dnf Invoked with name=['iscsi-initiator-utils'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec 05 09:09:04 compute-1 sudo[161037]: pam_unix(sudo:session): session closed for user root
Dec 05 09:09:04 compute-1 sudo[161190]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nrfhccozlmaroidjimsfdibpjwksomac ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925744.5352254-138-69252522201935/AnsiballZ_stat.py'
Dec 05 09:09:04 compute-1 sudo[161190]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:09:05 compute-1 python3.9[161192]: ansible-ansible.builtin.stat Invoked with path=/var/lib/config-data/puppet-generated/iscsid/etc/iscsi follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 05 09:09:05 compute-1 sudo[161190]: pam_unix(sudo:session): session closed for user root
Dec 05 09:09:05 compute-1 sudo[161342]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gzspimoeduijzldrpdcslvaoosyxwfzh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925745.53766-168-162596296840021/AnsiballZ_command.py'
Dec 05 09:09:05 compute-1 sudo[161342]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:09:06 compute-1 python3.9[161344]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/restorecon -nvr /etc/iscsi /var/lib/iscsi _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 05 09:09:06 compute-1 sudo[161342]: pam_unix(sudo:session): session closed for user root
Dec 05 09:09:07 compute-1 sudo[161495]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-prslsmdhntkukkylqruinvinuralvmpl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925746.8362432-198-25031612993184/AnsiballZ_stat.py'
Dec 05 09:09:07 compute-1 sudo[161495]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:09:07 compute-1 python3.9[161497]: ansible-ansible.builtin.stat Invoked with path=/etc/iscsi/.initiator_reset follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 05 09:09:07 compute-1 sudo[161495]: pam_unix(sudo:session): session closed for user root
Dec 05 09:09:07 compute-1 sudo[161647]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vcvpeprgfvoiyyxoajuxbapbxoatmvxi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925747.4977353-222-177341322364789/AnsiballZ_command.py'
Dec 05 09:09:07 compute-1 sudo[161647]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:09:07 compute-1 python3.9[161649]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/iscsi-iname _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 05 09:09:08 compute-1 sudo[161647]: pam_unix(sudo:session): session closed for user root
Dec 05 09:09:08 compute-1 sudo[161800]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-spshimpheooxxngycefljfckkdqzqtbu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925748.1893704-246-7025140634034/AnsiballZ_stat.py'
Dec 05 09:09:08 compute-1 sudo[161800]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:09:08 compute-1 python3.9[161802]: ansible-ansible.legacy.stat Invoked with path=/etc/iscsi/initiatorname.iscsi follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 09:09:08 compute-1 sudo[161800]: pam_unix(sudo:session): session closed for user root
Dec 05 09:09:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:09:08.851 105272 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:09:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:09:08.853 105272 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.003s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:09:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:09:08.854 105272 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:09:09 compute-1 sudo[161933]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nnfuzyohwjyjqranstwrtqvcpsepxebn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925748.1893704-246-7025140634034/AnsiballZ_copy.py'
Dec 05 09:09:09 compute-1 sudo[161933]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:09:09 compute-1 podman[161897]: 2025-12-05 09:09:09.329551232 +0000 UTC m=+0.109734824 container health_status 0cdc951f3b455a1459471b20e1f1626ca53f702831c125f835d1b670aeb17ccf (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a42d21893a42142b0b00e96296a13ac9d2c0fedf55d6438ad104fd0309c63376-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Dec 05 09:09:09 compute-1 python3.9[161940]: ansible-ansible.legacy.copy Invoked with dest=/etc/iscsi/initiatorname.iscsi mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764925748.1893704-246-7025140634034/.source.iscsi _original_basename=.dvzvysx8 follow=False checksum=7ca62bb492ebfd7654fff998f2e075671fe5fff0 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 09:09:09 compute-1 sudo[161933]: pam_unix(sudo:session): session closed for user root
Dec 05 09:09:10 compute-1 sudo[162101]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ktxqlbaxlkufrnazayvycevbijugoyzi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925749.6955042-291-251195731499752/AnsiballZ_file.py'
Dec 05 09:09:10 compute-1 sudo[162101]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:09:10 compute-1 python3.9[162103]: ansible-ansible.builtin.file Invoked with mode=0600 path=/etc/iscsi/.initiator_reset state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 09:09:10 compute-1 sudo[162101]: pam_unix(sudo:session): session closed for user root
Dec 05 09:09:11 compute-1 sudo[162253]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-stxbogbkxusgvgfnuatbjkxcdmytzzsd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925750.5809872-315-64999834982573/AnsiballZ_lineinfile.py'
Dec 05 09:09:11 compute-1 sudo[162253]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:09:11 compute-1 python3.9[162255]: ansible-ansible.builtin.lineinfile Invoked with insertafter=^#node.session.auth.chap.algs line=node.session.auth.chap_algs = SHA3-256,SHA256,SHA1,MD5 path=/etc/iscsi/iscsid.conf regexp=^node.session.auth.chap_algs state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 09:09:11 compute-1 sudo[162253]: pam_unix(sudo:session): session closed for user root
Dec 05 09:09:12 compute-1 sudo[162421]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aqghwpjhxcslavzeytotrggvucmixkcb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925751.5349193-342-38175821233157/AnsiballZ_systemd_service.py'
Dec 05 09:09:12 compute-1 sudo[162421]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:09:12 compute-1 podman[162379]: 2025-12-05 09:09:12.286427215 +0000 UTC m=+0.056939768 container health_status e8cea6352187f28e672aa25c227f2705a90d58074b8cf42235a56df5d4026fd5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a42d21893a42142b0b00e96296a13ac9d2c0fedf55d6438ad104fd0309c63376-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 05 09:09:12 compute-1 python3.9[162427]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=iscsid.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 05 09:09:12 compute-1 systemd[1]: Listening on Open-iSCSI iscsid Socket.
Dec 05 09:09:12 compute-1 sudo[162421]: pam_unix(sudo:session): session closed for user root
Dec 05 09:09:13 compute-1 sudo[162581]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zuamnisqfnqfxlhfbbmubjweqmwfvviv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925752.9191442-366-60897605440927/AnsiballZ_systemd_service.py'
Dec 05 09:09:13 compute-1 sudo[162581]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:09:13 compute-1 python3.9[162583]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=iscsid state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 05 09:09:13 compute-1 systemd[1]: Reloading.
Dec 05 09:09:13 compute-1 systemd-rc-local-generator[162612]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 05 09:09:13 compute-1 systemd-sysv-generator[162615]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 05 09:09:13 compute-1 systemd[1]: One time configuration for iscsi.service was skipped because of an unmet condition check (ConditionPathExists=!/etc/iscsi/initiatorname.iscsi).
Dec 05 09:09:13 compute-1 systemd[1]: Starting Open-iSCSI...
Dec 05 09:09:13 compute-1 kernel: Loading iSCSI transport class v2.0-870.
Dec 05 09:09:13 compute-1 systemd[1]: Started Open-iSCSI.
Dec 05 09:09:13 compute-1 systemd[1]: Starting Logout off all iSCSI sessions on shutdown...
Dec 05 09:09:13 compute-1 systemd[1]: Finished Logout off all iSCSI sessions on shutdown.
Dec 05 09:09:13 compute-1 sudo[162581]: pam_unix(sudo:session): session closed for user root
Dec 05 09:09:15 compute-1 sudo[162784]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ceifzztllzzbbttnahutnnjmvmvwuluz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925754.8984938-399-188733724539975/AnsiballZ_service_facts.py'
Dec 05 09:09:15 compute-1 sudo[162784]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:09:15 compute-1 python3.9[162786]: ansible-ansible.builtin.service_facts Invoked
Dec 05 09:09:15 compute-1 network[162803]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Dec 05 09:09:15 compute-1 network[162804]: 'network-scripts' will be removed from distribution in near future.
Dec 05 09:09:15 compute-1 network[162805]: It is advised to switch to 'NetworkManager' instead for network management.
Dec 05 09:09:18 compute-1 sudo[162784]: pam_unix(sudo:session): session closed for user root
Dec 05 09:09:20 compute-1 sshd-session[162925]: Received disconnect from 43.225.158.169 port 58625:11: Bye Bye [preauth]
Dec 05 09:09:20 compute-1 sshd-session[162925]: Disconnected from authenticating user root 43.225.158.169 port 58625 [preauth]
Dec 05 09:09:21 compute-1 sudo[163076]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-frnwhvcoyzolgatfiwnfrgrasrxfeypr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925760.7661514-429-120847352166830/AnsiballZ_file.py'
Dec 05 09:09:21 compute-1 sudo[163076]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:09:21 compute-1 python3.9[163078]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/modules-load.d selevel=s0 setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Dec 05 09:09:21 compute-1 sudo[163076]: pam_unix(sudo:session): session closed for user root
Dec 05 09:09:22 compute-1 sudo[163228]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-syyzxxrrdrwyqdaswrfxdqqgeigoayul ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925761.489701-453-263789884575904/AnsiballZ_modprobe.py'
Dec 05 09:09:22 compute-1 sudo[163228]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:09:22 compute-1 python3.9[163230]: ansible-community.general.modprobe Invoked with name=dm-multipath state=present params= persistent=disabled
Dec 05 09:09:22 compute-1 sudo[163228]: pam_unix(sudo:session): session closed for user root
Dec 05 09:09:23 compute-1 sudo[163386]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iisktahxzafroomkwktlhtelmmglxckr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925762.5634577-477-157768569191423/AnsiballZ_stat.py'
Dec 05 09:09:23 compute-1 sudo[163386]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:09:23 compute-1 python3.9[163388]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/dm-multipath.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 09:09:23 compute-1 sudo[163386]: pam_unix(sudo:session): session closed for user root
Dec 05 09:09:23 compute-1 sshd-session[163234]: Received disconnect from 185.118.15.236 port 34680:11: Bye Bye [preauth]
Dec 05 09:09:23 compute-1 sshd-session[163234]: Disconnected from authenticating user root 185.118.15.236 port 34680 [preauth]
Dec 05 09:09:23 compute-1 sudo[163509]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hfmzqcgdtilskzcrmpmkkrutmrevalkp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925762.5634577-477-157768569191423/AnsiballZ_copy.py'
Dec 05 09:09:23 compute-1 sudo[163509]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:09:23 compute-1 python3.9[163511]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/dm-multipath.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764925762.5634577-477-157768569191423/.source.conf follow=False _original_basename=module-load.conf.j2 checksum=065061c60917e4f67cecc70d12ce55e42f9d0b3f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 09:09:23 compute-1 sudo[163509]: pam_unix(sudo:session): session closed for user root
Dec 05 09:09:24 compute-1 sudo[163661]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pjclofauyxwjkoktrbhigfvsaptsqyys ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925764.2735608-525-274974768265047/AnsiballZ_lineinfile.py'
Dec 05 09:09:24 compute-1 sudo[163661]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:09:24 compute-1 python3.9[163663]: ansible-ansible.builtin.lineinfile Invoked with create=True dest=/etc/modules line=dm-multipath  mode=0644 state=present path=/etc/modules encoding=utf-8 backrefs=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 09:09:24 compute-1 sudo[163661]: pam_unix(sudo:session): session closed for user root
Dec 05 09:09:25 compute-1 sudo[163813]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zpprfkogehylujhgurvrboosgrvqswtg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925765.041941-549-46093085327992/AnsiballZ_systemd.py'
Dec 05 09:09:25 compute-1 sudo[163813]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:09:26 compute-1 python3.9[163815]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 05 09:09:26 compute-1 systemd[1]: systemd-modules-load.service: Deactivated successfully.
Dec 05 09:09:26 compute-1 systemd[1]: Stopped Load Kernel Modules.
Dec 05 09:09:26 compute-1 systemd[1]: Stopping Load Kernel Modules...
Dec 05 09:09:26 compute-1 systemd[1]: Starting Load Kernel Modules...
Dec 05 09:09:26 compute-1 systemd[1]: Finished Load Kernel Modules.
Dec 05 09:09:26 compute-1 sudo[163813]: pam_unix(sudo:session): session closed for user root
Dec 05 09:09:26 compute-1 sudo[163969]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fbpkuvtkvgtjkvnnnylendsbtakfkgea ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925766.3743665-573-186739398069620/AnsiballZ_file.py'
Dec 05 09:09:26 compute-1 sudo[163969]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:09:26 compute-1 python3.9[163971]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/multipath setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 05 09:09:26 compute-1 sudo[163969]: pam_unix(sudo:session): session closed for user root
Dec 05 09:09:27 compute-1 sudo[164121]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ncqhklgqeciziujzgzmkzcfjglzipjqz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925767.2013361-600-265911590601585/AnsiballZ_stat.py'
Dec 05 09:09:27 compute-1 sudo[164121]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:09:27 compute-1 python3.9[164123]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 05 09:09:27 compute-1 sudo[164121]: pam_unix(sudo:session): session closed for user root
Dec 05 09:09:28 compute-1 sudo[164273]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wjwsdypgmwanhzzpmrxfphrnsuxjnoux ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925767.9911022-627-262570748050238/AnsiballZ_stat.py'
Dec 05 09:09:28 compute-1 sudo[164273]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:09:28 compute-1 python3.9[164275]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 05 09:09:28 compute-1 sudo[164273]: pam_unix(sudo:session): session closed for user root
Dec 05 09:09:29 compute-1 sudo[164425]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-foqpklxwnsebpqirrcahztecskkunnvk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925768.7109709-651-124345176975924/AnsiballZ_stat.py'
Dec 05 09:09:29 compute-1 sudo[164425]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:09:29 compute-1 python3.9[164427]: ansible-ansible.legacy.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 09:09:29 compute-1 sudo[164425]: pam_unix(sudo:session): session closed for user root
Dec 05 09:09:29 compute-1 sudo[164548]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-utnvatpkgzyerdtldgubccdmnyaudoho ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925768.7109709-651-124345176975924/AnsiballZ_copy.py'
Dec 05 09:09:29 compute-1 sudo[164548]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:09:29 compute-1 python3.9[164550]: ansible-ansible.legacy.copy Invoked with dest=/etc/multipath.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764925768.7109709-651-124345176975924/.source.conf _original_basename=multipath.conf follow=False checksum=bf02ab264d3d648048a81f3bacec8bc58db93162 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 09:09:29 compute-1 sudo[164548]: pam_unix(sudo:session): session closed for user root
Dec 05 09:09:30 compute-1 sudo[164700]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uwxwszdrwzvkkaxmhghaydrtetniiupi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925770.0412712-696-181431317108565/AnsiballZ_command.py'
Dec 05 09:09:30 compute-1 sudo[164700]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:09:30 compute-1 python3.9[164702]: ansible-ansible.legacy.command Invoked with _raw_params=grep -q '^blacklist\s*{' /etc/multipath.conf _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 05 09:09:30 compute-1 sudo[164700]: pam_unix(sudo:session): session closed for user root
Dec 05 09:09:31 compute-1 sudo[164853]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mpphkyrzywvanmbuobfgidzunerqdokd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925770.8449283-720-256159519770004/AnsiballZ_lineinfile.py'
Dec 05 09:09:31 compute-1 sudo[164853]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:09:31 compute-1 python3.9[164855]: ansible-ansible.builtin.lineinfile Invoked with line=blacklist { path=/etc/multipath.conf state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 09:09:31 compute-1 sudo[164853]: pam_unix(sudo:session): session closed for user root
Dec 05 09:09:31 compute-1 sudo[165005]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-llfncjcgazcxtejthjmfmeqpfzlhyrtl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925771.569071-744-161928170146782/AnsiballZ_replace.py'
Dec 05 09:09:32 compute-1 sudo[165005]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:09:32 compute-1 python3.9[165007]: ansible-ansible.builtin.replace Invoked with path=/etc/multipath.conf regexp=^(blacklist {) replace=\1\n} backup=False encoding=utf-8 unsafe_writes=False after=None before=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 09:09:32 compute-1 sudo[165005]: pam_unix(sudo:session): session closed for user root
Dec 05 09:09:32 compute-1 sudo[165157]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nztemkuikcnpphqvgsufiyuxanndhmpr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925772.4000113-768-14837506613559/AnsiballZ_replace.py'
Dec 05 09:09:32 compute-1 sudo[165157]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:09:32 compute-1 python3.9[165159]: ansible-ansible.builtin.replace Invoked with path=/etc/multipath.conf regexp=^blacklist\s*{\n[\s]+devnode \"\.\*\" replace=blacklist { backup=False encoding=utf-8 unsafe_writes=False after=None before=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 09:09:32 compute-1 sudo[165157]: pam_unix(sudo:session): session closed for user root
Dec 05 09:09:33 compute-1 sudo[165309]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hwepcdigsbspdupejpzxunvfegsulevx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925773.288337-795-182678115623035/AnsiballZ_lineinfile.py'
Dec 05 09:09:33 compute-1 sudo[165309]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:09:33 compute-1 python3.9[165311]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        find_multipaths yes path=/etc/multipath.conf regexp=^\s+find_multipaths state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 09:09:33 compute-1 sudo[165309]: pam_unix(sudo:session): session closed for user root
Dec 05 09:09:34 compute-1 sudo[165461]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ovjqgjqlewtplauiihzcqgrahbszrfus ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925773.9986293-795-227865782983227/AnsiballZ_lineinfile.py'
Dec 05 09:09:34 compute-1 sudo[165461]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:09:34 compute-1 python3.9[165463]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        recheck_wwid yes path=/etc/multipath.conf regexp=^\s+recheck_wwid state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 09:09:34 compute-1 sudo[165461]: pam_unix(sudo:session): session closed for user root
Dec 05 09:09:35 compute-1 sudo[165613]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rdaifgfgfwweecqrervvobhpimctmxfv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925775.058046-795-158177905427086/AnsiballZ_lineinfile.py'
Dec 05 09:09:35 compute-1 sudo[165613]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:09:35 compute-1 python3.9[165615]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        skip_kpartx yes path=/etc/multipath.conf regexp=^\s+skip_kpartx state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 09:09:35 compute-1 sudo[165613]: pam_unix(sudo:session): session closed for user root
Dec 05 09:09:36 compute-1 sudo[165765]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pggtovuiqhohupjtjokxsmmircooqthw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925775.918339-795-271669981340884/AnsiballZ_lineinfile.py'
Dec 05 09:09:36 compute-1 sudo[165765]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:09:36 compute-1 python3.9[165767]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        user_friendly_names no path=/etc/multipath.conf regexp=^\s+user_friendly_names state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 09:09:36 compute-1 sudo[165765]: pam_unix(sudo:session): session closed for user root
Dec 05 09:09:36 compute-1 sudo[165917]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gfgrqaxzmstqhtaulfindxvpaxcngvqu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925776.6470768-882-123719021533145/AnsiballZ_stat.py'
Dec 05 09:09:36 compute-1 sudo[165917]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:09:37 compute-1 python3.9[165919]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 05 09:09:37 compute-1 sudo[165917]: pam_unix(sudo:session): session closed for user root
Dec 05 09:09:37 compute-1 sudo[166071]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gqvffqcvdlxmpyjwactlwfwlzmvlzzrd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925777.3904138-906-270643017910980/AnsiballZ_file.py'
Dec 05 09:09:37 compute-1 sudo[166071]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:09:37 compute-1 python3.9[166073]: ansible-ansible.builtin.file Invoked with mode=0644 path=/etc/multipath/.multipath_restart_required state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 09:09:37 compute-1 sudo[166071]: pam_unix(sudo:session): session closed for user root
Dec 05 09:09:38 compute-1 sudo[166225]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hywftfypmdmltoewfhvvyghvzufvcyqz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925778.2544758-933-186546874592219/AnsiballZ_file.py'
Dec 05 09:09:38 compute-1 sudo[166225]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:09:38 compute-1 python3.9[166227]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 05 09:09:38 compute-1 sudo[166225]: pam_unix(sudo:session): session closed for user root
Dec 05 09:09:39 compute-1 sshd-session[166074]: Received disconnect from 122.168.194.41 port 48158:11: Bye Bye [preauth]
Dec 05 09:09:39 compute-1 sshd-session[166074]: Disconnected from authenticating user root 122.168.194.41 port 48158 [preauth]
Dec 05 09:09:39 compute-1 sudo[166387]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-olpdjusvexynvxlwopfefgemmrypidow ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925779.129306-958-267809639647861/AnsiballZ_stat.py'
Dec 05 09:09:39 compute-1 sudo[166387]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:09:39 compute-1 podman[166351]: 2025-12-05 09:09:39.633068002 +0000 UTC m=+0.155575412 container health_status 0cdc951f3b455a1459471b20e1f1626ca53f702831c125f835d1b670aeb17ccf (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a42d21893a42142b0b00e96296a13ac9d2c0fedf55d6438ad104fd0309c63376-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 05 09:09:39 compute-1 python3.9[166394]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 09:09:39 compute-1 sudo[166387]: pam_unix(sudo:session): session closed for user root
Dec 05 09:09:39 compute-1 sudo[166482]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-utjurhuhyrqdxpjhtcgvcuonhsbspujb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925779.129306-958-267809639647861/AnsiballZ_file.py'
Dec 05 09:09:39 compute-1 sudo[166482]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:09:40 compute-1 python3.9[166484]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 05 09:09:40 compute-1 sudo[166482]: pam_unix(sudo:session): session closed for user root
Dec 05 09:09:40 compute-1 sudo[166634]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yneobhhdwgkfagctrbgswonuxvdrtcij ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925780.3479867-958-184849895963875/AnsiballZ_stat.py'
Dec 05 09:09:40 compute-1 sudo[166634]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:09:40 compute-1 python3.9[166636]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 09:09:40 compute-1 sudo[166634]: pam_unix(sudo:session): session closed for user root
Dec 05 09:09:41 compute-1 sudo[166712]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wxxvczpkhctyxflswmgfdubeodnckncj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925780.3479867-958-184849895963875/AnsiballZ_file.py'
Dec 05 09:09:41 compute-1 sudo[166712]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:09:41 compute-1 python3.9[166714]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 05 09:09:41 compute-1 sudo[166712]: pam_unix(sudo:session): session closed for user root
Dec 05 09:09:41 compute-1 sudo[166864]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eiyjjgadqqgkwhuzykohjekbqglxwhro ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925781.549546-1026-277101908185253/AnsiballZ_file.py'
Dec 05 09:09:41 compute-1 sudo[166864]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:09:42 compute-1 python3.9[166866]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 09:09:42 compute-1 sudo[166864]: pam_unix(sudo:session): session closed for user root
Dec 05 09:09:42 compute-1 sudo[167027]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kuvnprxeclbbzhbmpdwsdycmsrzlhyia ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925782.2364523-1050-172662247654033/AnsiballZ_stat.py'
Dec 05 09:09:42 compute-1 sudo[167027]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:09:42 compute-1 podman[166990]: 2025-12-05 09:09:42.567233917 +0000 UTC m=+0.062986171 container health_status e8cea6352187f28e672aa25c227f2705a90d58074b8cf42235a56df5d4026fd5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a42d21893a42142b0b00e96296a13ac9d2c0fedf55d6438ad104fd0309c63376-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Dec 05 09:09:42 compute-1 python3.9[167035]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 09:09:42 compute-1 sudo[167027]: pam_unix(sudo:session): session closed for user root
Dec 05 09:09:43 compute-1 sudo[167113]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cdhagrlpfcgspsacbhsznkneoomhmjqy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925782.2364523-1050-172662247654033/AnsiballZ_file.py'
Dec 05 09:09:43 compute-1 sudo[167113]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:09:43 compute-1 python3.9[167115]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 09:09:43 compute-1 sudo[167113]: pam_unix(sudo:session): session closed for user root
Dec 05 09:09:43 compute-1 sudo[167265]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rqyrgiszbtbcbklzzozwjxqomzvooifp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925783.6017954-1086-132860356822434/AnsiballZ_stat.py'
Dec 05 09:09:43 compute-1 sudo[167265]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:09:44 compute-1 python3.9[167267]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 09:09:44 compute-1 sudo[167265]: pam_unix(sudo:session): session closed for user root
Dec 05 09:09:44 compute-1 sudo[167343]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tekefswfefvcjfyvxqazezrvxelmtomm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925783.6017954-1086-132860356822434/AnsiballZ_file.py'
Dec 05 09:09:44 compute-1 sudo[167343]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:09:44 compute-1 python3.9[167345]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 09:09:44 compute-1 sudo[167343]: pam_unix(sudo:session): session closed for user root
Dec 05 09:09:45 compute-1 sudo[167495]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zqequbpbwmrvtkiidgevtictwznalsmf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925784.7932577-1122-10585776949903/AnsiballZ_systemd.py'
Dec 05 09:09:45 compute-1 sudo[167495]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:09:45 compute-1 python3.9[167497]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 05 09:09:45 compute-1 systemd[1]: Reloading.
Dec 05 09:09:45 compute-1 systemd-rc-local-generator[167520]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 05 09:09:45 compute-1 systemd-sysv-generator[167527]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 05 09:09:45 compute-1 sudo[167495]: pam_unix(sudo:session): session closed for user root
Dec 05 09:09:46 compute-1 sudo[167683]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jujrowhedqdtetddljbilvloaqyjbbtq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925785.9745667-1146-102980376063330/AnsiballZ_stat.py'
Dec 05 09:09:46 compute-1 sudo[167683]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:09:46 compute-1 python3.9[167685]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 09:09:46 compute-1 sudo[167683]: pam_unix(sudo:session): session closed for user root
Dec 05 09:09:46 compute-1 sudo[167761]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bvddkinliltsrtlvzdoxoaghzejmxvps ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925785.9745667-1146-102980376063330/AnsiballZ_file.py'
Dec 05 09:09:46 compute-1 sudo[167761]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:09:46 compute-1 python3.9[167763]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 09:09:47 compute-1 sudo[167761]: pam_unix(sudo:session): session closed for user root
Dec 05 09:09:47 compute-1 sudo[167913]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wozauuapwgcugxgstbgemqcuvwoidyrr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925787.1939476-1182-180624163662434/AnsiballZ_stat.py'
Dec 05 09:09:47 compute-1 sudo[167913]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:09:47 compute-1 python3.9[167915]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 09:09:47 compute-1 sudo[167913]: pam_unix(sudo:session): session closed for user root
Dec 05 09:09:47 compute-1 sudo[167991]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hmzztanibxkczegzkcilqcffhgqautvd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925787.1939476-1182-180624163662434/AnsiballZ_file.py'
Dec 05 09:09:47 compute-1 sudo[167991]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:09:48 compute-1 python3.9[167993]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 09:09:48 compute-1 sudo[167991]: pam_unix(sudo:session): session closed for user root
Dec 05 09:09:48 compute-1 sudo[168143]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dcjnnfbancdauvebuixsqxwwvkvtmqxa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925788.3706863-1218-80434799197017/AnsiballZ_systemd.py'
Dec 05 09:09:48 compute-1 sudo[168143]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:09:48 compute-1 python3.9[168145]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 05 09:09:49 compute-1 systemd[1]: Reloading.
Dec 05 09:09:49 compute-1 systemd-sysv-generator[168174]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 05 09:09:49 compute-1 systemd-rc-local-generator[168168]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 05 09:09:49 compute-1 systemd[1]: Starting Create netns directory...
Dec 05 09:09:49 compute-1 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Dec 05 09:09:49 compute-1 systemd[1]: netns-placeholder.service: Deactivated successfully.
Dec 05 09:09:49 compute-1 systemd[1]: Finished Create netns directory.
Dec 05 09:09:49 compute-1 sudo[168143]: pam_unix(sudo:session): session closed for user root
Dec 05 09:09:50 compute-1 sudo[168336]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zanvprqygjlyniedteocliihfdwuimaz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925789.8403428-1248-137294767781419/AnsiballZ_file.py'
Dec 05 09:09:50 compute-1 sudo[168336]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:09:50 compute-1 python3.9[168338]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 05 09:09:50 compute-1 sudo[168336]: pam_unix(sudo:session): session closed for user root
Dec 05 09:09:50 compute-1 sudo[168488]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yyfmshatprrjpytzatizpudypcsgvclx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925790.5847263-1272-44532036365309/AnsiballZ_stat.py'
Dec 05 09:09:50 compute-1 sudo[168488]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:09:51 compute-1 python3.9[168490]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/multipathd/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 09:09:51 compute-1 sudo[168488]: pam_unix(sudo:session): session closed for user root
Dec 05 09:09:51 compute-1 sudo[168611]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xgxnsuzzwunweshmtqdaukipntkcjcmd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925790.5847263-1272-44532036365309/AnsiballZ_copy.py'
Dec 05 09:09:51 compute-1 sudo[168611]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:09:51 compute-1 python3.9[168613]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/multipathd/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764925790.5847263-1272-44532036365309/.source _original_basename=healthcheck follow=False checksum=af9d0c1c8f3cb0e30ce9609be9d5b01924d0d23f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Dec 05 09:09:51 compute-1 sudo[168611]: pam_unix(sudo:session): session closed for user root
Dec 05 09:09:52 compute-1 sudo[168763]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-glaaycxdujjaiaysemshgqqdaqoetzqe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925792.343161-1323-148330136778114/AnsiballZ_file.py'
Dec 05 09:09:52 compute-1 sudo[168763]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:09:52 compute-1 python3.9[168765]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/edpm-config recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 09:09:52 compute-1 sudo[168763]: pam_unix(sudo:session): session closed for user root
Dec 05 09:09:53 compute-1 sudo[168915]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-odetzuronlfbvdxfneigvxstkrepcodj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925793.0797358-1347-138136669720337/AnsiballZ_file.py'
Dec 05 09:09:53 compute-1 sudo[168915]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:09:53 compute-1 python3.9[168917]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 05 09:09:53 compute-1 sudo[168915]: pam_unix(sudo:session): session closed for user root
Dec 05 09:09:54 compute-1 sudo[169067]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xbpriqawlzqhnqvcfvviltzkibhwokms ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925794.113802-1371-81571036885888/AnsiballZ_stat.py'
Dec 05 09:09:54 compute-1 sudo[169067]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:09:54 compute-1 python3.9[169069]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/multipathd.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 09:09:54 compute-1 sudo[169067]: pam_unix(sudo:session): session closed for user root
Dec 05 09:09:55 compute-1 sudo[169190]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wyjqydkgnuxtocrklmmxbkouzpuekqbu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925794.113802-1371-81571036885888/AnsiballZ_copy.py'
Dec 05 09:09:55 compute-1 sudo[169190]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:09:55 compute-1 python3.9[169192]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/multipathd.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1764925794.113802-1371-81571036885888/.source.json _original_basename=.hv2f89iy follow=False checksum=3f7959ee8ac9757398adcc451c3b416c957d7c14 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 09:09:55 compute-1 sudo[169190]: pam_unix(sudo:session): session closed for user root
Dec 05 09:09:55 compute-1 python3.9[169342]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/multipathd state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 09:09:58 compute-1 sudo[169763]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rlsiflvzxkqvhhyjpxqzzllpldpachsh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925798.1475277-1491-122161645957464/AnsiballZ_container_config_data.py'
Dec 05 09:09:58 compute-1 sudo[169763]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:09:58 compute-1 python3.9[169765]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/multipathd config_pattern=*.json debug=False
Dec 05 09:09:58 compute-1 sudo[169763]: pam_unix(sudo:session): session closed for user root
Dec 05 09:09:59 compute-1 sudo[169915]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jqyhocqxktgfqhvhzbfjdrkubxplcvsh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925799.3397832-1524-25738925546112/AnsiballZ_container_config_hash.py'
Dec 05 09:09:59 compute-1 sudo[169915]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:10:00 compute-1 python3.9[169917]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/openstack
Dec 05 09:10:00 compute-1 sudo[169915]: pam_unix(sudo:session): session closed for user root
Dec 05 09:10:01 compute-1 sudo[170067]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fkpagsooglsajlzseclcfhxjwlzumqjf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925800.4374287-1551-122604170070694/AnsiballZ_podman_container_info.py'
Dec 05 09:10:01 compute-1 sudo[170067]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:10:01 compute-1 python3.9[170069]: ansible-containers.podman.podman_container_info Invoked with executable=podman name=None
Dec 05 09:10:01 compute-1 sudo[170067]: pam_unix(sudo:session): session closed for user root
Dec 05 09:10:03 compute-1 sudo[170246]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wxzbcufndeoghtxwyqduacqrjrscouij ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1764925802.4693727-1590-130300658871339/AnsiballZ_edpm_container_manage.py'
Dec 05 09:10:03 compute-1 sudo[170246]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:10:03 compute-1 python3[170248]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/multipathd config_id=multipathd config_overrides={} config_patterns=*.json containers=['multipathd'] log_base_path=/var/log/containers/stdouts debug=False
Dec 05 09:10:03 compute-1 podman[170282]: 2025-12-05 09:10:03.518833844 +0000 UTC m=+0.055125143 container create 3f3288243aefe700d53cffce184353529fbaf6e44f1c19ec4792ed576af41176 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, tcib_managed=true, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Dec 05 09:10:03 compute-1 podman[170282]: 2025-12-05 09:10:03.486886354 +0000 UTC m=+0.023177673 image pull 9af6aa52ee187025bc25565b66d3eefb486acac26f9281e33f4cce76a40d21f7 quay.io/podified-antelope-centos9/openstack-multipathd:current-podified
Dec 05 09:10:03 compute-1 python3[170248]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name multipathd --conmon-pidfile /run/multipathd.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --healthcheck-command /openstack/healthcheck --label config_id=multipathd --label container_name=multipathd --label managed_by=edpm_ansible --label config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro --volume /dev:/dev --volume /run/udev:/run/udev --volume /sys:/sys --volume /lib/modules:/lib/modules:ro --volume /etc/iscsi:/etc/iscsi:ro --volume /var/lib/iscsi:/var/lib/iscsi --volume /etc/multipath:/etc/multipath:z --volume /etc/multipath.conf:/etc/multipath.conf:ro --volume /var/lib/openstack/healthchecks/multipathd:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-multipathd:current-podified
Dec 05 09:10:03 compute-1 sudo[170246]: pam_unix(sudo:session): session closed for user root
Dec 05 09:10:04 compute-1 sudo[170470]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-omsyfalyfmwixdoolbxoyefyympghppj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925803.8909743-1614-89202226521317/AnsiballZ_stat.py'
Dec 05 09:10:04 compute-1 sudo[170470]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:10:04 compute-1 python3.9[170472]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 05 09:10:04 compute-1 sudo[170470]: pam_unix(sudo:session): session closed for user root
Dec 05 09:10:05 compute-1 sudo[170624]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-prqfktpnlyaerbnkvyamdwufssodmayj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925804.78576-1641-46664122135791/AnsiballZ_file.py'
Dec 05 09:10:05 compute-1 sudo[170624]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:10:05 compute-1 python3.9[170626]: ansible-file Invoked with path=/etc/systemd/system/edpm_multipathd.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 09:10:05 compute-1 sudo[170624]: pam_unix(sudo:session): session closed for user root
Dec 05 09:10:05 compute-1 sudo[170700]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lpqzzksvrwxnbrpuifwnblgotvhxzbgz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925804.78576-1641-46664122135791/AnsiballZ_stat.py'
Dec 05 09:10:05 compute-1 sudo[170700]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:10:05 compute-1 python3.9[170702]: ansible-stat Invoked with path=/etc/systemd/system/edpm_multipathd_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 05 09:10:05 compute-1 sudo[170700]: pam_unix(sudo:session): session closed for user root
Dec 05 09:10:06 compute-1 sudo[170851]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xsurqjpcztdmaxijwjjycfrvahttfopa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925805.9529278-1641-85535539596116/AnsiballZ_copy.py'
Dec 05 09:10:06 compute-1 sudo[170851]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:10:06 compute-1 python3.9[170853]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764925805.9529278-1641-85535539596116/source dest=/etc/systemd/system/edpm_multipathd.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 09:10:06 compute-1 sudo[170851]: pam_unix(sudo:session): session closed for user root
Dec 05 09:10:06 compute-1 sudo[170927]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-prsqthqatynyctfsmkxdwnfnyhkjwbgs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925805.9529278-1641-85535539596116/AnsiballZ_systemd.py'
Dec 05 09:10:06 compute-1 sudo[170927]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:10:07 compute-1 python3.9[170929]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Dec 05 09:10:07 compute-1 systemd[1]: Reloading.
Dec 05 09:10:07 compute-1 systemd-rc-local-generator[170956]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 05 09:10:07 compute-1 systemd-sysv-generator[170960]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 05 09:10:07 compute-1 sudo[170927]: pam_unix(sudo:session): session closed for user root
Dec 05 09:10:07 compute-1 sudo[171038]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rdknywymsdnmprngijidnuvkhyxftlgh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925805.9529278-1641-85535539596116/AnsiballZ_systemd.py'
Dec 05 09:10:07 compute-1 sudo[171038]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:10:08 compute-1 python3.9[171040]: ansible-systemd Invoked with state=restarted name=edpm_multipathd.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 05 09:10:08 compute-1 systemd[1]: Reloading.
Dec 05 09:10:08 compute-1 systemd-rc-local-generator[171068]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 05 09:10:08 compute-1 systemd-sysv-generator[171072]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 05 09:10:08 compute-1 systemd[1]: Starting multipathd container...
Dec 05 09:10:08 compute-1 systemd[1]: Started libcrun container.
Dec 05 09:10:08 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/fd9b9e36e8b1e1ed0a43c1624039408fde7b1f2e337322593fa33e1276ba70a9/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Dec 05 09:10:08 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/fd9b9e36e8b1e1ed0a43c1624039408fde7b1f2e337322593fa33e1276ba70a9/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Dec 05 09:10:08 compute-1 systemd[1]: Started /usr/bin/podman healthcheck run 3f3288243aefe700d53cffce184353529fbaf6e44f1c19ec4792ed576af41176.
Dec 05 09:10:08 compute-1 podman[171080]: 2025-12-05 09:10:08.630017226 +0000 UTC m=+0.126077823 container init 3f3288243aefe700d53cffce184353529fbaf6e44f1c19ec4792ed576af41176 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, container_name=multipathd, org.label-schema.license=GPLv2)
Dec 05 09:10:08 compute-1 multipathd[171095]: + sudo -E kolla_set_configs
Dec 05 09:10:08 compute-1 podman[171080]: 2025-12-05 09:10:08.6541928 +0000 UTC m=+0.150253377 container start 3f3288243aefe700d53cffce184353529fbaf6e44f1c19ec4792ed576af41176 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 05 09:10:08 compute-1 podman[171080]: multipathd
Dec 05 09:10:08 compute-1 systemd[1]: Started multipathd container.
Dec 05 09:10:08 compute-1 sudo[171102]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Dec 05 09:10:08 compute-1 sudo[171102]: pam_systemd(sudo:session): Failed to connect to system bus: No such file or directory
Dec 05 09:10:08 compute-1 sudo[171102]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Dec 05 09:10:08 compute-1 sudo[171038]: pam_unix(sudo:session): session closed for user root
Dec 05 09:10:08 compute-1 multipathd[171095]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Dec 05 09:10:08 compute-1 multipathd[171095]: INFO:__main__:Validating config file
Dec 05 09:10:08 compute-1 multipathd[171095]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Dec 05 09:10:08 compute-1 multipathd[171095]: INFO:__main__:Writing out command to execute
Dec 05 09:10:08 compute-1 sudo[171102]: pam_unix(sudo:session): session closed for user root
Dec 05 09:10:08 compute-1 multipathd[171095]: ++ cat /run_command
Dec 05 09:10:08 compute-1 multipathd[171095]: + CMD='/usr/sbin/multipathd -d'
Dec 05 09:10:08 compute-1 multipathd[171095]: + ARGS=
Dec 05 09:10:08 compute-1 multipathd[171095]: + sudo kolla_copy_cacerts
Dec 05 09:10:08 compute-1 sudo[171125]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_copy_cacerts
Dec 05 09:10:08 compute-1 sudo[171125]: pam_systemd(sudo:session): Failed to connect to system bus: No such file or directory
Dec 05 09:10:08 compute-1 sudo[171125]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Dec 05 09:10:08 compute-1 sudo[171125]: pam_unix(sudo:session): session closed for user root
Dec 05 09:10:08 compute-1 multipathd[171095]: + [[ ! -n '' ]]
Dec 05 09:10:08 compute-1 multipathd[171095]: + . kolla_extend_start
Dec 05 09:10:08 compute-1 multipathd[171095]: Running command: '/usr/sbin/multipathd -d'
Dec 05 09:10:08 compute-1 multipathd[171095]: + echo 'Running command: '\''/usr/sbin/multipathd -d'\'''
Dec 05 09:10:08 compute-1 multipathd[171095]: + umask 0022
Dec 05 09:10:08 compute-1 multipathd[171095]: + exec /usr/sbin/multipathd -d
Dec 05 09:10:08 compute-1 podman[171101]: 2025-12-05 09:10:08.762262728 +0000 UTC m=+0.094827827 container health_status 3f3288243aefe700d53cffce184353529fbaf6e44f1c19ec4792ed576af41176 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=starting, health_failing_streak=1, health_log=, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=multipathd, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team)
Dec 05 09:10:08 compute-1 systemd[1]: 3f3288243aefe700d53cffce184353529fbaf6e44f1c19ec4792ed576af41176-b63096dc7672e26.service: Main process exited, code=exited, status=1/FAILURE
Dec 05 09:10:08 compute-1 systemd[1]: 3f3288243aefe700d53cffce184353529fbaf6e44f1c19ec4792ed576af41176-b63096dc7672e26.service: Failed with result 'exit-code'.
Dec 05 09:10:08 compute-1 multipathd[171095]: 3081.831390 | --------start up--------
Dec 05 09:10:08 compute-1 multipathd[171095]: 3081.831414 | read /etc/multipath.conf
Dec 05 09:10:08 compute-1 multipathd[171095]: 3081.837929 | path checkers start up
Dec 05 09:10:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:10:08.852 105272 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:10:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:10:08.853 105272 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:10:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:10:08.854 105272 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:10:09 compute-1 python3.9[171284]: ansible-ansible.builtin.slurp Invoked with src=/var/lib/edpm-config/deployed_services.yaml
Dec 05 09:10:09 compute-1 systemd[1]: virtnodedevd.service: Deactivated successfully.
Dec 05 09:10:09 compute-1 podman[171286]: 2025-12-05 09:10:09.783642292 +0000 UTC m=+0.101479530 container health_status 0cdc951f3b455a1459471b20e1f1626ca53f702831c125f835d1b670aeb17ccf (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a42d21893a42142b0b00e96296a13ac9d2c0fedf55d6438ad104fd0309c63376-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 05 09:10:10 compute-1 sudo[171461]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rlnhqhyuphffidtnetnggpzygybiswva ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925810.1835518-1764-62841849602656/AnsiballZ_stat.py'
Dec 05 09:10:10 compute-1 sudo[171461]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:10:10 compute-1 python3.9[171463]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/deployed_services.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 09:10:10 compute-1 sudo[171461]: pam_unix(sudo:session): session closed for user root
Dec 05 09:10:10 compute-1 systemd[1]: virtproxyd.service: Deactivated successfully.
Dec 05 09:10:11 compute-1 sudo[171587]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-diqnvmusjqsesgdmomplbktvrrnxlpyj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925810.1835518-1764-62841849602656/AnsiballZ_copy.py'
Dec 05 09:10:11 compute-1 sudo[171587]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:10:11 compute-1 python3.9[171589]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/deployed_services.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764925810.1835518-1764-62841849602656/.source.yaml _original_basename=.rf4co65t follow=False checksum=f4fccae3e6c61a8c9b2044ea78e4eba88bda5466 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 09:10:11 compute-1 sudo[171587]: pam_unix(sudo:session): session closed for user root
Dec 05 09:10:11 compute-1 python3.9[171739]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath/.multipath_restart_required follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 05 09:10:12 compute-1 systemd[1]: virtqemud.service: Deactivated successfully.
Dec 05 09:10:12 compute-1 sudo[171892]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kziobtctrysxevhyddbnugozeavavkva ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925812.3021598-1833-93252098790111/AnsiballZ_command.py'
Dec 05 09:10:12 compute-1 sudo[171892]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:10:12 compute-1 podman[171894]: 2025-12-05 09:10:12.673351003 +0000 UTC m=+0.050046126 container health_status e8cea6352187f28e672aa25c227f2705a90d58074b8cf42235a56df5d4026fd5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a42d21893a42142b0b00e96296a13ac9d2c0fedf55d6438ad104fd0309c63376-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_metadata_agent)
Dec 05 09:10:12 compute-1 python3.9[171895]: ansible-ansible.legacy.command Invoked with _raw_params=podman ps --filter volume=/etc/multipath.conf --format {{.Names}} _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 05 09:10:12 compute-1 sudo[171892]: pam_unix(sudo:session): session closed for user root
Dec 05 09:10:13 compute-1 systemd[1]: virtsecretd.service: Deactivated successfully.
Dec 05 09:10:13 compute-1 sudo[172076]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gwhomswydmqqcajnlcwxvjjasrzscozo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925813.131435-1857-124112280101931/AnsiballZ_systemd.py'
Dec 05 09:10:13 compute-1 sudo[172076]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:10:13 compute-1 python3.9[172078]: ansible-ansible.builtin.systemd Invoked with name=edpm_multipathd state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 05 09:10:13 compute-1 systemd[1]: Stopping multipathd container...
Dec 05 09:10:13 compute-1 multipathd[171095]: 3086.975618 | exit (signal)
Dec 05 09:10:13 compute-1 multipathd[171095]: 3086.975705 | --------shut down-------
Dec 05 09:10:13 compute-1 systemd[1]: libpod-3f3288243aefe700d53cffce184353529fbaf6e44f1c19ec4792ed576af41176.scope: Deactivated successfully.
Dec 05 09:10:13 compute-1 podman[172082]: 2025-12-05 09:10:13.950461365 +0000 UTC m=+0.079622465 container died 3f3288243aefe700d53cffce184353529fbaf6e44f1c19ec4792ed576af41176 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=multipathd, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3)
Dec 05 09:10:13 compute-1 systemd[1]: 3f3288243aefe700d53cffce184353529fbaf6e44f1c19ec4792ed576af41176-b63096dc7672e26.timer: Deactivated successfully.
Dec 05 09:10:13 compute-1 systemd[1]: Stopped /usr/bin/podman healthcheck run 3f3288243aefe700d53cffce184353529fbaf6e44f1c19ec4792ed576af41176.
Dec 05 09:10:13 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-3f3288243aefe700d53cffce184353529fbaf6e44f1c19ec4792ed576af41176-userdata-shm.mount: Deactivated successfully.
Dec 05 09:10:14 compute-1 systemd[1]: var-lib-containers-storage-overlay-fd9b9e36e8b1e1ed0a43c1624039408fde7b1f2e337322593fa33e1276ba70a9-merged.mount: Deactivated successfully.
Dec 05 09:10:14 compute-1 podman[172082]: 2025-12-05 09:10:14.01984222 +0000 UTC m=+0.149003320 container cleanup 3f3288243aefe700d53cffce184353529fbaf6e44f1c19ec4792ed576af41176 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=multipathd, container_name=multipathd)
Dec 05 09:10:14 compute-1 podman[172082]: multipathd
Dec 05 09:10:14 compute-1 podman[172112]: multipathd
Dec 05 09:10:14 compute-1 systemd[1]: edpm_multipathd.service: Deactivated successfully.
Dec 05 09:10:14 compute-1 systemd[1]: Stopped multipathd container.
Dec 05 09:10:14 compute-1 systemd[1]: Starting multipathd container...
Dec 05 09:10:14 compute-1 systemd[1]: Started libcrun container.
Dec 05 09:10:14 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/fd9b9e36e8b1e1ed0a43c1624039408fde7b1f2e337322593fa33e1276ba70a9/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Dec 05 09:10:14 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/fd9b9e36e8b1e1ed0a43c1624039408fde7b1f2e337322593fa33e1276ba70a9/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Dec 05 09:10:14 compute-1 systemd[1]: Started /usr/bin/podman healthcheck run 3f3288243aefe700d53cffce184353529fbaf6e44f1c19ec4792ed576af41176.
Dec 05 09:10:14 compute-1 podman[172125]: 2025-12-05 09:10:14.221117681 +0000 UTC m=+0.118950981 container init 3f3288243aefe700d53cffce184353529fbaf6e44f1c19ec4792ed576af41176 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, config_id=multipathd, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true)
Dec 05 09:10:14 compute-1 multipathd[172141]: + sudo -E kolla_set_configs
Dec 05 09:10:14 compute-1 sudo[172147]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Dec 05 09:10:14 compute-1 sudo[172147]: pam_systemd(sudo:session): Failed to connect to system bus: No such file or directory
Dec 05 09:10:14 compute-1 sudo[172147]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Dec 05 09:10:14 compute-1 podman[172125]: 2025-12-05 09:10:14.256156506 +0000 UTC m=+0.153989806 container start 3f3288243aefe700d53cffce184353529fbaf6e44f1c19ec4792ed576af41176 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.vendor=CentOS, config_id=multipathd, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Dec 05 09:10:14 compute-1 podman[172125]: multipathd
Dec 05 09:10:14 compute-1 systemd[1]: Started multipathd container.
Dec 05 09:10:14 compute-1 multipathd[172141]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Dec 05 09:10:14 compute-1 multipathd[172141]: INFO:__main__:Validating config file
Dec 05 09:10:14 compute-1 multipathd[172141]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Dec 05 09:10:14 compute-1 multipathd[172141]: INFO:__main__:Writing out command to execute
Dec 05 09:10:14 compute-1 sudo[172147]: pam_unix(sudo:session): session closed for user root
Dec 05 09:10:14 compute-1 multipathd[172141]: ++ cat /run_command
Dec 05 09:10:14 compute-1 sudo[172076]: pam_unix(sudo:session): session closed for user root
Dec 05 09:10:14 compute-1 multipathd[172141]: + CMD='/usr/sbin/multipathd -d'
Dec 05 09:10:14 compute-1 multipathd[172141]: + ARGS=
Dec 05 09:10:14 compute-1 multipathd[172141]: + sudo kolla_copy_cacerts
Dec 05 09:10:14 compute-1 sudo[172170]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_copy_cacerts
Dec 05 09:10:14 compute-1 sudo[172170]: pam_systemd(sudo:session): Failed to connect to system bus: No such file or directory
Dec 05 09:10:14 compute-1 sudo[172170]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Dec 05 09:10:14 compute-1 podman[172148]: 2025-12-05 09:10:14.336058785 +0000 UTC m=+0.066868033 container health_status 3f3288243aefe700d53cffce184353529fbaf6e44f1c19ec4792ed576af41176 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=starting, health_failing_streak=1, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=multipathd, io.buildah.version=1.41.3, container_name=multipathd, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125)
Dec 05 09:10:14 compute-1 sudo[172170]: pam_unix(sudo:session): session closed for user root
Dec 05 09:10:14 compute-1 multipathd[172141]: + [[ ! -n '' ]]
Dec 05 09:10:14 compute-1 multipathd[172141]: + . kolla_extend_start
Dec 05 09:10:14 compute-1 multipathd[172141]: Running command: '/usr/sbin/multipathd -d'
Dec 05 09:10:14 compute-1 multipathd[172141]: + echo 'Running command: '\''/usr/sbin/multipathd -d'\'''
Dec 05 09:10:14 compute-1 multipathd[172141]: + umask 0022
Dec 05 09:10:14 compute-1 multipathd[172141]: + exec /usr/sbin/multipathd -d
Dec 05 09:10:14 compute-1 systemd[1]: 3f3288243aefe700d53cffce184353529fbaf6e44f1c19ec4792ed576af41176-16bbe0c5ad8512d5.service: Main process exited, code=exited, status=1/FAILURE
Dec 05 09:10:14 compute-1 systemd[1]: 3f3288243aefe700d53cffce184353529fbaf6e44f1c19ec4792ed576af41176-16bbe0c5ad8512d5.service: Failed with result 'exit-code'.
Dec 05 09:10:14 compute-1 multipathd[172141]: 3087.416813 | --------start up--------
Dec 05 09:10:14 compute-1 multipathd[172141]: 3087.416832 | read /etc/multipath.conf
Dec 05 09:10:14 compute-1 multipathd[172141]: 3087.422972 | path checkers start up
Dec 05 09:10:14 compute-1 sudo[172330]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-saoobugtaucimjwljpbbcpbllkiywayw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925814.5760512-1881-203915141650935/AnsiballZ_file.py'
Dec 05 09:10:14 compute-1 sudo[172330]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:10:15 compute-1 python3.9[172332]: ansible-ansible.builtin.file Invoked with path=/etc/multipath/.multipath_restart_required state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 09:10:15 compute-1 sudo[172330]: pam_unix(sudo:session): session closed for user root
Dec 05 09:10:16 compute-1 sudo[172482]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-prqibgafqtvryjusjgviyqxqbleigwal ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925815.7255137-1917-31227204640283/AnsiballZ_file.py'
Dec 05 09:10:16 compute-1 sudo[172482]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:10:16 compute-1 python3.9[172484]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/modules-load.d selevel=s0 setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Dec 05 09:10:16 compute-1 sudo[172482]: pam_unix(sudo:session): session closed for user root
Dec 05 09:10:16 compute-1 sudo[172634]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kxhtwifosamlibdedhvqjnbrihhdcdsm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925816.486133-1941-4569435921060/AnsiballZ_modprobe.py'
Dec 05 09:10:16 compute-1 sudo[172634]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:10:16 compute-1 python3.9[172636]: ansible-community.general.modprobe Invoked with name=nvme-fabrics state=present params= persistent=disabled
Dec 05 09:10:17 compute-1 kernel: Key type psk registered
Dec 05 09:10:17 compute-1 sudo[172634]: pam_unix(sudo:session): session closed for user root
Dec 05 09:10:17 compute-1 sudo[172797]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ijthubitcviomohibxtrkqxwqenfimgy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925817.2936718-1965-148623399046476/AnsiballZ_stat.py'
Dec 05 09:10:17 compute-1 sudo[172797]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:10:17 compute-1 python3.9[172799]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/nvme-fabrics.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 09:10:17 compute-1 sudo[172797]: pam_unix(sudo:session): session closed for user root
Dec 05 09:10:18 compute-1 sudo[172920]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aetdhjgdbybibbilqihcvfqzrrqwkzyz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925817.2936718-1965-148623399046476/AnsiballZ_copy.py'
Dec 05 09:10:18 compute-1 sudo[172920]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:10:18 compute-1 python3.9[172922]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/nvme-fabrics.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764925817.2936718-1965-148623399046476/.source.conf follow=False _original_basename=module-load.conf.j2 checksum=783c778f0c68cc414f35486f234cbb1cf3f9bbff backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 09:10:18 compute-1 sudo[172920]: pam_unix(sudo:session): session closed for user root
Dec 05 09:10:19 compute-1 sudo[173072]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uqzawedthfiusbvbkfitwzgqwhxrvbfb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925818.812543-2013-275961983111037/AnsiballZ_lineinfile.py'
Dec 05 09:10:19 compute-1 sudo[173072]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:10:19 compute-1 python3.9[173074]: ansible-ansible.builtin.lineinfile Invoked with create=True dest=/etc/modules line=nvme-fabrics  mode=0644 state=present path=/etc/modules encoding=utf-8 backrefs=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 09:10:19 compute-1 sudo[173072]: pam_unix(sudo:session): session closed for user root
Dec 05 09:10:19 compute-1 sudo[173224]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-khjdmmmqkhsaivhigxifqsazapffiwtq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925819.5602186-2037-190883988662961/AnsiballZ_systemd.py'
Dec 05 09:10:19 compute-1 sudo[173224]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:10:20 compute-1 python3.9[173226]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 05 09:10:20 compute-1 systemd[1]: systemd-modules-load.service: Deactivated successfully.
Dec 05 09:10:20 compute-1 systemd[1]: Stopped Load Kernel Modules.
Dec 05 09:10:20 compute-1 systemd[1]: Stopping Load Kernel Modules...
Dec 05 09:10:20 compute-1 systemd[1]: Starting Load Kernel Modules...
Dec 05 09:10:20 compute-1 systemd[1]: Finished Load Kernel Modules.
Dec 05 09:10:20 compute-1 sudo[173224]: pam_unix(sudo:session): session closed for user root
Dec 05 09:10:21 compute-1 sudo[173380]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lffhshqhnxfhuekxbdrjzlujeqocbsfk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925820.7432823-2061-160574640877894/AnsiballZ_dnf.py'
Dec 05 09:10:21 compute-1 sudo[173380]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:10:21 compute-1 python3.9[173382]: ansible-ansible.legacy.dnf Invoked with name=['nvme-cli'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec 05 09:10:23 compute-1 systemd[1]: Reloading.
Dec 05 09:10:23 compute-1 systemd-sysv-generator[173418]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 05 09:10:23 compute-1 systemd-rc-local-generator[173414]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 05 09:10:23 compute-1 systemd[1]: Reloading.
Dec 05 09:10:24 compute-1 systemd-rc-local-generator[173450]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 05 09:10:24 compute-1 systemd-sysv-generator[173454]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 05 09:10:24 compute-1 systemd-logind[807]: Watching system buttons on /dev/input/event0 (Power Button)
Dec 05 09:10:24 compute-1 systemd-logind[807]: Watching system buttons on /dev/input/event1 (AT Translated Set 2 keyboard)
Dec 05 09:10:24 compute-1 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Dec 05 09:10:24 compute-1 systemd[1]: Starting man-db-cache-update.service...
Dec 05 09:10:24 compute-1 systemd[1]: Reloading.
Dec 05 09:10:24 compute-1 systemd-rc-local-generator[173541]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 05 09:10:24 compute-1 systemd-sysv-generator[173547]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 05 09:10:24 compute-1 systemd[1]: Queuing reload/restart jobs for marked units…
Dec 05 09:10:25 compute-1 sudo[173380]: pam_unix(sudo:session): session closed for user root
Dec 05 09:10:26 compute-1 sudo[174834]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cuaxabjolooqlqantjxsuvaendkiggdq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925825.7128875-2085-155122061732255/AnsiballZ_systemd_service.py'
Dec 05 09:10:26 compute-1 sudo[174834]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:10:26 compute-1 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Dec 05 09:10:26 compute-1 systemd[1]: Finished man-db-cache-update.service.
Dec 05 09:10:26 compute-1 systemd[1]: man-db-cache-update.service: Consumed 1.758s CPU time.
Dec 05 09:10:26 compute-1 systemd[1]: run-rb7973635e10042c0a9d893a35a62a754.service: Deactivated successfully.
Dec 05 09:10:26 compute-1 python3.9[174836]: ansible-ansible.builtin.systemd_service Invoked with name=iscsid state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 05 09:10:26 compute-1 systemd[1]: Stopping Open-iSCSI...
Dec 05 09:10:26 compute-1 iscsid[162624]: iscsid shutting down.
Dec 05 09:10:26 compute-1 systemd[1]: iscsid.service: Deactivated successfully.
Dec 05 09:10:26 compute-1 systemd[1]: Stopped Open-iSCSI.
Dec 05 09:10:26 compute-1 systemd[1]: One time configuration for iscsi.service was skipped because of an unmet condition check (ConditionPathExists=!/etc/iscsi/initiatorname.iscsi).
Dec 05 09:10:26 compute-1 systemd[1]: Starting Open-iSCSI...
Dec 05 09:10:26 compute-1 systemd[1]: Started Open-iSCSI.
Dec 05 09:10:26 compute-1 sudo[174834]: pam_unix(sudo:session): session closed for user root
Dec 05 09:10:27 compute-1 python3.9[174991]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 05 09:10:28 compute-1 sudo[175145]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jubucafscupznzcnmirkzybgypvzwfjm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925827.8202815-2137-122020008802246/AnsiballZ_file.py'
Dec 05 09:10:28 compute-1 sudo[175145]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:10:28 compute-1 python3.9[175147]: ansible-ansible.builtin.file Invoked with mode=0644 path=/etc/ssh/ssh_known_hosts state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 09:10:28 compute-1 sudo[175145]: pam_unix(sudo:session): session closed for user root
Dec 05 09:10:29 compute-1 sudo[175297]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mimkpowxizikpcpqxhhhhkbwaxeahxds ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925829.0102127-2170-88121712667226/AnsiballZ_systemd_service.py'
Dec 05 09:10:29 compute-1 sudo[175297]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:10:29 compute-1 python3.9[175299]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Dec 05 09:10:29 compute-1 systemd[1]: Reloading.
Dec 05 09:10:29 compute-1 systemd-rc-local-generator[175324]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 05 09:10:29 compute-1 systemd-sysv-generator[175328]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 05 09:10:29 compute-1 sudo[175297]: pam_unix(sudo:session): session closed for user root
Dec 05 09:10:30 compute-1 python3.9[175484]: ansible-ansible.builtin.service_facts Invoked
Dec 05 09:10:30 compute-1 network[175501]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Dec 05 09:10:30 compute-1 network[175502]: 'network-scripts' will be removed from distribution in near future.
Dec 05 09:10:30 compute-1 network[175503]: It is advised to switch to 'NetworkManager' instead for network management.
Dec 05 09:10:33 compute-1 sshd-session[175509]: Received disconnect from 43.225.158.169 port 43533:11: Bye Bye [preauth]
Dec 05 09:10:33 compute-1 sshd-session[175509]: Disconnected from authenticating user root 43.225.158.169 port 43533 [preauth]
Dec 05 09:10:33 compute-1 sshd-session[175541]: Received disconnect from 122.114.113.177 port 50226:11: Bye Bye [preauth]
Dec 05 09:10:33 compute-1 sshd-session[175541]: Disconnected from authenticating user root 122.114.113.177 port 50226 [preauth]
Dec 05 09:10:35 compute-1 sudo[175779]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-crljmqysukwmehabhgqxunxlrezfwyko ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925835.1469247-2227-184936722478013/AnsiballZ_systemd_service.py'
Dec 05 09:10:35 compute-1 sudo[175779]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:10:35 compute-1 python3.9[175781]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_compute.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 05 09:10:35 compute-1 sudo[175779]: pam_unix(sudo:session): session closed for user root
Dec 05 09:10:36 compute-1 sudo[175932]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ujtyfxnwqsbexhiavommgeprkpcchmpt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925835.9848666-2227-171569944828944/AnsiballZ_systemd_service.py'
Dec 05 09:10:36 compute-1 sudo[175932]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:10:36 compute-1 python3.9[175934]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_migration_target.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 05 09:10:36 compute-1 sudo[175932]: pam_unix(sudo:session): session closed for user root
Dec 05 09:10:37 compute-1 sudo[176085]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ifidvyrgbmzfdltyeixlecsfdsosindb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925836.8549078-2227-233009954287334/AnsiballZ_systemd_service.py'
Dec 05 09:10:37 compute-1 sudo[176085]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:10:37 compute-1 python3.9[176087]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_api_cron.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 05 09:10:37 compute-1 sudo[176085]: pam_unix(sudo:session): session closed for user root
Dec 05 09:10:37 compute-1 sudo[176238]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mtoqwtjfhtbkolouwsuqsugnoyopicns ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925837.6673186-2227-146897811451225/AnsiballZ_systemd_service.py'
Dec 05 09:10:37 compute-1 sudo[176238]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:10:38 compute-1 python3.9[176240]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_api.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 05 09:10:38 compute-1 sudo[176238]: pam_unix(sudo:session): session closed for user root
Dec 05 09:10:38 compute-1 sudo[176391]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dlxgnnxmfsflbpqehgyzjyehklqucwat ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925838.4571066-2227-77904601683695/AnsiballZ_systemd_service.py'
Dec 05 09:10:38 compute-1 sudo[176391]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:10:39 compute-1 python3.9[176393]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_conductor.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 05 09:10:39 compute-1 sudo[176391]: pam_unix(sudo:session): session closed for user root
Dec 05 09:10:39 compute-1 sudo[176544]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-stfqfddkhndasgcqzjjdkageszpjhyru ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925839.3605173-2227-2161139929543/AnsiballZ_systemd_service.py'
Dec 05 09:10:39 compute-1 sudo[176544]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:10:39 compute-1 podman[176546]: 2025-12-05 09:10:39.989091949 +0000 UTC m=+0.107181941 container health_status 0cdc951f3b455a1459471b20e1f1626ca53f702831c125f835d1b670aeb17ccf (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a42d21893a42142b0b00e96296a13ac9d2c0fedf55d6438ad104fd0309c63376-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Dec 05 09:10:40 compute-1 python3.9[176547]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_metadata.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 05 09:10:40 compute-1 sudo[176544]: pam_unix(sudo:session): session closed for user root
Dec 05 09:10:40 compute-1 sudo[176723]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-riyyujiqqogigynzchytlivstniipgwj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925840.3278549-2227-86727324505754/AnsiballZ_systemd_service.py'
Dec 05 09:10:40 compute-1 sudo[176723]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:10:40 compute-1 python3.9[176725]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_scheduler.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 05 09:10:40 compute-1 sudo[176723]: pam_unix(sudo:session): session closed for user root
Dec 05 09:10:41 compute-1 sudo[176876]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lodpsejsgtmvejurbottfayamqfgrhoj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925841.1479423-2227-188965468945410/AnsiballZ_systemd_service.py'
Dec 05 09:10:41 compute-1 sudo[176876]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:10:41 compute-1 python3.9[176878]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_vnc_proxy.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 05 09:10:41 compute-1 sudo[176876]: pam_unix(sudo:session): session closed for user root
Dec 05 09:10:42 compute-1 sudo[177029]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-orfihuegnugnmftmsdstbbdcohvghkeh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925842.3400164-2404-36599386499356/AnsiballZ_file.py'
Dec 05 09:10:42 compute-1 sudo[177029]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:10:42 compute-1 python3.9[177031]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 09:10:42 compute-1 sudo[177029]: pam_unix(sudo:session): session closed for user root
Dec 05 09:10:43 compute-1 sudo[177194]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rpyydedqqnellahtsowzcooubuivcefl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925842.944034-2404-153801942383647/AnsiballZ_file.py'
Dec 05 09:10:43 compute-1 sudo[177194]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:10:43 compute-1 podman[177155]: 2025-12-05 09:10:43.245516349 +0000 UTC m=+0.062651754 container health_status e8cea6352187f28e672aa25c227f2705a90d58074b8cf42235a56df5d4026fd5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a42d21893a42142b0b00e96296a13ac9d2c0fedf55d6438ad104fd0309c63376-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Dec 05 09:10:43 compute-1 python3.9[177200]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_migration_target.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 09:10:43 compute-1 sudo[177194]: pam_unix(sudo:session): session closed for user root
Dec 05 09:10:43 compute-1 sudo[177352]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yihsszisjzvedamjpduutjpnychzyhgg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925843.5995018-2404-182471226805508/AnsiballZ_file.py'
Dec 05 09:10:43 compute-1 sudo[177352]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:10:44 compute-1 python3.9[177354]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_api_cron.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 09:10:44 compute-1 sudo[177352]: pam_unix(sudo:session): session closed for user root
Dec 05 09:10:44 compute-1 sudo[177515]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fkigqpeffdobcxwbvobfysdlmmtxfijh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925844.213401-2404-135995007196793/AnsiballZ_file.py'
Dec 05 09:10:44 compute-1 sudo[177515]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:10:44 compute-1 podman[177478]: 2025-12-05 09:10:44.54748682 +0000 UTC m=+0.071676806 container health_status 3f3288243aefe700d53cffce184353529fbaf6e44f1c19ec4792ed576af41176 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, container_name=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 05 09:10:44 compute-1 python3.9[177525]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_api.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 09:10:44 compute-1 sudo[177515]: pam_unix(sudo:session): session closed for user root
Dec 05 09:10:45 compute-1 sudo[177676]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kejkjmwxdhbrdgmqcbajnepvkbysguld ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925844.8756804-2404-214636119404230/AnsiballZ_file.py'
Dec 05 09:10:45 compute-1 sudo[177676]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:10:45 compute-1 python3.9[177678]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_conductor.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 09:10:45 compute-1 sudo[177676]: pam_unix(sudo:session): session closed for user root
Dec 05 09:10:45 compute-1 sudo[177828]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jcgkgwmdeghzkpcpzicnlbunqqsqcoor ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925845.4954576-2404-99431846836897/AnsiballZ_file.py'
Dec 05 09:10:45 compute-1 sudo[177828]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:10:45 compute-1 python3.9[177830]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_metadata.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 09:10:45 compute-1 sudo[177828]: pam_unix(sudo:session): session closed for user root
Dec 05 09:10:46 compute-1 sudo[177980]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dmivymkqexfxpmutqwujflxhzkjnzjqm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925846.1091495-2404-204104571675969/AnsiballZ_file.py'
Dec 05 09:10:46 compute-1 sudo[177980]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:10:46 compute-1 python3.9[177982]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_scheduler.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 09:10:46 compute-1 sudo[177980]: pam_unix(sudo:session): session closed for user root
Dec 05 09:10:47 compute-1 sudo[178132]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yuugxjbjflixbevojghasrsetwlrnvvr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925846.7247837-2404-108517086066542/AnsiballZ_file.py'
Dec 05 09:10:47 compute-1 sudo[178132]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:10:47 compute-1 python3.9[178134]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_vnc_proxy.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 09:10:47 compute-1 sudo[178132]: pam_unix(sudo:session): session closed for user root
Dec 05 09:10:47 compute-1 sudo[178284]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aqmwghiwsqedzrayzufblyemgzdonotu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925847.4042451-2575-166549789454968/AnsiballZ_file.py'
Dec 05 09:10:47 compute-1 sudo[178284]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:10:47 compute-1 python3.9[178286]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 09:10:47 compute-1 sudo[178284]: pam_unix(sudo:session): session closed for user root
Dec 05 09:10:48 compute-1 sudo[178436]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wktsxrmixpsuatkgmswsutkvrqtpuixr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925848.061213-2575-241919291311636/AnsiballZ_file.py'
Dec 05 09:10:48 compute-1 sudo[178436]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:10:48 compute-1 python3.9[178438]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_migration_target.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 09:10:48 compute-1 sudo[178436]: pam_unix(sudo:session): session closed for user root
Dec 05 09:10:48 compute-1 sudo[178590]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gbzxwjmrbkvyipjlympvcxtfaylwvapa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925848.697645-2575-160992743580562/AnsiballZ_file.py'
Dec 05 09:10:48 compute-1 sudo[178590]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:10:49 compute-1 python3.9[178592]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_api_cron.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 09:10:49 compute-1 sudo[178590]: pam_unix(sudo:session): session closed for user root
Dec 05 09:10:49 compute-1 sshd-session[178439]: Received disconnect from 185.118.15.236 port 34798:11: Bye Bye [preauth]
Dec 05 09:10:49 compute-1 sshd-session[178439]: Disconnected from authenticating user root 185.118.15.236 port 34798 [preauth]
Dec 05 09:10:49 compute-1 sudo[178742]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mbklfetgxwanosneoathaflgvtitbgtw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925849.30839-2575-17817799609599/AnsiballZ_file.py'
Dec 05 09:10:49 compute-1 sudo[178742]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:10:49 compute-1 python3.9[178744]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_api.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 09:10:49 compute-1 sudo[178742]: pam_unix(sudo:session): session closed for user root
Dec 05 09:10:50 compute-1 sudo[178894]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-chsraxfdfdlsqueatoamvmmweygsrnrw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925849.9638863-2575-28797536310620/AnsiballZ_file.py'
Dec 05 09:10:50 compute-1 sudo[178894]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:10:50 compute-1 python3.9[178896]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_conductor.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 09:10:50 compute-1 sudo[178894]: pam_unix(sudo:session): session closed for user root
Dec 05 09:10:51 compute-1 sudo[179046]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bypwkefnnbmzfnilkznytdmrahicupgh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925850.9454467-2575-84664199701468/AnsiballZ_file.py'
Dec 05 09:10:51 compute-1 sudo[179046]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:10:51 compute-1 python3.9[179048]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_metadata.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 09:10:51 compute-1 sudo[179046]: pam_unix(sudo:session): session closed for user root
Dec 05 09:10:51 compute-1 sudo[179198]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fmgjzclvjnhugpeofrbrglmsqsqfpucf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925851.60396-2575-135168013896636/AnsiballZ_file.py'
Dec 05 09:10:51 compute-1 sudo[179198]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:10:52 compute-1 python3.9[179200]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_scheduler.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 09:10:52 compute-1 sudo[179198]: pam_unix(sudo:session): session closed for user root
Dec 05 09:10:52 compute-1 sudo[179350]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pkrggtuaughpahbjqgmlgjakiczjkxzt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925852.2794914-2575-99059399690612/AnsiballZ_file.py'
Dec 05 09:10:52 compute-1 sudo[179350]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:10:52 compute-1 python3.9[179352]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_vnc_proxy.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 09:10:52 compute-1 sudo[179350]: pam_unix(sudo:session): session closed for user root
Dec 05 09:10:53 compute-1 sudo[179502]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vpdpkwdkyzzjxkfwgfcfmcboaifhjiia ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925853.136145-2749-161095081516972/AnsiballZ_command.py'
Dec 05 09:10:53 compute-1 sudo[179502]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:10:53 compute-1 python3.9[179504]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then
                                               systemctl disable --now certmonger.service
                                               test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service
                                             fi
                                              _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 05 09:10:53 compute-1 sudo[179502]: pam_unix(sudo:session): session closed for user root
Dec 05 09:10:54 compute-1 python3.9[179656]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Dec 05 09:10:55 compute-1 sudo[179806]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ggcuqujplzzbazthxaikhepxawztraum ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925854.8682342-2803-371273295547/AnsiballZ_systemd_service.py'
Dec 05 09:10:55 compute-1 sudo[179806]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:10:55 compute-1 python3.9[179808]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Dec 05 09:10:55 compute-1 systemd[1]: Reloading.
Dec 05 09:10:55 compute-1 systemd-rc-local-generator[179833]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 05 09:10:55 compute-1 systemd-sysv-generator[179836]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 05 09:10:55 compute-1 sudo[179806]: pam_unix(sudo:session): session closed for user root
Dec 05 09:10:56 compute-1 sudo[179993]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kyewuisskmmzyjdtcowtgiurrmvzacbt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925856.099333-2827-99270388048976/AnsiballZ_command.py'
Dec 05 09:10:56 compute-1 sudo[179993]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:10:56 compute-1 python3.9[179995]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_compute.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 05 09:10:56 compute-1 sudo[179993]: pam_unix(sudo:session): session closed for user root
Dec 05 09:10:57 compute-1 sudo[180146]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zlpoysktgbmoiwxlkknjwofnuquampbs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925856.743263-2827-133229467919265/AnsiballZ_command.py'
Dec 05 09:10:57 compute-1 sudo[180146]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:10:57 compute-1 python3.9[180148]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_migration_target.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 05 09:10:57 compute-1 sudo[180146]: pam_unix(sudo:session): session closed for user root
Dec 05 09:10:57 compute-1 sudo[180299]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-amvytfphufuzjtixtdxcpgcbizeqifck ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925857.3996847-2827-174629770063627/AnsiballZ_command.py'
Dec 05 09:10:57 compute-1 sudo[180299]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:10:57 compute-1 python3.9[180301]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_api_cron.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 05 09:10:57 compute-1 sudo[180299]: pam_unix(sudo:session): session closed for user root
Dec 05 09:10:58 compute-1 sudo[180454]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rlqfwzuvcwjkfckxwfssikpoyasbxzbn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925858.0926-2827-255038264069339/AnsiballZ_command.py'
Dec 05 09:10:58 compute-1 sudo[180454]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:10:58 compute-1 python3.9[180456]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_api.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 05 09:10:58 compute-1 sudo[180454]: pam_unix(sudo:session): session closed for user root
Dec 05 09:10:59 compute-1 sudo[180607]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jboiqbfigglnblfqlmsvhdfbjkwvoswf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925858.7711704-2827-109046894778339/AnsiballZ_command.py'
Dec 05 09:10:59 compute-1 sudo[180607]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:10:59 compute-1 python3.9[180609]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_conductor.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 05 09:10:59 compute-1 sudo[180607]: pam_unix(sudo:session): session closed for user root
Dec 05 09:10:59 compute-1 sshd-session[180303]: Received disconnect from 122.168.194.41 port 48382:11: Bye Bye [preauth]
Dec 05 09:10:59 compute-1 sshd-session[180303]: Disconnected from authenticating user root 122.168.194.41 port 48382 [preauth]
Dec 05 09:10:59 compute-1 sudo[180760]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pkmdannfnkaophsrhsiajkplwqcpsggi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925859.4292676-2827-143434578713914/AnsiballZ_command.py'
Dec 05 09:10:59 compute-1 sudo[180760]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:10:59 compute-1 python3.9[180762]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_metadata.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 05 09:10:59 compute-1 sudo[180760]: pam_unix(sudo:session): session closed for user root
Dec 05 09:11:00 compute-1 sudo[180913]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-miiaeqjopbebvrlyyidizrxjyiiamlmw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925860.091282-2827-92027721018452/AnsiballZ_command.py'
Dec 05 09:11:00 compute-1 sudo[180913]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:11:00 compute-1 python3.9[180915]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_scheduler.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 05 09:11:00 compute-1 sudo[180913]: pam_unix(sudo:session): session closed for user root
Dec 05 09:11:00 compute-1 sudo[181066]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wivcojsuafhixmgqbwvpmohducbshhqc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925860.708691-2827-200481743060345/AnsiballZ_command.py'
Dec 05 09:11:00 compute-1 sudo[181066]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:11:01 compute-1 python3.9[181068]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_vnc_proxy.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 05 09:11:01 compute-1 sudo[181066]: pam_unix(sudo:session): session closed for user root
Dec 05 09:11:02 compute-1 sudo[181219]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xaitsfxgkfgyumpthmbdfcqdmsckhrsd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925862.5693343-3034-61329821936656/AnsiballZ_file.py'
Dec 05 09:11:02 compute-1 sudo[181219]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:11:03 compute-1 python3.9[181221]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 05 09:11:03 compute-1 sudo[181219]: pam_unix(sudo:session): session closed for user root
Dec 05 09:11:03 compute-1 sudo[181371]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ecxygmdsymtzfesgjehxlnmzwargoxpt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925863.2154696-3034-267140674397550/AnsiballZ_file.py'
Dec 05 09:11:03 compute-1 sudo[181371]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:11:03 compute-1 python3.9[181373]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/containers setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 05 09:11:03 compute-1 sudo[181371]: pam_unix(sudo:session): session closed for user root
Dec 05 09:11:04 compute-1 sudo[181523]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xxipiqezbcklmlvabyebpozphfanlayi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925863.8323987-3034-167938636624897/AnsiballZ_file.py'
Dec 05 09:11:04 compute-1 sudo[181523]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:11:04 compute-1 python3.9[181525]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/nova_nvme_cleaner setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 05 09:11:04 compute-1 sudo[181523]: pam_unix(sudo:session): session closed for user root
Dec 05 09:11:04 compute-1 sudo[181675]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tzlaefzwdtvvtrrfywpkjsntjgohddud ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925864.5773973-3100-129186182470516/AnsiballZ_file.py'
Dec 05 09:11:04 compute-1 sudo[181675]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:11:05 compute-1 python3.9[181677]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 05 09:11:05 compute-1 sudo[181675]: pam_unix(sudo:session): session closed for user root
Dec 05 09:11:05 compute-1 sudo[181827]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dvosuwlvrvojyxkritmjgugrfsrxuldz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925865.2108448-3100-141123889902136/AnsiballZ_file.py'
Dec 05 09:11:05 compute-1 sudo[181827]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:11:05 compute-1 python3.9[181829]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/_nova_secontext setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 05 09:11:05 compute-1 sudo[181827]: pam_unix(sudo:session): session closed for user root
Dec 05 09:11:06 compute-1 sudo[181979]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gzgcbshhyhushhfenctjftdawfcnkkxx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925865.8189692-3100-138077726153730/AnsiballZ_file.py'
Dec 05 09:11:06 compute-1 sudo[181979]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:11:06 compute-1 python3.9[181981]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/nova/instances setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 05 09:11:06 compute-1 sudo[181979]: pam_unix(sudo:session): session closed for user root
Dec 05 09:11:06 compute-1 sudo[182131]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cenuxvrsjkwuydnwmzbwggihutnkqfre ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925866.5155575-3100-53454760778752/AnsiballZ_file.py'
Dec 05 09:11:06 compute-1 sudo[182131]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:11:07 compute-1 python3.9[182133]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/etc/ceph setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 05 09:11:07 compute-1 sudo[182131]: pam_unix(sudo:session): session closed for user root
Dec 05 09:11:07 compute-1 sudo[182283]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pfwaxhzzzooatkcgnhdydmhwcmkzkxor ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925867.182912-3100-221542435154378/AnsiballZ_file.py'
Dec 05 09:11:07 compute-1 sudo[182283]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:11:07 compute-1 python3.9[182285]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/etc/multipath setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Dec 05 09:11:07 compute-1 sudo[182283]: pam_unix(sudo:session): session closed for user root
Dec 05 09:11:08 compute-1 sudo[182435]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aivxjrljgswqvpaupycfktzalclvjhal ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925867.8455389-3100-196203476665828/AnsiballZ_file.py'
Dec 05 09:11:08 compute-1 sudo[182435]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:11:08 compute-1 python3.9[182437]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/etc/nvme setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Dec 05 09:11:08 compute-1 sudo[182435]: pam_unix(sudo:session): session closed for user root
Dec 05 09:11:08 compute-1 sudo[182587]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ollbyajpzpuurfzsmgepolrajtlaxter ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925868.5248864-3100-254308081012130/AnsiballZ_file.py'
Dec 05 09:11:08 compute-1 sudo[182587]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:11:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:11:08.853 105272 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:11:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:11:08.857 105272 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.005s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:11:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:11:08.858 105272 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:11:09 compute-1 python3.9[182589]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/run/openvswitch setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Dec 05 09:11:09 compute-1 sudo[182587]: pam_unix(sudo:session): session closed for user root
Dec 05 09:11:10 compute-1 podman[182614]: 2025-12-05 09:11:10.68915314 +0000 UTC m=+0.127406956 container health_status 0cdc951f3b455a1459471b20e1f1626ca53f702831c125f835d1b670aeb17ccf (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a42d21893a42142b0b00e96296a13ac9d2c0fedf55d6438ad104fd0309c63376-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, container_name=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251125)
Dec 05 09:11:13 compute-1 podman[182662]: 2025-12-05 09:11:13.619858414 +0000 UTC m=+0.061432422 container health_status e8cea6352187f28e672aa25c227f2705a90d58074b8cf42235a56df5d4026fd5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a42d21893a42142b0b00e96296a13ac9d2c0fedf55d6438ad104fd0309c63376-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Dec 05 09:11:14 compute-1 sudo[182786]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-imcmwpocxcbppruqqcqkcpwvqbfpkbzr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925873.5557525-3405-258891139692551/AnsiballZ_getent.py'
Dec 05 09:11:14 compute-1 sudo[182786]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:11:14 compute-1 python3.9[182788]: ansible-ansible.builtin.getent Invoked with database=passwd key=nova fail_key=True service=None split=None
Dec 05 09:11:14 compute-1 sudo[182786]: pam_unix(sudo:session): session closed for user root
Dec 05 09:11:14 compute-1 sudo[182957]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zzuhwcdjqpggwlbgwfmfrlcxvxflubna ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925874.4615917-3429-67388114129997/AnsiballZ_group.py'
Dec 05 09:11:14 compute-1 podman[182913]: 2025-12-05 09:11:14.946565365 +0000 UTC m=+0.062373515 container health_status 3f3288243aefe700d53cffce184353529fbaf6e44f1c19ec4792ed576af41176 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Dec 05 09:11:14 compute-1 sudo[182957]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:11:15 compute-1 python3.9[182962]: ansible-ansible.builtin.group Invoked with gid=42436 name=nova state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Dec 05 09:11:15 compute-1 groupadd[182964]: group added to /etc/group: name=nova, GID=42436
Dec 05 09:11:15 compute-1 groupadd[182964]: group added to /etc/gshadow: name=nova
Dec 05 09:11:15 compute-1 groupadd[182964]: new group: name=nova, GID=42436
Dec 05 09:11:15 compute-1 sudo[182957]: pam_unix(sudo:session): session closed for user root
Dec 05 09:11:16 compute-1 sudo[183119]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-djjofbpigdocoonbqzmgpmodtxoeartz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925875.623457-3453-269038232565138/AnsiballZ_user.py'
Dec 05 09:11:16 compute-1 sudo[183119]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:11:16 compute-1 python3.9[183121]: ansible-ansible.builtin.user Invoked with comment=nova user group=nova groups=['libvirt'] name=nova shell=/bin/sh state=present uid=42436 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-1 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Dec 05 09:11:16 compute-1 useradd[183123]: new user: name=nova, UID=42436, GID=42436, home=/home/nova, shell=/bin/sh, from=/dev/pts/0
Dec 05 09:11:16 compute-1 useradd[183123]: add 'nova' to group 'libvirt'
Dec 05 09:11:16 compute-1 useradd[183123]: add 'nova' to shadow group 'libvirt'
Dec 05 09:11:16 compute-1 sudo[183119]: pam_unix(sudo:session): session closed for user root
Dec 05 09:11:17 compute-1 sshd-session[183154]: Accepted publickey for zuul from 192.168.122.30 port 53742 ssh2: ECDSA SHA256:xFeNApY160LxGIiSPyA1pr6/OAQjwgC1t0EAbYvSE3Q
Dec 05 09:11:17 compute-1 systemd-logind[807]: New session 26 of user zuul.
Dec 05 09:11:17 compute-1 systemd[1]: Started Session 26 of User zuul.
Dec 05 09:11:17 compute-1 sshd-session[183154]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 05 09:11:17 compute-1 sshd-session[183157]: Received disconnect from 192.168.122.30 port 53742:11: disconnected by user
Dec 05 09:11:17 compute-1 sshd-session[183157]: Disconnected from user zuul 192.168.122.30 port 53742
Dec 05 09:11:17 compute-1 sshd-session[183154]: pam_unix(sshd:session): session closed for user zuul
Dec 05 09:11:17 compute-1 systemd[1]: session-26.scope: Deactivated successfully.
Dec 05 09:11:17 compute-1 systemd-logind[807]: Session 26 logged out. Waiting for processes to exit.
Dec 05 09:11:17 compute-1 systemd-logind[807]: Removed session 26.
Dec 05 09:11:18 compute-1 python3.9[183307]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/config.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 09:11:19 compute-1 python3.9[183428]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/config.json mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764925877.911703-3528-6765472154405/.source.json follow=False _original_basename=config.json.j2 checksum=b51012bfb0ca26296dcf3793a2f284446fb1395e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 05 09:11:19 compute-1 python3.9[183578]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/nova-blank.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 09:11:20 compute-1 python3.9[183654]: ansible-ansible.legacy.file Invoked with mode=0644 setype=container_file_t dest=/var/lib/openstack/config/nova/nova-blank.conf _original_basename=nova-blank.conf recurse=False state=file path=/var/lib/openstack/config/nova/nova-blank.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 05 09:11:20 compute-1 python3.9[183804]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/ssh-config follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 09:11:21 compute-1 python3.9[183925]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/ssh-config mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764925880.3141208-3528-247353474067080/.source follow=False _original_basename=ssh-config checksum=4297f735c41bdc1ff52d72e6f623a02242f37958 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 05 09:11:21 compute-1 python3.9[184075]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/02-nova-host-specific.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 09:11:22 compute-1 python3.9[184196]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/02-nova-host-specific.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764925881.5169566-3528-82906078225670/.source.conf follow=False _original_basename=02-nova-host-specific.conf.j2 checksum=bc7f3bb7d4094c596a18178a888511b54e157ba4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 05 09:11:23 compute-1 python3.9[184346]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/nova_statedir_ownership.py follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 09:11:23 compute-1 python3.9[184467]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/nova_statedir_ownership.py mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764925882.674967-3528-111008329950248/.source.py follow=False _original_basename=nova_statedir_ownership.py checksum=c6c8a3cfefa5efd60ceb1408c4e977becedb71e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 05 09:11:24 compute-1 python3.9[184617]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/run-on-host follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 09:11:24 compute-1 python3.9[184738]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/run-on-host mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764925883.904211-3528-150037882695664/.source follow=False _original_basename=run-on-host checksum=93aba8edc83d5878604a66d37fea2f12b60bdea2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 05 09:11:25 compute-1 sudo[184888]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ipnjsviinuceophqyfvkihqgfekoatba ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925885.218909-3777-277757942235430/AnsiballZ_file.py'
Dec 05 09:11:25 compute-1 sudo[184888]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:11:25 compute-1 python3.9[184890]: ansible-ansible.builtin.file Invoked with group=nova mode=0700 owner=nova path=/home/nova/.ssh state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 09:11:25 compute-1 sudo[184888]: pam_unix(sudo:session): session closed for user root
Dec 05 09:11:26 compute-1 sudo[185040]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-flwijcunudiqqjhzdfhmdjxxzrinzmib ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925885.961188-3801-267498350488531/AnsiballZ_copy.py'
Dec 05 09:11:26 compute-1 sudo[185040]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:11:26 compute-1 python3.9[185042]: ansible-ansible.legacy.copy Invoked with dest=/home/nova/.ssh/authorized_keys group=nova mode=0600 owner=nova remote_src=True src=/var/lib/openstack/config/nova/ssh-publickey backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 09:11:26 compute-1 sudo[185040]: pam_unix(sudo:session): session closed for user root
Dec 05 09:11:26 compute-1 sudo[185192]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eyibdlppdfooplvzxvfjanbhgvywjyxz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925886.6905363-3825-84810791037972/AnsiballZ_stat.py'
Dec 05 09:11:26 compute-1 sudo[185192]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:11:27 compute-1 python3.9[185194]: ansible-ansible.builtin.stat Invoked with path=/var/lib/nova/compute_id follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 05 09:11:27 compute-1 sudo[185192]: pam_unix(sudo:session): session closed for user root
Dec 05 09:11:27 compute-1 sudo[185344]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qseaklstieecfnljofehlkotokwgdrlj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925887.4014182-3849-86841356014269/AnsiballZ_stat.py'
Dec 05 09:11:27 compute-1 sudo[185344]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:11:27 compute-1 python3.9[185346]: ansible-ansible.legacy.stat Invoked with path=/var/lib/nova/compute_id follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 09:11:27 compute-1 sudo[185344]: pam_unix(sudo:session): session closed for user root
Dec 05 09:11:28 compute-1 sudo[185467]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vacfmyjtaeioibeagyqpsylmsrqdtsoc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925887.4014182-3849-86841356014269/AnsiballZ_copy.py'
Dec 05 09:11:28 compute-1 sudo[185467]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:11:28 compute-1 python3.9[185469]: ansible-ansible.legacy.copy Invoked with attributes=+i dest=/var/lib/nova/compute_id group=nova mode=0400 owner=nova src=/home/zuul/.ansible/tmp/ansible-tmp-1764925887.4014182-3849-86841356014269/.source _original_basename=.qqtvxyp5 follow=False checksum=5073cf13dd1fcf9608fd3439275a3ff80288717e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None
Dec 05 09:11:28 compute-1 sudo[185467]: pam_unix(sudo:session): session closed for user root
Dec 05 09:11:29 compute-1 python3.9[185621]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 05 09:11:30 compute-1 python3.9[185773]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/containers/nova_compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 09:11:31 compute-1 python3.9[185894]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/containers/nova_compute.json mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764925889.7042153-3927-273945155479048/.source.json follow=False _original_basename=nova_compute.json.j2 checksum=211ffd0bca4b407eb4de45a749ef70116a7806fd backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 05 09:11:31 compute-1 python3.9[186044]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/containers/nova_compute_init.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 09:11:32 compute-1 python3.9[186165]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/containers/nova_compute_init.json mode=0700 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764925891.2451038-3973-257025323333110/.source.json follow=False _original_basename=nova_compute_init.json.j2 checksum=60b024e6db49dc6e700fc0d50263944d98d4c034 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 05 09:11:33 compute-1 sudo[186315]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-orkchfyexgjglugqxhdxswcaakdxcpsf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925892.78655-4024-180466490596647/AnsiballZ_container_config_data.py'
Dec 05 09:11:33 compute-1 sudo[186315]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:11:33 compute-1 python3.9[186317]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/containers config_pattern=nova_compute_init.json debug=False
Dec 05 09:11:33 compute-1 sudo[186315]: pam_unix(sudo:session): session closed for user root
Dec 05 09:11:34 compute-1 sudo[186467]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dladwvbpdqhhmhdnvjliobbzvqmeusfk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925893.8228576-4056-235506801699075/AnsiballZ_container_config_hash.py'
Dec 05 09:11:34 compute-1 sudo[186467]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:11:34 compute-1 python3.9[186469]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/openstack
Dec 05 09:11:34 compute-1 sudo[186467]: pam_unix(sudo:session): session closed for user root
Dec 05 09:11:35 compute-1 sudo[186619]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sbmhqatyadvkjssesrhrqpkspyhkgueo ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1764925895.2192328-4086-250263160688451/AnsiballZ_edpm_container_manage.py'
Dec 05 09:11:35 compute-1 sudo[186619]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:11:35 compute-1 python3[186621]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/containers config_id=edpm config_overrides={} config_patterns=nova_compute_init.json containers=[] log_base_path=/var/log/containers/stdouts debug=False
Dec 05 09:11:36 compute-1 podman[186658]: 2025-12-05 09:11:36.036730894 +0000 UTC m=+0.057562006 container create 59dfeeed11c4e649ef970504cbc5a1f7167f0c7de0b0af93e55c353542dfe0ab (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_managed=true, config_id=edpm, container_name=nova_compute_init, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3)
Dec 05 09:11:36 compute-1 podman[186658]: 2025-12-05 09:11:36.007516856 +0000 UTC m=+0.028347988 image pull 5571c1b2140c835f70406e4553b3b44135b9c9b4eb673345cbd571460c5d59a3 quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified
Dec 05 09:11:36 compute-1 python3[186621]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name nova_compute_init --conmon-pidfile /run/nova_compute_init.pid --env NOVA_STATEDIR_OWNERSHIP_SKIP=/var/lib/nova/compute_id --env __OS_DEBUG=False --label config_id=edpm --label container_name=nova_compute_init --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']} --log-driver journald --log-level info --network none --privileged=False --security-opt label=disable --user root --volume /dev/log:/dev/log --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z --volume /var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init
Dec 05 09:11:36 compute-1 sudo[186619]: pam_unix(sudo:session): session closed for user root
Dec 05 09:11:36 compute-1 sudo[186845]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hpkjwbocyhmomhdupztkdsrmhwcnsetu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925896.3939896-4110-5318803820674/AnsiballZ_stat.py'
Dec 05 09:11:36 compute-1 sudo[186845]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:11:36 compute-1 python3.9[186847]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 05 09:11:36 compute-1 sudo[186845]: pam_unix(sudo:session): session closed for user root
Dec 05 09:11:37 compute-1 sudo[186999]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ecvbuvuzhasxqhpejqgjgwutbuznqbzg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925897.6542945-4146-10774615088530/AnsiballZ_container_config_data.py'
Dec 05 09:11:37 compute-1 sudo[186999]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:11:38 compute-1 python3.9[187001]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/containers config_pattern=nova_compute.json debug=False
Dec 05 09:11:38 compute-1 sudo[186999]: pam_unix(sudo:session): session closed for user root
Dec 05 09:11:39 compute-1 sudo[187151]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gfqznjdziwhivzcmfklnmajxtzgniyoi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925898.7435436-4179-244902386746927/AnsiballZ_container_config_hash.py'
Dec 05 09:11:39 compute-1 sudo[187151]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:11:39 compute-1 python3.9[187153]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/openstack
Dec 05 09:11:39 compute-1 sudo[187151]: pam_unix(sudo:session): session closed for user root
Dec 05 09:11:40 compute-1 sudo[187303]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kkljqiredhiqrghllnkctdjibopejsfm ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1764925899.7604296-4209-273886646020322/AnsiballZ_edpm_container_manage.py'
Dec 05 09:11:40 compute-1 sudo[187303]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:11:40 compute-1 python3[187305]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/containers config_id=edpm config_overrides={} config_patterns=nova_compute.json containers=[] log_base_path=/var/log/containers/stdouts debug=False
Dec 05 09:11:40 compute-1 podman[187338]: 2025-12-05 09:11:40.561153961 +0000 UTC m=+0.050125723 container create 0083f575e3208a4b9eecd208ae4078ca03e0269dd6771393ca4b6d2e2f2278e8 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, container_name=nova_compute, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']})
Dec 05 09:11:40 compute-1 podman[187338]: 2025-12-05 09:11:40.536111176 +0000 UTC m=+0.025082968 image pull 5571c1b2140c835f70406e4553b3b44135b9c9b4eb673345cbd571460c5d59a3 quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified
Dec 05 09:11:40 compute-1 python3[187305]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name nova_compute --conmon-pidfile /run/nova_compute.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --label config_id=edpm --label container_name=nova_compute --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']} --log-driver journald --log-level info --network host --pid host --privileged=True --user nova --volume /var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro --volume /var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /etc/localtime:/etc/localtime:ro --volume /lib/modules:/lib/modules:ro --volume /dev:/dev --volume /var/lib/libvirt:/var/lib/libvirt --volume /run/libvirt:/run/libvirt:shared --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/iscsi:/var/lib/iscsi --volume /etc/multipath:/etc/multipath:z --volume /etc/multipath.conf:/etc/multipath.conf:ro --volume /etc/iscsi:/etc/iscsi:ro --volume /etc/nvme:/etc/nvme --volume /var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro --volume /etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified kolla_start
Dec 05 09:11:40 compute-1 sudo[187303]: pam_unix(sudo:session): session closed for user root
Dec 05 09:11:41 compute-1 sudo[187537]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cieiglxxgqygxvtulmrtepirnyetimhz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925900.996924-4233-29440713788812/AnsiballZ_stat.py'
Dec 05 09:11:41 compute-1 sudo[187537]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:11:41 compute-1 podman[187500]: 2025-12-05 09:11:41.378637952 +0000 UTC m=+0.111837511 container health_status 0cdc951f3b455a1459471b20e1f1626ca53f702831c125f835d1b670aeb17ccf (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a42d21893a42142b0b00e96296a13ac9d2c0fedf55d6438ad104fd0309c63376-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.build-date=20251125, container_name=ovn_controller, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3)
Dec 05 09:11:41 compute-1 python3.9[187545]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 05 09:11:41 compute-1 sudo[187537]: pam_unix(sudo:session): session closed for user root
Dec 05 09:11:42 compute-1 sudo[187706]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ypefrkzdbhtvmapuaodjwfytvqsistgs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925901.9659913-4260-260200472958149/AnsiballZ_file.py'
Dec 05 09:11:42 compute-1 sudo[187706]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:11:42 compute-1 python3.9[187708]: ansible-file Invoked with path=/etc/systemd/system/edpm_nova_compute.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 09:11:42 compute-1 sudo[187706]: pam_unix(sudo:session): session closed for user root
Dec 05 09:11:42 compute-1 sudo[187859]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mwifgnsmredimstdfpoehdrhkdqrjglg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925902.5726864-4260-70718138303585/AnsiballZ_copy.py'
Dec 05 09:11:42 compute-1 sudo[187859]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:11:43 compute-1 python3.9[187861]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764925902.5726864-4260-70718138303585/source dest=/etc/systemd/system/edpm_nova_compute.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 09:11:43 compute-1 sudo[187859]: pam_unix(sudo:session): session closed for user root
Dec 05 09:11:43 compute-1 sudo[187935]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ldbelemfsxtuiuldfqshriswmuzxaiek ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925902.5726864-4260-70718138303585/AnsiballZ_systemd.py'
Dec 05 09:11:43 compute-1 sudo[187935]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:11:43 compute-1 sshd-session[187709]: Received disconnect from 43.225.158.169 port 56673:11: Bye Bye [preauth]
Dec 05 09:11:43 compute-1 sshd-session[187709]: Disconnected from authenticating user root 43.225.158.169 port 56673 [preauth]
Dec 05 09:11:43 compute-1 python3.9[187937]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Dec 05 09:11:43 compute-1 systemd[1]: Reloading.
Dec 05 09:11:43 compute-1 systemd-sysv-generator[187986]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 05 09:11:43 compute-1 systemd-rc-local-generator[187983]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 05 09:11:43 compute-1 podman[187939]: 2025-12-05 09:11:43.935708316 +0000 UTC m=+0.086927179 container health_status e8cea6352187f28e672aa25c227f2705a90d58074b8cf42235a56df5d4026fd5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a42d21893a42142b0b00e96296a13ac9d2c0fedf55d6438ad104fd0309c63376-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Dec 05 09:11:44 compute-1 sudo[187935]: pam_unix(sudo:session): session closed for user root
Dec 05 09:11:44 compute-1 sudo[188065]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dgiiahecvjcignqqwvxanfaucsbxwmbn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925902.5726864-4260-70718138303585/AnsiballZ_systemd.py'
Dec 05 09:11:44 compute-1 sudo[188065]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:11:45 compute-1 python3.9[188067]: ansible-systemd Invoked with state=restarted name=edpm_nova_compute.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 05 09:11:45 compute-1 systemd[1]: Reloading.
Dec 05 09:11:45 compute-1 podman[188069]: 2025-12-05 09:11:45.203395486 +0000 UTC m=+0.074838911 container health_status 3f3288243aefe700d53cffce184353529fbaf6e44f1c19ec4792ed576af41176 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_managed=true, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Dec 05 09:11:45 compute-1 systemd-rc-local-generator[188113]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 05 09:11:45 compute-1 systemd-sysv-generator[188116]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 05 09:11:45 compute-1 systemd[1]: Starting nova_compute container...
Dec 05 09:11:45 compute-1 systemd[1]: Started libcrun container.
Dec 05 09:11:45 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7b1f9d91d5b5128a72e0eeedcad988fe210bcab760888c4e537eee3c6b8e0d29/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Dec 05 09:11:45 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7b1f9d91d5b5128a72e0eeedcad988fe210bcab760888c4e537eee3c6b8e0d29/merged/etc/nvme supports timestamps until 2038 (0x7fffffff)
Dec 05 09:11:45 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7b1f9d91d5b5128a72e0eeedcad988fe210bcab760888c4e537eee3c6b8e0d29/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Dec 05 09:11:45 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7b1f9d91d5b5128a72e0eeedcad988fe210bcab760888c4e537eee3c6b8e0d29/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff)
Dec 05 09:11:45 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7b1f9d91d5b5128a72e0eeedcad988fe210bcab760888c4e537eee3c6b8e0d29/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Dec 05 09:11:45 compute-1 podman[188128]: 2025-12-05 09:11:45.59073158 +0000 UTC m=+0.108897339 container init 0083f575e3208a4b9eecd208ae4078ca03e0269dd6771393ca4b6d2e2f2278e8 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, config_id=edpm, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=nova_compute, io.buildah.version=1.41.3)
Dec 05 09:11:45 compute-1 podman[188128]: 2025-12-05 09:11:45.597568597 +0000 UTC m=+0.115734336 container start 0083f575e3208a4b9eecd208ae4078ca03e0269dd6771393ca4b6d2e2f2278e8 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=nova_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Dec 05 09:11:45 compute-1 podman[188128]: nova_compute
Dec 05 09:11:45 compute-1 nova_compute[188144]: + sudo -E kolla_set_configs
Dec 05 09:11:45 compute-1 systemd[1]: Started nova_compute container.
Dec 05 09:11:45 compute-1 sudo[188065]: pam_unix(sudo:session): session closed for user root
Dec 05 09:11:45 compute-1 nova_compute[188144]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Dec 05 09:11:45 compute-1 nova_compute[188144]: INFO:__main__:Validating config file
Dec 05 09:11:45 compute-1 nova_compute[188144]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Dec 05 09:11:45 compute-1 nova_compute[188144]: INFO:__main__:Copying service configuration files
Dec 05 09:11:45 compute-1 nova_compute[188144]: INFO:__main__:Deleting /etc/nova/nova.conf
Dec 05 09:11:45 compute-1 nova_compute[188144]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf
Dec 05 09:11:45 compute-1 nova_compute[188144]: INFO:__main__:Setting permission for /etc/nova/nova.conf
Dec 05 09:11:45 compute-1 nova_compute[188144]: INFO:__main__:Copying /var/lib/kolla/config_files/01-nova.conf to /etc/nova/nova.conf.d/01-nova.conf
Dec 05 09:11:45 compute-1 nova_compute[188144]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/01-nova.conf
Dec 05 09:11:45 compute-1 nova_compute[188144]: INFO:__main__:Copying /var/lib/kolla/config_files/25-nova-extra.conf to /etc/nova/nova.conf.d/25-nova-extra.conf
Dec 05 09:11:45 compute-1 nova_compute[188144]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/25-nova-extra.conf
Dec 05 09:11:45 compute-1 nova_compute[188144]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf.d/nova-blank.conf
Dec 05 09:11:45 compute-1 nova_compute[188144]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/nova-blank.conf
Dec 05 09:11:45 compute-1 nova_compute[188144]: INFO:__main__:Copying /var/lib/kolla/config_files/02-nova-host-specific.conf to /etc/nova/nova.conf.d/02-nova-host-specific.conf
Dec 05 09:11:45 compute-1 nova_compute[188144]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/02-nova-host-specific.conf
Dec 05 09:11:45 compute-1 nova_compute[188144]: INFO:__main__:Deleting /etc/ceph
Dec 05 09:11:45 compute-1 nova_compute[188144]: INFO:__main__:Creating directory /etc/ceph
Dec 05 09:11:45 compute-1 nova_compute[188144]: INFO:__main__:Setting permission for /etc/ceph
Dec 05 09:11:45 compute-1 nova_compute[188144]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-privatekey to /var/lib/nova/.ssh/ssh-privatekey
Dec 05 09:11:45 compute-1 nova_compute[188144]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Dec 05 09:11:45 compute-1 nova_compute[188144]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-config to /var/lib/nova/.ssh/config
Dec 05 09:11:45 compute-1 nova_compute[188144]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Dec 05 09:11:45 compute-1 nova_compute[188144]: INFO:__main__:Deleting /usr/sbin/iscsiadm
Dec 05 09:11:45 compute-1 nova_compute[188144]: INFO:__main__:Copying /var/lib/kolla/config_files/run-on-host to /usr/sbin/iscsiadm
Dec 05 09:11:45 compute-1 nova_compute[188144]: INFO:__main__:Setting permission for /usr/sbin/iscsiadm
Dec 05 09:11:45 compute-1 nova_compute[188144]: INFO:__main__:Writing out command to execute
Dec 05 09:11:45 compute-1 nova_compute[188144]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/
Dec 05 09:11:45 compute-1 nova_compute[188144]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Dec 05 09:11:45 compute-1 nova_compute[188144]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Dec 05 09:11:45 compute-1 nova_compute[188144]: ++ cat /run_command
Dec 05 09:11:45 compute-1 nova_compute[188144]: + CMD=nova-compute
Dec 05 09:11:45 compute-1 nova_compute[188144]: + ARGS=
Dec 05 09:11:45 compute-1 nova_compute[188144]: + sudo kolla_copy_cacerts
Dec 05 09:11:45 compute-1 nova_compute[188144]: + [[ ! -n '' ]]
Dec 05 09:11:45 compute-1 nova_compute[188144]: + . kolla_extend_start
Dec 05 09:11:45 compute-1 nova_compute[188144]: Running command: 'nova-compute'
Dec 05 09:11:45 compute-1 nova_compute[188144]: + echo 'Running command: '\''nova-compute'\'''
Dec 05 09:11:45 compute-1 nova_compute[188144]: + umask 0022
Dec 05 09:11:45 compute-1 nova_compute[188144]: + exec nova-compute
Dec 05 09:11:47 compute-1 python3.9[188306]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner_healthcheck.service follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 05 09:11:48 compute-1 python3.9[188456]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner.service follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 05 09:11:48 compute-1 nova_compute[188144]: 2025-12-05 09:11:48.363 188148 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_linux_bridge.linux_bridge.LinuxBridgePlugin'>' with name 'linux_bridge' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Dec 05 09:11:48 compute-1 nova_compute[188144]: 2025-12-05 09:11:48.364 188148 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_noop.noop.NoOpPlugin'>' with name 'noop' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Dec 05 09:11:48 compute-1 nova_compute[188144]: 2025-12-05 09:11:48.364 188148 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_ovs.ovs.OvsPlugin'>' with name 'ovs' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Dec 05 09:11:48 compute-1 nova_compute[188144]: 2025-12-05 09:11:48.364 188148 INFO os_vif [-] Loaded VIF plugins: linux_bridge, noop, ovs
Dec 05 09:11:48 compute-1 nova_compute[188144]: 2025-12-05 09:11:48.551 188148 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): grep -F node.session.scan /sbin/iscsiadm execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 09:11:48 compute-1 nova_compute[188144]: 2025-12-05 09:11:48.578 188148 DEBUG oslo_concurrency.processutils [-] CMD "grep -F node.session.scan /sbin/iscsiadm" returned: 1 in 0.027s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 09:11:48 compute-1 nova_compute[188144]: 2025-12-05 09:11:48.578 188148 DEBUG oslo_concurrency.processutils [-] 'grep -F node.session.scan /sbin/iscsiadm' failed. Not Retrying. execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:473
Dec 05 09:11:49 compute-1 python3.9[188610]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner.service.requires follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 05 09:11:49 compute-1 nova_compute[188144]: 2025-12-05 09:11:49.820 188148 INFO nova.virt.driver [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] Loading compute driver 'libvirt.LibvirtDriver'
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.008 188148 INFO nova.compute.provider_config [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] No provider configs found in /etc/nova/provider_config/. If files are present, ensure the Nova process has access.
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.086 188148 DEBUG oslo_concurrency.lockutils [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.087 188148 DEBUG oslo_concurrency.lockutils [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.087 188148 DEBUG oslo_concurrency.lockutils [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.088 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] Full set of CONF: _wait_for_exit_or_signal /usr/lib/python3.9/site-packages/oslo_service/service.py:362
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.088 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.088 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.088 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.088 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] config files: ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.089 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.089 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] allow_resize_to_same_host      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.089 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] arq_binding_timeout            = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.089 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] backdoor_port                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.089 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] backdoor_socket                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.090 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] block_device_allocate_retries  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.090 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] block_device_allocate_retries_interval = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.090 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] cert                           = self.pem log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.090 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] compute_driver                 = libvirt.LibvirtDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.091 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] compute_monitors               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.091 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] config_dir                     = ['/etc/nova/nova.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.091 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] config_drive_format            = iso9660 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.091 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] config_file                    = ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.091 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.092 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] console_host                   = compute-1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.092 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] control_exchange               = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.092 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] cpu_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.092 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] daemon                         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.092 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.093 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] default_access_ip_network_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.093 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] default_availability_zone      = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.093 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] default_ephemeral_format       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.093 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'glanceclient=WARN', 'oslo.privsep.daemon=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.094 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] default_schedule_zone          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.094 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] disk_allocation_ratio          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.094 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] enable_new_services            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.094 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] enabled_apis                   = ['osapi_compute', 'metadata'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.094 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] enabled_ssl_apis               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.094 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] flat_injected                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.094 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] force_config_drive             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.095 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] force_raw_images               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.095 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.095 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] heal_instance_info_cache_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.095 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] host                           = compute-1.ctlplane.example.com log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.095 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] initial_cpu_allocation_ratio   = 4.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.096 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] initial_disk_allocation_ratio  = 0.9 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.096 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] initial_ram_allocation_ratio   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.096 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] injected_network_template      = /usr/lib/python3.9/site-packages/nova/virt/interfaces.template log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.096 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] instance_build_timeout         = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.096 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] instance_delete_interval       = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.097 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.097 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] instance_name_template         = instance-%08x log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.097 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] instance_usage_audit           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.097 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] instance_usage_audit_period    = month log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.097 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.098 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] instances_path                 = /var/lib/nova/instances log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.098 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] internal_service_availability_zone = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.098 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] key                            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.098 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] live_migration_retry_count     = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.098 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.098 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.099 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.099 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.099 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.099 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.099 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.100 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] log_rotation_type              = size log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.100 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.100 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.100 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.100 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.100 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.100 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] long_rpc_timeout               = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.101 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] max_concurrent_builds          = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.101 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] max_concurrent_live_migrations = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.101 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] max_concurrent_snapshots       = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.101 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] max_local_block_devices        = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.101 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] max_logfile_count              = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.102 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] max_logfile_size_mb            = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.102 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] maximum_instance_delete_attempts = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.102 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] metadata_listen                = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.102 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] metadata_listen_port           = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.102 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] metadata_workers               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.102 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] migrate_max_retries            = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.103 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] mkisofs_cmd                    = /usr/bin/mkisofs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.103 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] my_block_storage_ip            = 192.168.122.101 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.103 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] my_ip                          = 192.168.122.101 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.103 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] network_allocate_retries       = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.103 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] non_inheritable_image_properties = ['cache_in_nova', 'bittorrent'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.104 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] osapi_compute_listen           = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.104 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] osapi_compute_listen_port      = 8774 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.104 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] osapi_compute_unique_server_name_scope =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.104 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] osapi_compute_workers          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.104 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] password_length                = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.104 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] periodic_enable                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.105 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] periodic_fuzzy_delay           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.105 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] pointer_model                  = usbtablet log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.105 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] preallocate_images             = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.105 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.105 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] pybasedir                      = /usr/lib/python3.9/site-packages log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.105 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] ram_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.105 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.106 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.106 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.106 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] reboot_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.106 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] reclaim_instance_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.106 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] record                         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.106 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] reimage_timeout_per_gb         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.106 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] report_interval                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.107 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] rescue_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.107 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] reserved_host_cpus             = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.107 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] reserved_host_disk_mb          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.107 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] reserved_host_memory_mb        = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.107 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] reserved_huge_pages            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.107 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] resize_confirm_window          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.107 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] resize_fs_using_block_device   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.108 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] resume_guests_state_on_host_boot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.108 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] rootwrap_config                = /etc/nova/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.108 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] rpc_response_timeout           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.108 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] run_external_periodic_tasks    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.108 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] running_deleted_instance_action = reap log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.108 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] running_deleted_instance_poll_interval = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.108 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] running_deleted_instance_timeout = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.109 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] scheduler_instance_sync_interval = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.109 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] service_down_time              = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.109 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] servicegroup_driver            = db log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.109 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] shelved_offload_time           = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.109 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] shelved_poll_interval          = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.109 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] shutdown_timeout               = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.110 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] source_is_ipv6                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.110 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] ssl_only                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.110 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] state_path                     = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.110 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] sync_power_state_interval      = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.110 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] sync_power_state_pool_size     = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.110 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.111 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] tempdir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.111 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] timeout_nbd                    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.111 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.111 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] update_resources_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.111 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] use_cow_images                 = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.111 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.111 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.112 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.112 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] use_rootwrap_daemon            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.112 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.112 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.112 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] vcpu_pin_set                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.113 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] vif_plugging_is_fatal          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.113 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] vif_plugging_timeout           = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.113 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] virt_mkfs                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.113 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] volume_usage_poll_interval     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.113 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.113 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] web                            = /usr/share/spice-html5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.113 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.114 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] oslo_concurrency.lock_path     = /var/lib/nova/tmp log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.114 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.114 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.114 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.114 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.114 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.114 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] api.auth_strategy              = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.115 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] api.compute_link_prefix        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.115 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] api.config_drive_skip_versions = 1.0 2007-01-19 2007-03-01 2007-08-29 2007-10-10 2007-12-15 2008-02-01 2008-09-01 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.115 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] api.dhcp_domain                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.115 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] api.enable_instance_password   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.115 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] api.glance_link_prefix         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.115 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] api.instance_list_cells_batch_fixed_size = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.115 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] api.instance_list_cells_batch_strategy = distributed log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.116 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] api.instance_list_per_project_cells = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.116 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] api.list_records_by_skipping_down_cells = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.116 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] api.local_metadata_per_cell    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.116 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] api.max_limit                  = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.116 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] api.metadata_cache_expiration  = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.117 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] api.neutron_default_tenant_id  = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.117 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] api.use_forwarded_for          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.117 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] api.use_neutron_default_nets   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.117 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] api.vendordata_dynamic_connect_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.117 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] api.vendordata_dynamic_failure_fatal = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.118 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] api.vendordata_dynamic_read_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.118 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] api.vendordata_dynamic_ssl_certfile =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.118 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] api.vendordata_dynamic_targets = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.118 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] api.vendordata_jsonfile_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.118 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] api.vendordata_providers       = ['StaticJSON'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.119 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] cache.backend                  = oslo_cache.dict log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.119 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] cache.backend_argument         = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.119 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] cache.config_prefix            = cache.oslo log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.119 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] cache.dead_timeout             = 60.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.120 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] cache.debug_cache_backend      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.120 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] cache.enable_retry_client      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.120 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] cache.enable_socket_keepalive  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.120 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] cache.enabled                  = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.120 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] cache.expiration_time          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.121 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] cache.hashclient_retry_attempts = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.121 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] cache.hashclient_retry_delay   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.121 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] cache.memcache_dead_retry      = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.121 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] cache.memcache_password        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.122 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] cache.memcache_pool_connection_get_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.122 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] cache.memcache_pool_flush_on_reconnect = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.122 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] cache.memcache_pool_maxsize    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.122 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] cache.memcache_pool_unused_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.122 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] cache.memcache_sasl_enabled    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.123 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] cache.memcache_servers         = ['localhost:11211'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.123 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] cache.memcache_socket_timeout  = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.123 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] cache.memcache_username        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.123 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] cache.proxies                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.124 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] cache.retry_attempts           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.124 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] cache.retry_delay              = 0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.124 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] cache.socket_keepalive_count   = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.124 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] cache.socket_keepalive_idle    = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.124 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] cache.socket_keepalive_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.125 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] cache.tls_allowed_ciphers      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.125 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] cache.tls_cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.125 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] cache.tls_certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.125 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] cache.tls_enabled              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.125 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] cache.tls_keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.126 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] cinder.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.126 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] cinder.auth_type               = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.126 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] cinder.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.126 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] cinder.catalog_info            = volumev3:cinderv3:internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.127 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] cinder.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.127 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] cinder.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.127 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] cinder.cross_az_attach         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.127 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] cinder.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.127 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] cinder.endpoint_template       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.128 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] cinder.http_retries            = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.128 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] cinder.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.128 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] cinder.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.128 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] cinder.os_region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.129 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] cinder.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.129 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] cinder.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.129 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] compute.consecutive_build_service_disable_threshold = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.129 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] compute.cpu_dedicated_set      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.129 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] compute.cpu_shared_set         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.130 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] compute.image_type_exclude_list = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.130 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] compute.live_migration_wait_for_vif_plug = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.130 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] compute.max_concurrent_disk_ops = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.130 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] compute.max_disk_devices_to_attach = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.130 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] compute.packing_host_numa_cells_allocation_strategy = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.131 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] compute.provider_config_location = /etc/nova/provider_config/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.131 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] compute.resource_provider_association_refresh = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.131 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] compute.shutdown_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.131 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] compute.vmdk_allowed_types     = ['streamOptimized', 'monolithicSparse'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.132 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] conductor.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.132 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] console.allowed_origins        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.132 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] console.ssl_ciphers            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.132 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] console.ssl_minimum_version    = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.133 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] consoleauth.token_ttl          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.133 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] cyborg.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.133 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] cyborg.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.133 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] cyborg.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.133 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] cyborg.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.134 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] cyborg.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.134 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] cyborg.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.134 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] cyborg.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.134 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] cyborg.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.134 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] cyborg.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.135 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] cyborg.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.135 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] cyborg.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.135 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] cyborg.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.135 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] cyborg.service_type            = accelerator log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.135 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] cyborg.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.136 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] cyborg.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.136 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] cyborg.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.136 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] cyborg.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.136 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] cyborg.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.136 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] cyborg.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.137 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] database.backend               = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.137 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] database.connection            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.137 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] database.connection_debug      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.137 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.137 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.138 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] database.connection_trace      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.138 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.138 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] database.db_max_retries        = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.138 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.139 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] database.db_retry_interval     = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.139 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] database.max_overflow          = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.139 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] database.max_pool_size         = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.139 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] database.max_retries           = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.139 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] database.mysql_enable_ndb      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.140 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] database.mysql_sql_mode        = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.140 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.140 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] database.pool_timeout          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.140 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] database.retry_interval        = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.140 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] database.slave_connection      = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.140 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] database.sqlite_synchronous    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.141 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] api_database.backend           = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.141 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] api_database.connection        = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.141 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] api_database.connection_debug  = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.141 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] api_database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.141 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] api_database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.142 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] api_database.connection_trace  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.142 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] api_database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.142 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] api_database.db_max_retries    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.142 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] api_database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.142 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] api_database.db_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.143 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] api_database.max_overflow      = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.143 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] api_database.max_pool_size     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.143 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] api_database.max_retries       = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.143 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] api_database.mysql_enable_ndb  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.143 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] api_database.mysql_sql_mode    = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.143 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] api_database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.143 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] api_database.pool_timeout      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.144 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] api_database.retry_interval    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.144 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] api_database.slave_connection  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.144 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] api_database.sqlite_synchronous = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.144 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] devices.enabled_mdev_types     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.144 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] ephemeral_storage_encryption.cipher = aes-xts-plain64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.144 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] ephemeral_storage_encryption.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.145 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] ephemeral_storage_encryption.key_size = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.145 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] glance.api_servers             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.145 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] glance.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.145 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] glance.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.145 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] glance.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.145 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] glance.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.146 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] glance.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.146 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] glance.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.146 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] glance.default_trusted_certificate_ids = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.146 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] glance.enable_certificate_validation = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.146 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] glance.enable_rbd_download     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.146 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] glance.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.147 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] glance.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.147 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] glance.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.147 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] glance.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.147 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] glance.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.147 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] glance.num_retries             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.147 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] glance.rbd_ceph_conf           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.148 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] glance.rbd_connect_timeout     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.148 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] glance.rbd_pool                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.148 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] glance.rbd_user                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.148 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] glance.region_name             = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.148 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] glance.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.148 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] glance.service_type            = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.149 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] glance.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.149 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] glance.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.149 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] glance.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.149 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] glance.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.149 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] glance.valid_interfaces        = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.150 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] glance.verify_glance_signatures = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.150 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] glance.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.150 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] guestfs.debug                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.150 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] hyperv.config_drive_cdrom      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.150 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] hyperv.config_drive_inject_password = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.150 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] hyperv.dynamic_memory_ratio    = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.151 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] hyperv.enable_instance_metrics_collection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.151 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] hyperv.enable_remotefx         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.151 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] hyperv.instances_path_share    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.151 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] hyperv.iscsi_initiator_list    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.151 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] hyperv.limit_cpu_features      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.152 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] hyperv.mounted_disk_query_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.152 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] hyperv.mounted_disk_query_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.152 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] hyperv.power_state_check_timeframe = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.152 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] hyperv.power_state_event_polling_interval = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.152 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] hyperv.qemu_img_cmd            = qemu-img.exe log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.153 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] hyperv.use_multipath_io        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.153 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] hyperv.volume_attach_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.153 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] hyperv.volume_attach_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.153 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] hyperv.vswitch_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.153 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] hyperv.wait_soft_reboot_seconds = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.154 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] mks.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.154 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] mks.mksproxy_base_url          = http://127.0.0.1:6090/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.154 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] image_cache.manager_interval   = 2400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.155 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] image_cache.precache_concurrency = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.155 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] image_cache.remove_unused_base_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.155 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] image_cache.remove_unused_original_minimum_age_seconds = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.155 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] image_cache.remove_unused_resized_minimum_age_seconds = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.155 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] image_cache.subdirectory_name  = _base log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.156 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] ironic.api_max_retries         = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.156 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] ironic.api_retry_interval      = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.156 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.156 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.156 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.157 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.157 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.157 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.157 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.157 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.157 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.158 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.158 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.158 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.158 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] ironic.partition_key           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.158 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] ironic.peer_list               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.158 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.158 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] ironic.serial_console_state_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.159 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.159 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] ironic.service_type            = baremetal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.159 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.159 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.159 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.159 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.160 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] ironic.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.160 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.160 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] key_manager.backend            = barbican log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.160 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] key_manager.fixed_key          = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.160 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] barbican.auth_endpoint         = http://localhost/identity/v3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.160 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] barbican.barbican_api_version  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.161 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] barbican.barbican_endpoint     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.161 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] barbican.barbican_endpoint_type = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.161 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] barbican.barbican_region_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.161 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] barbican.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.161 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] barbican.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.161 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] barbican.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.162 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] barbican.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.162 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] barbican.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.162 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] barbican.number_of_retries     = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.162 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] barbican.retry_delay           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.162 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] barbican.send_service_user_token = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.162 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] barbican.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.162 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] barbican.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.163 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] barbican.verify_ssl            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.163 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] barbican.verify_ssl_path       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.163 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] barbican_service_user.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.163 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] barbican_service_user.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.163 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] barbican_service_user.cafile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.163 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] barbican_service_user.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.163 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] barbican_service_user.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.164 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] barbican_service_user.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.164 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] barbican_service_user.keyfile  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.164 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] barbican_service_user.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.164 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] barbican_service_user.timeout  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.164 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] vault.approle_role_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.164 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] vault.approle_secret_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.164 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] vault.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.165 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] vault.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.165 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] vault.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.165 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] vault.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.165 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] vault.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.165 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] vault.kv_mountpoint            = secret log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.165 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] vault.kv_version               = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.165 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] vault.namespace                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.166 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] vault.root_token_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.166 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] vault.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.166 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] vault.ssl_ca_crt_file          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.166 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] vault.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.166 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] vault.use_ssl                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.166 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] vault.vault_url                = http://127.0.0.1:8200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.167 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] keystone.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.167 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] keystone.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.167 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] keystone.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.167 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] keystone.connect_retries       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.167 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] keystone.connect_retry_delay   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.167 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] keystone.endpoint_override     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.167 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] keystone.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.168 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] keystone.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.168 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] keystone.max_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.168 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] keystone.min_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.168 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] keystone.region_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.168 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] keystone.service_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.168 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] keystone.service_type          = identity log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.169 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] keystone.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.169 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] keystone.status_code_retries   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.169 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] keystone.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.169 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] keystone.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.169 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] keystone.valid_interfaces      = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.169 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] keystone.version               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.169 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] libvirt.connection_uri         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.170 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] libvirt.cpu_mode               = custom log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.170 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] libvirt.cpu_model_extra_flags  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.170 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] libvirt.cpu_models             = ['Nehalem'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.170 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] libvirt.cpu_power_governor_high = performance log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.170 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] libvirt.cpu_power_governor_low = powersave log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.170 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] libvirt.cpu_power_management   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.170 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] libvirt.cpu_power_management_strategy = cpu_state log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.171 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] libvirt.device_detach_attempts = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.171 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] libvirt.device_detach_timeout  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.171 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] libvirt.disk_cachemodes        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.171 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] libvirt.disk_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.171 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] libvirt.enabled_perf_events    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.171 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] libvirt.file_backed_memory     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.172 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] libvirt.gid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.172 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] libvirt.hw_disk_discard        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.172 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] libvirt.hw_machine_type        = ['x86_64=q35'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.172 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] libvirt.images_rbd_ceph_conf   =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.172 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] libvirt.images_rbd_glance_copy_poll_interval = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.172 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] libvirt.images_rbd_glance_copy_timeout = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.172 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] libvirt.images_rbd_glance_store_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.173 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] libvirt.images_rbd_pool        = rbd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.173 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] libvirt.images_type            = qcow2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.173 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] libvirt.images_volume_group    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.173 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] libvirt.inject_key             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.173 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] libvirt.inject_partition       = -2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.173 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] libvirt.inject_password        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.173 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] libvirt.iscsi_iface            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.174 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] libvirt.iser_use_multipath     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.174 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] libvirt.live_migration_bandwidth = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.174 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] libvirt.live_migration_completion_timeout = 800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.174 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] libvirt.live_migration_downtime = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.174 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] libvirt.live_migration_downtime_delay = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.175 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] libvirt.live_migration_downtime_steps = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.175 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] libvirt.live_migration_inbound_addr = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.175 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] libvirt.live_migration_permit_auto_converge = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.175 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] libvirt.live_migration_permit_post_copy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.175 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] libvirt.live_migration_scheme  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.175 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] libvirt.live_migration_timeout_action = force_complete log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.176 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] libvirt.live_migration_tunnelled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.176 188148 WARNING oslo_config.cfg [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] Deprecated: Option "live_migration_uri" from group "libvirt" is deprecated for removal (
Dec 05 09:11:50 compute-1 nova_compute[188144]: live_migration_uri is deprecated for removal in favor of two other options that
Dec 05 09:11:50 compute-1 nova_compute[188144]: allow to change live migration scheme and target URI: ``live_migration_scheme``
Dec 05 09:11:50 compute-1 nova_compute[188144]: and ``live_migration_inbound_addr`` respectively.
Dec 05 09:11:50 compute-1 nova_compute[188144]: ).  Its value may be silently ignored in the future.
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.176 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] libvirt.live_migration_uri     = qemu+tls://%s/system log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.176 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] libvirt.live_migration_with_native_tls = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.176 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] libvirt.max_queues             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.176 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] libvirt.mem_stats_period_seconds = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.178 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] libvirt.nfs_mount_options      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.178 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] libvirt.nfs_mount_point_base   = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.178 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] libvirt.num_aoe_discover_tries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.179 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] libvirt.num_iser_scan_tries    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.179 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] libvirt.num_memory_encrypted_guests = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.179 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] libvirt.num_nvme_discover_tries = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.179 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] libvirt.num_pcie_ports         = 24 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.179 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] libvirt.num_volume_scan_tries  = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.179 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] libvirt.pmem_namespaces        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.179 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] libvirt.quobyte_client_cfg     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.180 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] libvirt.quobyte_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.180 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] libvirt.rbd_connect_timeout    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.180 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] libvirt.rbd_destroy_volume_retries = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.180 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] libvirt.rbd_destroy_volume_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.180 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] libvirt.rbd_secret_uuid        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.180 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] libvirt.rbd_user               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.180 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] libvirt.realtime_scheduler_priority = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.181 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] libvirt.remote_filesystem_transport = ssh log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.181 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] libvirt.rescue_image_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.181 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] libvirt.rescue_kernel_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.181 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] libvirt.rescue_ramdisk_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.181 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] libvirt.rng_dev_path           = /dev/urandom log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.181 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] libvirt.rx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.181 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] libvirt.smbfs_mount_options    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.182 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] libvirt.smbfs_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.182 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] libvirt.snapshot_compression   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.182 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] libvirt.snapshot_image_format  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.182 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] libvirt.snapshots_directory    = /var/lib/nova/instances/snapshots log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.182 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] libvirt.sparse_logical_volumes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.182 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] libvirt.swtpm_enabled          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.183 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] libvirt.swtpm_group            = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.183 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] libvirt.swtpm_user             = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.183 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] libvirt.sysinfo_serial         = unique log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.183 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] libvirt.tx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.183 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] libvirt.uid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.183 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] libvirt.use_virtio_for_bridges = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.183 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] libvirt.virt_type              = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.184 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] libvirt.volume_clear           = zero log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.184 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] libvirt.volume_clear_size      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.184 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] libvirt.volume_use_multipath   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.184 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] libvirt.vzstorage_cache_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.184 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] libvirt.vzstorage_log_path     = /var/log/vstorage/%(cluster_name)s/nova.log.gz log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.184 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] libvirt.vzstorage_mount_group  = qemu log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.184 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] libvirt.vzstorage_mount_opts   = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.185 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] libvirt.vzstorage_mount_perms  = 0770 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.185 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] libvirt.vzstorage_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.185 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] libvirt.vzstorage_mount_user   = stack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.185 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] libvirt.wait_soft_reboot_seconds = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.185 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] neutron.auth_section           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.185 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] neutron.auth_type              = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.186 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] neutron.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.186 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] neutron.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.186 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] neutron.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.186 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] neutron.connect_retries        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.186 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] neutron.connect_retry_delay    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.186 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] neutron.default_floating_pool  = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.186 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] neutron.endpoint_override      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.187 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] neutron.extension_sync_interval = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.187 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] neutron.http_retries           = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.187 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] neutron.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.187 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] neutron.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.187 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] neutron.max_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.187 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] neutron.metadata_proxy_shared_secret = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.188 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] neutron.min_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.188 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] neutron.ovs_bridge             = br-int log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.188 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] neutron.physnets               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.188 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] neutron.region_name            = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.189 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] neutron.service_metadata_proxy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.189 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] neutron.service_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.189 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] neutron.service_type           = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.189 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] neutron.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.189 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] neutron.status_code_retries    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.189 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] neutron.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.190 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] neutron.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.190 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] neutron.valid_interfaces       = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.190 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] neutron.version                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.190 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] notifications.bdms_in_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.190 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] notifications.default_level    = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.190 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] notifications.notification_format = unversioned log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.190 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] notifications.notify_on_state_change = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.191 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] notifications.versioned_notifications_topics = ['versioned_notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.191 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] pci.alias                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.191 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] pci.device_spec                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.191 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] pci.report_in_placement        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.191 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.191 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] placement.auth_type            = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.192 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] placement.auth_url             = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.192 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.192 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.192 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.192 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] placement.connect_retries      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.192 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] placement.connect_retry_delay  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.192 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] placement.default_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.193 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] placement.default_domain_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.193 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] placement.domain_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.193 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] placement.domain_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.193 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] placement.endpoint_override    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.193 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.193 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.193 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] placement.max_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.194 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] placement.min_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.194 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] placement.password             = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.194 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] placement.project_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.194 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] placement.project_domain_name  = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.194 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] placement.project_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.194 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] placement.project_name         = service log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.195 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] placement.region_name          = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.195 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] placement.service_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.195 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] placement.service_type         = placement log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.195 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.195 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] placement.status_code_retries  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.195 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] placement.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.195 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] placement.system_scope         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.196 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.196 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] placement.trust_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.196 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] placement.user_domain_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.196 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] placement.user_domain_name     = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.196 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] placement.user_id              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.196 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] placement.username             = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.196 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] placement.valid_interfaces     = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.197 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] placement.version              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.197 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] quota.cores                    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.197 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] quota.count_usage_from_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.197 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] quota.driver                   = nova.quota.DbQuotaDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.197 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] quota.injected_file_content_bytes = 10240 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.197 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] quota.injected_file_path_length = 255 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.198 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] quota.injected_files           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.198 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] quota.instances                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.198 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] quota.key_pairs                = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.198 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] quota.metadata_items           = 128 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.198 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] quota.ram                      = 51200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.198 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] quota.recheck_quota            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.198 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] quota.server_group_members     = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.199 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] quota.server_groups            = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.199 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] rdp.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.199 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] rdp.html5_proxy_base_url       = http://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.199 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] scheduler.discover_hosts_in_cells_interval = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.200 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] scheduler.enable_isolated_aggregate_filtering = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.200 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] scheduler.image_metadata_prefilter = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.200 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] scheduler.limit_tenants_to_placement_aggregate = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.200 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] scheduler.max_attempts         = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.200 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] scheduler.max_placement_results = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.200 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] scheduler.placement_aggregate_required_for_tenants = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.201 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] scheduler.query_placement_for_availability_zone = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.201 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] scheduler.query_placement_for_image_type_support = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.201 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] scheduler.query_placement_for_routed_network_aggregates = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.201 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] scheduler.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.201 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_namespace = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.201 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_separator = . log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.202 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] filter_scheduler.available_filters = ['nova.scheduler.filters.all_filters'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.202 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] filter_scheduler.build_failure_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.202 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] filter_scheduler.cpu_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.202 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] filter_scheduler.cross_cell_move_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.202 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] filter_scheduler.disk_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.202 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] filter_scheduler.enabled_filters = ['ComputeFilter', 'ComputeCapabilitiesFilter', 'ImagePropertiesFilter', 'ServerGroupAntiAffinityFilter', 'ServerGroupAffinityFilter'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.203 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] filter_scheduler.host_subset_size = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.203 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] filter_scheduler.image_properties_default_architecture = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.203 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] filter_scheduler.io_ops_weight_multiplier = -1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.203 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] filter_scheduler.isolated_hosts = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.203 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] filter_scheduler.isolated_images = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.204 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] filter_scheduler.max_instances_per_host = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.204 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] filter_scheduler.max_io_ops_per_host = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.204 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] filter_scheduler.pci_in_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.204 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] filter_scheduler.pci_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.204 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] filter_scheduler.ram_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.205 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] filter_scheduler.restrict_isolated_hosts_to_isolated_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.205 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] filter_scheduler.shuffle_best_same_weighed_hosts = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.205 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] filter_scheduler.soft_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.205 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] filter_scheduler.soft_anti_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.205 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] filter_scheduler.track_instance_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.205 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] filter_scheduler.weight_classes = ['nova.scheduler.weights.all_weighers'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.206 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] metrics.required               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.206 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] metrics.weight_multiplier      = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.206 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] metrics.weight_of_unavailable  = -10000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.206 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] metrics.weight_setting         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.207 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] serial_console.base_url        = ws://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.207 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] serial_console.enabled         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.207 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] serial_console.port_range      = 10000:20000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.207 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] serial_console.proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.207 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] serial_console.serialproxy_host = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.208 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] serial_console.serialproxy_port = 6083 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.208 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] service_user.auth_section      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.208 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] service_user.auth_type         = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.208 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] service_user.cafile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.208 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] service_user.certfile          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.209 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] service_user.collect_timing    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.209 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] service_user.insecure          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.209 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] service_user.keyfile           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.209 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] service_user.send_service_user_token = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.209 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] service_user.split_loggers     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.210 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] service_user.timeout           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.210 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] spice.agent_enabled            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.210 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] spice.enabled                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.210 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] spice.html5proxy_base_url      = http://127.0.0.1:6082/spice_auto.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.210 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] spice.html5proxy_host          = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.211 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] spice.html5proxy_port          = 6082 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.211 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] spice.image_compression        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.211 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] spice.jpeg_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.211 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] spice.playback_compression     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.211 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] spice.server_listen            = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.211 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] spice.server_proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.212 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] spice.streaming_mode           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.212 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] spice.zlib_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.212 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] upgrade_levels.baseapi         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.212 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] upgrade_levels.cert            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.212 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] upgrade_levels.compute         = auto log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.213 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] upgrade_levels.conductor       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.213 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] upgrade_levels.scheduler       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.213 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] vendordata_dynamic_auth.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.213 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] vendordata_dynamic_auth.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.213 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] vendordata_dynamic_auth.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.213 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] vendordata_dynamic_auth.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.213 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] vendordata_dynamic_auth.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.214 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] vendordata_dynamic_auth.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.214 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] vendordata_dynamic_auth.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.214 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] vendordata_dynamic_auth.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.214 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] vendordata_dynamic_auth.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.214 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.214 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.215 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] vmware.cache_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.215 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] vmware.cluster_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.215 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] vmware.connection_pool_size    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.215 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] vmware.console_delay_seconds   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.215 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] vmware.datastore_regex         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.215 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] vmware.host_ip                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.216 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.216 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.216 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] vmware.host_username           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.216 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.216 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] vmware.integration_bridge      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.216 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] vmware.maximum_objects         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.216 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] vmware.pbm_default_policy      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.217 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] vmware.pbm_enabled             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.217 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] vmware.pbm_wsdl_location       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.217 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] vmware.serial_log_dir          = /opt/vmware/vspc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.217 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] vmware.serial_port_proxy_uri   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.217 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] vmware.serial_port_service_uri = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.217 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.218 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] vmware.use_linked_clone        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.218 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] vmware.vnc_keymap              = en-us log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.218 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] vmware.vnc_port                = 5900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.218 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] vmware.vnc_port_total          = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.218 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] vnc.auth_schemes               = ['none'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.218 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] vnc.enabled                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.219 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] vnc.novncproxy_base_url        = https://nova-novncproxy-cell1-public-openstack.apps-crc.testing/vnc_lite.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.219 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] vnc.novncproxy_host            = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.219 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] vnc.novncproxy_port            = 6080 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.219 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] vnc.server_listen              = ::0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.219 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] vnc.server_proxyclient_address = 192.168.122.101 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.220 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] vnc.vencrypt_ca_certs          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.220 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] vnc.vencrypt_client_cert       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.220 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] vnc.vencrypt_client_key        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.220 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] workarounds.disable_compute_service_check_for_ffu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.220 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] workarounds.disable_deep_image_inspection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.220 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] workarounds.disable_fallback_pcpu_query = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.221 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] workarounds.disable_group_policy_check_upcall = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.221 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] workarounds.disable_libvirt_livesnapshot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.221 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] workarounds.disable_rootwrap   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.221 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] workarounds.enable_numa_live_migration = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.221 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] workarounds.enable_qemu_monitor_announce_self = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.222 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] workarounds.ensure_libvirt_rbd_instance_dir_cleanup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.222 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] workarounds.handle_virt_lifecycle_events = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.222 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] workarounds.libvirt_disable_apic = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.222 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] workarounds.never_download_image_if_on_rbd = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.222 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] workarounds.qemu_monitor_announce_self_count = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.223 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] workarounds.qemu_monitor_announce_self_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.223 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] workarounds.reserve_disk_resource_for_image_cache = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.223 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] workarounds.skip_cpu_compare_at_startup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.223 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] workarounds.skip_cpu_compare_on_dest = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.224 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] workarounds.skip_hypervisor_version_check_on_lm = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.224 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] workarounds.skip_reserve_in_use_ironic_nodes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.224 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] workarounds.unified_limits_count_pcpu_as_vcpu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.224 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] workarounds.wait_for_vif_plugged_event_during_hard_reboot = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.224 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] wsgi.api_paste_config          = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.225 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] wsgi.client_socket_timeout     = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.225 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] wsgi.default_pool_size         = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.225 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] wsgi.keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.225 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] wsgi.max_header_line           = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.225 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] wsgi.secure_proxy_ssl_header   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.225 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] wsgi.ssl_ca_file               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.225 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] wsgi.ssl_cert_file             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.226 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] wsgi.ssl_key_file              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.226 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] wsgi.tcp_keepidle              = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.226 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] wsgi.wsgi_log_format           = %(client_ip)s "%(request_line)s" status: %(status_code)s len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.226 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] zvm.ca_file                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.226 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] zvm.cloud_connector_url        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.227 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] zvm.image_tmp_path             = /var/lib/nova/images log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.227 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] zvm.reachable_timeout          = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.227 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] oslo_policy.enforce_new_defaults = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.227 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] oslo_policy.enforce_scope      = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.227 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.227 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.227 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.228 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.228 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.228 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.228 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.228 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.228 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.229 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.229 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] remote_debug.host              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.229 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] remote_debug.port              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.229 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.229 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] oslo_messaging_rabbit.amqp_durable_queues = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.230 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.230 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.230 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.230 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.230 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.230 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.231 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.231 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.231 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.231 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.231 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.231 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.231 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.232 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.232 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.232 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.232 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.232 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.232 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_queue = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.233 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.233 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.233 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.233 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.233 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.234 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.234 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.234 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.234 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.234 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.235 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] oslo_messaging_notifications.driver = ['noop'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.235 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.235 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.235 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.235 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] oslo_limit.auth_section        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.235 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] oslo_limit.auth_type           = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.236 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] oslo_limit.auth_url            = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.236 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] oslo_limit.cafile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.236 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] oslo_limit.certfile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.236 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] oslo_limit.collect_timing      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.236 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] oslo_limit.connect_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.236 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] oslo_limit.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.237 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] oslo_limit.default_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.237 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] oslo_limit.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.237 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] oslo_limit.domain_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.237 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] oslo_limit.domain_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.237 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] oslo_limit.endpoint_id         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.237 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] oslo_limit.endpoint_override   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.238 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] oslo_limit.insecure            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.238 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] oslo_limit.keyfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.238 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] oslo_limit.max_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.238 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] oslo_limit.min_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.238 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] oslo_limit.password            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.238 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] oslo_limit.project_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.238 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] oslo_limit.project_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.239 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] oslo_limit.project_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.239 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] oslo_limit.project_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.239 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] oslo_limit.region_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.239 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] oslo_limit.service_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.239 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] oslo_limit.service_type        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.239 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] oslo_limit.split_loggers       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.239 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] oslo_limit.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.240 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] oslo_limit.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.240 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] oslo_limit.system_scope        = all log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.240 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] oslo_limit.timeout             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.240 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] oslo_limit.trust_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.240 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] oslo_limit.user_domain_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.240 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] oslo_limit.user_domain_name    = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.241 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] oslo_limit.user_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.241 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] oslo_limit.username            = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.241 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] oslo_limit.valid_interfaces    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.241 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] oslo_limit.version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.241 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] oslo_reports.file_event_handler = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.242 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] oslo_reports.file_event_handler_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.242 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] oslo_reports.log_dir           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.242 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] vif_plug_linux_bridge_privileged.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.242 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] vif_plug_linux_bridge_privileged.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.242 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] vif_plug_linux_bridge_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.243 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] vif_plug_linux_bridge_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.243 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] vif_plug_linux_bridge_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.243 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] vif_plug_linux_bridge_privileged.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.243 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] vif_plug_ovs_privileged.capabilities = [12, 1] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.243 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] vif_plug_ovs_privileged.group  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.244 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] vif_plug_ovs_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.244 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] vif_plug_ovs_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.244 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] vif_plug_ovs_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.244 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] vif_plug_ovs_privileged.user   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.245 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] os_vif_linux_bridge.flat_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.245 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] os_vif_linux_bridge.forward_bridge_interface = ['all'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.245 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] os_vif_linux_bridge.iptables_bottom_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.245 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] os_vif_linux_bridge.iptables_drop_action = DROP log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.246 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] os_vif_linux_bridge.iptables_top_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.246 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] os_vif_linux_bridge.network_device_mtu = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.246 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] os_vif_linux_bridge.use_ipv6   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.246 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] os_vif_linux_bridge.vlan_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.247 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] os_vif_ovs.isolate_vif         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.247 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] os_vif_ovs.network_device_mtu  = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.248 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] os_vif_ovs.ovs_vsctl_timeout   = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.248 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] os_vif_ovs.ovsdb_connection    = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.249 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] os_vif_ovs.ovsdb_interface     = native log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.249 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] os_vif_ovs.per_port_bridge     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.249 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] os_brick.lock_path             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.250 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] os_brick.wait_mpath_device_attempts = 4 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.250 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] os_brick.wait_mpath_device_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.250 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] privsep_osbrick.capabilities   = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.250 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] privsep_osbrick.group          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.250 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] privsep_osbrick.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.250 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] privsep_osbrick.logger_name    = os_brick.privileged log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.250 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] privsep_osbrick.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.251 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] privsep_osbrick.user           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.251 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] nova_sys_admin.capabilities    = [0, 1, 2, 3, 12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.251 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] nova_sys_admin.group           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.251 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] nova_sys_admin.helper_command  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.251 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] nova_sys_admin.logger_name     = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.251 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] nova_sys_admin.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.252 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] nova_sys_admin.user            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.252 188148 DEBUG oslo_service.service [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.253 188148 INFO nova.service [-] Starting compute node (version 27.5.2-0.20250829104910.6f8decf.el9)
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.920 188148 DEBUG nova.virt.libvirt.host [None req-d36cd56c-951f-4d9c-8180-5c001720b410 - - - - - -] Starting native event thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:492
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.921 188148 DEBUG nova.virt.libvirt.host [None req-d36cd56c-951f-4d9c-8180-5c001720b410 - - - - - -] Starting green dispatch thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:498
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.921 188148 DEBUG nova.virt.libvirt.host [None req-d36cd56c-951f-4d9c-8180-5c001720b410 - - - - - -] Starting connection event dispatch thread initialize /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:620
Dec 05 09:11:50 compute-1 nova_compute[188144]: 2025-12-05 09:11:50.922 188148 DEBUG nova.virt.libvirt.host [None req-d36cd56c-951f-4d9c-8180-5c001720b410 - - - - - -] Connecting to libvirt: qemu:///system _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:503
Dec 05 09:11:50 compute-1 systemd[1]: Starting libvirt QEMU daemon...
Dec 05 09:11:50 compute-1 systemd[1]: Started libvirt QEMU daemon.
Dec 05 09:11:51 compute-1 nova_compute[188144]: 2025-12-05 09:11:51.030 188148 DEBUG nova.virt.libvirt.host [None req-d36cd56c-951f-4d9c-8180-5c001720b410 - - - - - -] Registering for lifecycle events <nova.virt.libvirt.host.Host object at 0x7f2ace2acc10> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:509
Dec 05 09:11:51 compute-1 nova_compute[188144]: 2025-12-05 09:11:51.035 188148 DEBUG nova.virt.libvirt.host [None req-d36cd56c-951f-4d9c-8180-5c001720b410 - - - - - -] Registering for connection events: <nova.virt.libvirt.host.Host object at 0x7f2ace2acc10> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:530
Dec 05 09:11:51 compute-1 nova_compute[188144]: 2025-12-05 09:11:51.036 188148 INFO nova.virt.libvirt.driver [None req-d36cd56c-951f-4d9c-8180-5c001720b410 - - - - - -] Connection event '1' reason 'None'
Dec 05 09:11:51 compute-1 nova_compute[188144]: 2025-12-05 09:11:51.060 188148 WARNING nova.virt.libvirt.driver [None req-d36cd56c-951f-4d9c-8180-5c001720b410 - - - - - -] Cannot update service status on host "compute-1.ctlplane.example.com" since it is not registered.: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host compute-1.ctlplane.example.com could not be found.
Dec 05 09:11:51 compute-1 nova_compute[188144]: 2025-12-05 09:11:51.061 188148 DEBUG nova.virt.libvirt.volume.mount [None req-d36cd56c-951f-4d9c-8180-5c001720b410 - - - - - -] Initialising _HostMountState generation 0 host_up /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/mount.py:130
Dec 05 09:11:51 compute-1 sudo[188804]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hneldagvdjdsiuuchtdxbbggrokhrsvw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925910.1087394-4440-138511682504158/AnsiballZ_podman_container.py'
Dec 05 09:11:51 compute-1 sudo[188804]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:11:51 compute-1 python3.9[188807]: ansible-containers.podman.podman_container Invoked with name=nova_nvme_cleaner state=absent executable=podman detach=True debug=False force_restart=False force_delete=True generate_systemd={} image_strict=False recreate=False image=None annotation=None arch=None attach=None authfile=None blkio_weight=None blkio_weight_device=None cap_add=None cap_drop=None cgroup_conf=None cgroup_parent=None cgroupns=None cgroups=None chrootdirs=None cidfile=None cmd_args=None conmon_pidfile=None command=None cpu_period=None cpu_quota=None cpu_rt_period=None cpu_rt_runtime=None cpu_shares=None cpus=None cpuset_cpus=None cpuset_mems=None decryption_key=None delete_depend=None delete_time=None delete_volumes=None detach_keys=None device=None device_cgroup_rule=None device_read_bps=None device_read_iops=None device_write_bps=None device_write_iops=None dns=None dns_option=None dns_search=None entrypoint=None env=None env_file=None env_host=None env_merge=None etc_hosts=None expose=None gidmap=None gpus=None group_add=None group_entry=None healthcheck=None healthcheck_interval=None healthcheck_retries=None healthcheck_start_period=None health_startup_cmd=None health_startup_interval=None health_startup_retries=None health_startup_success=None health_startup_timeout=None healthcheck_timeout=None healthcheck_failure_action=None hooks_dir=None hostname=None hostuser=None http_proxy=None image_volume=None init=None init_ctr=None init_path=None interactive=None ip=None ip6=None ipc=None kernel_memory=None label=None label_file=None log_driver=None log_level=None log_opt=None mac_address=None memory=None memory_reservation=None memory_swap=None memory_swappiness=None mount=None network=None network_aliases=None no_healthcheck=None no_hosts=None oom_kill_disable=None oom_score_adj=None os=None passwd=None passwd_entry=None personality=None pid=None pid_file=None pids_limit=None platform=None pod=None pod_id_file=None preserve_fd=None preserve_fds=None privileged=None publish=None publish_all=None pull=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None rdt_class=None read_only=None read_only_tmpfs=None requires=None restart_policy=None restart_time=None retry=None retry_delay=None rm=None rmi=None rootfs=None seccomp_policy=None secrets=NOT_LOGGING_PARAMETER sdnotify=None security_opt=None shm_size=None shm_size_systemd=None sig_proxy=None stop_signal=None stop_timeout=None stop_time=None subgidname=None subuidname=None sysctl=None systemd=None timeout=None timezone=None tls_verify=None tmpfs=None tty=None uidmap=None ulimit=None umask=None unsetenv=None unsetenv_all=None user=None userns=None uts=None variant=None volume=None volumes_from=None workdir=None
Dec 05 09:11:51 compute-1 sudo[188804]: pam_unix(sudo:session): session closed for user root
Dec 05 09:11:51 compute-1 rsyslogd[1007]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Dec 05 09:11:51 compute-1 rsyslogd[1007]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Dec 05 09:11:51 compute-1 nova_compute[188144]: 2025-12-05 09:11:51.926 188148 INFO nova.virt.libvirt.host [None req-d36cd56c-951f-4d9c-8180-5c001720b410 - - - - - -] Libvirt host capabilities <capabilities>
Dec 05 09:11:51 compute-1 nova_compute[188144]: 
Dec 05 09:11:51 compute-1 nova_compute[188144]:   <host>
Dec 05 09:11:51 compute-1 nova_compute[188144]:     <uuid>dc540e4b-dbcd-41cc-9101-8457f90414ef</uuid>
Dec 05 09:11:51 compute-1 nova_compute[188144]:     <cpu>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <arch>x86_64</arch>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <model>EPYC-Rome-v4</model>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <vendor>AMD</vendor>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <microcode version='16777317'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <signature family='23' model='49' stepping='0'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <topology sockets='8' dies='1' clusters='1' cores='1' threads='1'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <maxphysaddr mode='emulate' bits='40'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <feature name='x2apic'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <feature name='tsc-deadline'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <feature name='osxsave'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <feature name='hypervisor'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <feature name='tsc_adjust'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <feature name='spec-ctrl'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <feature name='stibp'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <feature name='arch-capabilities'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <feature name='ssbd'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <feature name='cmp_legacy'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <feature name='topoext'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <feature name='virt-ssbd'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <feature name='lbrv'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <feature name='tsc-scale'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <feature name='vmcb-clean'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <feature name='pause-filter'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <feature name='pfthreshold'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <feature name='svme-addr-chk'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <feature name='rdctl-no'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <feature name='skip-l1dfl-vmentry'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <feature name='mds-no'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <feature name='pschange-mc-no'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <pages unit='KiB' size='4'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <pages unit='KiB' size='2048'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <pages unit='KiB' size='1048576'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:     </cpu>
Dec 05 09:11:51 compute-1 nova_compute[188144]:     <power_management>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <suspend_mem/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <suspend_disk/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <suspend_hybrid/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:     </power_management>
Dec 05 09:11:51 compute-1 nova_compute[188144]:     <iommu support='no'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:     <migration_features>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <live/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <uri_transports>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <uri_transport>tcp</uri_transport>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <uri_transport>rdma</uri_transport>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       </uri_transports>
Dec 05 09:11:51 compute-1 nova_compute[188144]:     </migration_features>
Dec 05 09:11:51 compute-1 nova_compute[188144]:     <topology>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <cells num='1'>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <cell id='0'>
Dec 05 09:11:51 compute-1 nova_compute[188144]:           <memory unit='KiB'>7864304</memory>
Dec 05 09:11:51 compute-1 nova_compute[188144]:           <pages unit='KiB' size='4'>1966076</pages>
Dec 05 09:11:51 compute-1 nova_compute[188144]:           <pages unit='KiB' size='2048'>0</pages>
Dec 05 09:11:51 compute-1 nova_compute[188144]:           <pages unit='KiB' size='1048576'>0</pages>
Dec 05 09:11:51 compute-1 nova_compute[188144]:           <distances>
Dec 05 09:11:51 compute-1 nova_compute[188144]:             <sibling id='0' value='10'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:           </distances>
Dec 05 09:11:51 compute-1 nova_compute[188144]:           <cpus num='8'>
Dec 05 09:11:51 compute-1 nova_compute[188144]:             <cpu id='0' socket_id='0' die_id='0' cluster_id='65535' core_id='0' siblings='0'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:             <cpu id='1' socket_id='1' die_id='1' cluster_id='65535' core_id='0' siblings='1'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:             <cpu id='2' socket_id='2' die_id='2' cluster_id='65535' core_id='0' siblings='2'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:             <cpu id='3' socket_id='3' die_id='3' cluster_id='65535' core_id='0' siblings='3'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:             <cpu id='4' socket_id='4' die_id='4' cluster_id='65535' core_id='0' siblings='4'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:             <cpu id='5' socket_id='5' die_id='5' cluster_id='65535' core_id='0' siblings='5'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:             <cpu id='6' socket_id='6' die_id='6' cluster_id='65535' core_id='0' siblings='6'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:             <cpu id='7' socket_id='7' die_id='7' cluster_id='65535' core_id='0' siblings='7'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:           </cpus>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         </cell>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       </cells>
Dec 05 09:11:51 compute-1 nova_compute[188144]:     </topology>
Dec 05 09:11:51 compute-1 nova_compute[188144]:     <cache>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <bank id='0' level='2' type='both' size='512' unit='KiB' cpus='0'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <bank id='1' level='2' type='both' size='512' unit='KiB' cpus='1'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <bank id='2' level='2' type='both' size='512' unit='KiB' cpus='2'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <bank id='3' level='2' type='both' size='512' unit='KiB' cpus='3'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <bank id='4' level='2' type='both' size='512' unit='KiB' cpus='4'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <bank id='5' level='2' type='both' size='512' unit='KiB' cpus='5'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <bank id='6' level='2' type='both' size='512' unit='KiB' cpus='6'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <bank id='7' level='2' type='both' size='512' unit='KiB' cpus='7'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <bank id='0' level='3' type='both' size='16' unit='MiB' cpus='0'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <bank id='1' level='3' type='both' size='16' unit='MiB' cpus='1'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <bank id='2' level='3' type='both' size='16' unit='MiB' cpus='2'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <bank id='3' level='3' type='both' size='16' unit='MiB' cpus='3'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <bank id='4' level='3' type='both' size='16' unit='MiB' cpus='4'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <bank id='5' level='3' type='both' size='16' unit='MiB' cpus='5'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <bank id='6' level='3' type='both' size='16' unit='MiB' cpus='6'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <bank id='7' level='3' type='both' size='16' unit='MiB' cpus='7'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:     </cache>
Dec 05 09:11:51 compute-1 nova_compute[188144]:     <secmodel>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <model>selinux</model>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <doi>0</doi>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <baselabel type='kvm'>system_u:system_r:svirt_t:s0</baselabel>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <baselabel type='qemu'>system_u:system_r:svirt_tcg_t:s0</baselabel>
Dec 05 09:11:51 compute-1 nova_compute[188144]:     </secmodel>
Dec 05 09:11:51 compute-1 nova_compute[188144]:     <secmodel>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <model>dac</model>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <doi>0</doi>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <baselabel type='kvm'>+107:+107</baselabel>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <baselabel type='qemu'>+107:+107</baselabel>
Dec 05 09:11:51 compute-1 nova_compute[188144]:     </secmodel>
Dec 05 09:11:51 compute-1 nova_compute[188144]:   </host>
Dec 05 09:11:51 compute-1 nova_compute[188144]: 
Dec 05 09:11:51 compute-1 nova_compute[188144]:   <guest>
Dec 05 09:11:51 compute-1 nova_compute[188144]:     <os_type>hvm</os_type>
Dec 05 09:11:51 compute-1 nova_compute[188144]:     <arch name='i686'>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <wordsize>32</wordsize>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <emulator>/usr/libexec/qemu-kvm</emulator>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <domain type='qemu'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <domain type='kvm'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:     </arch>
Dec 05 09:11:51 compute-1 nova_compute[188144]:     <features>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <pae/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <nonpae/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <acpi default='on' toggle='yes'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <apic default='on' toggle='no'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <cpuselection/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <deviceboot/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <disksnapshot default='on' toggle='no'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <externalSnapshot/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:     </features>
Dec 05 09:11:51 compute-1 nova_compute[188144]:   </guest>
Dec 05 09:11:51 compute-1 nova_compute[188144]: 
Dec 05 09:11:51 compute-1 nova_compute[188144]:   <guest>
Dec 05 09:11:51 compute-1 nova_compute[188144]:     <os_type>hvm</os_type>
Dec 05 09:11:51 compute-1 nova_compute[188144]:     <arch name='x86_64'>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <wordsize>64</wordsize>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <emulator>/usr/libexec/qemu-kvm</emulator>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <domain type='qemu'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <domain type='kvm'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:     </arch>
Dec 05 09:11:51 compute-1 nova_compute[188144]:     <features>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <acpi default='on' toggle='yes'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <apic default='on' toggle='no'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <cpuselection/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <deviceboot/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <disksnapshot default='on' toggle='no'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <externalSnapshot/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:     </features>
Dec 05 09:11:51 compute-1 nova_compute[188144]:   </guest>
Dec 05 09:11:51 compute-1 nova_compute[188144]: 
Dec 05 09:11:51 compute-1 nova_compute[188144]: </capabilities>
Dec 05 09:11:51 compute-1 nova_compute[188144]: 
Dec 05 09:11:51 compute-1 nova_compute[188144]: 2025-12-05 09:11:51.935 188148 DEBUG nova.virt.libvirt.host [None req-d36cd56c-951f-4d9c-8180-5c001720b410 - - - - - -] Getting domain capabilities for i686 via machine types: {'pc', 'q35'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952
Dec 05 09:11:51 compute-1 nova_compute[188144]: 2025-12-05 09:11:51.957 188148 DEBUG nova.virt.libvirt.host [None req-d36cd56c-951f-4d9c-8180-5c001720b410 - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=pc:
Dec 05 09:11:51 compute-1 nova_compute[188144]: <domainCapabilities>
Dec 05 09:11:51 compute-1 nova_compute[188144]:   <path>/usr/libexec/qemu-kvm</path>
Dec 05 09:11:51 compute-1 nova_compute[188144]:   <domain>kvm</domain>
Dec 05 09:11:51 compute-1 nova_compute[188144]:   <machine>pc-i440fx-rhel7.6.0</machine>
Dec 05 09:11:51 compute-1 nova_compute[188144]:   <arch>i686</arch>
Dec 05 09:11:51 compute-1 nova_compute[188144]:   <vcpu max='240'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:   <iothreads supported='yes'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:   <os supported='yes'>
Dec 05 09:11:51 compute-1 nova_compute[188144]:     <enum name='firmware'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:     <loader supported='yes'>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <enum name='type'>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <value>rom</value>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <value>pflash</value>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       </enum>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <enum name='readonly'>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <value>yes</value>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <value>no</value>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       </enum>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <enum name='secure'>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <value>no</value>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       </enum>
Dec 05 09:11:51 compute-1 nova_compute[188144]:     </loader>
Dec 05 09:11:51 compute-1 nova_compute[188144]:   </os>
Dec 05 09:11:51 compute-1 nova_compute[188144]:   <cpu>
Dec 05 09:11:51 compute-1 nova_compute[188144]:     <mode name='host-passthrough' supported='yes'>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <enum name='hostPassthroughMigratable'>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <value>on</value>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <value>off</value>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       </enum>
Dec 05 09:11:51 compute-1 nova_compute[188144]:     </mode>
Dec 05 09:11:51 compute-1 nova_compute[188144]:     <mode name='maximum' supported='yes'>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <enum name='maximumMigratable'>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <value>on</value>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <value>off</value>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       </enum>
Dec 05 09:11:51 compute-1 nova_compute[188144]:     </mode>
Dec 05 09:11:51 compute-1 nova_compute[188144]:     <mode name='host-model' supported='yes'>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <model fallback='forbid'>EPYC-Rome</model>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <vendor>AMD</vendor>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <maxphysaddr mode='passthrough' limit='40'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <feature policy='require' name='x2apic'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <feature policy='require' name='tsc-deadline'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <feature policy='require' name='hypervisor'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <feature policy='require' name='tsc_adjust'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <feature policy='require' name='spec-ctrl'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <feature policy='require' name='stibp'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <feature policy='require' name='ssbd'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <feature policy='require' name='cmp_legacy'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <feature policy='require' name='overflow-recov'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <feature policy='require' name='succor'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <feature policy='require' name='ibrs'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <feature policy='require' name='amd-ssbd'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <feature policy='require' name='virt-ssbd'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <feature policy='require' name='lbrv'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <feature policy='require' name='tsc-scale'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <feature policy='require' name='vmcb-clean'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <feature policy='require' name='flushbyasid'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <feature policy='require' name='pause-filter'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <feature policy='require' name='pfthreshold'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <feature policy='require' name='svme-addr-chk'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <feature policy='require' name='lfence-always-serializing'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <feature policy='disable' name='xsaves'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:     </mode>
Dec 05 09:11:51 compute-1 nova_compute[188144]:     <mode name='custom' supported='yes'>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <blockers model='Broadwell'>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='erms'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='hle'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='invpcid'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='pcid'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='rtm'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <blockers model='Broadwell-IBRS'>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='erms'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='hle'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='invpcid'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='pcid'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='rtm'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <blockers model='Broadwell-noTSX'>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='erms'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='invpcid'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='pcid'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <blockers model='Broadwell-noTSX-IBRS'>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='erms'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='invpcid'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='pcid'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <blockers model='Broadwell-v1'>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='erms'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='hle'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='invpcid'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='pcid'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='rtm'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <blockers model='Broadwell-v2'>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='erms'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='invpcid'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='pcid'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <blockers model='Broadwell-v3'>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='erms'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='hle'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='invpcid'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='pcid'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='rtm'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <blockers model='Broadwell-v4'>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='erms'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='invpcid'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='pcid'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <blockers model='Cascadelake-Server'>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx512bw'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx512cd'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx512dq'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx512f'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx512vl'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx512vnni'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='erms'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='hle'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='invpcid'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='pcid'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='pku'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='rtm'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <blockers model='Cascadelake-Server-noTSX'>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx512bw'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx512cd'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx512dq'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx512f'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx512vl'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx512vnni'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='erms'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='ibrs-all'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='invpcid'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='pcid'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='pku'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <blockers model='Cascadelake-Server-v1'>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx512bw'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx512cd'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx512dq'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx512f'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx512vl'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx512vnni'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='erms'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='hle'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='invpcid'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='pcid'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='pku'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='rtm'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <blockers model='Cascadelake-Server-v2'>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx512bw'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx512cd'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx512dq'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx512f'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx512vl'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx512vnni'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='erms'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='hle'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='ibrs-all'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='invpcid'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='pcid'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='pku'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='rtm'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <blockers model='Cascadelake-Server-v3'>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx512bw'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx512cd'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx512dq'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx512f'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx512vl'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx512vnni'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='erms'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='ibrs-all'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='invpcid'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='pcid'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='pku'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <blockers model='Cascadelake-Server-v4'>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx512bw'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx512cd'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx512dq'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx512f'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx512vl'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx512vnni'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='erms'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='ibrs-all'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='invpcid'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='pcid'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='pku'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <blockers model='Cascadelake-Server-v5'>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx512bw'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx512cd'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx512dq'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx512f'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx512vl'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx512vnni'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='erms'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='ibrs-all'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='invpcid'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='pcid'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='pku'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='xsaves'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <blockers model='Cooperlake'>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx512-bf16'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx512bw'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx512cd'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx512dq'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx512f'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx512vl'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx512vnni'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='erms'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='hle'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='ibrs-all'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='invpcid'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='pcid'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='pku'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='rtm'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='taa-no'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <blockers model='Cooperlake-v1'>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx512-bf16'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx512bw'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx512cd'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx512dq'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx512f'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx512vl'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx512vnni'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='erms'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='hle'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='ibrs-all'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='invpcid'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='pcid'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='pku'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='rtm'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='taa-no'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <blockers model='Cooperlake-v2'>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx512-bf16'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx512bw'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx512cd'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx512dq'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx512f'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx512vl'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx512vnni'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='erms'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='hle'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='ibrs-all'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='invpcid'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='pcid'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='pku'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='rtm'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='taa-no'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='xsaves'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <blockers model='Denverton'>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='erms'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='mpx'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <blockers model='Denverton-v1'>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='erms'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='mpx'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <blockers model='Denverton-v2'>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='erms'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <blockers model='Denverton-v3'>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='erms'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='xsaves'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <blockers model='Dhyana-v2'>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='xsaves'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <blockers model='EPYC-Genoa'>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='amd-psfd'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='auto-ibrs'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx512-bf16'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx512-vpopcntdq'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx512bitalg'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx512bw'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx512cd'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx512dq'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx512f'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx512ifma'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx512vbmi'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx512vbmi2'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx512vl'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx512vnni'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='erms'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='fsrm'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='gfni'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='invpcid'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='la57'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='no-nested-data-bp'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='null-sel-clr-base'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='pcid'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='pku'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='stibp-always-on'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='vaes'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='vpclmulqdq'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='xsaves'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <blockers model='EPYC-Genoa-v1'>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='amd-psfd'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='auto-ibrs'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx512-bf16'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx512-vpopcntdq'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx512bitalg'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx512bw'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx512cd'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx512dq'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx512f'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx512ifma'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx512vbmi'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx512vbmi2'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx512vl'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx512vnni'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='erms'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='fsrm'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='gfni'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='invpcid'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='la57'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='no-nested-data-bp'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='null-sel-clr-base'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='pcid'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='pku'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='stibp-always-on'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='vaes'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='vpclmulqdq'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='xsaves'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <blockers model='EPYC-Milan'>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='erms'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='fsrm'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='invpcid'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='pcid'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='pku'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='xsaves'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <blockers model='EPYC-Milan-v1'>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='erms'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='fsrm'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='invpcid'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='pcid'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='pku'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='xsaves'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <blockers model='EPYC-Milan-v2'>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='amd-psfd'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='erms'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='fsrm'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='invpcid'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='no-nested-data-bp'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='null-sel-clr-base'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='pcid'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='pku'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='stibp-always-on'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='vaes'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='vpclmulqdq'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='xsaves'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <blockers model='EPYC-Rome'>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='xsaves'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <blockers model='EPYC-Rome-v1'>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='xsaves'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <blockers model='EPYC-Rome-v2'>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='xsaves'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <blockers model='EPYC-Rome-v3'>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='xsaves'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <blockers model='EPYC-v3'>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='xsaves'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <blockers model='EPYC-v4'>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='xsaves'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <blockers model='GraniteRapids'>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='amx-bf16'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='amx-fp16'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='amx-int8'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='amx-tile'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx-vnni'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx512-bf16'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx512-fp16'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx512-vpopcntdq'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx512bitalg'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx512bw'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx512cd'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx512dq'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx512f'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx512ifma'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx512vbmi'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx512vbmi2'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx512vl'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx512vnni'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='bus-lock-detect'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='erms'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='fbsdp-no'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='fsrc'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='fsrm'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='fsrs'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='fzrm'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='gfni'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='hle'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='ibrs-all'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='invpcid'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='la57'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='mcdt-no'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='pbrsb-no'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='pcid'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='pku'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='prefetchiti'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='psdp-no'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='rtm'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='sbdr-ssdp-no'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='serialize'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='taa-no'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='tsx-ldtrk'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='vaes'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='vpclmulqdq'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='xfd'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='xsaves'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <blockers model='GraniteRapids-v1'>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='amx-bf16'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='amx-fp16'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='amx-int8'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='amx-tile'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx-vnni'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx512-bf16'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx512-fp16'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx512-vpopcntdq'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx512bitalg'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx512bw'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx512cd'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx512dq'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx512f'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx512ifma'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx512vbmi'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx512vbmi2'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx512vl'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx512vnni'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='bus-lock-detect'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='erms'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='fbsdp-no'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='fsrc'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='fsrm'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='fsrs'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='fzrm'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='gfni'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='hle'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='ibrs-all'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='invpcid'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='la57'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='mcdt-no'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='pbrsb-no'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='pcid'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='pku'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='prefetchiti'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='psdp-no'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='rtm'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='sbdr-ssdp-no'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='serialize'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='taa-no'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='tsx-ldtrk'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='vaes'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='vpclmulqdq'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='xfd'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='xsaves'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <blockers model='GraniteRapids-v2'>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='amx-bf16'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='amx-fp16'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='amx-int8'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='amx-tile'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx-vnni'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx10'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx10-128'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx10-256'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx10-512'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx512-bf16'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx512-fp16'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx512-vpopcntdq'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx512bitalg'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx512bw'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx512cd'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx512dq'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx512f'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx512ifma'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx512vbmi'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx512vbmi2'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx512vl'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx512vnni'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='bus-lock-detect'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='cldemote'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='erms'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='fbsdp-no'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='fsrc'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='fsrm'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='fsrs'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='fzrm'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='gfni'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='hle'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='ibrs-all'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='invpcid'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='la57'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='mcdt-no'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='movdir64b'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='movdiri'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='pbrsb-no'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='pcid'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='pku'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='prefetchiti'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='psdp-no'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='rtm'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='sbdr-ssdp-no'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='serialize'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='ss'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='taa-no'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='tsx-ldtrk'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='vaes'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='vpclmulqdq'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='xfd'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='xsaves'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <blockers model='Haswell'>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='erms'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='hle'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='invpcid'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='pcid'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='rtm'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <blockers model='Haswell-IBRS'>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='erms'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='hle'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='invpcid'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='pcid'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='rtm'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <blockers model='Haswell-noTSX'>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='erms'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='invpcid'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='pcid'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <blockers model='Haswell-noTSX-IBRS'>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='erms'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='invpcid'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='pcid'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <blockers model='Haswell-v1'>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='erms'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='hle'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='invpcid'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='pcid'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='rtm'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <blockers model='Haswell-v2'>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='erms'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='invpcid'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='pcid'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <blockers model='Haswell-v3'>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='erms'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='hle'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='invpcid'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='pcid'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='rtm'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <blockers model='Haswell-v4'>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='erms'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='invpcid'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='pcid'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <blockers model='Icelake-Server'>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx512-vpopcntdq'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx512bitalg'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx512bw'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx512cd'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx512dq'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx512f'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx512vbmi'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx512vbmi2'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx512vl'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx512vnni'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='erms'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='gfni'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='hle'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='invpcid'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='la57'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='pcid'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='pku'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='rtm'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='vaes'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='vpclmulqdq'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <blockers model='Icelake-Server-noTSX'>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx512-vpopcntdq'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx512bitalg'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx512bw'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx512cd'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx512dq'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx512f'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx512vbmi'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx512vbmi2'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx512vl'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx512vnni'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='erms'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='gfni'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='invpcid'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='la57'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='pcid'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='pku'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='vaes'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='vpclmulqdq'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <blockers model='Icelake-Server-v1'>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx512-vpopcntdq'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx512bitalg'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx512bw'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx512cd'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx512dq'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx512f'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx512vbmi'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx512vbmi2'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx512vl'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx512vnni'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='erms'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='gfni'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='hle'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='invpcid'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='la57'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='pcid'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='pku'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='rtm'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='vaes'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='vpclmulqdq'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <blockers model='Icelake-Server-v2'>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx512-vpopcntdq'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx512bitalg'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx512bw'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx512cd'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx512dq'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx512f'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx512vbmi'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx512vbmi2'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx512vl'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx512vnni'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='erms'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='gfni'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='invpcid'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='la57'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='pcid'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='pku'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='vaes'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='vpclmulqdq'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <blockers model='Icelake-Server-v3'>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx512-vpopcntdq'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx512bitalg'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx512bw'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx512cd'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx512dq'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx512f'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx512vbmi'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx512vbmi2'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx512vl'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx512vnni'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='erms'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='gfni'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='ibrs-all'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='invpcid'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='la57'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='pcid'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='pku'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='taa-no'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='vaes'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='vpclmulqdq'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <blockers model='Icelake-Server-v4'>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx512-vpopcntdq'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx512bitalg'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx512bw'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx512cd'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx512dq'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx512f'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx512ifma'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx512vbmi'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx512vbmi2'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx512vl'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx512vnni'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='erms'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='fsrm'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='gfni'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='ibrs-all'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='invpcid'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='la57'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='pcid'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='pku'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='taa-no'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='vaes'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='vpclmulqdq'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <blockers model='Icelake-Server-v5'>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx512-vpopcntdq'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx512bitalg'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx512bw'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx512cd'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx512dq'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx512f'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx512ifma'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx512vbmi'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx512vbmi2'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx512vl'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx512vnni'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='erms'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='fsrm'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='gfni'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='ibrs-all'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='invpcid'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='la57'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='pcid'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='pku'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='taa-no'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='vaes'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='vpclmulqdq'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='xsaves'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <blockers model='Icelake-Server-v6'>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx512-vpopcntdq'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx512bitalg'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx512bw'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx512cd'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx512dq'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx512f'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx512ifma'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx512vbmi'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx512vbmi2'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx512vl'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx512vnni'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='erms'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='fsrm'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='gfni'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='ibrs-all'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='invpcid'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='la57'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='pcid'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='pku'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='taa-no'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='vaes'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='vpclmulqdq'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='xsaves'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <blockers model='Icelake-Server-v7'>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx512-vpopcntdq'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx512bitalg'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx512bw'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx512cd'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx512dq'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx512f'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx512ifma'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx512vbmi'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx512vbmi2'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx512vl'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx512vnni'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='erms'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='fsrm'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='gfni'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='hle'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='ibrs-all'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='invpcid'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='la57'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='pcid'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='pku'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='rtm'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='taa-no'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='vaes'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='vpclmulqdq'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='xsaves'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <blockers model='IvyBridge'>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='erms'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <blockers model='IvyBridge-IBRS'>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='erms'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <blockers model='IvyBridge-v1'>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='erms'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <blockers model='IvyBridge-v2'>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='erms'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <blockers model='KnightsMill'>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx512-4fmaps'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx512-4vnniw'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx512-vpopcntdq'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx512cd'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx512er'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx512f'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx512pf'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='erms'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='ss'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <blockers model='KnightsMill-v1'>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx512-4fmaps'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx512-4vnniw'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx512-vpopcntdq'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx512cd'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx512er'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx512f'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx512pf'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='erms'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='ss'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <blockers model='Opteron_G4'>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='fma4'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='xop'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <blockers model='Opteron_G4-v1'>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='fma4'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='xop'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <blockers model='Opteron_G5'>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='fma4'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='tbm'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='xop'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <blockers model='Opteron_G5-v1'>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='fma4'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='tbm'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='xop'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <blockers model='SapphireRapids'>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='amx-bf16'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='amx-int8'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='amx-tile'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx-vnni'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx512-bf16'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx512-fp16'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx512-vpopcntdq'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx512bitalg'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx512bw'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx512cd'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx512dq'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx512f'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx512ifma'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx512vbmi'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx512vbmi2'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx512vl'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx512vnni'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='bus-lock-detect'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='erms'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='fsrc'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='fsrm'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='fsrs'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='fzrm'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='gfni'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='hle'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='ibrs-all'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='invpcid'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='la57'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='pcid'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='pku'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='rtm'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='serialize'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='taa-no'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='tsx-ldtrk'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='vaes'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='vpclmulqdq'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='xfd'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='xsaves'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <blockers model='SapphireRapids-v1'>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='amx-bf16'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='amx-int8'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='amx-tile'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx-vnni'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx512-bf16'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx512-fp16'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx512-vpopcntdq'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx512bitalg'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx512bw'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx512cd'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx512dq'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx512f'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx512ifma'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx512vbmi'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx512vbmi2'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx512vl'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx512vnni'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='bus-lock-detect'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='erms'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='fsrc'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='fsrm'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='fsrs'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='fzrm'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='gfni'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='hle'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='ibrs-all'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='invpcid'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='la57'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='pcid'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='pku'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='rtm'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='serialize'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='taa-no'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='tsx-ldtrk'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='vaes'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='vpclmulqdq'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='xfd'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='xsaves'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <blockers model='SapphireRapids-v2'>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='amx-bf16'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='amx-int8'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='amx-tile'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx-vnni'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx512-bf16'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx512-fp16'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx512-vpopcntdq'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx512bitalg'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx512bw'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx512cd'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx512dq'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx512f'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx512ifma'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx512vbmi'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx512vbmi2'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx512vl'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx512vnni'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='bus-lock-detect'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='erms'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='fbsdp-no'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='fsrc'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='fsrm'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='fsrs'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='fzrm'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='gfni'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='hle'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='ibrs-all'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='invpcid'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='la57'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='pcid'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='pku'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='psdp-no'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='rtm'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='sbdr-ssdp-no'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='serialize'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='taa-no'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='tsx-ldtrk'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='vaes'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='vpclmulqdq'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='xfd'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='xsaves'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <blockers model='SapphireRapids-v3'>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='amx-bf16'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='amx-int8'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='amx-tile'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx-vnni'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx512-bf16'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx512-fp16'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx512-vpopcntdq'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx512bitalg'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx512bw'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx512cd'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx512dq'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx512f'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx512ifma'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx512vbmi'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx512vbmi2'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx512vl'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx512vnni'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='bus-lock-detect'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='cldemote'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='erms'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='fbsdp-no'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='fsrc'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='fsrm'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='fsrs'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='fzrm'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='gfni'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='hle'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='ibrs-all'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='invpcid'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='la57'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='movdir64b'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='movdiri'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='pcid'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='pku'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='psdp-no'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='rtm'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='sbdr-ssdp-no'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='serialize'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='ss'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='taa-no'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='tsx-ldtrk'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='vaes'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='vpclmulqdq'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='xfd'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='xsaves'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <blockers model='SierraForest'>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx-ifma'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx-ne-convert'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx-vnni'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx-vnni-int8'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='bus-lock-detect'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='cmpccxadd'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='erms'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='fbsdp-no'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='fsrm'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='fsrs'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='gfni'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='ibrs-all'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='invpcid'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='mcdt-no'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='pbrsb-no'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='pcid'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='pku'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='psdp-no'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='sbdr-ssdp-no'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='serialize'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='vaes'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='vpclmulqdq'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='xsaves'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <blockers model='SierraForest-v1'>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx-ifma'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx-ne-convert'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx-vnni'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx-vnni-int8'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='bus-lock-detect'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='cmpccxadd'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='erms'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='fbsdp-no'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='fsrm'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='fsrs'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='gfni'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='ibrs-all'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='invpcid'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='mcdt-no'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='pbrsb-no'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='pcid'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='pku'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='psdp-no'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='sbdr-ssdp-no'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='serialize'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='vaes'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='vpclmulqdq'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='xsaves'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <blockers model='Skylake-Client'>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='erms'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='hle'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='invpcid'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='pcid'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='rtm'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <blockers model='Skylake-Client-IBRS'>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='erms'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='hle'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='invpcid'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='pcid'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='rtm'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='erms'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='invpcid'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='pcid'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <blockers model='Skylake-Client-v1'>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='erms'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='hle'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='invpcid'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='pcid'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='rtm'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <blockers model='Skylake-Client-v2'>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='erms'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='hle'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='invpcid'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='pcid'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='rtm'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <blockers model='Skylake-Client-v3'>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='erms'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='invpcid'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='pcid'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <blockers model='Skylake-Client-v4'>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='erms'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='invpcid'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='pcid'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='xsaves'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <blockers model='Skylake-Server'>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx512bw'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx512cd'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx512dq'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx512f'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx512vl'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='erms'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='hle'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='invpcid'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='pcid'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='pku'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='rtm'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <blockers model='Skylake-Server-IBRS'>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx512bw'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx512cd'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx512dq'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx512f'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx512vl'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='erms'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='hle'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='invpcid'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='pcid'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='pku'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='rtm'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx512bw'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx512cd'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx512dq'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx512f'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx512vl'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='erms'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='invpcid'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='pcid'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='pku'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <blockers model='Skylake-Server-v1'>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx512bw'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx512cd'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx512dq'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx512f'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx512vl'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='erms'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='hle'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='invpcid'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='pcid'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='pku'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='rtm'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <blockers model='Skylake-Server-v2'>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx512bw'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx512cd'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx512dq'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx512f'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx512vl'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='erms'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='hle'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='invpcid'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='pcid'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='pku'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='rtm'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <blockers model='Skylake-Server-v3'>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx512bw'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx512cd'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx512dq'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx512f'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx512vl'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='erms'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='invpcid'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='pcid'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='pku'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Dec 05 09:11:51 compute-1 nova_compute[188144]:       <blockers model='Skylake-Server-v4'>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx512bw'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx512cd'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx512dq'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx512f'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='avx512vl'/>
Dec 05 09:11:51 compute-1 nova_compute[188144]:         <feature name='erms'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='invpcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pku'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <blockers model='Skylake-Server-v5'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512bw'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512cd'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512dq'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512f'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512vl'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='erms'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='invpcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pku'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='xsaves'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <blockers model='Snowridge'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='cldemote'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='core-capability'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='erms'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='gfni'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='movdir64b'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='movdiri'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='mpx'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='split-lock-detect'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <blockers model='Snowridge-v1'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='cldemote'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='core-capability'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='erms'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='gfni'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='movdir64b'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='movdiri'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='mpx'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='split-lock-detect'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <blockers model='Snowridge-v2'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='cldemote'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='core-capability'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='erms'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='gfni'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='movdir64b'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='movdiri'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='split-lock-detect'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <blockers model='Snowridge-v3'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='cldemote'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='core-capability'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='erms'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='gfni'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='movdir64b'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='movdiri'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='split-lock-detect'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='xsaves'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <blockers model='Snowridge-v4'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='cldemote'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='erms'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='gfni'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='movdir64b'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='movdiri'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='xsaves'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <blockers model='athlon'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='3dnow'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='3dnowext'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <blockers model='athlon-v1'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='3dnow'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='3dnowext'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <blockers model='core2duo'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='ss'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <blockers model='core2duo-v1'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='ss'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <blockers model='coreduo'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='ss'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <blockers model='coreduo-v1'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='ss'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <blockers model='n270'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='ss'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <blockers model='n270-v1'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='ss'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <blockers model='phenom'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='3dnow'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='3dnowext'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <blockers model='phenom-v1'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='3dnow'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='3dnowext'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:     </mode>
Dec 05 09:11:52 compute-1 nova_compute[188144]:   </cpu>
Dec 05 09:11:52 compute-1 nova_compute[188144]:   <memoryBacking supported='yes'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:     <enum name='sourceType'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <value>file</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <value>anonymous</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <value>memfd</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:     </enum>
Dec 05 09:11:52 compute-1 nova_compute[188144]:   </memoryBacking>
Dec 05 09:11:52 compute-1 nova_compute[188144]:   <devices>
Dec 05 09:11:52 compute-1 nova_compute[188144]:     <disk supported='yes'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <enum name='diskDevice'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>disk</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>cdrom</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>floppy</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>lun</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </enum>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <enum name='bus'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>ide</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>fdc</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>scsi</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>virtio</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>usb</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>sata</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </enum>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <enum name='model'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>virtio</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>virtio-transitional</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>virtio-non-transitional</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </enum>
Dec 05 09:11:52 compute-1 nova_compute[188144]:     </disk>
Dec 05 09:11:52 compute-1 nova_compute[188144]:     <graphics supported='yes'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <enum name='type'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>vnc</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>egl-headless</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>dbus</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </enum>
Dec 05 09:11:52 compute-1 nova_compute[188144]:     </graphics>
Dec 05 09:11:52 compute-1 nova_compute[188144]:     <video supported='yes'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <enum name='modelType'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>vga</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>cirrus</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>virtio</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>none</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>bochs</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>ramfb</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </enum>
Dec 05 09:11:52 compute-1 nova_compute[188144]:     </video>
Dec 05 09:11:52 compute-1 nova_compute[188144]:     <hostdev supported='yes'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <enum name='mode'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>subsystem</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </enum>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <enum name='startupPolicy'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>default</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>mandatory</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>requisite</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>optional</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </enum>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <enum name='subsysType'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>usb</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>pci</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>scsi</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </enum>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <enum name='capsType'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <enum name='pciBackend'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:     </hostdev>
Dec 05 09:11:52 compute-1 nova_compute[188144]:     <rng supported='yes'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <enum name='model'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>virtio</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>virtio-transitional</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>virtio-non-transitional</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </enum>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <enum name='backendModel'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>random</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>egd</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>builtin</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </enum>
Dec 05 09:11:52 compute-1 nova_compute[188144]:     </rng>
Dec 05 09:11:52 compute-1 nova_compute[188144]:     <filesystem supported='yes'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <enum name='driverType'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>path</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>handle</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>virtiofs</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </enum>
Dec 05 09:11:52 compute-1 nova_compute[188144]:     </filesystem>
Dec 05 09:11:52 compute-1 nova_compute[188144]:     <tpm supported='yes'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <enum name='model'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>tpm-tis</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>tpm-crb</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </enum>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <enum name='backendModel'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>emulator</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>external</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </enum>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <enum name='backendVersion'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>2.0</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </enum>
Dec 05 09:11:52 compute-1 nova_compute[188144]:     </tpm>
Dec 05 09:11:52 compute-1 nova_compute[188144]:     <redirdev supported='yes'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <enum name='bus'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>usb</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </enum>
Dec 05 09:11:52 compute-1 nova_compute[188144]:     </redirdev>
Dec 05 09:11:52 compute-1 nova_compute[188144]:     <channel supported='yes'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <enum name='type'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>pty</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>unix</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </enum>
Dec 05 09:11:52 compute-1 nova_compute[188144]:     </channel>
Dec 05 09:11:52 compute-1 nova_compute[188144]:     <crypto supported='yes'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <enum name='model'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <enum name='type'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>qemu</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </enum>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <enum name='backendModel'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>builtin</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </enum>
Dec 05 09:11:52 compute-1 nova_compute[188144]:     </crypto>
Dec 05 09:11:52 compute-1 nova_compute[188144]:     <interface supported='yes'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <enum name='backendType'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>default</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>passt</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </enum>
Dec 05 09:11:52 compute-1 nova_compute[188144]:     </interface>
Dec 05 09:11:52 compute-1 nova_compute[188144]:     <panic supported='yes'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <enum name='model'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>isa</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>hyperv</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </enum>
Dec 05 09:11:52 compute-1 nova_compute[188144]:     </panic>
Dec 05 09:11:52 compute-1 nova_compute[188144]:     <console supported='yes'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <enum name='type'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>null</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>vc</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>pty</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>dev</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>file</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>pipe</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>stdio</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>udp</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>tcp</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>unix</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>qemu-vdagent</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>dbus</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </enum>
Dec 05 09:11:52 compute-1 nova_compute[188144]:     </console>
Dec 05 09:11:52 compute-1 nova_compute[188144]:   </devices>
Dec 05 09:11:52 compute-1 nova_compute[188144]:   <features>
Dec 05 09:11:52 compute-1 nova_compute[188144]:     <gic supported='no'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:     <vmcoreinfo supported='yes'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:     <genid supported='yes'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:     <backingStoreInput supported='yes'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:     <backup supported='yes'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:     <async-teardown supported='yes'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:     <ps2 supported='yes'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:     <sev supported='no'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:     <sgx supported='no'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:     <hyperv supported='yes'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <enum name='features'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>relaxed</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>vapic</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>spinlocks</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>vpindex</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>runtime</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>synic</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>stimer</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>reset</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>vendor_id</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>frequencies</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>reenlightenment</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>tlbflush</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>ipi</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>avic</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>emsr_bitmap</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>xmm_input</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </enum>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <defaults>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <spinlocks>4095</spinlocks>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <stimer_direct>on</stimer_direct>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <tlbflush_direct>on</tlbflush_direct>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <tlbflush_extended>on</tlbflush_extended>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <vendor_id>Linux KVM Hv</vendor_id>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </defaults>
Dec 05 09:11:52 compute-1 nova_compute[188144]:     </hyperv>
Dec 05 09:11:52 compute-1 nova_compute[188144]:     <launchSecurity supported='yes'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <enum name='sectype'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>tdx</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </enum>
Dec 05 09:11:52 compute-1 nova_compute[188144]:     </launchSecurity>
Dec 05 09:11:52 compute-1 nova_compute[188144]:   </features>
Dec 05 09:11:52 compute-1 nova_compute[188144]: </domainCapabilities>
Dec 05 09:11:52 compute-1 nova_compute[188144]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Dec 05 09:11:52 compute-1 nova_compute[188144]: 2025-12-05 09:11:51.965 188148 DEBUG nova.virt.libvirt.host [None req-d36cd56c-951f-4d9c-8180-5c001720b410 - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=q35:
Dec 05 09:11:52 compute-1 nova_compute[188144]: <domainCapabilities>
Dec 05 09:11:52 compute-1 nova_compute[188144]:   <path>/usr/libexec/qemu-kvm</path>
Dec 05 09:11:52 compute-1 nova_compute[188144]:   <domain>kvm</domain>
Dec 05 09:11:52 compute-1 nova_compute[188144]:   <machine>pc-q35-rhel9.8.0</machine>
Dec 05 09:11:52 compute-1 nova_compute[188144]:   <arch>i686</arch>
Dec 05 09:11:52 compute-1 nova_compute[188144]:   <vcpu max='4096'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:   <iothreads supported='yes'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:   <os supported='yes'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:     <enum name='firmware'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:     <loader supported='yes'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <enum name='type'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>rom</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>pflash</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </enum>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <enum name='readonly'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>yes</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>no</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </enum>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <enum name='secure'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>no</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </enum>
Dec 05 09:11:52 compute-1 nova_compute[188144]:     </loader>
Dec 05 09:11:52 compute-1 nova_compute[188144]:   </os>
Dec 05 09:11:52 compute-1 nova_compute[188144]:   <cpu>
Dec 05 09:11:52 compute-1 nova_compute[188144]:     <mode name='host-passthrough' supported='yes'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <enum name='hostPassthroughMigratable'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>on</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>off</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </enum>
Dec 05 09:11:52 compute-1 nova_compute[188144]:     </mode>
Dec 05 09:11:52 compute-1 nova_compute[188144]:     <mode name='maximum' supported='yes'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <enum name='maximumMigratable'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>on</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>off</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </enum>
Dec 05 09:11:52 compute-1 nova_compute[188144]:     </mode>
Dec 05 09:11:52 compute-1 nova_compute[188144]:     <mode name='host-model' supported='yes'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model fallback='forbid'>EPYC-Rome</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <vendor>AMD</vendor>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <maxphysaddr mode='passthrough' limit='40'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <feature policy='require' name='x2apic'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <feature policy='require' name='tsc-deadline'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <feature policy='require' name='hypervisor'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <feature policy='require' name='tsc_adjust'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <feature policy='require' name='spec-ctrl'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <feature policy='require' name='stibp'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <feature policy='require' name='ssbd'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <feature policy='require' name='cmp_legacy'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <feature policy='require' name='overflow-recov'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <feature policy='require' name='succor'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <feature policy='require' name='ibrs'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <feature policy='require' name='amd-ssbd'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <feature policy='require' name='virt-ssbd'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <feature policy='require' name='lbrv'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <feature policy='require' name='tsc-scale'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <feature policy='require' name='vmcb-clean'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <feature policy='require' name='flushbyasid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <feature policy='require' name='pause-filter'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <feature policy='require' name='pfthreshold'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <feature policy='require' name='svme-addr-chk'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <feature policy='require' name='lfence-always-serializing'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <feature policy='disable' name='xsaves'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:     </mode>
Dec 05 09:11:52 compute-1 nova_compute[188144]:     <mode name='custom' supported='yes'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <blockers model='Broadwell'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='erms'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='hle'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='invpcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='rtm'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <blockers model='Broadwell-IBRS'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='erms'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='hle'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='invpcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='rtm'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <blockers model='Broadwell-noTSX'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='erms'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='invpcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <blockers model='Broadwell-noTSX-IBRS'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='erms'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='invpcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <blockers model='Broadwell-v1'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='erms'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='hle'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='invpcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='rtm'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <blockers model='Broadwell-v2'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='erms'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='invpcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <blockers model='Broadwell-v3'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='erms'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='hle'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='invpcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='rtm'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <blockers model='Broadwell-v4'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='erms'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='invpcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <blockers model='Cascadelake-Server'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512bw'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512cd'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512dq'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512f'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512vl'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512vnni'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='erms'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='hle'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='invpcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pku'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='rtm'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <blockers model='Cascadelake-Server-noTSX'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512bw'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512cd'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512dq'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512f'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512vl'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512vnni'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='erms'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='ibrs-all'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='invpcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pku'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <blockers model='Cascadelake-Server-v1'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512bw'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512cd'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512dq'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512f'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512vl'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512vnni'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='erms'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='hle'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='invpcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pku'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='rtm'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <blockers model='Cascadelake-Server-v2'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512bw'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512cd'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512dq'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512f'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512vl'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512vnni'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='erms'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='hle'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='ibrs-all'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='invpcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pku'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='rtm'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <blockers model='Cascadelake-Server-v3'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512bw'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512cd'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512dq'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512f'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512vl'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512vnni'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='erms'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='ibrs-all'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='invpcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pku'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <blockers model='Cascadelake-Server-v4'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512bw'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512cd'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512dq'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512f'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512vl'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512vnni'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='erms'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='ibrs-all'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='invpcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pku'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <blockers model='Cascadelake-Server-v5'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512bw'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512cd'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512dq'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512f'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512vl'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512vnni'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='erms'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='ibrs-all'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='invpcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pku'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='xsaves'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <blockers model='Cooperlake'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512-bf16'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512bw'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512cd'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512dq'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512f'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512vl'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512vnni'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='erms'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='hle'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='ibrs-all'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='invpcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pku'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='rtm'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='taa-no'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <blockers model='Cooperlake-v1'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512-bf16'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512bw'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512cd'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512dq'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512f'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512vl'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512vnni'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='erms'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='hle'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='ibrs-all'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='invpcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pku'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='rtm'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='taa-no'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <blockers model='Cooperlake-v2'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512-bf16'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512bw'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512cd'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512dq'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512f'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512vl'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512vnni'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='erms'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='hle'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='ibrs-all'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='invpcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pku'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='rtm'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='taa-no'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='xsaves'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <blockers model='Denverton'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='erms'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='mpx'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <blockers model='Denverton-v1'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='erms'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='mpx'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <blockers model='Denverton-v2'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='erms'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <blockers model='Denverton-v3'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='erms'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='xsaves'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <blockers model='Dhyana-v2'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='xsaves'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <blockers model='EPYC-Genoa'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='amd-psfd'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='auto-ibrs'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512-bf16'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512-vpopcntdq'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512bitalg'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512bw'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512cd'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512dq'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512f'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512ifma'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512vbmi'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512vbmi2'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512vl'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512vnni'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='erms'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='fsrm'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='gfni'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='invpcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='la57'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='no-nested-data-bp'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='null-sel-clr-base'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pku'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='stibp-always-on'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='vaes'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='vpclmulqdq'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='xsaves'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <blockers model='EPYC-Genoa-v1'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='amd-psfd'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='auto-ibrs'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512-bf16'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512-vpopcntdq'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512bitalg'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512bw'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512cd'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512dq'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512f'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512ifma'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512vbmi'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512vbmi2'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512vl'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512vnni'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='erms'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='fsrm'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='gfni'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='invpcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='la57'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='no-nested-data-bp'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='null-sel-clr-base'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pku'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='stibp-always-on'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='vaes'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='vpclmulqdq'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='xsaves'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <blockers model='EPYC-Milan'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='erms'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='fsrm'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='invpcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pku'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='xsaves'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <blockers model='EPYC-Milan-v1'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='erms'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='fsrm'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='invpcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pku'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='xsaves'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <blockers model='EPYC-Milan-v2'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='amd-psfd'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='erms'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='fsrm'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='invpcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='no-nested-data-bp'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='null-sel-clr-base'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pku'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='stibp-always-on'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='vaes'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='vpclmulqdq'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='xsaves'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <blockers model='EPYC-Rome'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='xsaves'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <blockers model='EPYC-Rome-v1'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='xsaves'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <blockers model='EPYC-Rome-v2'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='xsaves'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <blockers model='EPYC-Rome-v3'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='xsaves'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <blockers model='EPYC-v3'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='xsaves'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <blockers model='EPYC-v4'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='xsaves'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <blockers model='GraniteRapids'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='amx-bf16'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='amx-fp16'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='amx-int8'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='amx-tile'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx-vnni'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512-bf16'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512-fp16'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512-vpopcntdq'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512bitalg'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512bw'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512cd'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512dq'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512f'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512ifma'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512vbmi'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512vbmi2'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512vl'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512vnni'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='bus-lock-detect'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='erms'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='fbsdp-no'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='fsrc'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='fsrm'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='fsrs'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='fzrm'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='gfni'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='hle'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='ibrs-all'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='invpcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='la57'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='mcdt-no'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pbrsb-no'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pku'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='prefetchiti'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='psdp-no'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='rtm'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='sbdr-ssdp-no'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='serialize'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='taa-no'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='tsx-ldtrk'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='vaes'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='vpclmulqdq'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='xfd'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='xsaves'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <blockers model='GraniteRapids-v1'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='amx-bf16'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='amx-fp16'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='amx-int8'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='amx-tile'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx-vnni'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512-bf16'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512-fp16'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512-vpopcntdq'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512bitalg'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512bw'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512cd'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512dq'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512f'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512ifma'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512vbmi'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512vbmi2'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512vl'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512vnni'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='bus-lock-detect'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='erms'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='fbsdp-no'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='fsrc'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='fsrm'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='fsrs'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='fzrm'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='gfni'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='hle'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='ibrs-all'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='invpcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='la57'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='mcdt-no'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pbrsb-no'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pku'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='prefetchiti'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='psdp-no'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='rtm'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='sbdr-ssdp-no'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='serialize'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='taa-no'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='tsx-ldtrk'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='vaes'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='vpclmulqdq'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='xfd'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='xsaves'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <blockers model='GraniteRapids-v2'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='amx-bf16'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='amx-fp16'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='amx-int8'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='amx-tile'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx-vnni'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx10'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx10-128'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx10-256'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx10-512'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512-bf16'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512-fp16'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512-vpopcntdq'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512bitalg'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512bw'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512cd'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512dq'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512f'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512ifma'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512vbmi'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512vbmi2'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512vl'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512vnni'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='bus-lock-detect'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='cldemote'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='erms'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='fbsdp-no'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='fsrc'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='fsrm'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='fsrs'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='fzrm'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='gfni'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='hle'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='ibrs-all'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='invpcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='la57'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='mcdt-no'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='movdir64b'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='movdiri'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pbrsb-no'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pku'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='prefetchiti'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='psdp-no'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='rtm'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='sbdr-ssdp-no'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='serialize'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='ss'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='taa-no'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='tsx-ldtrk'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='vaes'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='vpclmulqdq'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='xfd'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='xsaves'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <blockers model='Haswell'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='erms'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='hle'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='invpcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='rtm'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <blockers model='Haswell-IBRS'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='erms'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='hle'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='invpcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='rtm'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <blockers model='Haswell-noTSX'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='erms'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='invpcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <blockers model='Haswell-noTSX-IBRS'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='erms'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='invpcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <blockers model='Haswell-v1'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='erms'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='hle'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='invpcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='rtm'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <blockers model='Haswell-v2'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='erms'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='invpcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <blockers model='Haswell-v3'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='erms'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='hle'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='invpcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='rtm'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <blockers model='Haswell-v4'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='erms'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='invpcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <blockers model='Icelake-Server'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512-vpopcntdq'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512bitalg'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512bw'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512cd'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512dq'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512f'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512vbmi'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512vbmi2'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512vl'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512vnni'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='erms'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='gfni'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='hle'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='invpcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='la57'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pku'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='rtm'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='vaes'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='vpclmulqdq'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <blockers model='Icelake-Server-noTSX'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512-vpopcntdq'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512bitalg'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512bw'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512cd'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512dq'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512f'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512vbmi'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512vbmi2'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512vl'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512vnni'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='erms'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='gfni'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='invpcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='la57'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pku'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='vaes'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='vpclmulqdq'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <blockers model='Icelake-Server-v1'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512-vpopcntdq'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512bitalg'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512bw'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512cd'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512dq'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512f'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512vbmi'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512vbmi2'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512vl'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512vnni'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='erms'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='gfni'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='hle'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='invpcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='la57'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pku'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='rtm'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='vaes'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='vpclmulqdq'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <blockers model='Icelake-Server-v2'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512-vpopcntdq'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512bitalg'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512bw'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512cd'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512dq'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512f'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512vbmi'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512vbmi2'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512vl'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512vnni'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='erms'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='gfni'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='invpcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='la57'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pku'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='vaes'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='vpclmulqdq'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <blockers model='Icelake-Server-v3'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512-vpopcntdq'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512bitalg'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512bw'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512cd'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512dq'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512f'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512vbmi'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512vbmi2'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512vl'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512vnni'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='erms'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='gfni'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='ibrs-all'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='invpcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='la57'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pku'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='taa-no'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='vaes'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='vpclmulqdq'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <blockers model='Icelake-Server-v4'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512-vpopcntdq'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512bitalg'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512bw'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512cd'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512dq'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512f'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512ifma'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512vbmi'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512vbmi2'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512vl'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512vnni'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='erms'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='fsrm'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='gfni'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='ibrs-all'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='invpcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='la57'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pku'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='taa-no'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='vaes'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='vpclmulqdq'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <blockers model='Icelake-Server-v5'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512-vpopcntdq'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512bitalg'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512bw'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512cd'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512dq'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512f'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512ifma'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512vbmi'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512vbmi2'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512vl'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512vnni'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='erms'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='fsrm'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='gfni'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='ibrs-all'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='invpcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='la57'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pku'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='taa-no'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='vaes'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='vpclmulqdq'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='xsaves'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <blockers model='Icelake-Server-v6'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512-vpopcntdq'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512bitalg'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512bw'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512cd'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512dq'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512f'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512ifma'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512vbmi'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512vbmi2'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512vl'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512vnni'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='erms'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='fsrm'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='gfni'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='ibrs-all'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='invpcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='la57'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pku'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='taa-no'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='vaes'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='vpclmulqdq'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='xsaves'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <blockers model='Icelake-Server-v7'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512-vpopcntdq'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512bitalg'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512bw'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512cd'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512dq'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512f'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512ifma'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512vbmi'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512vbmi2'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512vl'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512vnni'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='erms'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='fsrm'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='gfni'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='hle'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='ibrs-all'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='invpcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='la57'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pku'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='rtm'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='taa-no'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='vaes'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='vpclmulqdq'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='xsaves'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <blockers model='IvyBridge'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='erms'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <blockers model='IvyBridge-IBRS'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='erms'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <blockers model='IvyBridge-v1'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='erms'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <blockers model='IvyBridge-v2'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='erms'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <blockers model='KnightsMill'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512-4fmaps'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512-4vnniw'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512-vpopcntdq'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512cd'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512er'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512f'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512pf'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='erms'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='ss'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <blockers model='KnightsMill-v1'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512-4fmaps'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512-4vnniw'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512-vpopcntdq'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512cd'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512er'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512f'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512pf'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='erms'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='ss'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <blockers model='Opteron_G4'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='fma4'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='xop'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <blockers model='Opteron_G4-v1'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='fma4'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='xop'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <blockers model='Opteron_G5'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='fma4'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='tbm'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='xop'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <blockers model='Opteron_G5-v1'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='fma4'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='tbm'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='xop'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <blockers model='SapphireRapids'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='amx-bf16'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='amx-int8'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='amx-tile'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx-vnni'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512-bf16'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512-fp16'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512-vpopcntdq'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512bitalg'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512bw'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512cd'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512dq'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512f'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512ifma'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512vbmi'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512vbmi2'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512vl'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512vnni'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='bus-lock-detect'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='erms'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='fsrc'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='fsrm'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='fsrs'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='fzrm'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='gfni'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='hle'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='ibrs-all'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='invpcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='la57'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pku'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='rtm'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='serialize'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='taa-no'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='tsx-ldtrk'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='vaes'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='vpclmulqdq'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='xfd'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='xsaves'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <blockers model='SapphireRapids-v1'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='amx-bf16'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='amx-int8'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='amx-tile'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx-vnni'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512-bf16'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512-fp16'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512-vpopcntdq'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512bitalg'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512bw'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512cd'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512dq'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512f'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512ifma'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512vbmi'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512vbmi2'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512vl'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512vnni'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='bus-lock-detect'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='erms'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='fsrc'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='fsrm'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='fsrs'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='fzrm'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='gfni'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='hle'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='ibrs-all'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='invpcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='la57'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pku'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='rtm'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='serialize'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='taa-no'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='tsx-ldtrk'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='vaes'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='vpclmulqdq'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='xfd'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='xsaves'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <blockers model='SapphireRapids-v2'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='amx-bf16'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='amx-int8'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='amx-tile'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx-vnni'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512-bf16'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512-fp16'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512-vpopcntdq'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512bitalg'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512bw'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512cd'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512dq'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512f'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512ifma'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512vbmi'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512vbmi2'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512vl'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512vnni'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='bus-lock-detect'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='erms'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='fbsdp-no'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='fsrc'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='fsrm'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='fsrs'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='fzrm'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='gfni'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='hle'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='ibrs-all'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='invpcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='la57'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pku'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='psdp-no'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='rtm'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='sbdr-ssdp-no'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='serialize'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='taa-no'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='tsx-ldtrk'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='vaes'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='vpclmulqdq'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='xfd'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='xsaves'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <blockers model='SapphireRapids-v3'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='amx-bf16'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='amx-int8'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='amx-tile'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx-vnni'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512-bf16'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512-fp16'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512-vpopcntdq'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512bitalg'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512bw'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512cd'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512dq'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512f'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512ifma'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512vbmi'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512vbmi2'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512vl'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512vnni'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='bus-lock-detect'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='cldemote'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='erms'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='fbsdp-no'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='fsrc'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='fsrm'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='fsrs'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='fzrm'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='gfni'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='hle'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='ibrs-all'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='invpcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='la57'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='movdir64b'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='movdiri'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pku'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='psdp-no'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='rtm'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='sbdr-ssdp-no'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='serialize'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='ss'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='taa-no'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='tsx-ldtrk'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='vaes'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='vpclmulqdq'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='xfd'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='xsaves'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <blockers model='SierraForest'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx-ifma'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx-ne-convert'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx-vnni'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx-vnni-int8'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='bus-lock-detect'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='cmpccxadd'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='erms'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='fbsdp-no'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='fsrm'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='fsrs'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='gfni'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='ibrs-all'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='invpcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='mcdt-no'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pbrsb-no'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pku'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='psdp-no'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='sbdr-ssdp-no'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='serialize'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='vaes'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='vpclmulqdq'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='xsaves'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <blockers model='SierraForest-v1'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx-ifma'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx-ne-convert'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx-vnni'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx-vnni-int8'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='bus-lock-detect'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='cmpccxadd'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='erms'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='fbsdp-no'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='fsrm'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='fsrs'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='gfni'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='ibrs-all'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='invpcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='mcdt-no'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pbrsb-no'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pku'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='psdp-no'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='sbdr-ssdp-no'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='serialize'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='vaes'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='vpclmulqdq'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='xsaves'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <blockers model='Skylake-Client'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='erms'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='hle'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='invpcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='rtm'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <blockers model='Skylake-Client-IBRS'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='erms'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='hle'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='invpcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='rtm'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='erms'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='invpcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <blockers model='Skylake-Client-v1'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='erms'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='hle'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='invpcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='rtm'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <blockers model='Skylake-Client-v2'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='erms'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='hle'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='invpcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='rtm'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <blockers model='Skylake-Client-v3'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='erms'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='invpcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <blockers model='Skylake-Client-v4'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='erms'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='invpcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='xsaves'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <blockers model='Skylake-Server'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512bw'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512cd'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512dq'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512f'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512vl'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='erms'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='hle'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='invpcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pku'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='rtm'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <blockers model='Skylake-Server-IBRS'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512bw'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512cd'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512dq'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512f'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512vl'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='erms'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='hle'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='invpcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pku'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='rtm'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512bw'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512cd'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512dq'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512f'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512vl'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='erms'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='invpcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pku'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <blockers model='Skylake-Server-v1'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512bw'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512cd'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512dq'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512f'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512vl'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='erms'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='hle'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='invpcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pku'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='rtm'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <blockers model='Skylake-Server-v2'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512bw'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512cd'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512dq'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512f'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512vl'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='erms'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='hle'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='invpcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pku'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='rtm'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <blockers model='Skylake-Server-v3'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512bw'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512cd'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512dq'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512f'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512vl'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='erms'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='invpcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pku'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <blockers model='Skylake-Server-v4'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512bw'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512cd'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512dq'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512f'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512vl'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='erms'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='invpcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pku'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <blockers model='Skylake-Server-v5'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512bw'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512cd'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512dq'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512f'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512vl'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='erms'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='invpcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pku'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='xsaves'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <blockers model='Snowridge'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='cldemote'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='core-capability'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='erms'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='gfni'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='movdir64b'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='movdiri'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='mpx'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='split-lock-detect'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <blockers model='Snowridge-v1'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='cldemote'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='core-capability'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='erms'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='gfni'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='movdir64b'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='movdiri'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='mpx'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='split-lock-detect'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <blockers model='Snowridge-v2'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='cldemote'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='core-capability'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='erms'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='gfni'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='movdir64b'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='movdiri'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='split-lock-detect'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <blockers model='Snowridge-v3'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='cldemote'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='core-capability'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='erms'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='gfni'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='movdir64b'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='movdiri'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='split-lock-detect'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='xsaves'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <blockers model='Snowridge-v4'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='cldemote'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='erms'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='gfni'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='movdir64b'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='movdiri'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='xsaves'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <blockers model='athlon'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='3dnow'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='3dnowext'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <blockers model='athlon-v1'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='3dnow'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='3dnowext'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <blockers model='core2duo'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='ss'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <blockers model='core2duo-v1'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='ss'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <blockers model='coreduo'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='ss'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <blockers model='coreduo-v1'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='ss'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <blockers model='n270'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='ss'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <blockers model='n270-v1'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='ss'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <blockers model='phenom'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='3dnow'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='3dnowext'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <blockers model='phenom-v1'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='3dnow'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='3dnowext'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:     </mode>
Dec 05 09:11:52 compute-1 nova_compute[188144]:   </cpu>
Dec 05 09:11:52 compute-1 nova_compute[188144]:   <memoryBacking supported='yes'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:     <enum name='sourceType'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <value>file</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <value>anonymous</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <value>memfd</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:     </enum>
Dec 05 09:11:52 compute-1 nova_compute[188144]:   </memoryBacking>
Dec 05 09:11:52 compute-1 nova_compute[188144]:   <devices>
Dec 05 09:11:52 compute-1 nova_compute[188144]:     <disk supported='yes'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <enum name='diskDevice'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>disk</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>cdrom</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>floppy</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>lun</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </enum>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <enum name='bus'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>fdc</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>scsi</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>virtio</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>usb</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>sata</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </enum>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <enum name='model'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>virtio</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>virtio-transitional</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>virtio-non-transitional</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </enum>
Dec 05 09:11:52 compute-1 nova_compute[188144]:     </disk>
Dec 05 09:11:52 compute-1 nova_compute[188144]:     <graphics supported='yes'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <enum name='type'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>vnc</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>egl-headless</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>dbus</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </enum>
Dec 05 09:11:52 compute-1 nova_compute[188144]:     </graphics>
Dec 05 09:11:52 compute-1 nova_compute[188144]:     <video supported='yes'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <enum name='modelType'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>vga</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>cirrus</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>virtio</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>none</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>bochs</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>ramfb</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </enum>
Dec 05 09:11:52 compute-1 nova_compute[188144]:     </video>
Dec 05 09:11:52 compute-1 nova_compute[188144]:     <hostdev supported='yes'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <enum name='mode'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>subsystem</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </enum>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <enum name='startupPolicy'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>default</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>mandatory</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>requisite</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>optional</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </enum>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <enum name='subsysType'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>usb</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>pci</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>scsi</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </enum>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <enum name='capsType'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <enum name='pciBackend'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:     </hostdev>
Dec 05 09:11:52 compute-1 nova_compute[188144]:     <rng supported='yes'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <enum name='model'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>virtio</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>virtio-transitional</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>virtio-non-transitional</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </enum>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <enum name='backendModel'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>random</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>egd</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>builtin</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </enum>
Dec 05 09:11:52 compute-1 nova_compute[188144]:     </rng>
Dec 05 09:11:52 compute-1 nova_compute[188144]:     <filesystem supported='yes'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <enum name='driverType'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>path</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>handle</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>virtiofs</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </enum>
Dec 05 09:11:52 compute-1 nova_compute[188144]:     </filesystem>
Dec 05 09:11:52 compute-1 nova_compute[188144]:     <tpm supported='yes'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <enum name='model'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>tpm-tis</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>tpm-crb</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </enum>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <enum name='backendModel'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>emulator</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>external</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </enum>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <enum name='backendVersion'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>2.0</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </enum>
Dec 05 09:11:52 compute-1 nova_compute[188144]:     </tpm>
Dec 05 09:11:52 compute-1 nova_compute[188144]:     <redirdev supported='yes'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <enum name='bus'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>usb</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </enum>
Dec 05 09:11:52 compute-1 nova_compute[188144]:     </redirdev>
Dec 05 09:11:52 compute-1 nova_compute[188144]:     <channel supported='yes'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <enum name='type'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>pty</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>unix</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </enum>
Dec 05 09:11:52 compute-1 nova_compute[188144]:     </channel>
Dec 05 09:11:52 compute-1 nova_compute[188144]:     <crypto supported='yes'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <enum name='model'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <enum name='type'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>qemu</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </enum>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <enum name='backendModel'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>builtin</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </enum>
Dec 05 09:11:52 compute-1 nova_compute[188144]:     </crypto>
Dec 05 09:11:52 compute-1 nova_compute[188144]:     <interface supported='yes'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <enum name='backendType'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>default</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>passt</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </enum>
Dec 05 09:11:52 compute-1 nova_compute[188144]:     </interface>
Dec 05 09:11:52 compute-1 nova_compute[188144]:     <panic supported='yes'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <enum name='model'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>isa</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>hyperv</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </enum>
Dec 05 09:11:52 compute-1 nova_compute[188144]:     </panic>
Dec 05 09:11:52 compute-1 nova_compute[188144]:     <console supported='yes'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <enum name='type'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>null</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>vc</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>pty</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>dev</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>file</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>pipe</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>stdio</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>udp</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>tcp</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>unix</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>qemu-vdagent</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>dbus</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </enum>
Dec 05 09:11:52 compute-1 nova_compute[188144]:     </console>
Dec 05 09:11:52 compute-1 nova_compute[188144]:   </devices>
Dec 05 09:11:52 compute-1 nova_compute[188144]:   <features>
Dec 05 09:11:52 compute-1 nova_compute[188144]:     <gic supported='no'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:     <vmcoreinfo supported='yes'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:     <genid supported='yes'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:     <backingStoreInput supported='yes'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:     <backup supported='yes'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:     <async-teardown supported='yes'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:     <ps2 supported='yes'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:     <sev supported='no'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:     <sgx supported='no'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:     <hyperv supported='yes'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <enum name='features'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>relaxed</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>vapic</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>spinlocks</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>vpindex</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>runtime</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>synic</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>stimer</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>reset</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>vendor_id</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>frequencies</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>reenlightenment</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>tlbflush</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>ipi</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>avic</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>emsr_bitmap</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>xmm_input</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </enum>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <defaults>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <spinlocks>4095</spinlocks>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <stimer_direct>on</stimer_direct>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <tlbflush_direct>on</tlbflush_direct>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <tlbflush_extended>on</tlbflush_extended>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <vendor_id>Linux KVM Hv</vendor_id>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </defaults>
Dec 05 09:11:52 compute-1 nova_compute[188144]:     </hyperv>
Dec 05 09:11:52 compute-1 nova_compute[188144]:     <launchSecurity supported='yes'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <enum name='sectype'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>tdx</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </enum>
Dec 05 09:11:52 compute-1 nova_compute[188144]:     </launchSecurity>
Dec 05 09:11:52 compute-1 nova_compute[188144]:   </features>
Dec 05 09:11:52 compute-1 nova_compute[188144]: </domainCapabilities>
Dec 05 09:11:52 compute-1 nova_compute[188144]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Dec 05 09:11:52 compute-1 nova_compute[188144]: 2025-12-05 09:11:52.003 188148 DEBUG nova.virt.libvirt.host [None req-d36cd56c-951f-4d9c-8180-5c001720b410 - - - - - -] Getting domain capabilities for x86_64 via machine types: {'pc', 'q35'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952
Dec 05 09:11:52 compute-1 nova_compute[188144]: 2025-12-05 09:11:52.007 188148 DEBUG nova.virt.libvirt.host [None req-d36cd56c-951f-4d9c-8180-5c001720b410 - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=pc:
Dec 05 09:11:52 compute-1 nova_compute[188144]: <domainCapabilities>
Dec 05 09:11:52 compute-1 nova_compute[188144]:   <path>/usr/libexec/qemu-kvm</path>
Dec 05 09:11:52 compute-1 nova_compute[188144]:   <domain>kvm</domain>
Dec 05 09:11:52 compute-1 nova_compute[188144]:   <machine>pc-i440fx-rhel7.6.0</machine>
Dec 05 09:11:52 compute-1 nova_compute[188144]:   <arch>x86_64</arch>
Dec 05 09:11:52 compute-1 nova_compute[188144]:   <vcpu max='240'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:   <iothreads supported='yes'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:   <os supported='yes'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:     <enum name='firmware'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:     <loader supported='yes'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <enum name='type'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>rom</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>pflash</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </enum>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <enum name='readonly'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>yes</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>no</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </enum>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <enum name='secure'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>no</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </enum>
Dec 05 09:11:52 compute-1 nova_compute[188144]:     </loader>
Dec 05 09:11:52 compute-1 nova_compute[188144]:   </os>
Dec 05 09:11:52 compute-1 nova_compute[188144]:   <cpu>
Dec 05 09:11:52 compute-1 nova_compute[188144]:     <mode name='host-passthrough' supported='yes'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <enum name='hostPassthroughMigratable'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>on</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>off</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </enum>
Dec 05 09:11:52 compute-1 nova_compute[188144]:     </mode>
Dec 05 09:11:52 compute-1 nova_compute[188144]:     <mode name='maximum' supported='yes'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <enum name='maximumMigratable'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>on</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>off</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </enum>
Dec 05 09:11:52 compute-1 nova_compute[188144]:     </mode>
Dec 05 09:11:52 compute-1 nova_compute[188144]:     <mode name='host-model' supported='yes'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model fallback='forbid'>EPYC-Rome</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <vendor>AMD</vendor>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <maxphysaddr mode='passthrough' limit='40'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <feature policy='require' name='x2apic'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <feature policy='require' name='tsc-deadline'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <feature policy='require' name='hypervisor'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <feature policy='require' name='tsc_adjust'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <feature policy='require' name='spec-ctrl'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <feature policy='require' name='stibp'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <feature policy='require' name='ssbd'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <feature policy='require' name='cmp_legacy'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <feature policy='require' name='overflow-recov'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <feature policy='require' name='succor'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <feature policy='require' name='ibrs'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <feature policy='require' name='amd-ssbd'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <feature policy='require' name='virt-ssbd'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <feature policy='require' name='lbrv'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <feature policy='require' name='tsc-scale'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <feature policy='require' name='vmcb-clean'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <feature policy='require' name='flushbyasid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <feature policy='require' name='pause-filter'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <feature policy='require' name='pfthreshold'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <feature policy='require' name='svme-addr-chk'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <feature policy='require' name='lfence-always-serializing'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <feature policy='disable' name='xsaves'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:     </mode>
Dec 05 09:11:52 compute-1 nova_compute[188144]:     <mode name='custom' supported='yes'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <blockers model='Broadwell'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='erms'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='hle'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='invpcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='rtm'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <blockers model='Broadwell-IBRS'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='erms'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='hle'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='invpcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='rtm'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <blockers model='Broadwell-noTSX'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='erms'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='invpcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <blockers model='Broadwell-noTSX-IBRS'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='erms'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='invpcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <blockers model='Broadwell-v1'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='erms'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='hle'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='invpcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='rtm'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <blockers model='Broadwell-v2'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='erms'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='invpcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <blockers model='Broadwell-v3'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='erms'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='hle'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='invpcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='rtm'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <blockers model='Broadwell-v4'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='erms'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='invpcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <blockers model='Cascadelake-Server'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512bw'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512cd'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512dq'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512f'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512vl'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512vnni'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='erms'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='hle'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='invpcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pku'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='rtm'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <blockers model='Cascadelake-Server-noTSX'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512bw'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512cd'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512dq'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512f'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512vl'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512vnni'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='erms'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='ibrs-all'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='invpcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pku'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <blockers model='Cascadelake-Server-v1'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512bw'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512cd'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512dq'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512f'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512vl'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512vnni'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='erms'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='hle'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='invpcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pku'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='rtm'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <blockers model='Cascadelake-Server-v2'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512bw'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512cd'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512dq'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512f'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512vl'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512vnni'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='erms'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='hle'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='ibrs-all'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='invpcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pku'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='rtm'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <blockers model='Cascadelake-Server-v3'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512bw'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512cd'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512dq'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512f'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512vl'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512vnni'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='erms'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='ibrs-all'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='invpcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pku'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <blockers model='Cascadelake-Server-v4'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512bw'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512cd'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512dq'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512f'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512vl'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512vnni'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='erms'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='ibrs-all'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='invpcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pku'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <blockers model='Cascadelake-Server-v5'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512bw'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512cd'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512dq'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512f'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512vl'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512vnni'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='erms'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='ibrs-all'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='invpcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pku'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='xsaves'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <blockers model='Cooperlake'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512-bf16'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512bw'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512cd'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512dq'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512f'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512vl'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512vnni'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='erms'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='hle'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='ibrs-all'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='invpcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pku'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='rtm'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='taa-no'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <blockers model='Cooperlake-v1'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512-bf16'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512bw'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512cd'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512dq'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512f'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512vl'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512vnni'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='erms'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='hle'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='ibrs-all'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='invpcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pku'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='rtm'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='taa-no'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <blockers model='Cooperlake-v2'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512-bf16'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512bw'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512cd'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512dq'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512f'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512vl'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512vnni'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='erms'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='hle'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='ibrs-all'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='invpcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pku'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='rtm'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='taa-no'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='xsaves'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <blockers model='Denverton'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='erms'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='mpx'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <blockers model='Denverton-v1'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='erms'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='mpx'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <blockers model='Denverton-v2'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='erms'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <blockers model='Denverton-v3'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='erms'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='xsaves'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <blockers model='Dhyana-v2'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='xsaves'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <blockers model='EPYC-Genoa'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='amd-psfd'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='auto-ibrs'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512-bf16'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512-vpopcntdq'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512bitalg'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512bw'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512cd'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512dq'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512f'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512ifma'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512vbmi'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512vbmi2'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512vl'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512vnni'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='erms'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='fsrm'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='gfni'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='invpcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='la57'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='no-nested-data-bp'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='null-sel-clr-base'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pku'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='stibp-always-on'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='vaes'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='vpclmulqdq'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='xsaves'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <blockers model='EPYC-Genoa-v1'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='amd-psfd'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='auto-ibrs'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512-bf16'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512-vpopcntdq'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512bitalg'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512bw'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512cd'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512dq'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512f'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512ifma'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512vbmi'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512vbmi2'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512vl'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512vnni'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='erms'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='fsrm'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='gfni'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='invpcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='la57'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='no-nested-data-bp'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='null-sel-clr-base'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pku'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='stibp-always-on'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='vaes'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='vpclmulqdq'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='xsaves'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <blockers model='EPYC-Milan'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='erms'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='fsrm'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='invpcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pku'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='xsaves'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <blockers model='EPYC-Milan-v1'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='erms'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='fsrm'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='invpcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pku'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='xsaves'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <blockers model='EPYC-Milan-v2'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='amd-psfd'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='erms'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='fsrm'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='invpcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='no-nested-data-bp'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='null-sel-clr-base'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pku'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='stibp-always-on'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='vaes'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='vpclmulqdq'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='xsaves'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <blockers model='EPYC-Rome'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='xsaves'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <blockers model='EPYC-Rome-v1'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='xsaves'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <blockers model='EPYC-Rome-v2'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='xsaves'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <blockers model='EPYC-Rome-v3'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='xsaves'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <blockers model='EPYC-v3'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='xsaves'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <blockers model='EPYC-v4'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='xsaves'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <blockers model='GraniteRapids'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='amx-bf16'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='amx-fp16'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='amx-int8'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='amx-tile'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx-vnni'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512-bf16'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512-fp16'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512-vpopcntdq'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512bitalg'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512bw'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512cd'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512dq'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512f'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512ifma'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512vbmi'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512vbmi2'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512vl'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512vnni'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='bus-lock-detect'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='erms'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='fbsdp-no'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='fsrc'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='fsrm'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='fsrs'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='fzrm'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='gfni'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='hle'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='ibrs-all'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='invpcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='la57'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='mcdt-no'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pbrsb-no'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pku'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='prefetchiti'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='psdp-no'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='rtm'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='sbdr-ssdp-no'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='serialize'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='taa-no'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='tsx-ldtrk'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='vaes'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='vpclmulqdq'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='xfd'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='xsaves'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <blockers model='GraniteRapids-v1'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='amx-bf16'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='amx-fp16'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='amx-int8'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='amx-tile'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx-vnni'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512-bf16'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512-fp16'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512-vpopcntdq'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512bitalg'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512bw'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512cd'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512dq'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512f'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512ifma'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512vbmi'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512vbmi2'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512vl'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512vnni'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='bus-lock-detect'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='erms'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='fbsdp-no'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='fsrc'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='fsrm'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='fsrs'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='fzrm'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='gfni'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='hle'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='ibrs-all'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='invpcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='la57'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='mcdt-no'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pbrsb-no'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pku'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='prefetchiti'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='psdp-no'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='rtm'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='sbdr-ssdp-no'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='serialize'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='taa-no'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='tsx-ldtrk'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='vaes'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='vpclmulqdq'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='xfd'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='xsaves'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <blockers model='GraniteRapids-v2'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='amx-bf16'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='amx-fp16'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='amx-int8'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='amx-tile'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx-vnni'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx10'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx10-128'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx10-256'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx10-512'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512-bf16'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512-fp16'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512-vpopcntdq'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512bitalg'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512bw'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512cd'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512dq'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512f'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512ifma'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512vbmi'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512vbmi2'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512vl'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512vnni'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='bus-lock-detect'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='cldemote'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='erms'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='fbsdp-no'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='fsrc'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='fsrm'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='fsrs'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='fzrm'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='gfni'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='hle'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='ibrs-all'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='invpcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='la57'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='mcdt-no'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='movdir64b'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='movdiri'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pbrsb-no'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pku'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='prefetchiti'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='psdp-no'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='rtm'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='sbdr-ssdp-no'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='serialize'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='ss'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='taa-no'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='tsx-ldtrk'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='vaes'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='vpclmulqdq'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='xfd'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='xsaves'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <blockers model='Haswell'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='erms'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='hle'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='invpcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='rtm'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <blockers model='Haswell-IBRS'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='erms'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='hle'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='invpcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='rtm'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <blockers model='Haswell-noTSX'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='erms'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='invpcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <blockers model='Haswell-noTSX-IBRS'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='erms'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='invpcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <blockers model='Haswell-v1'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='erms'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='hle'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='invpcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='rtm'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <blockers model='Haswell-v2'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='erms'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='invpcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <blockers model='Haswell-v3'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='erms'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='hle'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='invpcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='rtm'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <blockers model='Haswell-v4'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='erms'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='invpcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <blockers model='Icelake-Server'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512-vpopcntdq'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512bitalg'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512bw'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512cd'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512dq'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512f'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512vbmi'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512vbmi2'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512vl'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512vnni'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='erms'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='gfni'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='hle'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='invpcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='la57'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pku'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='rtm'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='vaes'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='vpclmulqdq'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <blockers model='Icelake-Server-noTSX'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512-vpopcntdq'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512bitalg'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512bw'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512cd'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512dq'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512f'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512vbmi'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512vbmi2'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512vl'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512vnni'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='erms'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='gfni'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='invpcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='la57'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pku'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='vaes'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='vpclmulqdq'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <blockers model='Icelake-Server-v1'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512-vpopcntdq'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512bitalg'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512bw'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512cd'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512dq'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512f'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512vbmi'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512vbmi2'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512vl'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512vnni'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='erms'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='gfni'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='hle'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='invpcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='la57'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pku'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='rtm'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='vaes'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='vpclmulqdq'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <blockers model='Icelake-Server-v2'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512-vpopcntdq'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512bitalg'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512bw'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512cd'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512dq'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512f'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512vbmi'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512vbmi2'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512vl'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512vnni'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='erms'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='gfni'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='invpcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='la57'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pku'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='vaes'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='vpclmulqdq'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <blockers model='Icelake-Server-v3'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512-vpopcntdq'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512bitalg'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512bw'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512cd'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512dq'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512f'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512vbmi'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512vbmi2'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512vl'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512vnni'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='erms'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='gfni'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='ibrs-all'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='invpcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='la57'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pku'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='taa-no'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='vaes'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='vpclmulqdq'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <blockers model='Icelake-Server-v4'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512-vpopcntdq'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512bitalg'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512bw'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512cd'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512dq'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512f'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512ifma'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512vbmi'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512vbmi2'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512vl'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512vnni'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='erms'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='fsrm'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='gfni'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='ibrs-all'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='invpcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='la57'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pku'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='taa-no'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='vaes'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='vpclmulqdq'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <blockers model='Icelake-Server-v5'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512-vpopcntdq'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512bitalg'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512bw'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512cd'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512dq'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512f'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512ifma'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512vbmi'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512vbmi2'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512vl'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512vnni'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='erms'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='fsrm'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='gfni'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='ibrs-all'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='invpcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='la57'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pku'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='taa-no'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='vaes'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='vpclmulqdq'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='xsaves'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <blockers model='Icelake-Server-v6'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512-vpopcntdq'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512bitalg'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512bw'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512cd'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512dq'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512f'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512ifma'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512vbmi'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512vbmi2'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512vl'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512vnni'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='erms'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='fsrm'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='gfni'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='ibrs-all'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='invpcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='la57'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pku'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='taa-no'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='vaes'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='vpclmulqdq'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='xsaves'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <blockers model='Icelake-Server-v7'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512-vpopcntdq'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512bitalg'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512bw'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512cd'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512dq'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512f'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512ifma'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512vbmi'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512vbmi2'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512vl'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512vnni'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='erms'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='fsrm'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='gfni'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='hle'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='ibrs-all'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='invpcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='la57'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pku'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='rtm'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='taa-no'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='vaes'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='vpclmulqdq'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='xsaves'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <blockers model='IvyBridge'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='erms'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <blockers model='IvyBridge-IBRS'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='erms'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <blockers model='IvyBridge-v1'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='erms'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <blockers model='IvyBridge-v2'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='erms'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <blockers model='KnightsMill'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512-4fmaps'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512-4vnniw'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512-vpopcntdq'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512cd'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512er'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512f'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512pf'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='erms'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='ss'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <blockers model='KnightsMill-v1'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512-4fmaps'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512-4vnniw'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512-vpopcntdq'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512cd'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512er'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512f'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512pf'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='erms'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='ss'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <blockers model='Opteron_G4'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='fma4'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='xop'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <blockers model='Opteron_G4-v1'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='fma4'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='xop'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <blockers model='Opteron_G5'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='fma4'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='tbm'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='xop'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <blockers model='Opteron_G5-v1'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='fma4'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='tbm'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='xop'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <blockers model='SapphireRapids'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='amx-bf16'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='amx-int8'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='amx-tile'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx-vnni'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512-bf16'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512-fp16'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512-vpopcntdq'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512bitalg'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512bw'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512cd'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512dq'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512f'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512ifma'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512vbmi'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512vbmi2'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512vl'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512vnni'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='bus-lock-detect'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='erms'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='fsrc'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='fsrm'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='fsrs'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='fzrm'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='gfni'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='hle'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='ibrs-all'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='invpcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='la57'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pku'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='rtm'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='serialize'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='taa-no'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='tsx-ldtrk'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='vaes'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='vpclmulqdq'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='xfd'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='xsaves'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <blockers model='SapphireRapids-v1'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='amx-bf16'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='amx-int8'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='amx-tile'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx-vnni'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512-bf16'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512-fp16'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512-vpopcntdq'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512bitalg'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512bw'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512cd'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512dq'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512f'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512ifma'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512vbmi'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512vbmi2'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512vl'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512vnni'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='bus-lock-detect'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='erms'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='fsrc'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='fsrm'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='fsrs'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='fzrm'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='gfni'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='hle'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='ibrs-all'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='invpcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='la57'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pku'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='rtm'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='serialize'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='taa-no'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='tsx-ldtrk'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='vaes'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='vpclmulqdq'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='xfd'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='xsaves'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <blockers model='SapphireRapids-v2'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='amx-bf16'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='amx-int8'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='amx-tile'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx-vnni'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512-bf16'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512-fp16'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512-vpopcntdq'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512bitalg'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512bw'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512cd'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512dq'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512f'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512ifma'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512vbmi'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512vbmi2'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512vl'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512vnni'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='bus-lock-detect'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='erms'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='fbsdp-no'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='fsrc'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='fsrm'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='fsrs'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='fzrm'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='gfni'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='hle'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='ibrs-all'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='invpcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='la57'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pku'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='psdp-no'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='rtm'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='sbdr-ssdp-no'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='serialize'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='taa-no'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='tsx-ldtrk'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='vaes'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='vpclmulqdq'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='xfd'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='xsaves'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <blockers model='SapphireRapids-v3'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='amx-bf16'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='amx-int8'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='amx-tile'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx-vnni'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512-bf16'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512-fp16'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512-vpopcntdq'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512bitalg'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512bw'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512cd'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512dq'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512f'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512ifma'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512vbmi'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512vbmi2'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512vl'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512vnni'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='bus-lock-detect'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='cldemote'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='erms'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='fbsdp-no'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='fsrc'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='fsrm'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='fsrs'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='fzrm'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='gfni'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='hle'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='ibrs-all'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='invpcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='la57'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='movdir64b'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='movdiri'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pku'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='psdp-no'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='rtm'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='sbdr-ssdp-no'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='serialize'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='ss'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='taa-no'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='tsx-ldtrk'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='vaes'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='vpclmulqdq'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='xfd'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='xsaves'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <blockers model='SierraForest'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx-ifma'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx-ne-convert'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx-vnni'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx-vnni-int8'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='bus-lock-detect'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='cmpccxadd'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='erms'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='fbsdp-no'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='fsrm'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='fsrs'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='gfni'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='ibrs-all'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='invpcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='mcdt-no'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pbrsb-no'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pku'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='psdp-no'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='sbdr-ssdp-no'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='serialize'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='vaes'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='vpclmulqdq'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='xsaves'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <blockers model='SierraForest-v1'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx-ifma'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx-ne-convert'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx-vnni'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx-vnni-int8'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='bus-lock-detect'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='cmpccxadd'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='erms'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='fbsdp-no'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='fsrm'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='fsrs'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='gfni'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='ibrs-all'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='invpcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='mcdt-no'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pbrsb-no'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pku'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='psdp-no'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='sbdr-ssdp-no'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='serialize'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='vaes'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='vpclmulqdq'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='xsaves'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <blockers model='Skylake-Client'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='erms'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='hle'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='invpcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='rtm'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <blockers model='Skylake-Client-IBRS'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='erms'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='hle'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='invpcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='rtm'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='erms'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='invpcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <blockers model='Skylake-Client-v1'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='erms'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='hle'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='invpcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='rtm'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <blockers model='Skylake-Client-v2'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='erms'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='hle'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='invpcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='rtm'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <blockers model='Skylake-Client-v3'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='erms'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='invpcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <blockers model='Skylake-Client-v4'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='erms'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='invpcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='xsaves'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <blockers model='Skylake-Server'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512bw'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512cd'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512dq'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512f'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512vl'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='erms'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='hle'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='invpcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pku'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='rtm'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <blockers model='Skylake-Server-IBRS'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512bw'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512cd'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512dq'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512f'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512vl'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='erms'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='hle'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='invpcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pku'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='rtm'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512bw'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512cd'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512dq'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512f'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512vl'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='erms'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='invpcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pku'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <blockers model='Skylake-Server-v1'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512bw'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512cd'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512dq'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512f'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512vl'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='erms'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='hle'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='invpcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pku'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='rtm'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <blockers model='Skylake-Server-v2'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512bw'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512cd'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512dq'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512f'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512vl'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='erms'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='hle'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='invpcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pku'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='rtm'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <blockers model='Skylake-Server-v3'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512bw'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512cd'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512dq'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512f'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512vl'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='erms'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='invpcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pku'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <blockers model='Skylake-Server-v4'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512bw'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512cd'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512dq'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512f'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512vl'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='erms'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='invpcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pku'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <blockers model='Skylake-Server-v5'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512bw'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512cd'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512dq'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512f'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512vl'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='erms'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='invpcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pku'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='xsaves'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <blockers model='Snowridge'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='cldemote'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='core-capability'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='erms'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='gfni'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='movdir64b'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='movdiri'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='mpx'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='split-lock-detect'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <blockers model='Snowridge-v1'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='cldemote'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='core-capability'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='erms'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='gfni'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='movdir64b'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='movdiri'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='mpx'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='split-lock-detect'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <blockers model='Snowridge-v2'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='cldemote'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='core-capability'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='erms'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='gfni'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='movdir64b'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='movdiri'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='split-lock-detect'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <blockers model='Snowridge-v3'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='cldemote'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='core-capability'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='erms'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='gfni'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='movdir64b'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='movdiri'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='split-lock-detect'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='xsaves'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <blockers model='Snowridge-v4'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='cldemote'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='erms'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='gfni'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='movdir64b'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='movdiri'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='xsaves'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <blockers model='athlon'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='3dnow'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='3dnowext'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <blockers model='athlon-v1'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='3dnow'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='3dnowext'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <blockers model='core2duo'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='ss'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <blockers model='core2duo-v1'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='ss'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <blockers model='coreduo'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='ss'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <blockers model='coreduo-v1'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='ss'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <blockers model='n270'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='ss'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <blockers model='n270-v1'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='ss'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <blockers model='phenom'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='3dnow'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='3dnowext'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <blockers model='phenom-v1'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='3dnow'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='3dnowext'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:     </mode>
Dec 05 09:11:52 compute-1 nova_compute[188144]:   </cpu>
Dec 05 09:11:52 compute-1 nova_compute[188144]:   <memoryBacking supported='yes'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:     <enum name='sourceType'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <value>file</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <value>anonymous</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <value>memfd</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:     </enum>
Dec 05 09:11:52 compute-1 nova_compute[188144]:   </memoryBacking>
Dec 05 09:11:52 compute-1 nova_compute[188144]:   <devices>
Dec 05 09:11:52 compute-1 nova_compute[188144]:     <disk supported='yes'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <enum name='diskDevice'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>disk</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>cdrom</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>floppy</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>lun</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </enum>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <enum name='bus'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>ide</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>fdc</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>scsi</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>virtio</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>usb</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>sata</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </enum>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <enum name='model'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>virtio</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>virtio-transitional</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>virtio-non-transitional</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </enum>
Dec 05 09:11:52 compute-1 nova_compute[188144]:     </disk>
Dec 05 09:11:52 compute-1 nova_compute[188144]:     <graphics supported='yes'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <enum name='type'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>vnc</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>egl-headless</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>dbus</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </enum>
Dec 05 09:11:52 compute-1 nova_compute[188144]:     </graphics>
Dec 05 09:11:52 compute-1 nova_compute[188144]:     <video supported='yes'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <enum name='modelType'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>vga</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>cirrus</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>virtio</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>none</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>bochs</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>ramfb</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </enum>
Dec 05 09:11:52 compute-1 nova_compute[188144]:     </video>
Dec 05 09:11:52 compute-1 nova_compute[188144]:     <hostdev supported='yes'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <enum name='mode'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>subsystem</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </enum>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <enum name='startupPolicy'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>default</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>mandatory</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>requisite</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>optional</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </enum>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <enum name='subsysType'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>usb</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>pci</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>scsi</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </enum>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <enum name='capsType'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <enum name='pciBackend'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:     </hostdev>
Dec 05 09:11:52 compute-1 nova_compute[188144]:     <rng supported='yes'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <enum name='model'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>virtio</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>virtio-transitional</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>virtio-non-transitional</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </enum>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <enum name='backendModel'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>random</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>egd</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>builtin</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </enum>
Dec 05 09:11:52 compute-1 nova_compute[188144]:     </rng>
Dec 05 09:11:52 compute-1 nova_compute[188144]:     <filesystem supported='yes'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <enum name='driverType'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>path</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>handle</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>virtiofs</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </enum>
Dec 05 09:11:52 compute-1 nova_compute[188144]:     </filesystem>
Dec 05 09:11:52 compute-1 nova_compute[188144]:     <tpm supported='yes'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <enum name='model'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>tpm-tis</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>tpm-crb</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </enum>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <enum name='backendModel'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>emulator</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>external</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </enum>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <enum name='backendVersion'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>2.0</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </enum>
Dec 05 09:11:52 compute-1 nova_compute[188144]:     </tpm>
Dec 05 09:11:52 compute-1 nova_compute[188144]:     <redirdev supported='yes'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <enum name='bus'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>usb</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </enum>
Dec 05 09:11:52 compute-1 nova_compute[188144]:     </redirdev>
Dec 05 09:11:52 compute-1 nova_compute[188144]:     <channel supported='yes'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <enum name='type'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>pty</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>unix</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </enum>
Dec 05 09:11:52 compute-1 nova_compute[188144]:     </channel>
Dec 05 09:11:52 compute-1 nova_compute[188144]:     <crypto supported='yes'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <enum name='model'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <enum name='type'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>qemu</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </enum>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <enum name='backendModel'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>builtin</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </enum>
Dec 05 09:11:52 compute-1 nova_compute[188144]:     </crypto>
Dec 05 09:11:52 compute-1 nova_compute[188144]:     <interface supported='yes'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <enum name='backendType'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>default</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>passt</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </enum>
Dec 05 09:11:52 compute-1 nova_compute[188144]:     </interface>
Dec 05 09:11:52 compute-1 nova_compute[188144]:     <panic supported='yes'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <enum name='model'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>isa</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>hyperv</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </enum>
Dec 05 09:11:52 compute-1 nova_compute[188144]:     </panic>
Dec 05 09:11:52 compute-1 nova_compute[188144]:     <console supported='yes'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <enum name='type'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>null</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>vc</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>pty</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>dev</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>file</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>pipe</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>stdio</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>udp</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>tcp</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>unix</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>qemu-vdagent</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>dbus</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </enum>
Dec 05 09:11:52 compute-1 nova_compute[188144]:     </console>
Dec 05 09:11:52 compute-1 nova_compute[188144]:   </devices>
Dec 05 09:11:52 compute-1 nova_compute[188144]:   <features>
Dec 05 09:11:52 compute-1 nova_compute[188144]:     <gic supported='no'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:     <vmcoreinfo supported='yes'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:     <genid supported='yes'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:     <backingStoreInput supported='yes'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:     <backup supported='yes'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:     <async-teardown supported='yes'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:     <ps2 supported='yes'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:     <sev supported='no'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:     <sgx supported='no'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:     <hyperv supported='yes'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <enum name='features'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>relaxed</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>vapic</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>spinlocks</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>vpindex</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>runtime</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>synic</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>stimer</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>reset</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>vendor_id</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>frequencies</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>reenlightenment</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>tlbflush</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>ipi</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>avic</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>emsr_bitmap</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>xmm_input</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </enum>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <defaults>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <spinlocks>4095</spinlocks>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <stimer_direct>on</stimer_direct>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <tlbflush_direct>on</tlbflush_direct>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <tlbflush_extended>on</tlbflush_extended>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <vendor_id>Linux KVM Hv</vendor_id>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </defaults>
Dec 05 09:11:52 compute-1 nova_compute[188144]:     </hyperv>
Dec 05 09:11:52 compute-1 nova_compute[188144]:     <launchSecurity supported='yes'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <enum name='sectype'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>tdx</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </enum>
Dec 05 09:11:52 compute-1 nova_compute[188144]:     </launchSecurity>
Dec 05 09:11:52 compute-1 nova_compute[188144]:   </features>
Dec 05 09:11:52 compute-1 nova_compute[188144]: </domainCapabilities>
Dec 05 09:11:52 compute-1 nova_compute[188144]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Dec 05 09:11:52 compute-1 nova_compute[188144]: 2025-12-05 09:11:52.080 188148 DEBUG nova.virt.libvirt.host [None req-d36cd56c-951f-4d9c-8180-5c001720b410 - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=q35:
Dec 05 09:11:52 compute-1 nova_compute[188144]: <domainCapabilities>
Dec 05 09:11:52 compute-1 nova_compute[188144]:   <path>/usr/libexec/qemu-kvm</path>
Dec 05 09:11:52 compute-1 nova_compute[188144]:   <domain>kvm</domain>
Dec 05 09:11:52 compute-1 nova_compute[188144]:   <machine>pc-q35-rhel9.8.0</machine>
Dec 05 09:11:52 compute-1 nova_compute[188144]:   <arch>x86_64</arch>
Dec 05 09:11:52 compute-1 nova_compute[188144]:   <vcpu max='4096'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:   <iothreads supported='yes'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:   <os supported='yes'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:     <enum name='firmware'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <value>efi</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:     </enum>
Dec 05 09:11:52 compute-1 nova_compute[188144]:     <loader supported='yes'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <value>/usr/share/edk2/ovmf/OVMF_CODE.secboot.fd</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <value>/usr/share/edk2/ovmf/OVMF_CODE.fd</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <value>/usr/share/edk2/ovmf/OVMF.amdsev.fd</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <value>/usr/share/edk2/ovmf/OVMF.inteltdx.secboot.fd</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <enum name='type'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>rom</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>pflash</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </enum>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <enum name='readonly'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>yes</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>no</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </enum>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <enum name='secure'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>yes</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>no</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </enum>
Dec 05 09:11:52 compute-1 nova_compute[188144]:     </loader>
Dec 05 09:11:52 compute-1 nova_compute[188144]:   </os>
Dec 05 09:11:52 compute-1 nova_compute[188144]:   <cpu>
Dec 05 09:11:52 compute-1 nova_compute[188144]:     <mode name='host-passthrough' supported='yes'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <enum name='hostPassthroughMigratable'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>on</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>off</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </enum>
Dec 05 09:11:52 compute-1 nova_compute[188144]:     </mode>
Dec 05 09:11:52 compute-1 nova_compute[188144]:     <mode name='maximum' supported='yes'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <enum name='maximumMigratable'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>on</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>off</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </enum>
Dec 05 09:11:52 compute-1 nova_compute[188144]:     </mode>
Dec 05 09:11:52 compute-1 nova_compute[188144]:     <mode name='host-model' supported='yes'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model fallback='forbid'>EPYC-Rome</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <vendor>AMD</vendor>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <maxphysaddr mode='passthrough' limit='40'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <feature policy='require' name='x2apic'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <feature policy='require' name='tsc-deadline'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <feature policy='require' name='hypervisor'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <feature policy='require' name='tsc_adjust'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <feature policy='require' name='spec-ctrl'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <feature policy='require' name='stibp'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <feature policy='require' name='ssbd'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <feature policy='require' name='cmp_legacy'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <feature policy='require' name='overflow-recov'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <feature policy='require' name='succor'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <feature policy='require' name='ibrs'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <feature policy='require' name='amd-ssbd'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <feature policy='require' name='virt-ssbd'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <feature policy='require' name='lbrv'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <feature policy='require' name='tsc-scale'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <feature policy='require' name='vmcb-clean'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <feature policy='require' name='flushbyasid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <feature policy='require' name='pause-filter'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <feature policy='require' name='pfthreshold'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <feature policy='require' name='svme-addr-chk'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <feature policy='require' name='lfence-always-serializing'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <feature policy='disable' name='xsaves'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:     </mode>
Dec 05 09:11:52 compute-1 nova_compute[188144]:     <mode name='custom' supported='yes'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <blockers model='Broadwell'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='erms'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='hle'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='invpcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='rtm'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <blockers model='Broadwell-IBRS'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='erms'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='hle'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='invpcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='rtm'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <blockers model='Broadwell-noTSX'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='erms'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='invpcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <blockers model='Broadwell-noTSX-IBRS'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='erms'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='invpcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <blockers model='Broadwell-v1'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='erms'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='hle'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='invpcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='rtm'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <blockers model='Broadwell-v2'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='erms'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='invpcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <blockers model='Broadwell-v3'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='erms'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='hle'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='invpcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='rtm'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <blockers model='Broadwell-v4'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='erms'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='invpcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <blockers model='Cascadelake-Server'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512bw'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512cd'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512dq'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512f'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512vl'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512vnni'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='erms'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='hle'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='invpcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pku'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='rtm'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <blockers model='Cascadelake-Server-noTSX'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512bw'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512cd'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512dq'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512f'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512vl'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512vnni'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='erms'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='ibrs-all'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='invpcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pku'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <blockers model='Cascadelake-Server-v1'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512bw'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512cd'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512dq'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512f'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512vl'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512vnni'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='erms'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='hle'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='invpcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pku'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='rtm'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <blockers model='Cascadelake-Server-v2'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512bw'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512cd'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512dq'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512f'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512vl'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512vnni'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='erms'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='hle'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='ibrs-all'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='invpcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pku'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='rtm'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <blockers model='Cascadelake-Server-v3'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512bw'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512cd'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512dq'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512f'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512vl'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512vnni'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='erms'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='ibrs-all'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='invpcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pku'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <blockers model='Cascadelake-Server-v4'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512bw'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512cd'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512dq'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512f'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512vl'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512vnni'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='erms'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='ibrs-all'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='invpcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pku'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <blockers model='Cascadelake-Server-v5'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512bw'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512cd'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512dq'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512f'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512vl'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512vnni'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='erms'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='ibrs-all'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='invpcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pku'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='xsaves'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <blockers model='Cooperlake'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512-bf16'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512bw'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512cd'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512dq'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512f'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512vl'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512vnni'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='erms'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='hle'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='ibrs-all'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='invpcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pku'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='rtm'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='taa-no'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <blockers model='Cooperlake-v1'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512-bf16'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512bw'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512cd'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512dq'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512f'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512vl'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512vnni'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='erms'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='hle'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='ibrs-all'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='invpcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pku'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='rtm'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='taa-no'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <blockers model='Cooperlake-v2'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512-bf16'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512bw'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512cd'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512dq'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512f'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512vl'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512vnni'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='erms'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='hle'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='ibrs-all'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='invpcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pku'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='rtm'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='taa-no'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='xsaves'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <blockers model='Denverton'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='erms'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='mpx'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <blockers model='Denverton-v1'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='erms'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='mpx'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <blockers model='Denverton-v2'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='erms'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <blockers model='Denverton-v3'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='erms'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='xsaves'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <blockers model='Dhyana-v2'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='xsaves'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <blockers model='EPYC-Genoa'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='amd-psfd'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='auto-ibrs'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512-bf16'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512-vpopcntdq'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512bitalg'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512bw'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512cd'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512dq'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512f'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512ifma'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512vbmi'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512vbmi2'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512vl'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512vnni'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='erms'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='fsrm'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='gfni'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='invpcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='la57'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='no-nested-data-bp'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='null-sel-clr-base'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pku'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='stibp-always-on'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='vaes'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='vpclmulqdq'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='xsaves'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <blockers model='EPYC-Genoa-v1'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='amd-psfd'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='auto-ibrs'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512-bf16'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512-vpopcntdq'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512bitalg'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512bw'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512cd'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512dq'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512f'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512ifma'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512vbmi'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512vbmi2'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512vl'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512vnni'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='erms'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='fsrm'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='gfni'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='invpcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='la57'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='no-nested-data-bp'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='null-sel-clr-base'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pku'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='stibp-always-on'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='vaes'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='vpclmulqdq'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='xsaves'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <blockers model='EPYC-Milan'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='erms'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='fsrm'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='invpcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pku'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='xsaves'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <blockers model='EPYC-Milan-v1'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='erms'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='fsrm'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='invpcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pku'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='xsaves'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <blockers model='EPYC-Milan-v2'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='amd-psfd'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='erms'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='fsrm'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='invpcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='no-nested-data-bp'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='null-sel-clr-base'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pku'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='stibp-always-on'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='vaes'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='vpclmulqdq'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='xsaves'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <blockers model='EPYC-Rome'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='xsaves'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <blockers model='EPYC-Rome-v1'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='xsaves'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <blockers model='EPYC-Rome-v2'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='xsaves'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <blockers model='EPYC-Rome-v3'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='xsaves'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <blockers model='EPYC-v3'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='xsaves'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <blockers model='EPYC-v4'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='xsaves'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <blockers model='GraniteRapids'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='amx-bf16'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='amx-fp16'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='amx-int8'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='amx-tile'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx-vnni'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512-bf16'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512-fp16'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512-vpopcntdq'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512bitalg'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512bw'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512cd'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512dq'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512f'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512ifma'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512vbmi'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512vbmi2'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512vl'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512vnni'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='bus-lock-detect'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='erms'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='fbsdp-no'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='fsrc'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='fsrm'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='fsrs'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='fzrm'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='gfni'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='hle'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='ibrs-all'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='invpcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='la57'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='mcdt-no'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pbrsb-no'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pku'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='prefetchiti'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='psdp-no'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='rtm'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='sbdr-ssdp-no'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='serialize'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='taa-no'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='tsx-ldtrk'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='vaes'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='vpclmulqdq'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='xfd'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='xsaves'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <blockers model='GraniteRapids-v1'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='amx-bf16'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='amx-fp16'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='amx-int8'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='amx-tile'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx-vnni'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512-bf16'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512-fp16'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512-vpopcntdq'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512bitalg'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512bw'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512cd'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512dq'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512f'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512ifma'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512vbmi'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512vbmi2'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512vl'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512vnni'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='bus-lock-detect'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='erms'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='fbsdp-no'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='fsrc'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='fsrm'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='fsrs'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='fzrm'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='gfni'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='hle'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='ibrs-all'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='invpcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='la57'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='mcdt-no'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pbrsb-no'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pku'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='prefetchiti'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='psdp-no'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='rtm'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='sbdr-ssdp-no'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='serialize'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='taa-no'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='tsx-ldtrk'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='vaes'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='vpclmulqdq'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='xfd'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='xsaves'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <blockers model='GraniteRapids-v2'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='amx-bf16'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='amx-fp16'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='amx-int8'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='amx-tile'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx-vnni'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx10'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx10-128'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx10-256'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx10-512'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512-bf16'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512-fp16'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512-vpopcntdq'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512bitalg'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512bw'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512cd'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512dq'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512f'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512ifma'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512vbmi'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512vbmi2'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512vl'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512vnni'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='bus-lock-detect'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='cldemote'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='erms'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='fbsdp-no'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='fsrc'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='fsrm'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='fsrs'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='fzrm'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='gfni'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='hle'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='ibrs-all'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='invpcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='la57'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='mcdt-no'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='movdir64b'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='movdiri'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pbrsb-no'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pku'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='prefetchiti'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='psdp-no'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='rtm'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='sbdr-ssdp-no'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='serialize'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='ss'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='taa-no'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='tsx-ldtrk'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='vaes'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='vpclmulqdq'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='xfd'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='xsaves'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <blockers model='Haswell'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='erms'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='hle'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='invpcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='rtm'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <blockers model='Haswell-IBRS'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='erms'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='hle'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='invpcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='rtm'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <blockers model='Haswell-noTSX'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='erms'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='invpcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <blockers model='Haswell-noTSX-IBRS'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='erms'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='invpcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <blockers model='Haswell-v1'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='erms'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='hle'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='invpcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='rtm'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <blockers model='Haswell-v2'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='erms'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='invpcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <blockers model='Haswell-v3'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='erms'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='hle'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='invpcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='rtm'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <blockers model='Haswell-v4'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='erms'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='invpcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <blockers model='Icelake-Server'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512-vpopcntdq'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512bitalg'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512bw'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512cd'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512dq'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512f'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512vbmi'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512vbmi2'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512vl'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512vnni'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='erms'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='gfni'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='hle'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='invpcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='la57'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pku'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='rtm'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='vaes'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='vpclmulqdq'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <blockers model='Icelake-Server-noTSX'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512-vpopcntdq'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512bitalg'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512bw'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512cd'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512dq'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512f'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512vbmi'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512vbmi2'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512vl'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512vnni'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='erms'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='gfni'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='invpcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='la57'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pku'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='vaes'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='vpclmulqdq'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <blockers model='Icelake-Server-v1'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512-vpopcntdq'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512bitalg'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512bw'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512cd'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512dq'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512f'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512vbmi'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512vbmi2'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512vl'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512vnni'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='erms'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='gfni'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='hle'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='invpcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='la57'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pku'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='rtm'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='vaes'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='vpclmulqdq'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <blockers model='Icelake-Server-v2'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512-vpopcntdq'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512bitalg'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512bw'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512cd'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512dq'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512f'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512vbmi'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512vbmi2'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512vl'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512vnni'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='erms'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='gfni'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='invpcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='la57'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pku'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='vaes'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='vpclmulqdq'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <blockers model='Icelake-Server-v3'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512-vpopcntdq'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512bitalg'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512bw'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512cd'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512dq'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512f'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512vbmi'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512vbmi2'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512vl'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512vnni'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='erms'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='gfni'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='ibrs-all'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='invpcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='la57'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pku'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='taa-no'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='vaes'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='vpclmulqdq'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <blockers model='Icelake-Server-v4'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512-vpopcntdq'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512bitalg'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512bw'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512cd'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512dq'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512f'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512ifma'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512vbmi'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512vbmi2'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512vl'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512vnni'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='erms'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='fsrm'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='gfni'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='ibrs-all'/>
Dec 05 09:11:52 compute-1 sudo[188999]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-invqdeheqtxxvmjzhmwfjjyehairicmh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925911.913567-4464-244681724616558/AnsiballZ_systemd.py'
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='invpcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='la57'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pku'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='taa-no'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='vaes'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='vpclmulqdq'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <blockers model='Icelake-Server-v5'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512-vpopcntdq'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512bitalg'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512bw'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512cd'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512dq'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512f'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512ifma'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512vbmi'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512vbmi2'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512vl'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512vnni'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='erms'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='fsrm'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='gfni'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='ibrs-all'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='invpcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='la57'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pku'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='taa-no'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='vaes'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='vpclmulqdq'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='xsaves'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <blockers model='Icelake-Server-v6'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512-vpopcntdq'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512bitalg'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512bw'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512cd'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512dq'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512f'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512ifma'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512vbmi'/>
Dec 05 09:11:52 compute-1 sudo[188999]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512vbmi2'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512vl'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512vnni'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='erms'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='fsrm'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='gfni'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='ibrs-all'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='invpcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='la57'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pku'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='taa-no'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='vaes'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='vpclmulqdq'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='xsaves'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <blockers model='Icelake-Server-v7'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512-vpopcntdq'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512bitalg'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512bw'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512cd'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512dq'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512f'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512ifma'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512vbmi'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512vbmi2'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512vl'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512vnni'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='erms'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='fsrm'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='gfni'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='hle'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='ibrs-all'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='invpcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='la57'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pku'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='rtm'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='taa-no'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='vaes'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='vpclmulqdq'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='xsaves'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <blockers model='IvyBridge'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='erms'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <blockers model='IvyBridge-IBRS'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='erms'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <blockers model='IvyBridge-v1'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='erms'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <blockers model='IvyBridge-v2'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='erms'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <blockers model='KnightsMill'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512-4fmaps'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512-4vnniw'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512-vpopcntdq'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512cd'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512er'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512f'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512pf'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='erms'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='ss'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <blockers model='KnightsMill-v1'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512-4fmaps'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512-4vnniw'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512-vpopcntdq'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512cd'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512er'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512f'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512pf'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='erms'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='ss'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <blockers model='Opteron_G4'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='fma4'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='xop'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <blockers model='Opteron_G4-v1'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='fma4'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='xop'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <blockers model='Opteron_G5'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='fma4'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='tbm'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='xop'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <blockers model='Opteron_G5-v1'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='fma4'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='tbm'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='xop'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <blockers model='SapphireRapids'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='amx-bf16'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='amx-int8'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='amx-tile'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx-vnni'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512-bf16'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512-fp16'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512-vpopcntdq'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512bitalg'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512bw'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512cd'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512dq'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512f'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512ifma'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512vbmi'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512vbmi2'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512vl'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512vnni'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='bus-lock-detect'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='erms'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='fsrc'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='fsrm'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='fsrs'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='fzrm'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='gfni'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='hle'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='ibrs-all'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='invpcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='la57'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pku'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='rtm'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='serialize'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='taa-no'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='tsx-ldtrk'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='vaes'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='vpclmulqdq'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='xfd'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='xsaves'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <blockers model='SapphireRapids-v1'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='amx-bf16'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='amx-int8'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='amx-tile'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx-vnni'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512-bf16'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512-fp16'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512-vpopcntdq'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512bitalg'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512bw'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512cd'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512dq'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512f'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512ifma'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512vbmi'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512vbmi2'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512vl'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512vnni'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='bus-lock-detect'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='erms'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='fsrc'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='fsrm'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='fsrs'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='fzrm'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='gfni'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='hle'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='ibrs-all'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='invpcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='la57'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pku'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='rtm'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='serialize'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='taa-no'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='tsx-ldtrk'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='vaes'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='vpclmulqdq'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='xfd'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='xsaves'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <blockers model='SapphireRapids-v2'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='amx-bf16'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='amx-int8'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='amx-tile'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx-vnni'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512-bf16'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512-fp16'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512-vpopcntdq'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512bitalg'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512bw'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512cd'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512dq'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512f'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512ifma'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512vbmi'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512vbmi2'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512vl'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512vnni'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='bus-lock-detect'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='erms'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='fbsdp-no'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='fsrc'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='fsrm'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='fsrs'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='fzrm'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='gfni'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='hle'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='ibrs-all'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='invpcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='la57'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pku'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='psdp-no'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='rtm'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='sbdr-ssdp-no'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='serialize'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='taa-no'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='tsx-ldtrk'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='vaes'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='vpclmulqdq'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='xfd'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='xsaves'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <blockers model='SapphireRapids-v3'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='amx-bf16'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='amx-int8'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='amx-tile'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx-vnni'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512-bf16'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512-fp16'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512-vpopcntdq'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512bitalg'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512bw'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512cd'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512dq'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512f'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512ifma'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512vbmi'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512vbmi2'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512vl'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512vnni'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='bus-lock-detect'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='cldemote'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='erms'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='fbsdp-no'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='fsrc'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='fsrm'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='fsrs'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='fzrm'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='gfni'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='hle'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='ibrs-all'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='invpcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='la57'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='movdir64b'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='movdiri'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pku'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='psdp-no'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='rtm'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='sbdr-ssdp-no'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='serialize'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='ss'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='taa-no'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='tsx-ldtrk'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='vaes'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='vpclmulqdq'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='xfd'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='xsaves'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <blockers model='SierraForest'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx-ifma'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx-ne-convert'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx-vnni'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx-vnni-int8'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='bus-lock-detect'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='cmpccxadd'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='erms'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='fbsdp-no'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='fsrm'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='fsrs'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='gfni'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='ibrs-all'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='invpcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='mcdt-no'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pbrsb-no'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pku'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='psdp-no'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='sbdr-ssdp-no'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='serialize'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='vaes'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='vpclmulqdq'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='xsaves'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <blockers model='SierraForest-v1'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx-ifma'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx-ne-convert'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx-vnni'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx-vnni-int8'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='bus-lock-detect'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='cmpccxadd'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='erms'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='fbsdp-no'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='fsrm'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='fsrs'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='gfni'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='ibrs-all'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='invpcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='mcdt-no'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pbrsb-no'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pku'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='psdp-no'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='sbdr-ssdp-no'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='serialize'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='vaes'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='vpclmulqdq'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='xsaves'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <blockers model='Skylake-Client'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='erms'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='hle'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='invpcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='rtm'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <blockers model='Skylake-Client-IBRS'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='erms'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='hle'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='invpcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='rtm'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='erms'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='invpcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <blockers model='Skylake-Client-v1'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='erms'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='hle'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='invpcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='rtm'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <blockers model='Skylake-Client-v2'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='erms'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='hle'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='invpcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='rtm'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <blockers model='Skylake-Client-v3'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='erms'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='invpcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <blockers model='Skylake-Client-v4'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='erms'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='invpcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='xsaves'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <blockers model='Skylake-Server'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512bw'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512cd'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512dq'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512f'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512vl'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='erms'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='hle'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='invpcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pku'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='rtm'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <blockers model='Skylake-Server-IBRS'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512bw'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512cd'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512dq'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512f'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512vl'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='erms'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='hle'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='invpcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pku'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='rtm'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512bw'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512cd'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512dq'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512f'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512vl'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='erms'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='invpcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pku'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <blockers model='Skylake-Server-v1'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512bw'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512cd'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512dq'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512f'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512vl'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='erms'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='hle'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='invpcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pku'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='rtm'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <blockers model='Skylake-Server-v2'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512bw'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512cd'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512dq'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512f'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512vl'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='erms'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='hle'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='invpcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pku'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='rtm'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <blockers model='Skylake-Server-v3'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512bw'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512cd'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512dq'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512f'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512vl'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='erms'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='invpcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pku'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <blockers model='Skylake-Server-v4'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512bw'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512cd'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512dq'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512f'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512vl'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='erms'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='invpcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pku'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <blockers model='Skylake-Server-v5'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512bw'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512cd'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512dq'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512f'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='avx512vl'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='erms'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='invpcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pcid'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='pku'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='xsaves'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <blockers model='Snowridge'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='cldemote'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='core-capability'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='erms'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='gfni'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='movdir64b'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='movdiri'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='mpx'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='split-lock-detect'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <blockers model='Snowridge-v1'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='cldemote'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='core-capability'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='erms'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='gfni'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='movdir64b'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='movdiri'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='mpx'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='split-lock-detect'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <blockers model='Snowridge-v2'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='cldemote'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='core-capability'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='erms'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='gfni'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='movdir64b'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='movdiri'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='split-lock-detect'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <blockers model='Snowridge-v3'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='cldemote'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='core-capability'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='erms'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='gfni'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='movdir64b'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='movdiri'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='split-lock-detect'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='xsaves'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <blockers model='Snowridge-v4'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='cldemote'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='erms'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='gfni'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='movdir64b'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='movdiri'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='xsaves'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <blockers model='athlon'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='3dnow'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='3dnowext'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <blockers model='athlon-v1'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='3dnow'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='3dnowext'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <blockers model='core2duo'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='ss'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <blockers model='core2duo-v1'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='ss'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <blockers model='coreduo'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='ss'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <blockers model='coreduo-v1'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='ss'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <blockers model='n270'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='ss'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <blockers model='n270-v1'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='ss'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <blockers model='phenom'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='3dnow'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='3dnowext'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <blockers model='phenom-v1'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='3dnow'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <feature name='3dnowext'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </blockers>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]:     </mode>
Dec 05 09:11:52 compute-1 nova_compute[188144]:   </cpu>
Dec 05 09:11:52 compute-1 nova_compute[188144]:   <memoryBacking supported='yes'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:     <enum name='sourceType'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <value>file</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <value>anonymous</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <value>memfd</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:     </enum>
Dec 05 09:11:52 compute-1 nova_compute[188144]:   </memoryBacking>
Dec 05 09:11:52 compute-1 nova_compute[188144]:   <devices>
Dec 05 09:11:52 compute-1 nova_compute[188144]:     <disk supported='yes'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <enum name='diskDevice'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>disk</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>cdrom</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>floppy</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>lun</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </enum>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <enum name='bus'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>fdc</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>scsi</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>virtio</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>usb</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>sata</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </enum>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <enum name='model'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>virtio</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>virtio-transitional</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>virtio-non-transitional</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </enum>
Dec 05 09:11:52 compute-1 nova_compute[188144]:     </disk>
Dec 05 09:11:52 compute-1 nova_compute[188144]:     <graphics supported='yes'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <enum name='type'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>vnc</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>egl-headless</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>dbus</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </enum>
Dec 05 09:11:52 compute-1 nova_compute[188144]:     </graphics>
Dec 05 09:11:52 compute-1 nova_compute[188144]:     <video supported='yes'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <enum name='modelType'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>vga</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>cirrus</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>virtio</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>none</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>bochs</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>ramfb</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </enum>
Dec 05 09:11:52 compute-1 nova_compute[188144]:     </video>
Dec 05 09:11:52 compute-1 nova_compute[188144]:     <hostdev supported='yes'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <enum name='mode'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>subsystem</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </enum>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <enum name='startupPolicy'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>default</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>mandatory</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>requisite</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>optional</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </enum>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <enum name='subsysType'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>usb</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>pci</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>scsi</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </enum>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <enum name='capsType'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <enum name='pciBackend'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:     </hostdev>
Dec 05 09:11:52 compute-1 nova_compute[188144]:     <rng supported='yes'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <enum name='model'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>virtio</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>virtio-transitional</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>virtio-non-transitional</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </enum>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <enum name='backendModel'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>random</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>egd</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>builtin</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </enum>
Dec 05 09:11:52 compute-1 nova_compute[188144]:     </rng>
Dec 05 09:11:52 compute-1 nova_compute[188144]:     <filesystem supported='yes'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <enum name='driverType'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>path</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>handle</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>virtiofs</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </enum>
Dec 05 09:11:52 compute-1 nova_compute[188144]:     </filesystem>
Dec 05 09:11:52 compute-1 nova_compute[188144]:     <tpm supported='yes'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <enum name='model'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>tpm-tis</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>tpm-crb</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </enum>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <enum name='backendModel'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>emulator</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>external</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </enum>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <enum name='backendVersion'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>2.0</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </enum>
Dec 05 09:11:52 compute-1 nova_compute[188144]:     </tpm>
Dec 05 09:11:52 compute-1 nova_compute[188144]:     <redirdev supported='yes'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <enum name='bus'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>usb</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </enum>
Dec 05 09:11:52 compute-1 nova_compute[188144]:     </redirdev>
Dec 05 09:11:52 compute-1 nova_compute[188144]:     <channel supported='yes'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <enum name='type'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>pty</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>unix</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </enum>
Dec 05 09:11:52 compute-1 nova_compute[188144]:     </channel>
Dec 05 09:11:52 compute-1 nova_compute[188144]:     <crypto supported='yes'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <enum name='model'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <enum name='type'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>qemu</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </enum>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <enum name='backendModel'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>builtin</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </enum>
Dec 05 09:11:52 compute-1 nova_compute[188144]:     </crypto>
Dec 05 09:11:52 compute-1 nova_compute[188144]:     <interface supported='yes'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <enum name='backendType'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>default</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>passt</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </enum>
Dec 05 09:11:52 compute-1 nova_compute[188144]:     </interface>
Dec 05 09:11:52 compute-1 nova_compute[188144]:     <panic supported='yes'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <enum name='model'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>isa</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>hyperv</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </enum>
Dec 05 09:11:52 compute-1 nova_compute[188144]:     </panic>
Dec 05 09:11:52 compute-1 nova_compute[188144]:     <console supported='yes'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <enum name='type'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>null</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>vc</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>pty</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>dev</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>file</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>pipe</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>stdio</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>udp</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>tcp</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>unix</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>qemu-vdagent</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>dbus</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </enum>
Dec 05 09:11:52 compute-1 nova_compute[188144]:     </console>
Dec 05 09:11:52 compute-1 nova_compute[188144]:   </devices>
Dec 05 09:11:52 compute-1 nova_compute[188144]:   <features>
Dec 05 09:11:52 compute-1 nova_compute[188144]:     <gic supported='no'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:     <vmcoreinfo supported='yes'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:     <genid supported='yes'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:     <backingStoreInput supported='yes'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:     <backup supported='yes'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:     <async-teardown supported='yes'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:     <ps2 supported='yes'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:     <sev supported='no'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:     <sgx supported='no'/>
Dec 05 09:11:52 compute-1 nova_compute[188144]:     <hyperv supported='yes'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <enum name='features'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>relaxed</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>vapic</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>spinlocks</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>vpindex</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>runtime</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>synic</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>stimer</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>reset</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>vendor_id</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>frequencies</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>reenlightenment</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>tlbflush</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>ipi</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>avic</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>emsr_bitmap</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>xmm_input</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </enum>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <defaults>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <spinlocks>4095</spinlocks>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <stimer_direct>on</stimer_direct>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <tlbflush_direct>on</tlbflush_direct>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <tlbflush_extended>on</tlbflush_extended>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <vendor_id>Linux KVM Hv</vendor_id>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </defaults>
Dec 05 09:11:52 compute-1 nova_compute[188144]:     </hyperv>
Dec 05 09:11:52 compute-1 nova_compute[188144]:     <launchSecurity supported='yes'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       <enum name='sectype'>
Dec 05 09:11:52 compute-1 nova_compute[188144]:         <value>tdx</value>
Dec 05 09:11:52 compute-1 nova_compute[188144]:       </enum>
Dec 05 09:11:52 compute-1 nova_compute[188144]:     </launchSecurity>
Dec 05 09:11:52 compute-1 nova_compute[188144]:   </features>
Dec 05 09:11:52 compute-1 nova_compute[188144]: </domainCapabilities>
Dec 05 09:11:52 compute-1 nova_compute[188144]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Dec 05 09:11:52 compute-1 nova_compute[188144]: 2025-12-05 09:11:52.151 188148 DEBUG nova.virt.libvirt.host [None req-d36cd56c-951f-4d9c-8180-5c001720b410 - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782
Dec 05 09:11:52 compute-1 nova_compute[188144]: 2025-12-05 09:11:52.152 188148 DEBUG nova.virt.libvirt.host [None req-d36cd56c-951f-4d9c-8180-5c001720b410 - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782
Dec 05 09:11:52 compute-1 nova_compute[188144]: 2025-12-05 09:11:52.152 188148 DEBUG nova.virt.libvirt.host [None req-d36cd56c-951f-4d9c-8180-5c001720b410 - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782
Dec 05 09:11:52 compute-1 nova_compute[188144]: 2025-12-05 09:11:52.152 188148 INFO nova.virt.libvirt.host [None req-d36cd56c-951f-4d9c-8180-5c001720b410 - - - - - -] Secure Boot support detected
Dec 05 09:11:52 compute-1 nova_compute[188144]: 2025-12-05 09:11:52.155 188148 INFO nova.virt.libvirt.driver [None req-d36cd56c-951f-4d9c-8180-5c001720b410 - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.
Dec 05 09:11:52 compute-1 nova_compute[188144]: 2025-12-05 09:11:52.155 188148 INFO nova.virt.libvirt.driver [None req-d36cd56c-951f-4d9c-8180-5c001720b410 - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.
Dec 05 09:11:52 compute-1 nova_compute[188144]: 2025-12-05 09:11:52.170 188148 DEBUG nova.virt.libvirt.driver [None req-d36cd56c-951f-4d9c-8180-5c001720b410 - - - - - -] cpu compare xml: <cpu match="exact">
Dec 05 09:11:52 compute-1 nova_compute[188144]:   <model>Nehalem</model>
Dec 05 09:11:52 compute-1 nova_compute[188144]: </cpu>
Dec 05 09:11:52 compute-1 nova_compute[188144]:  _compare_cpu /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10019
Dec 05 09:11:52 compute-1 nova_compute[188144]: 2025-12-05 09:11:52.176 188148 DEBUG nova.virt.libvirt.driver [None req-d36cd56c-951f-4d9c-8180-5c001720b410 - - - - - -] Enabling emulated TPM support _check_vtpm_support /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1097
Dec 05 09:11:52 compute-1 python3.9[189001]: ansible-ansible.builtin.systemd Invoked with name=edpm_nova_compute.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 05 09:11:52 compute-1 systemd[1]: Stopping nova_compute container...
Dec 05 09:11:52 compute-1 nova_compute[188144]: 2025-12-05 09:11:52.649 188148 DEBUG oslo_concurrency.lockutils [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 09:11:52 compute-1 nova_compute[188144]: 2025-12-05 09:11:52.650 188148 DEBUG oslo_concurrency.lockutils [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 09:11:52 compute-1 nova_compute[188144]: 2025-12-05 09:11:52.650 188148 DEBUG oslo_concurrency.lockutils [None req-3b7978d9-3873-4cdc-bc83-ed09e977e0f5 - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 09:11:53 compute-1 virtqemud[188731]: libvirt version: 11.9.0, package: 1.el9 (builder@centos.org, 2025-11-04-09:54:50, )
Dec 05 09:11:53 compute-1 virtqemud[188731]: hostname: compute-1
Dec 05 09:11:53 compute-1 virtqemud[188731]: End of file while reading data: Input/output error
Dec 05 09:11:53 compute-1 systemd[1]: libpod-0083f575e3208a4b9eecd208ae4078ca03e0269dd6771393ca4b6d2e2f2278e8.scope: Deactivated successfully.
Dec 05 09:11:53 compute-1 systemd[1]: libpod-0083f575e3208a4b9eecd208ae4078ca03e0269dd6771393ca4b6d2e2f2278e8.scope: Consumed 4.072s CPU time.
Dec 05 09:11:53 compute-1 podman[189005]: 2025-12-05 09:11:53.25529102 +0000 UTC m=+0.659839456 container died 0083f575e3208a4b9eecd208ae4078ca03e0269dd6771393ca4b6d2e2f2278e8 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, container_name=nova_compute, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, config_id=edpm)
Dec 05 09:11:53 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-0083f575e3208a4b9eecd208ae4078ca03e0269dd6771393ca4b6d2e2f2278e8-userdata-shm.mount: Deactivated successfully.
Dec 05 09:11:53 compute-1 systemd[1]: var-lib-containers-storage-overlay-7b1f9d91d5b5128a72e0eeedcad988fe210bcab760888c4e537eee3c6b8e0d29-merged.mount: Deactivated successfully.
Dec 05 09:11:54 compute-1 podman[189005]: 2025-12-05 09:11:54.018213889 +0000 UTC m=+1.422762325 container cleanup 0083f575e3208a4b9eecd208ae4078ca03e0269dd6771393ca4b6d2e2f2278e8 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, config_id=edpm, container_name=nova_compute, io.buildah.version=1.41.3)
Dec 05 09:11:54 compute-1 podman[189005]: nova_compute
Dec 05 09:11:54 compute-1 podman[189036]: nova_compute
Dec 05 09:11:54 compute-1 systemd[1]: edpm_nova_compute.service: Deactivated successfully.
Dec 05 09:11:54 compute-1 systemd[1]: Stopped nova_compute container.
Dec 05 09:11:54 compute-1 systemd[1]: Starting nova_compute container...
Dec 05 09:11:54 compute-1 systemd[1]: Started libcrun container.
Dec 05 09:11:54 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7b1f9d91d5b5128a72e0eeedcad988fe210bcab760888c4e537eee3c6b8e0d29/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Dec 05 09:11:54 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7b1f9d91d5b5128a72e0eeedcad988fe210bcab760888c4e537eee3c6b8e0d29/merged/etc/nvme supports timestamps until 2038 (0x7fffffff)
Dec 05 09:11:54 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7b1f9d91d5b5128a72e0eeedcad988fe210bcab760888c4e537eee3c6b8e0d29/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Dec 05 09:11:54 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7b1f9d91d5b5128a72e0eeedcad988fe210bcab760888c4e537eee3c6b8e0d29/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff)
Dec 05 09:11:54 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7b1f9d91d5b5128a72e0eeedcad988fe210bcab760888c4e537eee3c6b8e0d29/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Dec 05 09:11:54 compute-1 podman[189050]: 2025-12-05 09:11:54.212886835 +0000 UTC m=+0.090565568 container init 0083f575e3208a4b9eecd208ae4078ca03e0269dd6771393ca4b6d2e2f2278e8 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, org.label-schema.vendor=CentOS, config_id=edpm, container_name=nova_compute, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2)
Dec 05 09:11:54 compute-1 podman[189050]: 2025-12-05 09:11:54.223478325 +0000 UTC m=+0.101157028 container start 0083f575e3208a4b9eecd208ae4078ca03e0269dd6771393ca4b6d2e2f2278e8 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, config_id=edpm, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, container_name=nova_compute, maintainer=OpenStack Kubernetes Operator team)
Dec 05 09:11:54 compute-1 podman[189050]: nova_compute
Dec 05 09:11:54 compute-1 nova_compute[189066]: + sudo -E kolla_set_configs
Dec 05 09:11:54 compute-1 systemd[1]: Started nova_compute container.
Dec 05 09:11:54 compute-1 sudo[188999]: pam_unix(sudo:session): session closed for user root
Dec 05 09:11:54 compute-1 nova_compute[189066]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Dec 05 09:11:54 compute-1 nova_compute[189066]: INFO:__main__:Validating config file
Dec 05 09:11:54 compute-1 nova_compute[189066]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Dec 05 09:11:54 compute-1 nova_compute[189066]: INFO:__main__:Copying service configuration files
Dec 05 09:11:54 compute-1 nova_compute[189066]: INFO:__main__:Deleting /etc/nova/nova.conf
Dec 05 09:11:54 compute-1 nova_compute[189066]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf
Dec 05 09:11:54 compute-1 nova_compute[189066]: INFO:__main__:Setting permission for /etc/nova/nova.conf
Dec 05 09:11:54 compute-1 nova_compute[189066]: INFO:__main__:Deleting /etc/nova/nova.conf.d/01-nova.conf
Dec 05 09:11:54 compute-1 nova_compute[189066]: INFO:__main__:Copying /var/lib/kolla/config_files/01-nova.conf to /etc/nova/nova.conf.d/01-nova.conf
Dec 05 09:11:54 compute-1 nova_compute[189066]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/01-nova.conf
Dec 05 09:11:54 compute-1 nova_compute[189066]: INFO:__main__:Deleting /etc/nova/nova.conf.d/25-nova-extra.conf
Dec 05 09:11:54 compute-1 nova_compute[189066]: INFO:__main__:Copying /var/lib/kolla/config_files/25-nova-extra.conf to /etc/nova/nova.conf.d/25-nova-extra.conf
Dec 05 09:11:54 compute-1 nova_compute[189066]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/25-nova-extra.conf
Dec 05 09:11:54 compute-1 nova_compute[189066]: INFO:__main__:Deleting /etc/nova/nova.conf.d/nova-blank.conf
Dec 05 09:11:54 compute-1 nova_compute[189066]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf.d/nova-blank.conf
Dec 05 09:11:54 compute-1 nova_compute[189066]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/nova-blank.conf
Dec 05 09:11:54 compute-1 nova_compute[189066]: INFO:__main__:Deleting /etc/nova/nova.conf.d/02-nova-host-specific.conf
Dec 05 09:11:54 compute-1 nova_compute[189066]: INFO:__main__:Copying /var/lib/kolla/config_files/02-nova-host-specific.conf to /etc/nova/nova.conf.d/02-nova-host-specific.conf
Dec 05 09:11:54 compute-1 nova_compute[189066]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/02-nova-host-specific.conf
Dec 05 09:11:54 compute-1 nova_compute[189066]: INFO:__main__:Deleting /etc/ceph
Dec 05 09:11:54 compute-1 nova_compute[189066]: INFO:__main__:Creating directory /etc/ceph
Dec 05 09:11:54 compute-1 nova_compute[189066]: INFO:__main__:Setting permission for /etc/ceph
Dec 05 09:11:54 compute-1 nova_compute[189066]: INFO:__main__:Deleting /var/lib/nova/.ssh/ssh-privatekey
Dec 05 09:11:54 compute-1 nova_compute[189066]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-privatekey to /var/lib/nova/.ssh/ssh-privatekey
Dec 05 09:11:54 compute-1 nova_compute[189066]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Dec 05 09:11:54 compute-1 nova_compute[189066]: INFO:__main__:Deleting /var/lib/nova/.ssh/config
Dec 05 09:11:54 compute-1 nova_compute[189066]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-config to /var/lib/nova/.ssh/config
Dec 05 09:11:54 compute-1 nova_compute[189066]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Dec 05 09:11:54 compute-1 nova_compute[189066]: INFO:__main__:Deleting /usr/sbin/iscsiadm
Dec 05 09:11:54 compute-1 nova_compute[189066]: INFO:__main__:Copying /var/lib/kolla/config_files/run-on-host to /usr/sbin/iscsiadm
Dec 05 09:11:54 compute-1 nova_compute[189066]: INFO:__main__:Setting permission for /usr/sbin/iscsiadm
Dec 05 09:11:54 compute-1 nova_compute[189066]: INFO:__main__:Writing out command to execute
Dec 05 09:11:54 compute-1 nova_compute[189066]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/
Dec 05 09:11:54 compute-1 nova_compute[189066]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Dec 05 09:11:54 compute-1 nova_compute[189066]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Dec 05 09:11:54 compute-1 nova_compute[189066]: ++ cat /run_command
Dec 05 09:11:54 compute-1 nova_compute[189066]: + CMD=nova-compute
Dec 05 09:11:54 compute-1 nova_compute[189066]: + ARGS=
Dec 05 09:11:54 compute-1 nova_compute[189066]: + sudo kolla_copy_cacerts
Dec 05 09:11:54 compute-1 nova_compute[189066]: + [[ ! -n '' ]]
Dec 05 09:11:54 compute-1 nova_compute[189066]: + . kolla_extend_start
Dec 05 09:11:54 compute-1 nova_compute[189066]: + echo 'Running command: '\''nova-compute'\'''
Dec 05 09:11:54 compute-1 nova_compute[189066]: + umask 0022
Dec 05 09:11:54 compute-1 nova_compute[189066]: + exec nova-compute
Dec 05 09:11:54 compute-1 nova_compute[189066]: Running command: 'nova-compute'
Dec 05 09:11:54 compute-1 sudo[189227]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gxrvlmgaqkfbcxrnylkjwpsktqeoljpd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925914.5927463-4491-28509119657538/AnsiballZ_podman_container.py'
Dec 05 09:11:54 compute-1 sudo[189227]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:11:55 compute-1 python3.9[189229]: ansible-containers.podman.podman_container Invoked with name=nova_compute_init state=started executable=podman detach=True debug=False force_restart=False force_delete=True generate_systemd={} image_strict=False recreate=False image=None annotation=None arch=None attach=None authfile=None blkio_weight=None blkio_weight_device=None cap_add=None cap_drop=None cgroup_conf=None cgroup_parent=None cgroupns=None cgroups=None chrootdirs=None cidfile=None cmd_args=None conmon_pidfile=None command=None cpu_period=None cpu_quota=None cpu_rt_period=None cpu_rt_runtime=None cpu_shares=None cpus=None cpuset_cpus=None cpuset_mems=None decryption_key=None delete_depend=None delete_time=None delete_volumes=None detach_keys=None device=None device_cgroup_rule=None device_read_bps=None device_read_iops=None device_write_bps=None device_write_iops=None dns=None dns_option=None dns_search=None entrypoint=None env=None env_file=None env_host=None env_merge=None etc_hosts=None expose=None gidmap=None gpus=None group_add=None group_entry=None healthcheck=None healthcheck_interval=None healthcheck_retries=None healthcheck_start_period=None health_startup_cmd=None health_startup_interval=None health_startup_retries=None health_startup_success=None health_startup_timeout=None healthcheck_timeout=None healthcheck_failure_action=None hooks_dir=None hostname=None hostuser=None http_proxy=None image_volume=None init=None init_ctr=None init_path=None interactive=None ip=None ip6=None ipc=None kernel_memory=None label=None label_file=None log_driver=None log_level=None log_opt=None mac_address=None memory=None memory_reservation=None memory_swap=None memory_swappiness=None mount=None network=None network_aliases=None no_healthcheck=None no_hosts=None oom_kill_disable=None oom_score_adj=None os=None passwd=None passwd_entry=None personality=None pid=None pid_file=None pids_limit=None platform=None pod=None pod_id_file=None preserve_fd=None preserve_fds=None privileged=None publish=None publish_all=None pull=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None rdt_class=None read_only=None read_only_tmpfs=None requires=None restart_policy=None restart_time=None retry=None retry_delay=None rm=None rmi=None rootfs=None seccomp_policy=None secrets=NOT_LOGGING_PARAMETER sdnotify=None security_opt=None shm_size=None shm_size_systemd=None sig_proxy=None stop_signal=None stop_timeout=None stop_time=None subgidname=None subuidname=None sysctl=None systemd=None timeout=None timezone=None tls_verify=None tmpfs=None tty=None uidmap=None ulimit=None umask=None unsetenv=None unsetenv_all=None user=None userns=None uts=None variant=None volume=None volumes_from=None workdir=None
Dec 05 09:11:55 compute-1 systemd[1]: Started libpod-conmon-59dfeeed11c4e649ef970504cbc5a1f7167f0c7de0b0af93e55c353542dfe0ab.scope.
Dec 05 09:11:55 compute-1 systemd[1]: Started libcrun container.
Dec 05 09:11:55 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7d7f73ded750e19702abb7b872a39e49b2e92387d18f43689ac4f59277d46f9d/merged/usr/sbin/nova_statedir_ownership.py supports timestamps until 2038 (0x7fffffff)
Dec 05 09:11:55 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7d7f73ded750e19702abb7b872a39e49b2e92387d18f43689ac4f59277d46f9d/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Dec 05 09:11:55 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7d7f73ded750e19702abb7b872a39e49b2e92387d18f43689ac4f59277d46f9d/merged/var/lib/_nova_secontext supports timestamps until 2038 (0x7fffffff)
Dec 05 09:11:55 compute-1 podman[189253]: 2025-12-05 09:11:55.469264917 +0000 UTC m=+0.131600257 container init 59dfeeed11c4e649ef970504cbc5a1f7167f0c7de0b0af93e55c353542dfe0ab (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, io.buildah.version=1.41.3, container_name=nova_compute_init, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_id=edpm)
Dec 05 09:11:55 compute-1 podman[189253]: 2025-12-05 09:11:55.479725674 +0000 UTC m=+0.142060994 container start 59dfeeed11c4e649ef970504cbc5a1f7167f0c7de0b0af93e55c353542dfe0ab (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, container_name=nova_compute_init, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Dec 05 09:11:55 compute-1 python3.9[189229]: ansible-containers.podman.podman_container PODMAN-CONTAINER-DEBUG: podman start nova_compute_init
Dec 05 09:11:55 compute-1 nova_compute_init[189276]: INFO:nova_statedir:Applying nova statedir ownership
Dec 05 09:11:55 compute-1 nova_compute_init[189276]: INFO:nova_statedir:Target ownership for /var/lib/nova: 42436:42436
Dec 05 09:11:55 compute-1 nova_compute_init[189276]: INFO:nova_statedir:Checking uid: 1000 gid: 1000 path: /var/lib/nova/
Dec 05 09:11:55 compute-1 nova_compute_init[189276]: INFO:nova_statedir:Changing ownership of /var/lib/nova from 1000:1000 to 42436:42436
Dec 05 09:11:55 compute-1 nova_compute_init[189276]: INFO:nova_statedir:Setting selinux context of /var/lib/nova to system_u:object_r:container_file_t:s0
Dec 05 09:11:55 compute-1 nova_compute_init[189276]: INFO:nova_statedir:Checking uid: 1000 gid: 1000 path: /var/lib/nova/instances/
Dec 05 09:11:55 compute-1 nova_compute_init[189276]: INFO:nova_statedir:Changing ownership of /var/lib/nova/instances from 1000:1000 to 42436:42436
Dec 05 09:11:55 compute-1 nova_compute_init[189276]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/instances to system_u:object_r:container_file_t:s0
Dec 05 09:11:55 compute-1 nova_compute_init[189276]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/
Dec 05 09:11:55 compute-1 nova_compute_init[189276]: INFO:nova_statedir:Ownership of /var/lib/nova/.ssh already 42436:42436
Dec 05 09:11:55 compute-1 nova_compute_init[189276]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/.ssh to system_u:object_r:container_file_t:s0
Dec 05 09:11:55 compute-1 nova_compute_init[189276]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/ssh-privatekey
Dec 05 09:11:55 compute-1 nova_compute_init[189276]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/config
Dec 05 09:11:55 compute-1 nova_compute_init[189276]: INFO:nova_statedir:Nova statedir ownership complete
Dec 05 09:11:55 compute-1 systemd[1]: libpod-59dfeeed11c4e649ef970504cbc5a1f7167f0c7de0b0af93e55c353542dfe0ab.scope: Deactivated successfully.
Dec 05 09:11:55 compute-1 podman[189289]: 2025-12-05 09:11:55.625604072 +0000 UTC m=+0.028129673 container died 59dfeeed11c4e649ef970504cbc5a1f7167f0c7de0b0af93e55c353542dfe0ab (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, container_name=nova_compute_init, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, config_id=edpm, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 05 09:11:55 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-59dfeeed11c4e649ef970504cbc5a1f7167f0c7de0b0af93e55c353542dfe0ab-userdata-shm.mount: Deactivated successfully.
Dec 05 09:11:55 compute-1 systemd[1]: var-lib-containers-storage-overlay-7d7f73ded750e19702abb7b872a39e49b2e92387d18f43689ac4f59277d46f9d-merged.mount: Deactivated successfully.
Dec 05 09:11:55 compute-1 podman[189289]: 2025-12-05 09:11:55.665277257 +0000 UTC m=+0.067802838 container cleanup 59dfeeed11c4e649ef970504cbc5a1f7167f0c7de0b0af93e55c353542dfe0ab (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=edpm, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, container_name=nova_compute_init, io.buildah.version=1.41.3, tcib_managed=true)
Dec 05 09:11:55 compute-1 systemd[1]: libpod-conmon-59dfeeed11c4e649ef970504cbc5a1f7167f0c7de0b0af93e55c353542dfe0ab.scope: Deactivated successfully.
Dec 05 09:11:55 compute-1 sudo[189227]: pam_unix(sudo:session): session closed for user root
Dec 05 09:11:56 compute-1 sshd-session[160360]: Connection closed by 192.168.122.30 port 56692
Dec 05 09:11:56 compute-1 sshd-session[160357]: pam_unix(sshd:session): session closed for user zuul
Dec 05 09:11:56 compute-1 systemd[1]: session-25.scope: Deactivated successfully.
Dec 05 09:11:56 compute-1 systemd[1]: session-25.scope: Consumed 2min 1.078s CPU time.
Dec 05 09:11:56 compute-1 systemd-logind[807]: Session 25 logged out. Waiting for processes to exit.
Dec 05 09:11:56 compute-1 systemd-logind[807]: Removed session 25.
Dec 05 09:11:56 compute-1 nova_compute[189066]: 2025-12-05 09:11:56.790 189070 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_linux_bridge.linux_bridge.LinuxBridgePlugin'>' with name 'linux_bridge' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Dec 05 09:11:56 compute-1 nova_compute[189066]: 2025-12-05 09:11:56.791 189070 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_noop.noop.NoOpPlugin'>' with name 'noop' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Dec 05 09:11:56 compute-1 nova_compute[189066]: 2025-12-05 09:11:56.791 189070 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_ovs.ovs.OvsPlugin'>' with name 'ovs' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Dec 05 09:11:56 compute-1 nova_compute[189066]: 2025-12-05 09:11:56.791 189070 INFO os_vif [-] Loaded VIF plugins: linux_bridge, noop, ovs
Dec 05 09:11:56 compute-1 nova_compute[189066]: 2025-12-05 09:11:56.980 189070 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): grep -F node.session.scan /sbin/iscsiadm execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.007 189070 DEBUG oslo_concurrency.processutils [-] CMD "grep -F node.session.scan /sbin/iscsiadm" returned: 1 in 0.028s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.008 189070 DEBUG oslo_concurrency.processutils [-] 'grep -F node.session.scan /sbin/iscsiadm' failed. Not Retrying. execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:473
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.570 189070 INFO nova.virt.driver [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] Loading compute driver 'libvirt.LibvirtDriver'
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.694 189070 INFO nova.compute.provider_config [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] No provider configs found in /etc/nova/provider_config/. If files are present, ensure the Nova process has access.
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.756 189070 DEBUG oslo_concurrency.lockutils [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.756 189070 DEBUG oslo_concurrency.lockutils [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.757 189070 DEBUG oslo_concurrency.lockutils [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.758 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] Full set of CONF: _wait_for_exit_or_signal /usr/lib/python3.9/site-packages/oslo_service/service.py:362
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.758 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.759 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.759 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.760 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] config files: ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.760 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.760 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] allow_resize_to_same_host      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.760 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] arq_binding_timeout            = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.761 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] backdoor_port                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.761 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] backdoor_socket                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.761 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] block_device_allocate_retries  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.761 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] block_device_allocate_retries_interval = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.761 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] cert                           = self.pem log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.762 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] compute_driver                 = libvirt.LibvirtDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.762 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] compute_monitors               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.762 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] config_dir                     = ['/etc/nova/nova.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.762 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] config_drive_format            = iso9660 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.762 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] config_file                    = ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.762 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.762 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] console_host                   = compute-1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.763 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] control_exchange               = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.763 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] cpu_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.763 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] daemon                         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.763 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.763 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] default_access_ip_network_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.764 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] default_availability_zone      = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.764 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] default_ephemeral_format       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.764 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'glanceclient=WARN', 'oslo.privsep.daemon=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.764 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] default_schedule_zone          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.764 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] disk_allocation_ratio          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.765 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] enable_new_services            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.765 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] enabled_apis                   = ['osapi_compute', 'metadata'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.765 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] enabled_ssl_apis               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.765 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] flat_injected                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.765 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] force_config_drive             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.766 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] force_raw_images               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.766 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.766 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] heal_instance_info_cache_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.766 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] host                           = compute-1.ctlplane.example.com log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.766 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] initial_cpu_allocation_ratio   = 4.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.767 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] initial_disk_allocation_ratio  = 0.9 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.767 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] initial_ram_allocation_ratio   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.767 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] injected_network_template      = /usr/lib/python3.9/site-packages/nova/virt/interfaces.template log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.767 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] instance_build_timeout         = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.767 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] instance_delete_interval       = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.767 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.768 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] instance_name_template         = instance-%08x log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.768 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] instance_usage_audit           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.768 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] instance_usage_audit_period    = month log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.768 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.768 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] instances_path                 = /var/lib/nova/instances log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.768 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] internal_service_availability_zone = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.769 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] key                            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.769 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] live_migration_retry_count     = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.769 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.769 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.769 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.769 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.770 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.770 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.770 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.770 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] log_rotation_type              = size log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.770 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.770 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.770 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.771 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.771 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.771 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] long_rpc_timeout               = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.771 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] max_concurrent_builds          = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.771 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] max_concurrent_live_migrations = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.771 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] max_concurrent_snapshots       = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.771 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] max_local_block_devices        = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.772 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] max_logfile_count              = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.772 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] max_logfile_size_mb            = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.772 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] maximum_instance_delete_attempts = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.772 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] metadata_listen                = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.772 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] metadata_listen_port           = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.772 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] metadata_workers               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.772 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] migrate_max_retries            = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.773 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] mkisofs_cmd                    = /usr/bin/mkisofs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.773 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] my_block_storage_ip            = 192.168.122.101 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.773 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] my_ip                          = 192.168.122.101 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.773 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] network_allocate_retries       = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.773 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] non_inheritable_image_properties = ['cache_in_nova', 'bittorrent'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.774 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] osapi_compute_listen           = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.774 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] osapi_compute_listen_port      = 8774 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.774 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] osapi_compute_unique_server_name_scope =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.774 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] osapi_compute_workers          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.774 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] password_length                = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.774 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] periodic_enable                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.775 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] periodic_fuzzy_delay           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.775 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] pointer_model                  = usbtablet log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.775 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] preallocate_images             = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.775 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.775 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] pybasedir                      = /usr/lib/python3.9/site-packages log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.775 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] ram_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.776 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.776 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.776 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.776 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] reboot_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.776 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] reclaim_instance_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.776 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] record                         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.777 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] reimage_timeout_per_gb         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.777 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] report_interval                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.777 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] rescue_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.777 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] reserved_host_cpus             = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.777 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] reserved_host_disk_mb          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.778 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] reserved_host_memory_mb        = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.778 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] reserved_huge_pages            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.778 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] resize_confirm_window          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.778 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] resize_fs_using_block_device   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.778 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] resume_guests_state_on_host_boot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.778 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] rootwrap_config                = /etc/nova/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.779 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] rpc_response_timeout           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.779 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] run_external_periodic_tasks    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.779 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] running_deleted_instance_action = reap log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.779 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] running_deleted_instance_poll_interval = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.779 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] running_deleted_instance_timeout = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.779 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] scheduler_instance_sync_interval = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.780 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] service_down_time              = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.780 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] servicegroup_driver            = db log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.780 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] shelved_offload_time           = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.780 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] shelved_poll_interval          = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.780 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] shutdown_timeout               = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.780 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] source_is_ipv6                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.780 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] ssl_only                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.781 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] state_path                     = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.781 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] sync_power_state_interval      = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.781 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] sync_power_state_pool_size     = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.781 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.781 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] tempdir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.782 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] timeout_nbd                    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.782 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.782 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] update_resources_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.782 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] use_cow_images                 = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.782 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.782 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.783 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.783 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] use_rootwrap_daemon            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.783 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.783 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.783 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] vcpu_pin_set                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.783 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] vif_plugging_is_fatal          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.783 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] vif_plugging_timeout           = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.784 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] virt_mkfs                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.784 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] volume_usage_poll_interval     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.784 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.784 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] web                            = /usr/share/spice-html5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.784 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.784 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] oslo_concurrency.lock_path     = /var/lib/nova/tmp log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.785 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.785 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.785 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.785 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.785 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.785 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] api.auth_strategy              = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.786 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] api.compute_link_prefix        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.786 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] api.config_drive_skip_versions = 1.0 2007-01-19 2007-03-01 2007-08-29 2007-10-10 2007-12-15 2008-02-01 2008-09-01 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.786 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] api.dhcp_domain                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.786 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] api.enable_instance_password   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.786 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] api.glance_link_prefix         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.787 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] api.instance_list_cells_batch_fixed_size = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.787 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] api.instance_list_cells_batch_strategy = distributed log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.787 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] api.instance_list_per_project_cells = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.787 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] api.list_records_by_skipping_down_cells = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.787 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] api.local_metadata_per_cell    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.787 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] api.max_limit                  = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.787 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] api.metadata_cache_expiration  = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.788 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] api.neutron_default_tenant_id  = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.788 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] api.use_forwarded_for          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.788 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] api.use_neutron_default_nets   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.788 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] api.vendordata_dynamic_connect_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.788 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] api.vendordata_dynamic_failure_fatal = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.788 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] api.vendordata_dynamic_read_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.788 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] api.vendordata_dynamic_ssl_certfile =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.789 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] api.vendordata_dynamic_targets = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.789 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] api.vendordata_jsonfile_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.789 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] api.vendordata_providers       = ['StaticJSON'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.789 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] cache.backend                  = oslo_cache.dict log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.789 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] cache.backend_argument         = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.790 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] cache.config_prefix            = cache.oslo log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.790 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] cache.dead_timeout             = 60.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.790 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] cache.debug_cache_backend      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.790 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] cache.enable_retry_client      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.790 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] cache.enable_socket_keepalive  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.790 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] cache.enabled                  = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.791 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] cache.expiration_time          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.791 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] cache.hashclient_retry_attempts = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.791 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] cache.hashclient_retry_delay   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.791 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] cache.memcache_dead_retry      = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.791 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] cache.memcache_password        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.791 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] cache.memcache_pool_connection_get_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.792 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] cache.memcache_pool_flush_on_reconnect = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.792 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] cache.memcache_pool_maxsize    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.792 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] cache.memcache_pool_unused_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.792 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] cache.memcache_sasl_enabled    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.792 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] cache.memcache_servers         = ['localhost:11211'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.792 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] cache.memcache_socket_timeout  = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.793 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] cache.memcache_username        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.793 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] cache.proxies                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.793 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] cache.retry_attempts           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.793 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] cache.retry_delay              = 0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.793 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] cache.socket_keepalive_count   = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.793 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] cache.socket_keepalive_idle    = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.794 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] cache.socket_keepalive_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.794 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] cache.tls_allowed_ciphers      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.794 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] cache.tls_cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.794 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] cache.tls_certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.794 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] cache.tls_enabled              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.794 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] cache.tls_keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.795 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] cinder.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.795 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] cinder.auth_type               = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.795 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] cinder.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.795 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] cinder.catalog_info            = volumev3:cinderv3:internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.795 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] cinder.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.795 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] cinder.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.795 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] cinder.cross_az_attach         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.796 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] cinder.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.796 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] cinder.endpoint_template       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.796 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] cinder.http_retries            = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.796 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] cinder.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.796 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] cinder.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.797 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] cinder.os_region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.797 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] cinder.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.797 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] cinder.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.797 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] compute.consecutive_build_service_disable_threshold = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.797 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] compute.cpu_dedicated_set      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.797 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] compute.cpu_shared_set         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.797 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] compute.image_type_exclude_list = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.798 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] compute.live_migration_wait_for_vif_plug = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.798 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] compute.max_concurrent_disk_ops = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.798 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] compute.max_disk_devices_to_attach = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.798 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] compute.packing_host_numa_cells_allocation_strategy = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.798 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] compute.provider_config_location = /etc/nova/provider_config/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.799 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] compute.resource_provider_association_refresh = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.799 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] compute.shutdown_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.799 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] compute.vmdk_allowed_types     = ['streamOptimized', 'monolithicSparse'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.799 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] conductor.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.799 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] console.allowed_origins        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.799 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] console.ssl_ciphers            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.800 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] console.ssl_minimum_version    = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.800 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] consoleauth.token_ttl          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.800 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] cyborg.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.800 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] cyborg.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.800 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] cyborg.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.801 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] cyborg.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.801 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] cyborg.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.801 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] cyborg.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.801 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] cyborg.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.801 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] cyborg.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.801 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] cyborg.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.802 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] cyborg.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.802 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] cyborg.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.802 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] cyborg.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.802 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] cyborg.service_type            = accelerator log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.802 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] cyborg.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.802 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] cyborg.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.803 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] cyborg.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.803 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] cyborg.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.803 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] cyborg.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.803 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] cyborg.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.803 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] database.backend               = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.803 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] database.connection            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.804 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] database.connection_debug      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.804 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.804 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.804 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] database.connection_trace      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.804 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.804 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] database.db_max_retries        = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.804 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.805 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] database.db_retry_interval     = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.805 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] database.max_overflow          = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.805 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] database.max_pool_size         = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.805 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] database.max_retries           = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.805 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] database.mysql_enable_ndb      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.805 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] database.mysql_sql_mode        = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.806 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.806 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] database.pool_timeout          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.806 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] database.retry_interval        = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.806 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] database.slave_connection      = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.806 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] database.sqlite_synchronous    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.807 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] api_database.backend           = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.807 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] api_database.connection        = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.807 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] api_database.connection_debug  = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.807 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] api_database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.807 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] api_database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.807 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] api_database.connection_trace  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.808 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] api_database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.808 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] api_database.db_max_retries    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.808 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] api_database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.808 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] api_database.db_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.808 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] api_database.max_overflow      = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.808 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] api_database.max_pool_size     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.809 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] api_database.max_retries       = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.809 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] api_database.mysql_enable_ndb  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.809 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] api_database.mysql_sql_mode    = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.809 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] api_database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.809 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] api_database.pool_timeout      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.809 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] api_database.retry_interval    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.810 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] api_database.slave_connection  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.810 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] api_database.sqlite_synchronous = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.810 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] devices.enabled_mdev_types     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.810 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] ephemeral_storage_encryption.cipher = aes-xts-plain64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.810 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] ephemeral_storage_encryption.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.810 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] ephemeral_storage_encryption.key_size = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.811 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] glance.api_servers             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.811 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] glance.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.811 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] glance.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.811 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] glance.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.811 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] glance.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.812 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] glance.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.812 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] glance.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.812 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] glance.default_trusted_certificate_ids = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.812 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] glance.enable_certificate_validation = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.812 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] glance.enable_rbd_download     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.812 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] glance.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.813 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] glance.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.813 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] glance.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.813 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] glance.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.813 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] glance.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.813 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] glance.num_retries             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.814 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] glance.rbd_ceph_conf           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.814 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] glance.rbd_connect_timeout     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.814 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] glance.rbd_pool                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.814 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] glance.rbd_user                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.814 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] glance.region_name             = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.814 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] glance.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.815 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] glance.service_type            = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.815 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] glance.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.815 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] glance.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.815 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] glance.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.815 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] glance.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.815 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] glance.valid_interfaces        = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.816 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] glance.verify_glance_signatures = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.816 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] glance.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.816 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] guestfs.debug                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.816 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] hyperv.config_drive_cdrom      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.816 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] hyperv.config_drive_inject_password = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.816 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] hyperv.dynamic_memory_ratio    = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.817 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] hyperv.enable_instance_metrics_collection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.817 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] hyperv.enable_remotefx         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.817 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] hyperv.instances_path_share    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.817 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] hyperv.iscsi_initiator_list    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.817 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] hyperv.limit_cpu_features      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.817 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] hyperv.mounted_disk_query_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.818 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] hyperv.mounted_disk_query_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.818 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] hyperv.power_state_check_timeframe = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.818 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] hyperv.power_state_event_polling_interval = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.818 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] hyperv.qemu_img_cmd            = qemu-img.exe log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.818 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] hyperv.use_multipath_io        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.818 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] hyperv.volume_attach_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.818 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] hyperv.volume_attach_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.819 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] hyperv.vswitch_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.819 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] hyperv.wait_soft_reboot_seconds = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.819 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] mks.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.819 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] mks.mksproxy_base_url          = http://127.0.0.1:6090/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.820 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] image_cache.manager_interval   = 2400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.820 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] image_cache.precache_concurrency = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.820 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] image_cache.remove_unused_base_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.820 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] image_cache.remove_unused_original_minimum_age_seconds = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.820 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] image_cache.remove_unused_resized_minimum_age_seconds = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.820 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] image_cache.subdirectory_name  = _base log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.821 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] ironic.api_max_retries         = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.821 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] ironic.api_retry_interval      = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.821 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.821 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.821 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.821 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.821 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.822 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.822 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.822 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.822 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.822 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.823 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.823 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.823 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] ironic.partition_key           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.823 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] ironic.peer_list               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.823 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.823 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] ironic.serial_console_state_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.823 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.824 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] ironic.service_type            = baremetal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.824 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.824 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.824 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.824 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.825 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] ironic.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.825 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.825 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] key_manager.backend            = barbican log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.825 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] key_manager.fixed_key          = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.825 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] barbican.auth_endpoint         = http://localhost/identity/v3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.825 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] barbican.barbican_api_version  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.826 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] barbican.barbican_endpoint     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.826 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] barbican.barbican_endpoint_type = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.826 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] barbican.barbican_region_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.826 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] barbican.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.826 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] barbican.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.826 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] barbican.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.826 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] barbican.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.827 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] barbican.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.827 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] barbican.number_of_retries     = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.827 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] barbican.retry_delay           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.827 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] barbican.send_service_user_token = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.827 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] barbican.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.827 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] barbican.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.828 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] barbican.verify_ssl            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.828 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] barbican.verify_ssl_path       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.828 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] barbican_service_user.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.828 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] barbican_service_user.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.828 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] barbican_service_user.cafile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.828 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] barbican_service_user.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.829 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] barbican_service_user.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.829 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] barbican_service_user.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.829 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] barbican_service_user.keyfile  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.829 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] barbican_service_user.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.829 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] barbican_service_user.timeout  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.830 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] vault.approle_role_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.830 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] vault.approle_secret_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.830 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] vault.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.830 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] vault.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.830 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] vault.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.830 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] vault.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.831 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] vault.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.831 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] vault.kv_mountpoint            = secret log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.831 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] vault.kv_version               = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.831 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] vault.namespace                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.831 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] vault.root_token_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.831 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] vault.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.832 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] vault.ssl_ca_crt_file          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.832 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] vault.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.832 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] vault.use_ssl                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.832 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] vault.vault_url                = http://127.0.0.1:8200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.832 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] keystone.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.833 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] keystone.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.833 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] keystone.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.833 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] keystone.connect_retries       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.833 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] keystone.connect_retry_delay   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.833 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] keystone.endpoint_override     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.834 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] keystone.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.834 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] keystone.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.834 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] keystone.max_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.834 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] keystone.min_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.834 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] keystone.region_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.834 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] keystone.service_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.835 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] keystone.service_type          = identity log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.835 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] keystone.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.835 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] keystone.status_code_retries   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.835 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] keystone.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.835 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] keystone.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.835 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] keystone.valid_interfaces      = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.835 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] keystone.version               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.836 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] libvirt.connection_uri         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.836 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] libvirt.cpu_mode               = custom log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.836 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] libvirt.cpu_model_extra_flags  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.836 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] libvirt.cpu_models             = ['Nehalem'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.836 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] libvirt.cpu_power_governor_high = performance log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.836 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] libvirt.cpu_power_governor_low = powersave log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.837 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] libvirt.cpu_power_management   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.837 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] libvirt.cpu_power_management_strategy = cpu_state log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.837 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] libvirt.device_detach_attempts = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.837 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] libvirt.device_detach_timeout  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.837 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] libvirt.disk_cachemodes        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.837 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] libvirt.disk_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.838 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] libvirt.enabled_perf_events    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.838 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] libvirt.file_backed_memory     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.838 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] libvirt.gid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.838 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] libvirt.hw_disk_discard        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.838 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] libvirt.hw_machine_type        = ['x86_64=q35'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.839 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] libvirt.images_rbd_ceph_conf   =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.839 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] libvirt.images_rbd_glance_copy_poll_interval = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.839 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] libvirt.images_rbd_glance_copy_timeout = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.839 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] libvirt.images_rbd_glance_store_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.839 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] libvirt.images_rbd_pool        = rbd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.839 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] libvirt.images_type            = qcow2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.839 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] libvirt.images_volume_group    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.840 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] libvirt.inject_key             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.840 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] libvirt.inject_partition       = -2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.840 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] libvirt.inject_password        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.840 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] libvirt.iscsi_iface            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.840 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] libvirt.iser_use_multipath     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.841 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] libvirt.live_migration_bandwidth = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.841 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] libvirt.live_migration_completion_timeout = 800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.841 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] libvirt.live_migration_downtime = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.841 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] libvirt.live_migration_downtime_delay = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.841 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] libvirt.live_migration_downtime_steps = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.841 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] libvirt.live_migration_inbound_addr = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.841 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] libvirt.live_migration_permit_auto_converge = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.842 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] libvirt.live_migration_permit_post_copy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.842 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] libvirt.live_migration_scheme  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.842 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] libvirt.live_migration_timeout_action = force_complete log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.842 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] libvirt.live_migration_tunnelled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.842 189070 WARNING oslo_config.cfg [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] Deprecated: Option "live_migration_uri" from group "libvirt" is deprecated for removal (
Dec 05 09:11:57 compute-1 nova_compute[189066]: live_migration_uri is deprecated for removal in favor of two other options that
Dec 05 09:11:57 compute-1 nova_compute[189066]: allow to change live migration scheme and target URI: ``live_migration_scheme``
Dec 05 09:11:57 compute-1 nova_compute[189066]: and ``live_migration_inbound_addr`` respectively.
Dec 05 09:11:57 compute-1 nova_compute[189066]: ).  Its value may be silently ignored in the future.
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.843 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] libvirt.live_migration_uri     = qemu+tls://%s/system log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.843 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] libvirt.live_migration_with_native_tls = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.843 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] libvirt.max_queues             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.843 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] libvirt.mem_stats_period_seconds = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.843 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] libvirt.nfs_mount_options      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.844 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] libvirt.nfs_mount_point_base   = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.844 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] libvirt.num_aoe_discover_tries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.844 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] libvirt.num_iser_scan_tries    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.845 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] libvirt.num_memory_encrypted_guests = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.845 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] libvirt.num_nvme_discover_tries = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.845 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] libvirt.num_pcie_ports         = 24 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.845 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] libvirt.num_volume_scan_tries  = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.845 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] libvirt.pmem_namespaces        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.845 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] libvirt.quobyte_client_cfg     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.846 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] libvirt.quobyte_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.846 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] libvirt.rbd_connect_timeout    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.846 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] libvirt.rbd_destroy_volume_retries = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.846 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] libvirt.rbd_destroy_volume_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.846 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] libvirt.rbd_secret_uuid        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.846 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] libvirt.rbd_user               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.846 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] libvirt.realtime_scheduler_priority = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.847 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] libvirt.remote_filesystem_transport = ssh log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.847 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] libvirt.rescue_image_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.847 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] libvirt.rescue_kernel_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.847 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] libvirt.rescue_ramdisk_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.847 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] libvirt.rng_dev_path           = /dev/urandom log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.847 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] libvirt.rx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.847 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] libvirt.smbfs_mount_options    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.848 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] libvirt.smbfs_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.848 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] libvirt.snapshot_compression   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.848 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] libvirt.snapshot_image_format  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.848 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] libvirt.snapshots_directory    = /var/lib/nova/instances/snapshots log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.848 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] libvirt.sparse_logical_volumes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.849 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] libvirt.swtpm_enabled          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.849 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] libvirt.swtpm_group            = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.849 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] libvirt.swtpm_user             = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.849 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] libvirt.sysinfo_serial         = unique log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.849 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] libvirt.tx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.849 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] libvirt.uid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.850 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] libvirt.use_virtio_for_bridges = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.850 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] libvirt.virt_type              = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.850 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] libvirt.volume_clear           = zero log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.850 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] libvirt.volume_clear_size      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.850 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] libvirt.volume_use_multipath   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.850 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] libvirt.vzstorage_cache_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.851 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] libvirt.vzstorage_log_path     = /var/log/vstorage/%(cluster_name)s/nova.log.gz log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.851 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] libvirt.vzstorage_mount_group  = qemu log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.851 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] libvirt.vzstorage_mount_opts   = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.851 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] libvirt.vzstorage_mount_perms  = 0770 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.851 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] libvirt.vzstorage_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.851 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] libvirt.vzstorage_mount_user   = stack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.852 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] libvirt.wait_soft_reboot_seconds = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.852 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] neutron.auth_section           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.852 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] neutron.auth_type              = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.852 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] neutron.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.852 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] neutron.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.853 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] neutron.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.853 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] neutron.connect_retries        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.853 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] neutron.connect_retry_delay    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.853 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] neutron.default_floating_pool  = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.853 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] neutron.endpoint_override      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.854 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] neutron.extension_sync_interval = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.854 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] neutron.http_retries           = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.854 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] neutron.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.854 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] neutron.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.854 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] neutron.max_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.854 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] neutron.metadata_proxy_shared_secret = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.855 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] neutron.min_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.855 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] neutron.ovs_bridge             = br-int log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.855 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] neutron.physnets               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.855 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] neutron.region_name            = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.855 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] neutron.service_metadata_proxy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.855 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] neutron.service_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.856 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] neutron.service_type           = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.856 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] neutron.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.856 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] neutron.status_code_retries    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.856 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] neutron.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.856 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] neutron.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.856 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] neutron.valid_interfaces       = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.857 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] neutron.version                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.857 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] notifications.bdms_in_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.857 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] notifications.default_level    = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.857 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] notifications.notification_format = unversioned log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.858 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] notifications.notify_on_state_change = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.858 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] notifications.versioned_notifications_topics = ['versioned_notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.858 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] pci.alias                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.858 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] pci.device_spec                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.858 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] pci.report_in_placement        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.859 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.859 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] placement.auth_type            = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.859 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] placement.auth_url             = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.859 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.860 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.860 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.860 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] placement.connect_retries      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.860 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] placement.connect_retry_delay  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.860 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] placement.default_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.860 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] placement.default_domain_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.861 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] placement.domain_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.861 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] placement.domain_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.861 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] placement.endpoint_override    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.861 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.861 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.861 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] placement.max_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.862 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] placement.min_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.862 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] placement.password             = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.862 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] placement.project_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.862 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] placement.project_domain_name  = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.862 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] placement.project_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.862 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] placement.project_name         = service log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.863 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] placement.region_name          = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.863 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] placement.service_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.863 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] placement.service_type         = placement log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.863 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.863 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] placement.status_code_retries  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.863 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] placement.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.864 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] placement.system_scope         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.864 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.864 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] placement.trust_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.864 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] placement.user_domain_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.864 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] placement.user_domain_name     = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.865 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] placement.user_id              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.865 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] placement.username             = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.865 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] placement.valid_interfaces     = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.865 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] placement.version              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.865 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] quota.cores                    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.865 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] quota.count_usage_from_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.866 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] quota.driver                   = nova.quota.DbQuotaDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.866 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] quota.injected_file_content_bytes = 10240 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.866 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] quota.injected_file_path_length = 255 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.866 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] quota.injected_files           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.866 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] quota.instances                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.867 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] quota.key_pairs                = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.867 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] quota.metadata_items           = 128 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.867 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] quota.ram                      = 51200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.867 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] quota.recheck_quota            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.867 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] quota.server_group_members     = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.867 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] quota.server_groups            = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.868 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] rdp.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.868 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] rdp.html5_proxy_base_url       = http://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.868 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] scheduler.discover_hosts_in_cells_interval = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.868 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] scheduler.enable_isolated_aggregate_filtering = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.869 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] scheduler.image_metadata_prefilter = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.869 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] scheduler.limit_tenants_to_placement_aggregate = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.869 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] scheduler.max_attempts         = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.869 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] scheduler.max_placement_results = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.869 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] scheduler.placement_aggregate_required_for_tenants = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.870 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] scheduler.query_placement_for_availability_zone = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.870 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] scheduler.query_placement_for_image_type_support = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.870 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] scheduler.query_placement_for_routed_network_aggregates = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.870 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] scheduler.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.870 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] filter_scheduler.aggregate_image_properties_isolation_namespace = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.870 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] filter_scheduler.aggregate_image_properties_isolation_separator = . log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.871 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] filter_scheduler.available_filters = ['nova.scheduler.filters.all_filters'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.871 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] filter_scheduler.build_failure_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.871 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] filter_scheduler.cpu_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.871 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] filter_scheduler.cross_cell_move_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.871 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] filter_scheduler.disk_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.872 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] filter_scheduler.enabled_filters = ['ComputeFilter', 'ComputeCapabilitiesFilter', 'ImagePropertiesFilter', 'ServerGroupAntiAffinityFilter', 'ServerGroupAffinityFilter'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.872 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] filter_scheduler.host_subset_size = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.872 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] filter_scheduler.image_properties_default_architecture = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.872 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] filter_scheduler.io_ops_weight_multiplier = -1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.872 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] filter_scheduler.isolated_hosts = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.873 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] filter_scheduler.isolated_images = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.873 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] filter_scheduler.max_instances_per_host = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.873 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] filter_scheduler.max_io_ops_per_host = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.873 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] filter_scheduler.pci_in_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.873 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] filter_scheduler.pci_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.873 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] filter_scheduler.ram_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.874 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] filter_scheduler.restrict_isolated_hosts_to_isolated_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.874 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] filter_scheduler.shuffle_best_same_weighed_hosts = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.874 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] filter_scheduler.soft_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.874 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] filter_scheduler.soft_anti_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.874 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] filter_scheduler.track_instance_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.874 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] filter_scheduler.weight_classes = ['nova.scheduler.weights.all_weighers'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.875 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] metrics.required               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.875 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] metrics.weight_multiplier      = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.875 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] metrics.weight_of_unavailable  = -10000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.875 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] metrics.weight_setting         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.876 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] serial_console.base_url        = ws://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.876 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] serial_console.enabled         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.876 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] serial_console.port_range      = 10000:20000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.876 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] serial_console.proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.876 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] serial_console.serialproxy_host = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.876 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] serial_console.serialproxy_port = 6083 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.877 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] service_user.auth_section      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.877 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] service_user.auth_type         = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.877 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] service_user.cafile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.877 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] service_user.certfile          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.877 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] service_user.collect_timing    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.877 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] service_user.insecure          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.878 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] service_user.keyfile           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.878 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] service_user.send_service_user_token = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.878 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] service_user.split_loggers     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.878 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] service_user.timeout           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.878 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] spice.agent_enabled            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.879 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] spice.enabled                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.879 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] spice.html5proxy_base_url      = http://127.0.0.1:6082/spice_auto.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.879 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] spice.html5proxy_host          = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.879 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] spice.html5proxy_port          = 6082 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.879 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] spice.image_compression        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.880 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] spice.jpeg_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.880 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] spice.playback_compression     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.880 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] spice.server_listen            = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.880 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] spice.server_proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.880 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] spice.streaming_mode           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.881 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] spice.zlib_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.881 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] upgrade_levels.baseapi         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.881 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] upgrade_levels.cert            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.881 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] upgrade_levels.compute         = auto log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.881 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] upgrade_levels.conductor       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.881 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] upgrade_levels.scheduler       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.881 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] vendordata_dynamic_auth.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.882 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] vendordata_dynamic_auth.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.882 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] vendordata_dynamic_auth.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.882 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] vendordata_dynamic_auth.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.882 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] vendordata_dynamic_auth.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.883 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] vendordata_dynamic_auth.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.883 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] vendordata_dynamic_auth.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.883 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] vendordata_dynamic_auth.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.883 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] vendordata_dynamic_auth.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.883 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.883 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.884 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] vmware.cache_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.884 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] vmware.cluster_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.884 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] vmware.connection_pool_size    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.884 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] vmware.console_delay_seconds   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.884 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] vmware.datastore_regex         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.885 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] vmware.host_ip                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.885 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.885 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.885 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] vmware.host_username           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.886 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.886 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] vmware.integration_bridge      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.886 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] vmware.maximum_objects         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.886 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] vmware.pbm_default_policy      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.886 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] vmware.pbm_enabled             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.887 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] vmware.pbm_wsdl_location       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.887 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] vmware.serial_log_dir          = /opt/vmware/vspc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.887 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] vmware.serial_port_proxy_uri   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.887 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] vmware.serial_port_service_uri = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.887 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.888 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] vmware.use_linked_clone        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.888 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] vmware.vnc_keymap              = en-us log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.888 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] vmware.vnc_port                = 5900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.888 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] vmware.vnc_port_total          = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.889 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] vnc.auth_schemes               = ['none'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.889 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] vnc.enabled                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.889 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] vnc.novncproxy_base_url        = https://nova-novncproxy-cell1-public-openstack.apps-crc.testing/vnc_lite.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.889 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] vnc.novncproxy_host            = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.890 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] vnc.novncproxy_port            = 6080 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.890 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] vnc.server_listen              = ::0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.890 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] vnc.server_proxyclient_address = 192.168.122.101 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.890 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] vnc.vencrypt_ca_certs          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.891 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] vnc.vencrypt_client_cert       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.891 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] vnc.vencrypt_client_key        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.891 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] workarounds.disable_compute_service_check_for_ffu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.891 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] workarounds.disable_deep_image_inspection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.891 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] workarounds.disable_fallback_pcpu_query = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.892 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] workarounds.disable_group_policy_check_upcall = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.892 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] workarounds.disable_libvirt_livesnapshot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.892 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] workarounds.disable_rootwrap   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.892 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] workarounds.enable_numa_live_migration = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.892 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] workarounds.enable_qemu_monitor_announce_self = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.893 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] workarounds.ensure_libvirt_rbd_instance_dir_cleanup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.893 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] workarounds.handle_virt_lifecycle_events = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.893 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] workarounds.libvirt_disable_apic = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.893 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] workarounds.never_download_image_if_on_rbd = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.893 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] workarounds.qemu_monitor_announce_self_count = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.894 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] workarounds.qemu_monitor_announce_self_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.894 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] workarounds.reserve_disk_resource_for_image_cache = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.894 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] workarounds.skip_cpu_compare_at_startup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.894 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] workarounds.skip_cpu_compare_on_dest = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.894 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] workarounds.skip_hypervisor_version_check_on_lm = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.895 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] workarounds.skip_reserve_in_use_ironic_nodes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.895 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] workarounds.unified_limits_count_pcpu_as_vcpu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.895 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] workarounds.wait_for_vif_plugged_event_during_hard_reboot = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.895 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] wsgi.api_paste_config          = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.895 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] wsgi.client_socket_timeout     = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.896 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] wsgi.default_pool_size         = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.896 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] wsgi.keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.896 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] wsgi.max_header_line           = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.896 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] wsgi.secure_proxy_ssl_header   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.896 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] wsgi.ssl_ca_file               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.897 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] wsgi.ssl_cert_file             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.897 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] wsgi.ssl_key_file              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.897 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] wsgi.tcp_keepidle              = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.897 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] wsgi.wsgi_log_format           = %(client_ip)s "%(request_line)s" status: %(status_code)s len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.898 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] zvm.ca_file                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.898 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] zvm.cloud_connector_url        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.898 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] zvm.image_tmp_path             = /var/lib/nova/images log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.898 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] zvm.reachable_timeout          = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.898 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] oslo_policy.enforce_new_defaults = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.899 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] oslo_policy.enforce_scope      = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.899 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.899 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.899 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.899 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.899 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.900 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.900 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.900 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.900 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.900 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.900 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] remote_debug.host              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.900 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] remote_debug.port              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.901 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.901 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] oslo_messaging_rabbit.amqp_durable_queues = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.901 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.901 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.901 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.902 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.902 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.902 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.902 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.902 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.903 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.903 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.903 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.903 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.903 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.903 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.903 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.904 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.904 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.904 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.904 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] oslo_messaging_rabbit.rabbit_quorum_queue = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.904 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.904 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.905 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.905 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.905 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.905 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.905 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.906 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.906 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.906 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.906 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] oslo_messaging_notifications.driver = ['noop'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.906 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.907 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.907 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.907 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] oslo_limit.auth_section        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.907 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] oslo_limit.auth_type           = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.907 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] oslo_limit.auth_url            = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.908 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] oslo_limit.cafile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.908 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] oslo_limit.certfile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.908 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] oslo_limit.collect_timing      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.908 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] oslo_limit.connect_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.908 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] oslo_limit.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.908 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] oslo_limit.default_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.909 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] oslo_limit.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.909 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] oslo_limit.domain_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.909 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] oslo_limit.domain_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.909 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] oslo_limit.endpoint_id         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.909 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] oslo_limit.endpoint_override   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.909 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] oslo_limit.insecure            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.910 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] oslo_limit.keyfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.910 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] oslo_limit.max_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.910 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] oslo_limit.min_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.910 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] oslo_limit.password            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.910 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] oslo_limit.project_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.910 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] oslo_limit.project_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.910 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] oslo_limit.project_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.911 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] oslo_limit.project_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.911 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] oslo_limit.region_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.911 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] oslo_limit.service_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.911 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] oslo_limit.service_type        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.911 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] oslo_limit.split_loggers       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.911 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] oslo_limit.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.912 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] oslo_limit.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.912 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] oslo_limit.system_scope        = all log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.912 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] oslo_limit.timeout             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.912 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] oslo_limit.trust_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.912 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] oslo_limit.user_domain_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.913 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] oslo_limit.user_domain_name    = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.913 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] oslo_limit.user_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.913 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] oslo_limit.username            = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.913 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] oslo_limit.valid_interfaces    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.913 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] oslo_limit.version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.913 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] oslo_reports.file_event_handler = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.914 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] oslo_reports.file_event_handler_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.914 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] oslo_reports.log_dir           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.914 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] vif_plug_linux_bridge_privileged.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.914 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] vif_plug_linux_bridge_privileged.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.914 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] vif_plug_linux_bridge_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.914 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] vif_plug_linux_bridge_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.915 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] vif_plug_linux_bridge_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.915 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] vif_plug_linux_bridge_privileged.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.915 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] vif_plug_ovs_privileged.capabilities = [12, 1] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.915 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] vif_plug_ovs_privileged.group  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.915 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] vif_plug_ovs_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.915 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] vif_plug_ovs_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.915 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] vif_plug_ovs_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.916 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] vif_plug_ovs_privileged.user   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.916 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] os_vif_linux_bridge.flat_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.916 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] os_vif_linux_bridge.forward_bridge_interface = ['all'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.916 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] os_vif_linux_bridge.iptables_bottom_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.916 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] os_vif_linux_bridge.iptables_drop_action = DROP log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.916 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] os_vif_linux_bridge.iptables_top_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.917 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] os_vif_linux_bridge.network_device_mtu = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.917 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] os_vif_linux_bridge.use_ipv6   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.917 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] os_vif_linux_bridge.vlan_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.917 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] os_vif_ovs.isolate_vif         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.917 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] os_vif_ovs.network_device_mtu  = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.917 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] os_vif_ovs.ovs_vsctl_timeout   = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.918 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] os_vif_ovs.ovsdb_connection    = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.918 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] os_vif_ovs.ovsdb_interface     = native log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.918 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] os_vif_ovs.per_port_bridge     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.918 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] os_brick.lock_path             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.918 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] os_brick.wait_mpath_device_attempts = 4 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.918 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] os_brick.wait_mpath_device_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.918 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] privsep_osbrick.capabilities   = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.919 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] privsep_osbrick.group          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.919 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] privsep_osbrick.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.919 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] privsep_osbrick.logger_name    = os_brick.privileged log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.919 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] privsep_osbrick.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.919 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] privsep_osbrick.user           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.919 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] nova_sys_admin.capabilities    = [0, 1, 2, 3, 12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.919 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] nova_sys_admin.group           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.920 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] nova_sys_admin.helper_command  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.920 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] nova_sys_admin.logger_name     = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.920 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] nova_sys_admin.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.920 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] nova_sys_admin.user            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.920 189070 DEBUG oslo_service.service [None req-2382022c-0511-4b94-a812-4bae7ddc358a - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.922 189070 INFO nova.service [-] Starting compute node (version 27.5.2-0.20250829104910.6f8decf.el9)
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.938 189070 DEBUG nova.virt.libvirt.host [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] Starting native event thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:492
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.940 189070 DEBUG nova.virt.libvirt.host [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] Starting green dispatch thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:498
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.940 189070 DEBUG nova.virt.libvirt.host [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] Starting connection event dispatch thread initialize /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:620
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.940 189070 DEBUG nova.virt.libvirt.host [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] Connecting to libvirt: qemu:///system _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:503
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.959 189070 DEBUG nova.virt.libvirt.host [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] Registering for lifecycle events <nova.virt.libvirt.host.Host object at 0x7fcbb17e43d0> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:509
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.962 189070 DEBUG nova.virt.libvirt.host [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] Registering for connection events: <nova.virt.libvirt.host.Host object at 0x7fcbb17e43d0> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:530
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.963 189070 INFO nova.virt.libvirt.driver [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] Connection event '1' reason 'None'
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.971 189070 INFO nova.virt.libvirt.host [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] Libvirt host capabilities <capabilities>
Dec 05 09:11:57 compute-1 nova_compute[189066]: 
Dec 05 09:11:57 compute-1 nova_compute[189066]:   <host>
Dec 05 09:11:57 compute-1 nova_compute[189066]:     <uuid>dc540e4b-dbcd-41cc-9101-8457f90414ef</uuid>
Dec 05 09:11:57 compute-1 nova_compute[189066]:     <cpu>
Dec 05 09:11:57 compute-1 nova_compute[189066]:       <arch>x86_64</arch>
Dec 05 09:11:57 compute-1 nova_compute[189066]:       <model>EPYC-Rome-v4</model>
Dec 05 09:11:57 compute-1 nova_compute[189066]:       <vendor>AMD</vendor>
Dec 05 09:11:57 compute-1 nova_compute[189066]:       <microcode version='16777317'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:       <signature family='23' model='49' stepping='0'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:       <topology sockets='8' dies='1' clusters='1' cores='1' threads='1'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:       <maxphysaddr mode='emulate' bits='40'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:       <feature name='x2apic'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:       <feature name='tsc-deadline'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:       <feature name='osxsave'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:       <feature name='hypervisor'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:       <feature name='tsc_adjust'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:       <feature name='spec-ctrl'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:       <feature name='stibp'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:       <feature name='arch-capabilities'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:       <feature name='ssbd'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:       <feature name='cmp_legacy'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:       <feature name='topoext'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:       <feature name='virt-ssbd'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:       <feature name='lbrv'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:       <feature name='tsc-scale'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:       <feature name='vmcb-clean'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:       <feature name='pause-filter'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:       <feature name='pfthreshold'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:       <feature name='svme-addr-chk'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:       <feature name='rdctl-no'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:       <feature name='skip-l1dfl-vmentry'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:       <feature name='mds-no'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:       <feature name='pschange-mc-no'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:       <pages unit='KiB' size='4'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:       <pages unit='KiB' size='2048'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:       <pages unit='KiB' size='1048576'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:     </cpu>
Dec 05 09:11:57 compute-1 nova_compute[189066]:     <power_management>
Dec 05 09:11:57 compute-1 nova_compute[189066]:       <suspend_mem/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:       <suspend_disk/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:       <suspend_hybrid/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:     </power_management>
Dec 05 09:11:57 compute-1 nova_compute[189066]:     <iommu support='no'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:     <migration_features>
Dec 05 09:11:57 compute-1 nova_compute[189066]:       <live/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:       <uri_transports>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <uri_transport>tcp</uri_transport>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <uri_transport>rdma</uri_transport>
Dec 05 09:11:57 compute-1 nova_compute[189066]:       </uri_transports>
Dec 05 09:11:57 compute-1 nova_compute[189066]:     </migration_features>
Dec 05 09:11:57 compute-1 nova_compute[189066]:     <topology>
Dec 05 09:11:57 compute-1 nova_compute[189066]:       <cells num='1'>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <cell id='0'>
Dec 05 09:11:57 compute-1 nova_compute[189066]:           <memory unit='KiB'>7864304</memory>
Dec 05 09:11:57 compute-1 nova_compute[189066]:           <pages unit='KiB' size='4'>1966076</pages>
Dec 05 09:11:57 compute-1 nova_compute[189066]:           <pages unit='KiB' size='2048'>0</pages>
Dec 05 09:11:57 compute-1 nova_compute[189066]:           <pages unit='KiB' size='1048576'>0</pages>
Dec 05 09:11:57 compute-1 nova_compute[189066]:           <distances>
Dec 05 09:11:57 compute-1 nova_compute[189066]:             <sibling id='0' value='10'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:           </distances>
Dec 05 09:11:57 compute-1 nova_compute[189066]:           <cpus num='8'>
Dec 05 09:11:57 compute-1 nova_compute[189066]:             <cpu id='0' socket_id='0' die_id='0' cluster_id='65535' core_id='0' siblings='0'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:             <cpu id='1' socket_id='1' die_id='1' cluster_id='65535' core_id='0' siblings='1'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:             <cpu id='2' socket_id='2' die_id='2' cluster_id='65535' core_id='0' siblings='2'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:             <cpu id='3' socket_id='3' die_id='3' cluster_id='65535' core_id='0' siblings='3'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:             <cpu id='4' socket_id='4' die_id='4' cluster_id='65535' core_id='0' siblings='4'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:             <cpu id='5' socket_id='5' die_id='5' cluster_id='65535' core_id='0' siblings='5'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:             <cpu id='6' socket_id='6' die_id='6' cluster_id='65535' core_id='0' siblings='6'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:             <cpu id='7' socket_id='7' die_id='7' cluster_id='65535' core_id='0' siblings='7'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:           </cpus>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         </cell>
Dec 05 09:11:57 compute-1 nova_compute[189066]:       </cells>
Dec 05 09:11:57 compute-1 nova_compute[189066]:     </topology>
Dec 05 09:11:57 compute-1 nova_compute[189066]:     <cache>
Dec 05 09:11:57 compute-1 nova_compute[189066]:       <bank id='0' level='2' type='both' size='512' unit='KiB' cpus='0'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:       <bank id='1' level='2' type='both' size='512' unit='KiB' cpus='1'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:       <bank id='2' level='2' type='both' size='512' unit='KiB' cpus='2'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:       <bank id='3' level='2' type='both' size='512' unit='KiB' cpus='3'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:       <bank id='4' level='2' type='both' size='512' unit='KiB' cpus='4'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:       <bank id='5' level='2' type='both' size='512' unit='KiB' cpus='5'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:       <bank id='6' level='2' type='both' size='512' unit='KiB' cpus='6'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:       <bank id='7' level='2' type='both' size='512' unit='KiB' cpus='7'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:       <bank id='0' level='3' type='both' size='16' unit='MiB' cpus='0'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:       <bank id='1' level='3' type='both' size='16' unit='MiB' cpus='1'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:       <bank id='2' level='3' type='both' size='16' unit='MiB' cpus='2'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:       <bank id='3' level='3' type='both' size='16' unit='MiB' cpus='3'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:       <bank id='4' level='3' type='both' size='16' unit='MiB' cpus='4'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:       <bank id='5' level='3' type='both' size='16' unit='MiB' cpus='5'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:       <bank id='6' level='3' type='both' size='16' unit='MiB' cpus='6'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:       <bank id='7' level='3' type='both' size='16' unit='MiB' cpus='7'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:     </cache>
Dec 05 09:11:57 compute-1 nova_compute[189066]:     <secmodel>
Dec 05 09:11:57 compute-1 nova_compute[189066]:       <model>selinux</model>
Dec 05 09:11:57 compute-1 nova_compute[189066]:       <doi>0</doi>
Dec 05 09:11:57 compute-1 nova_compute[189066]:       <baselabel type='kvm'>system_u:system_r:svirt_t:s0</baselabel>
Dec 05 09:11:57 compute-1 nova_compute[189066]:       <baselabel type='qemu'>system_u:system_r:svirt_tcg_t:s0</baselabel>
Dec 05 09:11:57 compute-1 nova_compute[189066]:     </secmodel>
Dec 05 09:11:57 compute-1 nova_compute[189066]:     <secmodel>
Dec 05 09:11:57 compute-1 nova_compute[189066]:       <model>dac</model>
Dec 05 09:11:57 compute-1 nova_compute[189066]:       <doi>0</doi>
Dec 05 09:11:57 compute-1 nova_compute[189066]:       <baselabel type='kvm'>+107:+107</baselabel>
Dec 05 09:11:57 compute-1 nova_compute[189066]:       <baselabel type='qemu'>+107:+107</baselabel>
Dec 05 09:11:57 compute-1 nova_compute[189066]:     </secmodel>
Dec 05 09:11:57 compute-1 nova_compute[189066]:   </host>
Dec 05 09:11:57 compute-1 nova_compute[189066]: 
Dec 05 09:11:57 compute-1 nova_compute[189066]:   <guest>
Dec 05 09:11:57 compute-1 nova_compute[189066]:     <os_type>hvm</os_type>
Dec 05 09:11:57 compute-1 nova_compute[189066]:     <arch name='i686'>
Dec 05 09:11:57 compute-1 nova_compute[189066]:       <wordsize>32</wordsize>
Dec 05 09:11:57 compute-1 nova_compute[189066]:       <emulator>/usr/libexec/qemu-kvm</emulator>
Dec 05 09:11:57 compute-1 nova_compute[189066]:       <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Dec 05 09:11:57 compute-1 nova_compute[189066]:       <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Dec 05 09:11:57 compute-1 nova_compute[189066]:       <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Dec 05 09:11:57 compute-1 nova_compute[189066]:       <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Dec 05 09:11:57 compute-1 nova_compute[189066]:       <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Dec 05 09:11:57 compute-1 nova_compute[189066]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Dec 05 09:11:57 compute-1 nova_compute[189066]:       <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Dec 05 09:11:57 compute-1 nova_compute[189066]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Dec 05 09:11:57 compute-1 nova_compute[189066]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Dec 05 09:11:57 compute-1 nova_compute[189066]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Dec 05 09:11:57 compute-1 nova_compute[189066]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Dec 05 09:11:57 compute-1 nova_compute[189066]:       <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Dec 05 09:11:57 compute-1 nova_compute[189066]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Dec 05 09:11:57 compute-1 nova_compute[189066]:       <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Dec 05 09:11:57 compute-1 nova_compute[189066]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Dec 05 09:11:57 compute-1 nova_compute[189066]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Dec 05 09:11:57 compute-1 nova_compute[189066]:       <domain type='qemu'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:       <domain type='kvm'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:     </arch>
Dec 05 09:11:57 compute-1 nova_compute[189066]:     <features>
Dec 05 09:11:57 compute-1 nova_compute[189066]:       <pae/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:       <nonpae/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:       <acpi default='on' toggle='yes'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:       <apic default='on' toggle='no'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:       <cpuselection/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:       <deviceboot/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:       <disksnapshot default='on' toggle='no'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:       <externalSnapshot/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:     </features>
Dec 05 09:11:57 compute-1 nova_compute[189066]:   </guest>
Dec 05 09:11:57 compute-1 nova_compute[189066]: 
Dec 05 09:11:57 compute-1 nova_compute[189066]:   <guest>
Dec 05 09:11:57 compute-1 nova_compute[189066]:     <os_type>hvm</os_type>
Dec 05 09:11:57 compute-1 nova_compute[189066]:     <arch name='x86_64'>
Dec 05 09:11:57 compute-1 nova_compute[189066]:       <wordsize>64</wordsize>
Dec 05 09:11:57 compute-1 nova_compute[189066]:       <emulator>/usr/libexec/qemu-kvm</emulator>
Dec 05 09:11:57 compute-1 nova_compute[189066]:       <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Dec 05 09:11:57 compute-1 nova_compute[189066]:       <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Dec 05 09:11:57 compute-1 nova_compute[189066]:       <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Dec 05 09:11:57 compute-1 nova_compute[189066]:       <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Dec 05 09:11:57 compute-1 nova_compute[189066]:       <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Dec 05 09:11:57 compute-1 nova_compute[189066]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Dec 05 09:11:57 compute-1 nova_compute[189066]:       <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Dec 05 09:11:57 compute-1 nova_compute[189066]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Dec 05 09:11:57 compute-1 nova_compute[189066]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Dec 05 09:11:57 compute-1 nova_compute[189066]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Dec 05 09:11:57 compute-1 nova_compute[189066]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Dec 05 09:11:57 compute-1 nova_compute[189066]:       <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Dec 05 09:11:57 compute-1 nova_compute[189066]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Dec 05 09:11:57 compute-1 nova_compute[189066]:       <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Dec 05 09:11:57 compute-1 nova_compute[189066]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Dec 05 09:11:57 compute-1 nova_compute[189066]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Dec 05 09:11:57 compute-1 nova_compute[189066]:       <domain type='qemu'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:       <domain type='kvm'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:     </arch>
Dec 05 09:11:57 compute-1 nova_compute[189066]:     <features>
Dec 05 09:11:57 compute-1 nova_compute[189066]:       <acpi default='on' toggle='yes'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:       <apic default='on' toggle='no'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:       <cpuselection/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:       <deviceboot/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:       <disksnapshot default='on' toggle='no'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:       <externalSnapshot/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:     </features>
Dec 05 09:11:57 compute-1 nova_compute[189066]:   </guest>
Dec 05 09:11:57 compute-1 nova_compute[189066]: 
Dec 05 09:11:57 compute-1 nova_compute[189066]: </capabilities>
Dec 05 09:11:57 compute-1 nova_compute[189066]: 
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.976 189070 DEBUG nova.virt.libvirt.host [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] Getting domain capabilities for i686 via machine types: {'pc', 'q35'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952
Dec 05 09:11:57 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.980 189070 DEBUG nova.virt.libvirt.host [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=pc:
Dec 05 09:11:57 compute-1 nova_compute[189066]: <domainCapabilities>
Dec 05 09:11:57 compute-1 nova_compute[189066]:   <path>/usr/libexec/qemu-kvm</path>
Dec 05 09:11:57 compute-1 nova_compute[189066]:   <domain>kvm</domain>
Dec 05 09:11:57 compute-1 nova_compute[189066]:   <machine>pc-i440fx-rhel7.6.0</machine>
Dec 05 09:11:57 compute-1 nova_compute[189066]:   <arch>i686</arch>
Dec 05 09:11:57 compute-1 nova_compute[189066]:   <vcpu max='240'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:   <iothreads supported='yes'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:   <os supported='yes'>
Dec 05 09:11:57 compute-1 nova_compute[189066]:     <enum name='firmware'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:     <loader supported='yes'>
Dec 05 09:11:57 compute-1 nova_compute[189066]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Dec 05 09:11:57 compute-1 nova_compute[189066]:       <enum name='type'>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <value>rom</value>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <value>pflash</value>
Dec 05 09:11:57 compute-1 nova_compute[189066]:       </enum>
Dec 05 09:11:57 compute-1 nova_compute[189066]:       <enum name='readonly'>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <value>yes</value>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <value>no</value>
Dec 05 09:11:57 compute-1 nova_compute[189066]:       </enum>
Dec 05 09:11:57 compute-1 nova_compute[189066]:       <enum name='secure'>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <value>no</value>
Dec 05 09:11:57 compute-1 nova_compute[189066]:       </enum>
Dec 05 09:11:57 compute-1 nova_compute[189066]:     </loader>
Dec 05 09:11:57 compute-1 nova_compute[189066]:   </os>
Dec 05 09:11:57 compute-1 nova_compute[189066]:   <cpu>
Dec 05 09:11:57 compute-1 nova_compute[189066]:     <mode name='host-passthrough' supported='yes'>
Dec 05 09:11:57 compute-1 nova_compute[189066]:       <enum name='hostPassthroughMigratable'>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <value>on</value>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <value>off</value>
Dec 05 09:11:57 compute-1 nova_compute[189066]:       </enum>
Dec 05 09:11:57 compute-1 nova_compute[189066]:     </mode>
Dec 05 09:11:57 compute-1 nova_compute[189066]:     <mode name='maximum' supported='yes'>
Dec 05 09:11:57 compute-1 nova_compute[189066]:       <enum name='maximumMigratable'>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <value>on</value>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <value>off</value>
Dec 05 09:11:57 compute-1 nova_compute[189066]:       </enum>
Dec 05 09:11:57 compute-1 nova_compute[189066]:     </mode>
Dec 05 09:11:57 compute-1 nova_compute[189066]:     <mode name='host-model' supported='yes'>
Dec 05 09:11:57 compute-1 nova_compute[189066]:       <model fallback='forbid'>EPYC-Rome</model>
Dec 05 09:11:57 compute-1 nova_compute[189066]:       <vendor>AMD</vendor>
Dec 05 09:11:57 compute-1 nova_compute[189066]:       <maxphysaddr mode='passthrough' limit='40'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:       <feature policy='require' name='x2apic'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:       <feature policy='require' name='tsc-deadline'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:       <feature policy='require' name='hypervisor'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:       <feature policy='require' name='tsc_adjust'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:       <feature policy='require' name='spec-ctrl'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:       <feature policy='require' name='stibp'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:       <feature policy='require' name='ssbd'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:       <feature policy='require' name='cmp_legacy'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:       <feature policy='require' name='overflow-recov'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:       <feature policy='require' name='succor'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:       <feature policy='require' name='ibrs'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:       <feature policy='require' name='amd-ssbd'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:       <feature policy='require' name='virt-ssbd'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:       <feature policy='require' name='lbrv'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:       <feature policy='require' name='tsc-scale'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:       <feature policy='require' name='vmcb-clean'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:       <feature policy='require' name='flushbyasid'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:       <feature policy='require' name='pause-filter'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:       <feature policy='require' name='pfthreshold'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:       <feature policy='require' name='svme-addr-chk'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:       <feature policy='require' name='lfence-always-serializing'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:       <feature policy='disable' name='xsaves'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:     </mode>
Dec 05 09:11:57 compute-1 nova_compute[189066]:     <mode name='custom' supported='yes'>
Dec 05 09:11:57 compute-1 nova_compute[189066]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Dec 05 09:11:57 compute-1 nova_compute[189066]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Dec 05 09:11:57 compute-1 nova_compute[189066]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Dec 05 09:11:57 compute-1 nova_compute[189066]:       <blockers model='Broadwell'>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='erms'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='hle'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='invpcid'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='pcid'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='rtm'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:57 compute-1 nova_compute[189066]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Dec 05 09:11:57 compute-1 nova_compute[189066]:       <blockers model='Broadwell-IBRS'>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='erms'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='hle'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='invpcid'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='pcid'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='rtm'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:57 compute-1 nova_compute[189066]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Dec 05 09:11:57 compute-1 nova_compute[189066]:       <blockers model='Broadwell-noTSX'>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='erms'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='invpcid'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='pcid'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:57 compute-1 nova_compute[189066]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Dec 05 09:11:57 compute-1 nova_compute[189066]:       <blockers model='Broadwell-noTSX-IBRS'>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='erms'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='invpcid'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='pcid'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:57 compute-1 nova_compute[189066]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Dec 05 09:11:57 compute-1 nova_compute[189066]:       <blockers model='Broadwell-v1'>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='erms'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='hle'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='invpcid'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='pcid'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='rtm'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:57 compute-1 nova_compute[189066]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Dec 05 09:11:57 compute-1 nova_compute[189066]:       <blockers model='Broadwell-v2'>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='erms'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='invpcid'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='pcid'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:57 compute-1 nova_compute[189066]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Dec 05 09:11:57 compute-1 nova_compute[189066]:       <blockers model='Broadwell-v3'>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='erms'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='hle'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='invpcid'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='pcid'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='rtm'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:57 compute-1 nova_compute[189066]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Dec 05 09:11:57 compute-1 nova_compute[189066]:       <blockers model='Broadwell-v4'>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='erms'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='invpcid'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='pcid'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:57 compute-1 nova_compute[189066]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Dec 05 09:11:57 compute-1 nova_compute[189066]:       <blockers model='Cascadelake-Server'>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='avx512bw'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='avx512cd'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='avx512dq'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='avx512f'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='avx512vl'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='avx512vnni'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='erms'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='hle'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='invpcid'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='pcid'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='pku'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='rtm'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:57 compute-1 nova_compute[189066]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Dec 05 09:11:57 compute-1 nova_compute[189066]:       <blockers model='Cascadelake-Server-noTSX'>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='avx512bw'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='avx512cd'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='avx512dq'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='avx512f'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='avx512vl'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='avx512vnni'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='erms'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='ibrs-all'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='invpcid'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='pcid'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='pku'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:57 compute-1 nova_compute[189066]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Dec 05 09:11:57 compute-1 nova_compute[189066]:       <blockers model='Cascadelake-Server-v1'>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='avx512bw'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='avx512cd'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='avx512dq'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='avx512f'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='avx512vl'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='avx512vnni'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='erms'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='hle'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='invpcid'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='pcid'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='pku'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='rtm'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:57 compute-1 nova_compute[189066]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Dec 05 09:11:57 compute-1 nova_compute[189066]:       <blockers model='Cascadelake-Server-v2'>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='avx512bw'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='avx512cd'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='avx512dq'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='avx512f'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='avx512vl'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='avx512vnni'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='erms'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='hle'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='ibrs-all'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='invpcid'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='pcid'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='pku'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='rtm'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:57 compute-1 nova_compute[189066]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Dec 05 09:11:57 compute-1 nova_compute[189066]:       <blockers model='Cascadelake-Server-v3'>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='avx512bw'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='avx512cd'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='avx512dq'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='avx512f'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='avx512vl'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='avx512vnni'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='erms'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='ibrs-all'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='invpcid'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='pcid'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='pku'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:57 compute-1 nova_compute[189066]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Dec 05 09:11:57 compute-1 nova_compute[189066]:       <blockers model='Cascadelake-Server-v4'>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='avx512bw'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='avx512cd'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='avx512dq'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='avx512f'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='avx512vl'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='avx512vnni'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='erms'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='ibrs-all'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='invpcid'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='pcid'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='pku'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:57 compute-1 nova_compute[189066]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Dec 05 09:11:57 compute-1 nova_compute[189066]:       <blockers model='Cascadelake-Server-v5'>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='avx512bw'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='avx512cd'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='avx512dq'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='avx512f'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='avx512vl'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='avx512vnni'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='erms'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='ibrs-all'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='invpcid'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='pcid'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='pku'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='xsaves'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:57 compute-1 nova_compute[189066]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Dec 05 09:11:57 compute-1 nova_compute[189066]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Dec 05 09:11:57 compute-1 nova_compute[189066]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Dec 05 09:11:57 compute-1 nova_compute[189066]:       <blockers model='Cooperlake'>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='avx512-bf16'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='avx512bw'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='avx512cd'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='avx512dq'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='avx512f'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='avx512vl'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='avx512vnni'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='erms'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='hle'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='ibrs-all'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='invpcid'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='pcid'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='pku'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='rtm'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='taa-no'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:57 compute-1 nova_compute[189066]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Dec 05 09:11:57 compute-1 nova_compute[189066]:       <blockers model='Cooperlake-v1'>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='avx512-bf16'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='avx512bw'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='avx512cd'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='avx512dq'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='avx512f'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='avx512vl'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='avx512vnni'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='erms'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='hle'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='ibrs-all'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='invpcid'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='pcid'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='pku'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='rtm'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='taa-no'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:57 compute-1 nova_compute[189066]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Dec 05 09:11:57 compute-1 nova_compute[189066]:       <blockers model='Cooperlake-v2'>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='avx512-bf16'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='avx512bw'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='avx512cd'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='avx512dq'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='avx512f'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='avx512vl'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='avx512vnni'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='erms'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='hle'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='ibrs-all'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='invpcid'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='pcid'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='pku'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='rtm'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='taa-no'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='xsaves'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:57 compute-1 nova_compute[189066]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Dec 05 09:11:57 compute-1 nova_compute[189066]:       <blockers model='Denverton'>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='erms'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='mpx'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:57 compute-1 nova_compute[189066]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Dec 05 09:11:57 compute-1 nova_compute[189066]:       <blockers model='Denverton-v1'>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='erms'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='mpx'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:57 compute-1 nova_compute[189066]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Dec 05 09:11:57 compute-1 nova_compute[189066]:       <blockers model='Denverton-v2'>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='erms'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:57 compute-1 nova_compute[189066]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Dec 05 09:11:57 compute-1 nova_compute[189066]:       <blockers model='Denverton-v3'>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='erms'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='xsaves'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:57 compute-1 nova_compute[189066]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Dec 05 09:11:57 compute-1 nova_compute[189066]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Dec 05 09:11:57 compute-1 nova_compute[189066]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Dec 05 09:11:57 compute-1 nova_compute[189066]:       <blockers model='Dhyana-v2'>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='xsaves'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:57 compute-1 nova_compute[189066]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Dec 05 09:11:57 compute-1 nova_compute[189066]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Dec 05 09:11:57 compute-1 nova_compute[189066]:       <blockers model='EPYC-Genoa'>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='amd-psfd'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='auto-ibrs'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='avx512-bf16'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='avx512-vpopcntdq'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='avx512bitalg'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='avx512bw'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='avx512cd'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='avx512dq'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='avx512f'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='avx512ifma'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='avx512vbmi'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='avx512vbmi2'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='avx512vl'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='avx512vnni'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='erms'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='fsrm'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='gfni'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='invpcid'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='la57'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='no-nested-data-bp'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='null-sel-clr-base'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='pcid'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='pku'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='stibp-always-on'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='vaes'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='vpclmulqdq'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='xsaves'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:57 compute-1 nova_compute[189066]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Dec 05 09:11:57 compute-1 nova_compute[189066]:       <blockers model='EPYC-Genoa-v1'>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='amd-psfd'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='auto-ibrs'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='avx512-bf16'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='avx512-vpopcntdq'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='avx512bitalg'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='avx512bw'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='avx512cd'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='avx512dq'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='avx512f'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='avx512ifma'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='avx512vbmi'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='avx512vbmi2'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='avx512vl'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='avx512vnni'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='erms'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='fsrm'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='gfni'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='invpcid'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='la57'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='no-nested-data-bp'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='null-sel-clr-base'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='pcid'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='pku'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='stibp-always-on'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='vaes'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='vpclmulqdq'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='xsaves'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:57 compute-1 nova_compute[189066]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Dec 05 09:11:57 compute-1 nova_compute[189066]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Dec 05 09:11:57 compute-1 nova_compute[189066]:       <blockers model='EPYC-Milan'>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='erms'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='fsrm'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='invpcid'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='pcid'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='pku'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='xsaves'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:57 compute-1 nova_compute[189066]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Dec 05 09:11:57 compute-1 nova_compute[189066]:       <blockers model='EPYC-Milan-v1'>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='erms'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='fsrm'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='invpcid'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='pcid'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='pku'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='xsaves'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:57 compute-1 nova_compute[189066]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Dec 05 09:11:57 compute-1 nova_compute[189066]:       <blockers model='EPYC-Milan-v2'>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='amd-psfd'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='erms'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='fsrm'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='invpcid'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='no-nested-data-bp'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='null-sel-clr-base'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='pcid'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='pku'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='stibp-always-on'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='vaes'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='vpclmulqdq'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='xsaves'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:57 compute-1 nova_compute[189066]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Dec 05 09:11:57 compute-1 nova_compute[189066]:       <blockers model='EPYC-Rome'>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='xsaves'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:57 compute-1 nova_compute[189066]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Dec 05 09:11:57 compute-1 nova_compute[189066]:       <blockers model='EPYC-Rome-v1'>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='xsaves'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:57 compute-1 nova_compute[189066]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Dec 05 09:11:57 compute-1 nova_compute[189066]:       <blockers model='EPYC-Rome-v2'>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='xsaves'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:57 compute-1 nova_compute[189066]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Dec 05 09:11:57 compute-1 nova_compute[189066]:       <blockers model='EPYC-Rome-v3'>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='xsaves'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:57 compute-1 nova_compute[189066]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Dec 05 09:11:57 compute-1 nova_compute[189066]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Dec 05 09:11:57 compute-1 nova_compute[189066]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Dec 05 09:11:57 compute-1 nova_compute[189066]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Dec 05 09:11:57 compute-1 nova_compute[189066]:       <blockers model='EPYC-v3'>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='xsaves'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:57 compute-1 nova_compute[189066]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Dec 05 09:11:57 compute-1 nova_compute[189066]:       <blockers model='EPYC-v4'>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='xsaves'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:57 compute-1 nova_compute[189066]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Dec 05 09:11:57 compute-1 nova_compute[189066]:       <blockers model='GraniteRapids'>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='amx-bf16'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='amx-fp16'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='amx-int8'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='amx-tile'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='avx-vnni'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='avx512-bf16'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='avx512-fp16'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='avx512-vpopcntdq'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='avx512bitalg'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='avx512bw'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='avx512cd'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='avx512dq'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='avx512f'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='avx512ifma'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='avx512vbmi'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='avx512vbmi2'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='avx512vl'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='avx512vnni'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='bus-lock-detect'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='erms'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='fbsdp-no'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='fsrc'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='fsrm'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='fsrs'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='fzrm'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='gfni'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='hle'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='ibrs-all'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='invpcid'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='la57'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='mcdt-no'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='pbrsb-no'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='pcid'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='pku'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='prefetchiti'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='psdp-no'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='rtm'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='sbdr-ssdp-no'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='serialize'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='taa-no'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='tsx-ldtrk'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='vaes'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='vpclmulqdq'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='xfd'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='xsaves'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:57 compute-1 nova_compute[189066]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Dec 05 09:11:57 compute-1 nova_compute[189066]:       <blockers model='GraniteRapids-v1'>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='amx-bf16'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='amx-fp16'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='amx-int8'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='amx-tile'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='avx-vnni'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='avx512-bf16'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='avx512-fp16'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='avx512-vpopcntdq'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='avx512bitalg'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='avx512bw'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='avx512cd'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='avx512dq'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='avx512f'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='avx512ifma'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='avx512vbmi'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='avx512vbmi2'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='avx512vl'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='avx512vnni'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='bus-lock-detect'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='erms'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='fbsdp-no'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='fsrc'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='fsrm'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='fsrs'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='fzrm'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='gfni'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='hle'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='ibrs-all'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='invpcid'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='la57'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='mcdt-no'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='pbrsb-no'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='pcid'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='pku'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='prefetchiti'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='psdp-no'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='rtm'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='sbdr-ssdp-no'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='serialize'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='taa-no'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='tsx-ldtrk'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='vaes'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='vpclmulqdq'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='xfd'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='xsaves'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:57 compute-1 nova_compute[189066]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Dec 05 09:11:57 compute-1 nova_compute[189066]:       <blockers model='GraniteRapids-v2'>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='amx-bf16'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='amx-fp16'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='amx-int8'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='amx-tile'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='avx-vnni'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='avx10'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='avx10-128'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='avx10-256'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='avx10-512'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='avx512-bf16'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='avx512-fp16'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='avx512-vpopcntdq'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='avx512bitalg'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='avx512bw'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='avx512cd'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='avx512dq'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='avx512f'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='avx512ifma'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='avx512vbmi'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='avx512vbmi2'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='avx512vl'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='avx512vnni'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='bus-lock-detect'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='cldemote'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='erms'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='fbsdp-no'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='fsrc'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='fsrm'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='fsrs'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='fzrm'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='gfni'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='hle'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='ibrs-all'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='invpcid'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='la57'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='mcdt-no'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='movdir64b'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='movdiri'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='pbrsb-no'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='pcid'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='pku'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='prefetchiti'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='psdp-no'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='rtm'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='sbdr-ssdp-no'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='serialize'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='ss'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='taa-no'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='tsx-ldtrk'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='vaes'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='vpclmulqdq'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='xfd'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='xsaves'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:57 compute-1 nova_compute[189066]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Dec 05 09:11:57 compute-1 nova_compute[189066]:       <blockers model='Haswell'>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='erms'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='hle'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='invpcid'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='pcid'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='rtm'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:57 compute-1 nova_compute[189066]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Dec 05 09:11:57 compute-1 nova_compute[189066]:       <blockers model='Haswell-IBRS'>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='erms'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='hle'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='invpcid'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='pcid'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='rtm'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:57 compute-1 nova_compute[189066]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Dec 05 09:11:57 compute-1 nova_compute[189066]:       <blockers model='Haswell-noTSX'>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='erms'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='invpcid'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='pcid'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:57 compute-1 nova_compute[189066]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Dec 05 09:11:57 compute-1 nova_compute[189066]:       <blockers model='Haswell-noTSX-IBRS'>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='erms'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='invpcid'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='pcid'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:57 compute-1 nova_compute[189066]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Dec 05 09:11:57 compute-1 nova_compute[189066]:       <blockers model='Haswell-v1'>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='erms'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='hle'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='invpcid'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='pcid'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='rtm'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:57 compute-1 nova_compute[189066]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Dec 05 09:11:57 compute-1 nova_compute[189066]:       <blockers model='Haswell-v2'>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='erms'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='invpcid'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='pcid'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:57 compute-1 nova_compute[189066]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Dec 05 09:11:57 compute-1 nova_compute[189066]:       <blockers model='Haswell-v3'>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='erms'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='hle'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='invpcid'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='pcid'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='rtm'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:57 compute-1 nova_compute[189066]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Dec 05 09:11:57 compute-1 nova_compute[189066]:       <blockers model='Haswell-v4'>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='erms'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='invpcid'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='pcid'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:57 compute-1 nova_compute[189066]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Dec 05 09:11:57 compute-1 nova_compute[189066]:       <blockers model='Icelake-Server'>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='avx512-vpopcntdq'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='avx512bitalg'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='avx512bw'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='avx512cd'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='avx512dq'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='avx512f'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='avx512vbmi'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='avx512vbmi2'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='avx512vl'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='avx512vnni'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='erms'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='gfni'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='hle'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='invpcid'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='la57'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='pcid'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='pku'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='rtm'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='vaes'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='vpclmulqdq'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:57 compute-1 nova_compute[189066]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Dec 05 09:11:57 compute-1 nova_compute[189066]:       <blockers model='Icelake-Server-noTSX'>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='avx512-vpopcntdq'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='avx512bitalg'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='avx512bw'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='avx512cd'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='avx512dq'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='avx512f'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='avx512vbmi'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='avx512vbmi2'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='avx512vl'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='avx512vnni'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='erms'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='gfni'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='invpcid'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='la57'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='pcid'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='pku'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='vaes'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='vpclmulqdq'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:57 compute-1 nova_compute[189066]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Dec 05 09:11:57 compute-1 nova_compute[189066]:       <blockers model='Icelake-Server-v1'>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='avx512-vpopcntdq'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='avx512bitalg'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='avx512bw'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='avx512cd'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='avx512dq'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='avx512f'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='avx512vbmi'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='avx512vbmi2'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='avx512vl'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='avx512vnni'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='erms'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='gfni'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='hle'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='invpcid'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='la57'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='pcid'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='pku'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='rtm'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='vaes'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='vpclmulqdq'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:57 compute-1 nova_compute[189066]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Dec 05 09:11:57 compute-1 nova_compute[189066]:       <blockers model='Icelake-Server-v2'>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='avx512-vpopcntdq'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='avx512bitalg'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='avx512bw'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='avx512cd'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='avx512dq'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='avx512f'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='avx512vbmi'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='avx512vbmi2'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='avx512vl'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='avx512vnni'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='erms'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='gfni'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='invpcid'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='la57'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='pcid'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='pku'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='vaes'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='vpclmulqdq'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:57 compute-1 nova_compute[189066]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Dec 05 09:11:57 compute-1 nova_compute[189066]:       <blockers model='Icelake-Server-v3'>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='avx512-vpopcntdq'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='avx512bitalg'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='avx512bw'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='avx512cd'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='avx512dq'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='avx512f'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='avx512vbmi'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='avx512vbmi2'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='avx512vl'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='avx512vnni'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='erms'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='gfni'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='ibrs-all'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='invpcid'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='la57'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='pcid'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='pku'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='taa-no'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='vaes'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='vpclmulqdq'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:57 compute-1 nova_compute[189066]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Dec 05 09:11:57 compute-1 nova_compute[189066]:       <blockers model='Icelake-Server-v4'>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='avx512-vpopcntdq'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='avx512bitalg'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='avx512bw'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='avx512cd'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='avx512dq'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='avx512f'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='avx512ifma'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='avx512vbmi'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='avx512vbmi2'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='avx512vl'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='avx512vnni'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='erms'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='fsrm'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='gfni'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='ibrs-all'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='invpcid'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='la57'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='pcid'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='pku'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='taa-no'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='vaes'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='vpclmulqdq'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:57 compute-1 nova_compute[189066]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Dec 05 09:11:57 compute-1 nova_compute[189066]:       <blockers model='Icelake-Server-v5'>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='avx512-vpopcntdq'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='avx512bitalg'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='avx512bw'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='avx512cd'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='avx512dq'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='avx512f'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='avx512ifma'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='avx512vbmi'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='avx512vbmi2'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='avx512vl'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='avx512vnni'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='erms'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='fsrm'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='gfni'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='ibrs-all'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='invpcid'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='la57'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='pcid'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='pku'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='taa-no'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='vaes'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='vpclmulqdq'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='xsaves'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:57 compute-1 nova_compute[189066]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Dec 05 09:11:57 compute-1 nova_compute[189066]:       <blockers model='Icelake-Server-v6'>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='avx512-vpopcntdq'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='avx512bitalg'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='avx512bw'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='avx512cd'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='avx512dq'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='avx512f'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='avx512ifma'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='avx512vbmi'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='avx512vbmi2'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='avx512vl'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='avx512vnni'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='erms'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='fsrm'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='gfni'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='ibrs-all'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='invpcid'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='la57'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='pcid'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='pku'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='taa-no'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='vaes'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='vpclmulqdq'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='xsaves'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:57 compute-1 nova_compute[189066]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Dec 05 09:11:57 compute-1 nova_compute[189066]:       <blockers model='Icelake-Server-v7'>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='avx512-vpopcntdq'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='avx512bitalg'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='avx512bw'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='avx512cd'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='avx512dq'/>
Dec 05 09:11:57 compute-1 nova_compute[189066]:         <feature name='avx512f'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512ifma'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512vbmi'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512vbmi2'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512vl'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512vnni'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='erms'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='fsrm'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='gfni'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='hle'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='ibrs-all'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='invpcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='la57'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pku'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='rtm'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='taa-no'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='vaes'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='vpclmulqdq'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='xsaves'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='IvyBridge'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='erms'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='IvyBridge-IBRS'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='erms'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='IvyBridge-v1'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='erms'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='IvyBridge-v2'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='erms'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='KnightsMill'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512-4fmaps'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512-4vnniw'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512-vpopcntdq'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512cd'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512er'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512f'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512pf'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='erms'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='ss'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='KnightsMill-v1'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512-4fmaps'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512-4vnniw'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512-vpopcntdq'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512cd'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512er'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512f'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512pf'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='erms'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='ss'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='Opteron_G4'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='fma4'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='xop'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='Opteron_G4-v1'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='fma4'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='xop'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='Opteron_G5'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='fma4'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='tbm'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='xop'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='Opteron_G5-v1'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='fma4'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='tbm'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='xop'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='SapphireRapids'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='amx-bf16'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='amx-int8'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='amx-tile'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx-vnni'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512-bf16'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512-fp16'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512-vpopcntdq'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512bitalg'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512bw'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512cd'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512dq'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512f'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512ifma'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512vbmi'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512vbmi2'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512vl'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512vnni'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='bus-lock-detect'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='erms'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='fsrc'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='fsrm'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='fsrs'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='fzrm'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='gfni'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='hle'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='ibrs-all'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='invpcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='la57'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pku'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='rtm'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='serialize'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='taa-no'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='tsx-ldtrk'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='vaes'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='vpclmulqdq'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='xfd'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='xsaves'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='SapphireRapids-v1'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='amx-bf16'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='amx-int8'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='amx-tile'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx-vnni'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512-bf16'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512-fp16'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512-vpopcntdq'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512bitalg'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512bw'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512cd'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512dq'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512f'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512ifma'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512vbmi'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512vbmi2'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512vl'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512vnni'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='bus-lock-detect'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='erms'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='fsrc'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='fsrm'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='fsrs'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='fzrm'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='gfni'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='hle'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='ibrs-all'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='invpcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='la57'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pku'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='rtm'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='serialize'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='taa-no'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='tsx-ldtrk'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='vaes'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='vpclmulqdq'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='xfd'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='xsaves'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='SapphireRapids-v2'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='amx-bf16'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='amx-int8'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='amx-tile'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx-vnni'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512-bf16'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512-fp16'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512-vpopcntdq'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512bitalg'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512bw'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512cd'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512dq'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512f'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512ifma'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512vbmi'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512vbmi2'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512vl'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512vnni'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='bus-lock-detect'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='erms'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='fbsdp-no'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='fsrc'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='fsrm'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='fsrs'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='fzrm'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='gfni'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='hle'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='ibrs-all'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='invpcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='la57'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pku'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='psdp-no'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='rtm'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='sbdr-ssdp-no'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='serialize'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='taa-no'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='tsx-ldtrk'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='vaes'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='vpclmulqdq'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='xfd'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='xsaves'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='SapphireRapids-v3'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='amx-bf16'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='amx-int8'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='amx-tile'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx-vnni'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512-bf16'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512-fp16'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512-vpopcntdq'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512bitalg'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512bw'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512cd'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512dq'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512f'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512ifma'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512vbmi'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512vbmi2'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512vl'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512vnni'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='bus-lock-detect'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='cldemote'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='erms'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='fbsdp-no'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='fsrc'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='fsrm'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='fsrs'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='fzrm'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='gfni'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='hle'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='ibrs-all'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='invpcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='la57'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='movdir64b'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='movdiri'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pku'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='psdp-no'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='rtm'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='sbdr-ssdp-no'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='serialize'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='ss'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='taa-no'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='tsx-ldtrk'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='vaes'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='vpclmulqdq'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='xfd'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='xsaves'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='SierraForest'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx-ifma'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx-ne-convert'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx-vnni'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx-vnni-int8'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='bus-lock-detect'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='cmpccxadd'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='erms'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='fbsdp-no'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='fsrm'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='fsrs'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='gfni'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='ibrs-all'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='invpcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='mcdt-no'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pbrsb-no'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pku'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='psdp-no'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='sbdr-ssdp-no'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='serialize'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='vaes'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='vpclmulqdq'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='xsaves'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='SierraForest-v1'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx-ifma'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx-ne-convert'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx-vnni'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx-vnni-int8'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='bus-lock-detect'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='cmpccxadd'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='erms'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='fbsdp-no'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='fsrm'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='fsrs'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='gfni'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='ibrs-all'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='invpcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='mcdt-no'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pbrsb-no'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pku'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='psdp-no'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='sbdr-ssdp-no'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='serialize'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='vaes'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='vpclmulqdq'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='xsaves'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='Skylake-Client'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='erms'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='hle'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='invpcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='rtm'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='Skylake-Client-IBRS'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='erms'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='hle'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='invpcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='rtm'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='erms'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='invpcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='Skylake-Client-v1'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='erms'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='hle'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='invpcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='rtm'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='Skylake-Client-v2'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='erms'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='hle'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='invpcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='rtm'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='Skylake-Client-v3'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='erms'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='invpcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='Skylake-Client-v4'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='erms'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='invpcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='xsaves'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='Skylake-Server'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512bw'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512cd'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512dq'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512f'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512vl'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='erms'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='hle'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='invpcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pku'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='rtm'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='Skylake-Server-IBRS'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512bw'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512cd'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512dq'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512f'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512vl'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='erms'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='hle'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='invpcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pku'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='rtm'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512bw'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512cd'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512dq'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512f'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512vl'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='erms'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='invpcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pku'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='Skylake-Server-v1'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512bw'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512cd'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512dq'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512f'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512vl'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='erms'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='hle'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='invpcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pku'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='rtm'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='Skylake-Server-v2'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512bw'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512cd'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512dq'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512f'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512vl'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='erms'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='hle'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='invpcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pku'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='rtm'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='Skylake-Server-v3'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512bw'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512cd'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512dq'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512f'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512vl'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='erms'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='invpcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pku'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='Skylake-Server-v4'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512bw'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512cd'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512dq'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512f'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512vl'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='erms'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='invpcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pku'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='Skylake-Server-v5'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512bw'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512cd'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512dq'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512f'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512vl'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='erms'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='invpcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pku'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='xsaves'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='Snowridge'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='cldemote'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='core-capability'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='erms'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='gfni'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='movdir64b'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='movdiri'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='mpx'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='split-lock-detect'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='Snowridge-v1'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='cldemote'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='core-capability'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='erms'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='gfni'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='movdir64b'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='movdiri'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='mpx'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='split-lock-detect'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='Snowridge-v2'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='cldemote'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='core-capability'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='erms'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='gfni'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='movdir64b'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='movdiri'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='split-lock-detect'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='Snowridge-v3'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='cldemote'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='core-capability'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='erms'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='gfni'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='movdir64b'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='movdiri'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='split-lock-detect'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='xsaves'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='Snowridge-v4'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='cldemote'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='erms'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='gfni'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='movdir64b'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='movdiri'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='xsaves'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='athlon'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='3dnow'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='3dnowext'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='athlon-v1'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='3dnow'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='3dnowext'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='core2duo'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='ss'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='core2duo-v1'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='ss'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='coreduo'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='ss'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='coreduo-v1'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='ss'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='n270'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='ss'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='n270-v1'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='ss'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='phenom'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='3dnow'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='3dnowext'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='phenom-v1'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='3dnow'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='3dnowext'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:     </mode>
Dec 05 09:11:58 compute-1 nova_compute[189066]:   </cpu>
Dec 05 09:11:58 compute-1 nova_compute[189066]:   <memoryBacking supported='yes'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:     <enum name='sourceType'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <value>file</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <value>anonymous</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <value>memfd</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:     </enum>
Dec 05 09:11:58 compute-1 nova_compute[189066]:   </memoryBacking>
Dec 05 09:11:58 compute-1 nova_compute[189066]:   <devices>
Dec 05 09:11:58 compute-1 nova_compute[189066]:     <disk supported='yes'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <enum name='diskDevice'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>disk</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>cdrom</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>floppy</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>lun</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </enum>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <enum name='bus'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>ide</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>fdc</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>scsi</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>virtio</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>usb</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>sata</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </enum>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <enum name='model'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>virtio</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>virtio-transitional</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>virtio-non-transitional</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </enum>
Dec 05 09:11:58 compute-1 nova_compute[189066]:     </disk>
Dec 05 09:11:58 compute-1 nova_compute[189066]:     <graphics supported='yes'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <enum name='type'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>vnc</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>egl-headless</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>dbus</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </enum>
Dec 05 09:11:58 compute-1 nova_compute[189066]:     </graphics>
Dec 05 09:11:58 compute-1 nova_compute[189066]:     <video supported='yes'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <enum name='modelType'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>vga</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>cirrus</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>virtio</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>none</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>bochs</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>ramfb</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </enum>
Dec 05 09:11:58 compute-1 nova_compute[189066]:     </video>
Dec 05 09:11:58 compute-1 nova_compute[189066]:     <hostdev supported='yes'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <enum name='mode'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>subsystem</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </enum>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <enum name='startupPolicy'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>default</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>mandatory</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>requisite</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>optional</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </enum>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <enum name='subsysType'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>usb</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>pci</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>scsi</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </enum>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <enum name='capsType'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <enum name='pciBackend'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:     </hostdev>
Dec 05 09:11:58 compute-1 nova_compute[189066]:     <rng supported='yes'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <enum name='model'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>virtio</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>virtio-transitional</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>virtio-non-transitional</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </enum>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <enum name='backendModel'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>random</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>egd</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>builtin</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </enum>
Dec 05 09:11:58 compute-1 nova_compute[189066]:     </rng>
Dec 05 09:11:58 compute-1 nova_compute[189066]:     <filesystem supported='yes'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <enum name='driverType'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>path</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>handle</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>virtiofs</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </enum>
Dec 05 09:11:58 compute-1 nova_compute[189066]:     </filesystem>
Dec 05 09:11:58 compute-1 nova_compute[189066]:     <tpm supported='yes'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <enum name='model'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>tpm-tis</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>tpm-crb</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </enum>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <enum name='backendModel'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>emulator</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>external</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </enum>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <enum name='backendVersion'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>2.0</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </enum>
Dec 05 09:11:58 compute-1 nova_compute[189066]:     </tpm>
Dec 05 09:11:58 compute-1 nova_compute[189066]:     <redirdev supported='yes'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <enum name='bus'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>usb</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </enum>
Dec 05 09:11:58 compute-1 nova_compute[189066]:     </redirdev>
Dec 05 09:11:58 compute-1 nova_compute[189066]:     <channel supported='yes'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <enum name='type'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>pty</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>unix</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </enum>
Dec 05 09:11:58 compute-1 nova_compute[189066]:     </channel>
Dec 05 09:11:58 compute-1 nova_compute[189066]:     <crypto supported='yes'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <enum name='model'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <enum name='type'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>qemu</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </enum>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <enum name='backendModel'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>builtin</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </enum>
Dec 05 09:11:58 compute-1 nova_compute[189066]:     </crypto>
Dec 05 09:11:58 compute-1 nova_compute[189066]:     <interface supported='yes'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <enum name='backendType'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>default</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>passt</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </enum>
Dec 05 09:11:58 compute-1 nova_compute[189066]:     </interface>
Dec 05 09:11:58 compute-1 nova_compute[189066]:     <panic supported='yes'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <enum name='model'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>isa</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>hyperv</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </enum>
Dec 05 09:11:58 compute-1 nova_compute[189066]:     </panic>
Dec 05 09:11:58 compute-1 nova_compute[189066]:     <console supported='yes'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <enum name='type'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>null</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>vc</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>pty</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>dev</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>file</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>pipe</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>stdio</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>udp</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>tcp</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>unix</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>qemu-vdagent</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>dbus</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </enum>
Dec 05 09:11:58 compute-1 nova_compute[189066]:     </console>
Dec 05 09:11:58 compute-1 nova_compute[189066]:   </devices>
Dec 05 09:11:58 compute-1 nova_compute[189066]:   <features>
Dec 05 09:11:58 compute-1 nova_compute[189066]:     <gic supported='no'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:     <vmcoreinfo supported='yes'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:     <genid supported='yes'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:     <backingStoreInput supported='yes'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:     <backup supported='yes'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:     <async-teardown supported='yes'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:     <ps2 supported='yes'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:     <sev supported='no'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:     <sgx supported='no'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:     <hyperv supported='yes'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <enum name='features'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>relaxed</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>vapic</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>spinlocks</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>vpindex</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>runtime</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>synic</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>stimer</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>reset</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>vendor_id</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>frequencies</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>reenlightenment</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>tlbflush</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>ipi</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>avic</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>emsr_bitmap</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>xmm_input</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </enum>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <defaults>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <spinlocks>4095</spinlocks>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <stimer_direct>on</stimer_direct>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <tlbflush_direct>on</tlbflush_direct>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <tlbflush_extended>on</tlbflush_extended>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <vendor_id>Linux KVM Hv</vendor_id>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </defaults>
Dec 05 09:11:58 compute-1 nova_compute[189066]:     </hyperv>
Dec 05 09:11:58 compute-1 nova_compute[189066]:     <launchSecurity supported='yes'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <enum name='sectype'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>tdx</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </enum>
Dec 05 09:11:58 compute-1 nova_compute[189066]:     </launchSecurity>
Dec 05 09:11:58 compute-1 nova_compute[189066]:   </features>
Dec 05 09:11:58 compute-1 nova_compute[189066]: </domainCapabilities>
Dec 05 09:11:58 compute-1 nova_compute[189066]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Dec 05 09:11:58 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.986 189070 WARNING nova.virt.libvirt.driver [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] Cannot update service status on host "compute-1.ctlplane.example.com" since it is not registered.: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host compute-1.ctlplane.example.com could not be found.
Dec 05 09:11:58 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.986 189070 DEBUG nova.virt.libvirt.volume.mount [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] Initialising _HostMountState generation 0 host_up /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/mount.py:130
Dec 05 09:11:58 compute-1 nova_compute[189066]: 2025-12-05 09:11:57.988 189070 DEBUG nova.virt.libvirt.host [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=q35:
Dec 05 09:11:58 compute-1 nova_compute[189066]: <domainCapabilities>
Dec 05 09:11:58 compute-1 nova_compute[189066]:   <path>/usr/libexec/qemu-kvm</path>
Dec 05 09:11:58 compute-1 nova_compute[189066]:   <domain>kvm</domain>
Dec 05 09:11:58 compute-1 nova_compute[189066]:   <machine>pc-q35-rhel9.8.0</machine>
Dec 05 09:11:58 compute-1 nova_compute[189066]:   <arch>i686</arch>
Dec 05 09:11:58 compute-1 nova_compute[189066]:   <vcpu max='4096'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:   <iothreads supported='yes'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:   <os supported='yes'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:     <enum name='firmware'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:     <loader supported='yes'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <enum name='type'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>rom</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>pflash</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </enum>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <enum name='readonly'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>yes</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>no</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </enum>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <enum name='secure'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>no</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </enum>
Dec 05 09:11:58 compute-1 nova_compute[189066]:     </loader>
Dec 05 09:11:58 compute-1 nova_compute[189066]:   </os>
Dec 05 09:11:58 compute-1 nova_compute[189066]:   <cpu>
Dec 05 09:11:58 compute-1 nova_compute[189066]:     <mode name='host-passthrough' supported='yes'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <enum name='hostPassthroughMigratable'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>on</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>off</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </enum>
Dec 05 09:11:58 compute-1 nova_compute[189066]:     </mode>
Dec 05 09:11:58 compute-1 nova_compute[189066]:     <mode name='maximum' supported='yes'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <enum name='maximumMigratable'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>on</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>off</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </enum>
Dec 05 09:11:58 compute-1 nova_compute[189066]:     </mode>
Dec 05 09:11:58 compute-1 nova_compute[189066]:     <mode name='host-model' supported='yes'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model fallback='forbid'>EPYC-Rome</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <vendor>AMD</vendor>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <maxphysaddr mode='passthrough' limit='40'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <feature policy='require' name='x2apic'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <feature policy='require' name='tsc-deadline'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <feature policy='require' name='hypervisor'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <feature policy='require' name='tsc_adjust'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <feature policy='require' name='spec-ctrl'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <feature policy='require' name='stibp'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <feature policy='require' name='ssbd'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <feature policy='require' name='cmp_legacy'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <feature policy='require' name='overflow-recov'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <feature policy='require' name='succor'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <feature policy='require' name='ibrs'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <feature policy='require' name='amd-ssbd'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <feature policy='require' name='virt-ssbd'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <feature policy='require' name='lbrv'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <feature policy='require' name='tsc-scale'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <feature policy='require' name='vmcb-clean'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <feature policy='require' name='flushbyasid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <feature policy='require' name='pause-filter'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <feature policy='require' name='pfthreshold'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <feature policy='require' name='svme-addr-chk'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <feature policy='require' name='lfence-always-serializing'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <feature policy='disable' name='xsaves'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:     </mode>
Dec 05 09:11:58 compute-1 nova_compute[189066]:     <mode name='custom' supported='yes'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='Broadwell'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='erms'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='hle'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='invpcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='rtm'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='Broadwell-IBRS'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='erms'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='hle'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='invpcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='rtm'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='Broadwell-noTSX'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='erms'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='invpcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='Broadwell-noTSX-IBRS'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='erms'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='invpcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='Broadwell-v1'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='erms'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='hle'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='invpcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='rtm'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='Broadwell-v2'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='erms'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='invpcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='Broadwell-v3'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='erms'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='hle'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='invpcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='rtm'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='Broadwell-v4'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='erms'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='invpcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='Cascadelake-Server'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512bw'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512cd'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512dq'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512f'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512vl'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512vnni'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='erms'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='hle'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='invpcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pku'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='rtm'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='Cascadelake-Server-noTSX'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512bw'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512cd'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512dq'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512f'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512vl'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512vnni'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='erms'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='ibrs-all'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='invpcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pku'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='Cascadelake-Server-v1'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512bw'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512cd'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512dq'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512f'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512vl'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512vnni'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='erms'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='hle'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='invpcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pku'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='rtm'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='Cascadelake-Server-v2'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512bw'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512cd'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512dq'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512f'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512vl'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512vnni'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='erms'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='hle'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='ibrs-all'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='invpcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pku'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='rtm'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='Cascadelake-Server-v3'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512bw'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512cd'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512dq'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512f'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512vl'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512vnni'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='erms'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='ibrs-all'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='invpcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pku'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='Cascadelake-Server-v4'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512bw'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512cd'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512dq'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512f'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512vl'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512vnni'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='erms'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='ibrs-all'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='invpcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pku'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='Cascadelake-Server-v5'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512bw'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512cd'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512dq'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512f'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512vl'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512vnni'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='erms'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='ibrs-all'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='invpcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pku'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='xsaves'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='Cooperlake'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512-bf16'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512bw'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512cd'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512dq'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512f'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512vl'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512vnni'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='erms'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='hle'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='ibrs-all'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='invpcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pku'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='rtm'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='taa-no'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='Cooperlake-v1'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512-bf16'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512bw'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512cd'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512dq'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512f'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512vl'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512vnni'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='erms'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='hle'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='ibrs-all'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='invpcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pku'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='rtm'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='taa-no'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='Cooperlake-v2'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512-bf16'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512bw'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512cd'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512dq'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512f'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512vl'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512vnni'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='erms'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='hle'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='ibrs-all'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='invpcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pku'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='rtm'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='taa-no'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='xsaves'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='Denverton'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='erms'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='mpx'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='Denverton-v1'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='erms'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='mpx'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='Denverton-v2'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='erms'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='Denverton-v3'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='erms'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='xsaves'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='Dhyana-v2'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='xsaves'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='EPYC-Genoa'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='amd-psfd'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='auto-ibrs'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512-bf16'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512-vpopcntdq'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512bitalg'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512bw'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512cd'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512dq'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512f'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512ifma'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512vbmi'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512vbmi2'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512vl'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512vnni'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='erms'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='fsrm'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='gfni'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='invpcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='la57'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='no-nested-data-bp'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='null-sel-clr-base'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pku'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='stibp-always-on'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='vaes'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='vpclmulqdq'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='xsaves'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='EPYC-Genoa-v1'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='amd-psfd'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='auto-ibrs'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512-bf16'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512-vpopcntdq'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512bitalg'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512bw'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512cd'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512dq'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512f'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512ifma'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512vbmi'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512vbmi2'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512vl'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512vnni'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='erms'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='fsrm'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='gfni'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='invpcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='la57'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='no-nested-data-bp'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='null-sel-clr-base'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pku'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='stibp-always-on'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='vaes'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='vpclmulqdq'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='xsaves'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='EPYC-Milan'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='erms'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='fsrm'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='invpcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pku'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='xsaves'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='EPYC-Milan-v1'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='erms'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='fsrm'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='invpcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pku'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='xsaves'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='EPYC-Milan-v2'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='amd-psfd'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='erms'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='fsrm'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='invpcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='no-nested-data-bp'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='null-sel-clr-base'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pku'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='stibp-always-on'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='vaes'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='vpclmulqdq'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='xsaves'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='EPYC-Rome'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='xsaves'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='EPYC-Rome-v1'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='xsaves'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='EPYC-Rome-v2'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='xsaves'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='EPYC-Rome-v3'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='xsaves'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='EPYC-v3'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='xsaves'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='EPYC-v4'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='xsaves'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='GraniteRapids'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='amx-bf16'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='amx-fp16'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='amx-int8'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='amx-tile'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx-vnni'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512-bf16'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512-fp16'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512-vpopcntdq'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512bitalg'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512bw'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512cd'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512dq'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512f'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512ifma'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512vbmi'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512vbmi2'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512vl'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512vnni'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='bus-lock-detect'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='erms'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='fbsdp-no'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='fsrc'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='fsrm'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='fsrs'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='fzrm'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='gfni'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='hle'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='ibrs-all'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='invpcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='la57'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='mcdt-no'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pbrsb-no'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pku'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='prefetchiti'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='psdp-no'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='rtm'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='sbdr-ssdp-no'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='serialize'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='taa-no'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='tsx-ldtrk'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='vaes'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='vpclmulqdq'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='xfd'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='xsaves'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='GraniteRapids-v1'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='amx-bf16'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='amx-fp16'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='amx-int8'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='amx-tile'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx-vnni'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512-bf16'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512-fp16'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512-vpopcntdq'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512bitalg'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512bw'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512cd'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512dq'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512f'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512ifma'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512vbmi'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512vbmi2'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512vl'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512vnni'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='bus-lock-detect'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='erms'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='fbsdp-no'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='fsrc'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='fsrm'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='fsrs'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='fzrm'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='gfni'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='hle'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='ibrs-all'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='invpcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='la57'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='mcdt-no'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pbrsb-no'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pku'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='prefetchiti'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='psdp-no'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='rtm'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='sbdr-ssdp-no'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='serialize'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='taa-no'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='tsx-ldtrk'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='vaes'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='vpclmulqdq'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='xfd'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='xsaves'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='GraniteRapids-v2'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='amx-bf16'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='amx-fp16'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='amx-int8'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='amx-tile'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx-vnni'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx10'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx10-128'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx10-256'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx10-512'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512-bf16'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512-fp16'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512-vpopcntdq'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512bitalg'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512bw'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512cd'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512dq'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512f'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512ifma'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512vbmi'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512vbmi2'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512vl'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512vnni'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='bus-lock-detect'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='cldemote'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='erms'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='fbsdp-no'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='fsrc'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='fsrm'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='fsrs'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='fzrm'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='gfni'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='hle'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='ibrs-all'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='invpcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='la57'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='mcdt-no'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='movdir64b'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='movdiri'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pbrsb-no'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pku'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='prefetchiti'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='psdp-no'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='rtm'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='sbdr-ssdp-no'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='serialize'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='ss'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='taa-no'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='tsx-ldtrk'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='vaes'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='vpclmulqdq'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='xfd'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='xsaves'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='Haswell'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='erms'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='hle'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='invpcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='rtm'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='Haswell-IBRS'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='erms'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='hle'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='invpcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='rtm'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='Haswell-noTSX'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='erms'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='invpcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='Haswell-noTSX-IBRS'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='erms'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='invpcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='Haswell-v1'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='erms'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='hle'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='invpcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='rtm'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='Haswell-v2'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='erms'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='invpcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='Haswell-v3'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='erms'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='hle'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='invpcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='rtm'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='Haswell-v4'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='erms'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='invpcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='Icelake-Server'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512-vpopcntdq'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512bitalg'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512bw'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512cd'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512dq'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512f'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512vbmi'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512vbmi2'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512vl'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512vnni'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='erms'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='gfni'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='hle'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='invpcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='la57'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pku'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='rtm'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='vaes'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='vpclmulqdq'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='Icelake-Server-noTSX'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512-vpopcntdq'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512bitalg'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512bw'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512cd'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512dq'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512f'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512vbmi'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512vbmi2'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512vl'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512vnni'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='erms'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='gfni'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='invpcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='la57'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pku'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='vaes'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='vpclmulqdq'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='Icelake-Server-v1'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512-vpopcntdq'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512bitalg'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512bw'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512cd'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512dq'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512f'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512vbmi'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512vbmi2'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512vl'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512vnni'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='erms'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='gfni'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='hle'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='invpcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='la57'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pku'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='rtm'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='vaes'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='vpclmulqdq'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='Icelake-Server-v2'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512-vpopcntdq'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512bitalg'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512bw'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512cd'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512dq'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512f'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512vbmi'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512vbmi2'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512vl'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512vnni'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='erms'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='gfni'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='invpcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='la57'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pku'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='vaes'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='vpclmulqdq'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='Icelake-Server-v3'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512-vpopcntdq'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512bitalg'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512bw'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512cd'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512dq'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512f'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512vbmi'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512vbmi2'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512vl'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512vnni'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='erms'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='gfni'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='ibrs-all'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='invpcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='la57'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pku'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='taa-no'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='vaes'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='vpclmulqdq'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='Icelake-Server-v4'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512-vpopcntdq'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512bitalg'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512bw'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512cd'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512dq'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512f'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512ifma'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512vbmi'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512vbmi2'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512vl'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512vnni'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='erms'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='fsrm'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='gfni'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='ibrs-all'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='invpcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='la57'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pku'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='taa-no'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='vaes'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='vpclmulqdq'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='Icelake-Server-v5'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512-vpopcntdq'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512bitalg'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512bw'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512cd'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512dq'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512f'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512ifma'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512vbmi'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512vbmi2'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512vl'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512vnni'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='erms'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='fsrm'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='gfni'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='ibrs-all'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='invpcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='la57'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pku'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='taa-no'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='vaes'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='vpclmulqdq'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='xsaves'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='Icelake-Server-v6'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512-vpopcntdq'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512bitalg'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512bw'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512cd'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512dq'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512f'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512ifma'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512vbmi'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512vbmi2'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512vl'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512vnni'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='erms'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='fsrm'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='gfni'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='ibrs-all'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='invpcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='la57'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pku'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='taa-no'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='vaes'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='vpclmulqdq'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='xsaves'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='Icelake-Server-v7'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512-vpopcntdq'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512bitalg'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512bw'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512cd'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512dq'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512f'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512ifma'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512vbmi'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512vbmi2'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512vl'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512vnni'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='erms'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='fsrm'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='gfni'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='hle'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='ibrs-all'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='invpcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='la57'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pku'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='rtm'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='taa-no'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='vaes'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='vpclmulqdq'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='xsaves'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='IvyBridge'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='erms'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='IvyBridge-IBRS'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='erms'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='IvyBridge-v1'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='erms'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='IvyBridge-v2'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='erms'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='KnightsMill'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512-4fmaps'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512-4vnniw'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512-vpopcntdq'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512cd'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512er'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512f'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512pf'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='erms'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='ss'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='KnightsMill-v1'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512-4fmaps'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512-4vnniw'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512-vpopcntdq'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512cd'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512er'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512f'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512pf'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='erms'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='ss'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='Opteron_G4'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='fma4'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='xop'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='Opteron_G4-v1'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='fma4'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='xop'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='Opteron_G5'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='fma4'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='tbm'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='xop'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='Opteron_G5-v1'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='fma4'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='tbm'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='xop'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='SapphireRapids'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='amx-bf16'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='amx-int8'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='amx-tile'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx-vnni'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512-bf16'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512-fp16'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512-vpopcntdq'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512bitalg'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512bw'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512cd'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512dq'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512f'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512ifma'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512vbmi'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512vbmi2'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512vl'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512vnni'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='bus-lock-detect'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='erms'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='fsrc'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='fsrm'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='fsrs'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='fzrm'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='gfni'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='hle'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='ibrs-all'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='invpcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='la57'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pku'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='rtm'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='serialize'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='taa-no'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='tsx-ldtrk'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='vaes'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='vpclmulqdq'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='xfd'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='xsaves'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='SapphireRapids-v1'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='amx-bf16'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='amx-int8'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='amx-tile'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx-vnni'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512-bf16'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512-fp16'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512-vpopcntdq'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512bitalg'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512bw'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512cd'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512dq'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512f'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512ifma'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512vbmi'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512vbmi2'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512vl'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512vnni'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='bus-lock-detect'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='erms'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='fsrc'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='fsrm'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='fsrs'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='fzrm'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='gfni'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='hle'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='ibrs-all'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='invpcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='la57'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pku'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='rtm'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='serialize'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='taa-no'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='tsx-ldtrk'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='vaes'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='vpclmulqdq'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='xfd'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='xsaves'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='SapphireRapids-v2'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='amx-bf16'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='amx-int8'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='amx-tile'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx-vnni'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512-bf16'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512-fp16'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512-vpopcntdq'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512bitalg'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512bw'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512cd'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512dq'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512f'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512ifma'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512vbmi'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512vbmi2'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512vl'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512vnni'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='bus-lock-detect'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='erms'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='fbsdp-no'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='fsrc'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='fsrm'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='fsrs'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='fzrm'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='gfni'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='hle'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='ibrs-all'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='invpcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='la57'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pku'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='psdp-no'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='rtm'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='sbdr-ssdp-no'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='serialize'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='taa-no'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='tsx-ldtrk'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='vaes'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='vpclmulqdq'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='xfd'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='xsaves'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='SapphireRapids-v3'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='amx-bf16'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='amx-int8'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='amx-tile'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx-vnni'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512-bf16'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512-fp16'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512-vpopcntdq'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512bitalg'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512bw'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512cd'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512dq'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512f'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512ifma'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512vbmi'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512vbmi2'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512vl'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512vnni'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='bus-lock-detect'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='cldemote'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='erms'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='fbsdp-no'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='fsrc'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='fsrm'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='fsrs'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='fzrm'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='gfni'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='hle'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='ibrs-all'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='invpcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='la57'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='movdir64b'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='movdiri'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pku'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='psdp-no'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='rtm'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='sbdr-ssdp-no'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='serialize'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='ss'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='taa-no'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='tsx-ldtrk'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='vaes'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='vpclmulqdq'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='xfd'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='xsaves'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='SierraForest'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx-ifma'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx-ne-convert'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx-vnni'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx-vnni-int8'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='bus-lock-detect'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='cmpccxadd'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='erms'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='fbsdp-no'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='fsrm'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='fsrs'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='gfni'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='ibrs-all'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='invpcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='mcdt-no'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pbrsb-no'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pku'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='psdp-no'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='sbdr-ssdp-no'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='serialize'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='vaes'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='vpclmulqdq'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='xsaves'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='SierraForest-v1'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx-ifma'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx-ne-convert'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx-vnni'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx-vnni-int8'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='bus-lock-detect'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='cmpccxadd'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='erms'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='fbsdp-no'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='fsrm'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='fsrs'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='gfni'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='ibrs-all'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='invpcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='mcdt-no'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pbrsb-no'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pku'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='psdp-no'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='sbdr-ssdp-no'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='serialize'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='vaes'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='vpclmulqdq'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='xsaves'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='Skylake-Client'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='erms'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='hle'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='invpcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='rtm'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='Skylake-Client-IBRS'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='erms'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='hle'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='invpcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='rtm'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='erms'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='invpcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='Skylake-Client-v1'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='erms'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='hle'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='invpcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='rtm'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='Skylake-Client-v2'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='erms'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='hle'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='invpcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='rtm'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='Skylake-Client-v3'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='erms'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='invpcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='Skylake-Client-v4'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='erms'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='invpcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='xsaves'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='Skylake-Server'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512bw'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512cd'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512dq'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512f'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512vl'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='erms'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='hle'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='invpcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pku'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='rtm'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='Skylake-Server-IBRS'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512bw'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512cd'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512dq'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512f'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512vl'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='erms'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='hle'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='invpcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pku'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='rtm'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512bw'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512cd'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512dq'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512f'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512vl'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='erms'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='invpcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pku'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='Skylake-Server-v1'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512bw'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512cd'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512dq'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512f'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512vl'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='erms'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='hle'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='invpcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pku'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='rtm'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='Skylake-Server-v2'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512bw'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512cd'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512dq'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512f'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512vl'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='erms'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='hle'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='invpcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pku'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='rtm'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='Skylake-Server-v3'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512bw'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512cd'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512dq'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512f'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512vl'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='erms'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='invpcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pku'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='Skylake-Server-v4'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512bw'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512cd'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512dq'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512f'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512vl'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='erms'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='invpcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pku'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='Skylake-Server-v5'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512bw'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512cd'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512dq'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512f'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512vl'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='erms'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='invpcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pku'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='xsaves'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='Snowridge'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='cldemote'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='core-capability'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='erms'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='gfni'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='movdir64b'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='movdiri'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='mpx'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='split-lock-detect'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='Snowridge-v1'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='cldemote'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='core-capability'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='erms'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='gfni'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='movdir64b'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='movdiri'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='mpx'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='split-lock-detect'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='Snowridge-v2'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='cldemote'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='core-capability'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='erms'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='gfni'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='movdir64b'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='movdiri'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='split-lock-detect'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='Snowridge-v3'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='cldemote'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='core-capability'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='erms'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='gfni'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='movdir64b'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='movdiri'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='split-lock-detect'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='xsaves'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='Snowridge-v4'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='cldemote'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='erms'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='gfni'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='movdir64b'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='movdiri'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='xsaves'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='athlon'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='3dnow'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='3dnowext'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='athlon-v1'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='3dnow'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='3dnowext'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='core2duo'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='ss'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='core2duo-v1'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='ss'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='coreduo'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='ss'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='coreduo-v1'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='ss'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='n270'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='ss'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='n270-v1'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='ss'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='phenom'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='3dnow'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='3dnowext'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='phenom-v1'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='3dnow'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='3dnowext'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:     </mode>
Dec 05 09:11:58 compute-1 nova_compute[189066]:   </cpu>
Dec 05 09:11:58 compute-1 nova_compute[189066]:   <memoryBacking supported='yes'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:     <enum name='sourceType'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <value>file</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <value>anonymous</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <value>memfd</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:     </enum>
Dec 05 09:11:58 compute-1 nova_compute[189066]:   </memoryBacking>
Dec 05 09:11:58 compute-1 nova_compute[189066]:   <devices>
Dec 05 09:11:58 compute-1 nova_compute[189066]:     <disk supported='yes'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <enum name='diskDevice'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>disk</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>cdrom</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>floppy</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>lun</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </enum>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <enum name='bus'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>fdc</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>scsi</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>virtio</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>usb</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>sata</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </enum>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <enum name='model'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>virtio</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>virtio-transitional</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>virtio-non-transitional</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </enum>
Dec 05 09:11:58 compute-1 nova_compute[189066]:     </disk>
Dec 05 09:11:58 compute-1 nova_compute[189066]:     <graphics supported='yes'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <enum name='type'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>vnc</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>egl-headless</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>dbus</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </enum>
Dec 05 09:11:58 compute-1 nova_compute[189066]:     </graphics>
Dec 05 09:11:58 compute-1 nova_compute[189066]:     <video supported='yes'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <enum name='modelType'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>vga</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>cirrus</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>virtio</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>none</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>bochs</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>ramfb</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </enum>
Dec 05 09:11:58 compute-1 nova_compute[189066]:     </video>
Dec 05 09:11:58 compute-1 nova_compute[189066]:     <hostdev supported='yes'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <enum name='mode'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>subsystem</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </enum>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <enum name='startupPolicy'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>default</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>mandatory</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>requisite</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>optional</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </enum>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <enum name='subsysType'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>usb</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>pci</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>scsi</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </enum>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <enum name='capsType'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <enum name='pciBackend'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:     </hostdev>
Dec 05 09:11:58 compute-1 nova_compute[189066]:     <rng supported='yes'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <enum name='model'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>virtio</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>virtio-transitional</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>virtio-non-transitional</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </enum>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <enum name='backendModel'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>random</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>egd</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>builtin</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </enum>
Dec 05 09:11:58 compute-1 nova_compute[189066]:     </rng>
Dec 05 09:11:58 compute-1 nova_compute[189066]:     <filesystem supported='yes'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <enum name='driverType'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>path</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>handle</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>virtiofs</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </enum>
Dec 05 09:11:58 compute-1 nova_compute[189066]:     </filesystem>
Dec 05 09:11:58 compute-1 nova_compute[189066]:     <tpm supported='yes'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <enum name='model'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>tpm-tis</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>tpm-crb</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </enum>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <enum name='backendModel'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>emulator</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>external</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </enum>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <enum name='backendVersion'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>2.0</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </enum>
Dec 05 09:11:58 compute-1 nova_compute[189066]:     </tpm>
Dec 05 09:11:58 compute-1 nova_compute[189066]:     <redirdev supported='yes'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <enum name='bus'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>usb</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </enum>
Dec 05 09:11:58 compute-1 nova_compute[189066]:     </redirdev>
Dec 05 09:11:58 compute-1 nova_compute[189066]:     <channel supported='yes'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <enum name='type'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>pty</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>unix</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </enum>
Dec 05 09:11:58 compute-1 nova_compute[189066]:     </channel>
Dec 05 09:11:58 compute-1 nova_compute[189066]:     <crypto supported='yes'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <enum name='model'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <enum name='type'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>qemu</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </enum>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <enum name='backendModel'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>builtin</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </enum>
Dec 05 09:11:58 compute-1 nova_compute[189066]:     </crypto>
Dec 05 09:11:58 compute-1 nova_compute[189066]:     <interface supported='yes'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <enum name='backendType'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>default</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>passt</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </enum>
Dec 05 09:11:58 compute-1 nova_compute[189066]:     </interface>
Dec 05 09:11:58 compute-1 nova_compute[189066]:     <panic supported='yes'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <enum name='model'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>isa</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>hyperv</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </enum>
Dec 05 09:11:58 compute-1 nova_compute[189066]:     </panic>
Dec 05 09:11:58 compute-1 nova_compute[189066]:     <console supported='yes'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <enum name='type'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>null</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>vc</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>pty</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>dev</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>file</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>pipe</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>stdio</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>udp</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>tcp</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>unix</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>qemu-vdagent</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>dbus</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </enum>
Dec 05 09:11:58 compute-1 nova_compute[189066]:     </console>
Dec 05 09:11:58 compute-1 nova_compute[189066]:   </devices>
Dec 05 09:11:58 compute-1 nova_compute[189066]:   <features>
Dec 05 09:11:58 compute-1 nova_compute[189066]:     <gic supported='no'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:     <vmcoreinfo supported='yes'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:     <genid supported='yes'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:     <backingStoreInput supported='yes'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:     <backup supported='yes'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:     <async-teardown supported='yes'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:     <ps2 supported='yes'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:     <sev supported='no'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:     <sgx supported='no'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:     <hyperv supported='yes'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <enum name='features'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>relaxed</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>vapic</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>spinlocks</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>vpindex</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>runtime</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>synic</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>stimer</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>reset</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>vendor_id</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>frequencies</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>reenlightenment</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>tlbflush</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>ipi</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>avic</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>emsr_bitmap</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>xmm_input</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </enum>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <defaults>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <spinlocks>4095</spinlocks>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <stimer_direct>on</stimer_direct>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <tlbflush_direct>on</tlbflush_direct>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <tlbflush_extended>on</tlbflush_extended>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <vendor_id>Linux KVM Hv</vendor_id>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </defaults>
Dec 05 09:11:58 compute-1 nova_compute[189066]:     </hyperv>
Dec 05 09:11:58 compute-1 nova_compute[189066]:     <launchSecurity supported='yes'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <enum name='sectype'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>tdx</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </enum>
Dec 05 09:11:58 compute-1 nova_compute[189066]:     </launchSecurity>
Dec 05 09:11:58 compute-1 nova_compute[189066]:   </features>
Dec 05 09:11:58 compute-1 nova_compute[189066]: </domainCapabilities>
Dec 05 09:11:58 compute-1 nova_compute[189066]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Dec 05 09:11:58 compute-1 nova_compute[189066]: 2025-12-05 09:11:58.027 189070 DEBUG nova.virt.libvirt.host [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] Getting domain capabilities for x86_64 via machine types: {'pc', 'q35'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952
Dec 05 09:11:58 compute-1 nova_compute[189066]: 2025-12-05 09:11:58.032 189070 DEBUG nova.virt.libvirt.host [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=pc:
Dec 05 09:11:58 compute-1 nova_compute[189066]: <domainCapabilities>
Dec 05 09:11:58 compute-1 nova_compute[189066]:   <path>/usr/libexec/qemu-kvm</path>
Dec 05 09:11:58 compute-1 nova_compute[189066]:   <domain>kvm</domain>
Dec 05 09:11:58 compute-1 nova_compute[189066]:   <machine>pc-i440fx-rhel7.6.0</machine>
Dec 05 09:11:58 compute-1 nova_compute[189066]:   <arch>x86_64</arch>
Dec 05 09:11:58 compute-1 nova_compute[189066]:   <vcpu max='240'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:   <iothreads supported='yes'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:   <os supported='yes'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:     <enum name='firmware'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:     <loader supported='yes'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <enum name='type'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>rom</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>pflash</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </enum>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <enum name='readonly'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>yes</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>no</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </enum>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <enum name='secure'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>no</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </enum>
Dec 05 09:11:58 compute-1 nova_compute[189066]:     </loader>
Dec 05 09:11:58 compute-1 nova_compute[189066]:   </os>
Dec 05 09:11:58 compute-1 nova_compute[189066]:   <cpu>
Dec 05 09:11:58 compute-1 nova_compute[189066]:     <mode name='host-passthrough' supported='yes'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <enum name='hostPassthroughMigratable'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>on</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>off</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </enum>
Dec 05 09:11:58 compute-1 nova_compute[189066]:     </mode>
Dec 05 09:11:58 compute-1 nova_compute[189066]:     <mode name='maximum' supported='yes'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <enum name='maximumMigratable'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>on</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>off</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </enum>
Dec 05 09:11:58 compute-1 nova_compute[189066]:     </mode>
Dec 05 09:11:58 compute-1 nova_compute[189066]:     <mode name='host-model' supported='yes'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model fallback='forbid'>EPYC-Rome</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <vendor>AMD</vendor>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <maxphysaddr mode='passthrough' limit='40'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <feature policy='require' name='x2apic'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <feature policy='require' name='tsc-deadline'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <feature policy='require' name='hypervisor'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <feature policy='require' name='tsc_adjust'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <feature policy='require' name='spec-ctrl'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <feature policy='require' name='stibp'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <feature policy='require' name='ssbd'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <feature policy='require' name='cmp_legacy'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <feature policy='require' name='overflow-recov'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <feature policy='require' name='succor'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <feature policy='require' name='ibrs'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <feature policy='require' name='amd-ssbd'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <feature policy='require' name='virt-ssbd'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <feature policy='require' name='lbrv'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <feature policy='require' name='tsc-scale'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <feature policy='require' name='vmcb-clean'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <feature policy='require' name='flushbyasid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <feature policy='require' name='pause-filter'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <feature policy='require' name='pfthreshold'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <feature policy='require' name='svme-addr-chk'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <feature policy='require' name='lfence-always-serializing'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <feature policy='disable' name='xsaves'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:     </mode>
Dec 05 09:11:58 compute-1 nova_compute[189066]:     <mode name='custom' supported='yes'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='Broadwell'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='erms'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='hle'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='invpcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='rtm'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='Broadwell-IBRS'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='erms'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='hle'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='invpcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='rtm'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='Broadwell-noTSX'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='erms'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='invpcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='Broadwell-noTSX-IBRS'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='erms'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='invpcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='Broadwell-v1'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='erms'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='hle'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='invpcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='rtm'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='Broadwell-v2'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='erms'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='invpcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='Broadwell-v3'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='erms'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='hle'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='invpcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='rtm'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='Broadwell-v4'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='erms'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='invpcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='Cascadelake-Server'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512bw'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512cd'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512dq'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512f'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512vl'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512vnni'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='erms'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='hle'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='invpcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pku'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='rtm'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='Cascadelake-Server-noTSX'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512bw'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512cd'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512dq'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512f'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512vl'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512vnni'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='erms'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='ibrs-all'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='invpcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pku'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='Cascadelake-Server-v1'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512bw'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512cd'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512dq'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512f'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512vl'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512vnni'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='erms'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='hle'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='invpcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pku'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='rtm'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='Cascadelake-Server-v2'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512bw'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512cd'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512dq'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512f'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512vl'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512vnni'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='erms'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='hle'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='ibrs-all'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='invpcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pku'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='rtm'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='Cascadelake-Server-v3'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512bw'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512cd'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512dq'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512f'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512vl'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512vnni'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='erms'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='ibrs-all'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='invpcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pku'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='Cascadelake-Server-v4'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512bw'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512cd'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512dq'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512f'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512vl'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512vnni'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='erms'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='ibrs-all'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='invpcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pku'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='Cascadelake-Server-v5'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512bw'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512cd'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512dq'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512f'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512vl'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512vnni'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='erms'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='ibrs-all'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='invpcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pku'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='xsaves'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='Cooperlake'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512-bf16'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512bw'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512cd'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512dq'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512f'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512vl'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512vnni'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='erms'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='hle'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='ibrs-all'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='invpcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pku'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='rtm'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='taa-no'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='Cooperlake-v1'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512-bf16'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512bw'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512cd'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512dq'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512f'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512vl'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512vnni'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='erms'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='hle'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='ibrs-all'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='invpcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pku'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='rtm'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='taa-no'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='Cooperlake-v2'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512-bf16'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512bw'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512cd'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512dq'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512f'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512vl'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512vnni'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='erms'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='hle'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='ibrs-all'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='invpcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pku'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='rtm'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='taa-no'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='xsaves'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='Denverton'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='erms'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='mpx'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='Denverton-v1'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='erms'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='mpx'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='Denverton-v2'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='erms'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='Denverton-v3'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='erms'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='xsaves'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='Dhyana-v2'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='xsaves'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='EPYC-Genoa'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='amd-psfd'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='auto-ibrs'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512-bf16'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512-vpopcntdq'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512bitalg'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512bw'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512cd'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512dq'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512f'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512ifma'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512vbmi'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512vbmi2'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512vl'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512vnni'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='erms'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='fsrm'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='gfni'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='invpcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='la57'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='no-nested-data-bp'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='null-sel-clr-base'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pku'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='stibp-always-on'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='vaes'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='vpclmulqdq'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='xsaves'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='EPYC-Genoa-v1'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='amd-psfd'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='auto-ibrs'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512-bf16'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512-vpopcntdq'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512bitalg'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512bw'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512cd'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512dq'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512f'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512ifma'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512vbmi'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512vbmi2'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512vl'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512vnni'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='erms'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='fsrm'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='gfni'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='invpcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='la57'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='no-nested-data-bp'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='null-sel-clr-base'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pku'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='stibp-always-on'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='vaes'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='vpclmulqdq'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='xsaves'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='EPYC-Milan'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='erms'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='fsrm'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='invpcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pku'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='xsaves'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='EPYC-Milan-v1'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='erms'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='fsrm'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='invpcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pku'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='xsaves'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='EPYC-Milan-v2'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='amd-psfd'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='erms'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='fsrm'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='invpcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='no-nested-data-bp'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='null-sel-clr-base'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pku'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='stibp-always-on'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='vaes'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='vpclmulqdq'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='xsaves'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='EPYC-Rome'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='xsaves'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='EPYC-Rome-v1'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='xsaves'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='EPYC-Rome-v2'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='xsaves'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='EPYC-Rome-v3'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='xsaves'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='EPYC-v3'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='xsaves'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='EPYC-v4'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='xsaves'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='GraniteRapids'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='amx-bf16'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='amx-fp16'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='amx-int8'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='amx-tile'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx-vnni'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512-bf16'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512-fp16'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512-vpopcntdq'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512bitalg'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512bw'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512cd'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512dq'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512f'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512ifma'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512vbmi'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512vbmi2'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512vl'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512vnni'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='bus-lock-detect'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='erms'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='fbsdp-no'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='fsrc'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='fsrm'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='fsrs'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='fzrm'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='gfni'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='hle'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='ibrs-all'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='invpcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='la57'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='mcdt-no'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pbrsb-no'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pku'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='prefetchiti'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='psdp-no'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='rtm'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='sbdr-ssdp-no'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='serialize'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='taa-no'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='tsx-ldtrk'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='vaes'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='vpclmulqdq'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='xfd'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='xsaves'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='GraniteRapids-v1'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='amx-bf16'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='amx-fp16'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='amx-int8'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='amx-tile'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx-vnni'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512-bf16'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512-fp16'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512-vpopcntdq'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512bitalg'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512bw'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512cd'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512dq'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512f'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512ifma'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512vbmi'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512vbmi2'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512vl'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512vnni'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='bus-lock-detect'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='erms'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='fbsdp-no'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='fsrc'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='fsrm'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='fsrs'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='fzrm'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='gfni'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='hle'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='ibrs-all'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='invpcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='la57'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='mcdt-no'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pbrsb-no'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pku'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='prefetchiti'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='psdp-no'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='rtm'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='sbdr-ssdp-no'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='serialize'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='taa-no'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='tsx-ldtrk'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='vaes'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='vpclmulqdq'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='xfd'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='xsaves'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='GraniteRapids-v2'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='amx-bf16'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='amx-fp16'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='amx-int8'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='amx-tile'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx-vnni'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx10'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx10-128'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx10-256'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx10-512'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512-bf16'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512-fp16'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512-vpopcntdq'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512bitalg'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512bw'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512cd'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512dq'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512f'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512ifma'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512vbmi'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512vbmi2'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512vl'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512vnni'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='bus-lock-detect'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='cldemote'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='erms'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='fbsdp-no'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='fsrc'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='fsrm'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='fsrs'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='fzrm'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='gfni'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='hle'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='ibrs-all'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='invpcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='la57'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='mcdt-no'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='movdir64b'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='movdiri'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pbrsb-no'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pku'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='prefetchiti'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='psdp-no'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='rtm'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='sbdr-ssdp-no'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='serialize'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='ss'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='taa-no'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='tsx-ldtrk'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='vaes'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='vpclmulqdq'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='xfd'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='xsaves'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='Haswell'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='erms'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='hle'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='invpcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='rtm'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='Haswell-IBRS'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='erms'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='hle'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='invpcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='rtm'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='Haswell-noTSX'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='erms'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='invpcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='Haswell-noTSX-IBRS'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='erms'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='invpcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='Haswell-v1'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='erms'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='hle'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='invpcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='rtm'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='Haswell-v2'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='erms'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='invpcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='Haswell-v3'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='erms'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='hle'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='invpcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='rtm'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='Haswell-v4'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='erms'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='invpcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='Icelake-Server'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512-vpopcntdq'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512bitalg'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512bw'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512cd'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512dq'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512f'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512vbmi'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512vbmi2'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512vl'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512vnni'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='erms'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='gfni'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='hle'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='invpcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='la57'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pku'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='rtm'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='vaes'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='vpclmulqdq'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='Icelake-Server-noTSX'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512-vpopcntdq'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512bitalg'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512bw'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512cd'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512dq'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512f'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512vbmi'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512vbmi2'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512vl'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512vnni'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='erms'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='gfni'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='invpcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='la57'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pku'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='vaes'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='vpclmulqdq'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='Icelake-Server-v1'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512-vpopcntdq'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512bitalg'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512bw'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512cd'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512dq'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512f'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512vbmi'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512vbmi2'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512vl'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512vnni'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='erms'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='gfni'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='hle'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='invpcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='la57'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pku'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='rtm'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='vaes'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='vpclmulqdq'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='Icelake-Server-v2'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512-vpopcntdq'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512bitalg'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512bw'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512cd'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512dq'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512f'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512vbmi'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512vbmi2'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512vl'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512vnni'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='erms'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='gfni'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='invpcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='la57'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pku'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='vaes'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='vpclmulqdq'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='Icelake-Server-v3'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512-vpopcntdq'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512bitalg'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512bw'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512cd'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512dq'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512f'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512vbmi'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512vbmi2'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512vl'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512vnni'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='erms'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='gfni'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='ibrs-all'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='invpcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='la57'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pku'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='taa-no'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='vaes'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='vpclmulqdq'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='Icelake-Server-v4'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512-vpopcntdq'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512bitalg'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512bw'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512cd'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512dq'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512f'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512ifma'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512vbmi'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512vbmi2'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512vl'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512vnni'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='erms'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='fsrm'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='gfni'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='ibrs-all'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='invpcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='la57'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pku'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='taa-no'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='vaes'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='vpclmulqdq'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='Icelake-Server-v5'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512-vpopcntdq'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512bitalg'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512bw'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512cd'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512dq'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512f'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512ifma'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512vbmi'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512vbmi2'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512vl'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512vnni'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='erms'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='fsrm'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='gfni'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='ibrs-all'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='invpcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='la57'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pku'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='taa-no'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='vaes'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='vpclmulqdq'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='xsaves'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='Icelake-Server-v6'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512-vpopcntdq'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512bitalg'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512bw'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512cd'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512dq'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512f'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512ifma'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512vbmi'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512vbmi2'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512vl'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512vnni'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='erms'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='fsrm'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='gfni'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='ibrs-all'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='invpcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='la57'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pku'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='taa-no'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='vaes'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='vpclmulqdq'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='xsaves'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='Icelake-Server-v7'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512-vpopcntdq'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512bitalg'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512bw'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512cd'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512dq'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512f'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512ifma'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512vbmi'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512vbmi2'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512vl'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512vnni'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='erms'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='fsrm'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='gfni'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='hle'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='ibrs-all'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='invpcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='la57'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pku'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='rtm'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='taa-no'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='vaes'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='vpclmulqdq'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='xsaves'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='IvyBridge'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='erms'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='IvyBridge-IBRS'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='erms'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='IvyBridge-v1'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='erms'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='IvyBridge-v2'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='erms'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='KnightsMill'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512-4fmaps'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512-4vnniw'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512-vpopcntdq'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512cd'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512er'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512f'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512pf'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='erms'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='ss'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='KnightsMill-v1'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512-4fmaps'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512-4vnniw'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512-vpopcntdq'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512cd'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512er'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512f'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512pf'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='erms'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='ss'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='Opteron_G4'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='fma4'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='xop'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='Opteron_G4-v1'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='fma4'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='xop'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='Opteron_G5'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='fma4'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='tbm'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='xop'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='Opteron_G5-v1'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='fma4'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='tbm'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='xop'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='SapphireRapids'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='amx-bf16'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='amx-int8'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='amx-tile'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx-vnni'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512-bf16'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512-fp16'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512-vpopcntdq'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512bitalg'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512bw'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512cd'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512dq'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512f'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512ifma'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512vbmi'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512vbmi2'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512vl'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512vnni'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='bus-lock-detect'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='erms'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='fsrc'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='fsrm'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='fsrs'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='fzrm'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='gfni'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='hle'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='ibrs-all'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='invpcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='la57'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pku'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='rtm'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='serialize'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='taa-no'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='tsx-ldtrk'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='vaes'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='vpclmulqdq'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='xfd'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='xsaves'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='SapphireRapids-v1'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='amx-bf16'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='amx-int8'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='amx-tile'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx-vnni'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512-bf16'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512-fp16'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512-vpopcntdq'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512bitalg'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512bw'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512cd'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512dq'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512f'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512ifma'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512vbmi'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512vbmi2'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512vl'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512vnni'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='bus-lock-detect'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='erms'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='fsrc'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='fsrm'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='fsrs'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='fzrm'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='gfni'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='hle'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='ibrs-all'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='invpcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='la57'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pku'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='rtm'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='serialize'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='taa-no'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='tsx-ldtrk'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='vaes'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='vpclmulqdq'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='xfd'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='xsaves'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='SapphireRapids-v2'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='amx-bf16'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='amx-int8'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='amx-tile'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx-vnni'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512-bf16'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512-fp16'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512-vpopcntdq'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512bitalg'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512bw'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512cd'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512dq'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512f'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512ifma'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512vbmi'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512vbmi2'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512vl'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512vnni'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='bus-lock-detect'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='erms'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='fbsdp-no'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='fsrc'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='fsrm'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='fsrs'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='fzrm'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='gfni'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='hle'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='ibrs-all'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='invpcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='la57'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pku'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='psdp-no'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='rtm'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='sbdr-ssdp-no'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='serialize'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='taa-no'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='tsx-ldtrk'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='vaes'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='vpclmulqdq'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='xfd'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='xsaves'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='SapphireRapids-v3'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='amx-bf16'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='amx-int8'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='amx-tile'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx-vnni'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512-bf16'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512-fp16'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512-vpopcntdq'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512bitalg'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512bw'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512cd'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512dq'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512f'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512ifma'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512vbmi'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512vbmi2'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512vl'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512vnni'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='bus-lock-detect'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='cldemote'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='erms'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='fbsdp-no'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='fsrc'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='fsrm'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='fsrs'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='fzrm'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='gfni'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='hle'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='ibrs-all'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='invpcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='la57'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='movdir64b'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='movdiri'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pku'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='psdp-no'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='rtm'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='sbdr-ssdp-no'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='serialize'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='ss'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='taa-no'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='tsx-ldtrk'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='vaes'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='vpclmulqdq'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='xfd'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='xsaves'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='SierraForest'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx-ifma'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx-ne-convert'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx-vnni'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx-vnni-int8'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='bus-lock-detect'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='cmpccxadd'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='erms'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='fbsdp-no'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='fsrm'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='fsrs'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='gfni'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='ibrs-all'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='invpcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='mcdt-no'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pbrsb-no'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pku'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='psdp-no'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='sbdr-ssdp-no'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='serialize'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='vaes'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='vpclmulqdq'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='xsaves'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='SierraForest-v1'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx-ifma'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx-ne-convert'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx-vnni'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx-vnni-int8'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='bus-lock-detect'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='cmpccxadd'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='erms'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='fbsdp-no'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='fsrm'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='fsrs'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='gfni'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='ibrs-all'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='invpcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='mcdt-no'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pbrsb-no'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pku'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='psdp-no'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='sbdr-ssdp-no'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='serialize'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='vaes'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='vpclmulqdq'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='xsaves'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='Skylake-Client'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='erms'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='hle'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='invpcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='rtm'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='Skylake-Client-IBRS'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='erms'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='hle'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='invpcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='rtm'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='erms'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='invpcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='Skylake-Client-v1'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='erms'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='hle'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='invpcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='rtm'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='Skylake-Client-v2'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='erms'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='hle'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='invpcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='rtm'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='Skylake-Client-v3'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='erms'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='invpcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='Skylake-Client-v4'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='erms'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='invpcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='xsaves'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='Skylake-Server'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512bw'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512cd'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512dq'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512f'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512vl'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='erms'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='hle'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='invpcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pku'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='rtm'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='Skylake-Server-IBRS'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512bw'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512cd'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512dq'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512f'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512vl'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='erms'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='hle'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='invpcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pku'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='rtm'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512bw'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512cd'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512dq'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512f'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512vl'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='erms'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='invpcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pku'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='Skylake-Server-v1'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512bw'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512cd'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512dq'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512f'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512vl'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='erms'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='hle'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='invpcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pku'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='rtm'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='Skylake-Server-v2'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512bw'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512cd'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512dq'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512f'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512vl'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='erms'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='hle'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='invpcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pku'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='rtm'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='Skylake-Server-v3'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512bw'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512cd'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512dq'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512f'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512vl'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='erms'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='invpcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pku'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='Skylake-Server-v4'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512bw'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512cd'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512dq'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512f'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512vl'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='erms'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='invpcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pku'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='Skylake-Server-v5'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512bw'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512cd'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512dq'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512f'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512vl'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='erms'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='invpcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pku'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='xsaves'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='Snowridge'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='cldemote'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='core-capability'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='erms'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='gfni'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='movdir64b'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='movdiri'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='mpx'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='split-lock-detect'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='Snowridge-v1'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='cldemote'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='core-capability'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='erms'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='gfni'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='movdir64b'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='movdiri'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='mpx'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='split-lock-detect'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='Snowridge-v2'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='cldemote'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='core-capability'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='erms'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='gfni'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='movdir64b'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='movdiri'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='split-lock-detect'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='Snowridge-v3'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='cldemote'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='core-capability'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='erms'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='gfni'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='movdir64b'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='movdiri'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='split-lock-detect'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='xsaves'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='Snowridge-v4'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='cldemote'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='erms'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='gfni'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='movdir64b'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='movdiri'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='xsaves'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='athlon'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='3dnow'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='3dnowext'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='athlon-v1'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='3dnow'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='3dnowext'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='core2duo'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='ss'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='core2duo-v1'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='ss'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='coreduo'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='ss'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='coreduo-v1'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='ss'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='n270'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='ss'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='n270-v1'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='ss'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='phenom'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='3dnow'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='3dnowext'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='phenom-v1'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='3dnow'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='3dnowext'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:     </mode>
Dec 05 09:11:58 compute-1 nova_compute[189066]:   </cpu>
Dec 05 09:11:58 compute-1 nova_compute[189066]:   <memoryBacking supported='yes'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:     <enum name='sourceType'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <value>file</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <value>anonymous</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <value>memfd</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:     </enum>
Dec 05 09:11:58 compute-1 nova_compute[189066]:   </memoryBacking>
Dec 05 09:11:58 compute-1 nova_compute[189066]:   <devices>
Dec 05 09:11:58 compute-1 nova_compute[189066]:     <disk supported='yes'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <enum name='diskDevice'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>disk</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>cdrom</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>floppy</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>lun</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </enum>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <enum name='bus'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>ide</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>fdc</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>scsi</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>virtio</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>usb</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>sata</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </enum>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <enum name='model'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>virtio</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>virtio-transitional</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>virtio-non-transitional</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </enum>
Dec 05 09:11:58 compute-1 nova_compute[189066]:     </disk>
Dec 05 09:11:58 compute-1 nova_compute[189066]:     <graphics supported='yes'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <enum name='type'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>vnc</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>egl-headless</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>dbus</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </enum>
Dec 05 09:11:58 compute-1 nova_compute[189066]:     </graphics>
Dec 05 09:11:58 compute-1 nova_compute[189066]:     <video supported='yes'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <enum name='modelType'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>vga</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>cirrus</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>virtio</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>none</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>bochs</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>ramfb</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </enum>
Dec 05 09:11:58 compute-1 nova_compute[189066]:     </video>
Dec 05 09:11:58 compute-1 nova_compute[189066]:     <hostdev supported='yes'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <enum name='mode'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>subsystem</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </enum>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <enum name='startupPolicy'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>default</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>mandatory</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>requisite</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>optional</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </enum>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <enum name='subsysType'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>usb</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>pci</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>scsi</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </enum>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <enum name='capsType'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <enum name='pciBackend'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:     </hostdev>
Dec 05 09:11:58 compute-1 nova_compute[189066]:     <rng supported='yes'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <enum name='model'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>virtio</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>virtio-transitional</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>virtio-non-transitional</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </enum>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <enum name='backendModel'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>random</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>egd</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>builtin</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </enum>
Dec 05 09:11:58 compute-1 nova_compute[189066]:     </rng>
Dec 05 09:11:58 compute-1 nova_compute[189066]:     <filesystem supported='yes'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <enum name='driverType'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>path</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>handle</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>virtiofs</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </enum>
Dec 05 09:11:58 compute-1 nova_compute[189066]:     </filesystem>
Dec 05 09:11:58 compute-1 nova_compute[189066]:     <tpm supported='yes'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <enum name='model'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>tpm-tis</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>tpm-crb</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </enum>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <enum name='backendModel'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>emulator</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>external</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </enum>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <enum name='backendVersion'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>2.0</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </enum>
Dec 05 09:11:58 compute-1 nova_compute[189066]:     </tpm>
Dec 05 09:11:58 compute-1 nova_compute[189066]:     <redirdev supported='yes'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <enum name='bus'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>usb</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </enum>
Dec 05 09:11:58 compute-1 nova_compute[189066]:     </redirdev>
Dec 05 09:11:58 compute-1 nova_compute[189066]:     <channel supported='yes'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <enum name='type'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>pty</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>unix</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </enum>
Dec 05 09:11:58 compute-1 nova_compute[189066]:     </channel>
Dec 05 09:11:58 compute-1 nova_compute[189066]:     <crypto supported='yes'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <enum name='model'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <enum name='type'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>qemu</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </enum>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <enum name='backendModel'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>builtin</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </enum>
Dec 05 09:11:58 compute-1 nova_compute[189066]:     </crypto>
Dec 05 09:11:58 compute-1 nova_compute[189066]:     <interface supported='yes'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <enum name='backendType'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>default</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>passt</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </enum>
Dec 05 09:11:58 compute-1 nova_compute[189066]:     </interface>
Dec 05 09:11:58 compute-1 nova_compute[189066]:     <panic supported='yes'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <enum name='model'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>isa</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>hyperv</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </enum>
Dec 05 09:11:58 compute-1 nova_compute[189066]:     </panic>
Dec 05 09:11:58 compute-1 nova_compute[189066]:     <console supported='yes'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <enum name='type'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>null</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>vc</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>pty</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>dev</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>file</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>pipe</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>stdio</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>udp</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>tcp</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>unix</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>qemu-vdagent</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>dbus</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </enum>
Dec 05 09:11:58 compute-1 nova_compute[189066]:     </console>
Dec 05 09:11:58 compute-1 nova_compute[189066]:   </devices>
Dec 05 09:11:58 compute-1 nova_compute[189066]:   <features>
Dec 05 09:11:58 compute-1 nova_compute[189066]:     <gic supported='no'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:     <vmcoreinfo supported='yes'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:     <genid supported='yes'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:     <backingStoreInput supported='yes'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:     <backup supported='yes'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:     <async-teardown supported='yes'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:     <ps2 supported='yes'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:     <sev supported='no'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:     <sgx supported='no'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:     <hyperv supported='yes'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <enum name='features'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>relaxed</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>vapic</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>spinlocks</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>vpindex</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>runtime</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>synic</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>stimer</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>reset</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>vendor_id</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>frequencies</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>reenlightenment</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>tlbflush</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>ipi</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>avic</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>emsr_bitmap</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>xmm_input</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </enum>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <defaults>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <spinlocks>4095</spinlocks>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <stimer_direct>on</stimer_direct>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <tlbflush_direct>on</tlbflush_direct>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <tlbflush_extended>on</tlbflush_extended>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <vendor_id>Linux KVM Hv</vendor_id>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </defaults>
Dec 05 09:11:58 compute-1 nova_compute[189066]:     </hyperv>
Dec 05 09:11:58 compute-1 nova_compute[189066]:     <launchSecurity supported='yes'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <enum name='sectype'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>tdx</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </enum>
Dec 05 09:11:58 compute-1 nova_compute[189066]:     </launchSecurity>
Dec 05 09:11:58 compute-1 nova_compute[189066]:   </features>
Dec 05 09:11:58 compute-1 nova_compute[189066]: </domainCapabilities>
Dec 05 09:11:58 compute-1 nova_compute[189066]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Dec 05 09:11:58 compute-1 nova_compute[189066]: 2025-12-05 09:11:58.098 189070 DEBUG nova.virt.libvirt.host [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=q35:
Dec 05 09:11:58 compute-1 nova_compute[189066]: <domainCapabilities>
Dec 05 09:11:58 compute-1 nova_compute[189066]:   <path>/usr/libexec/qemu-kvm</path>
Dec 05 09:11:58 compute-1 nova_compute[189066]:   <domain>kvm</domain>
Dec 05 09:11:58 compute-1 nova_compute[189066]:   <machine>pc-q35-rhel9.8.0</machine>
Dec 05 09:11:58 compute-1 nova_compute[189066]:   <arch>x86_64</arch>
Dec 05 09:11:58 compute-1 nova_compute[189066]:   <vcpu max='4096'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:   <iothreads supported='yes'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:   <os supported='yes'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:     <enum name='firmware'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <value>efi</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:     </enum>
Dec 05 09:11:58 compute-1 nova_compute[189066]:     <loader supported='yes'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <value>/usr/share/edk2/ovmf/OVMF_CODE.secboot.fd</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <value>/usr/share/edk2/ovmf/OVMF_CODE.fd</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <value>/usr/share/edk2/ovmf/OVMF.amdsev.fd</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <value>/usr/share/edk2/ovmf/OVMF.inteltdx.secboot.fd</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <enum name='type'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>rom</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>pflash</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </enum>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <enum name='readonly'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>yes</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>no</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </enum>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <enum name='secure'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>yes</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>no</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </enum>
Dec 05 09:11:58 compute-1 nova_compute[189066]:     </loader>
Dec 05 09:11:58 compute-1 nova_compute[189066]:   </os>
Dec 05 09:11:58 compute-1 nova_compute[189066]:   <cpu>
Dec 05 09:11:58 compute-1 nova_compute[189066]:     <mode name='host-passthrough' supported='yes'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <enum name='hostPassthroughMigratable'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>on</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>off</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </enum>
Dec 05 09:11:58 compute-1 nova_compute[189066]:     </mode>
Dec 05 09:11:58 compute-1 nova_compute[189066]:     <mode name='maximum' supported='yes'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <enum name='maximumMigratable'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>on</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>off</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </enum>
Dec 05 09:11:58 compute-1 nova_compute[189066]:     </mode>
Dec 05 09:11:58 compute-1 nova_compute[189066]:     <mode name='host-model' supported='yes'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model fallback='forbid'>EPYC-Rome</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <vendor>AMD</vendor>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <maxphysaddr mode='passthrough' limit='40'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <feature policy='require' name='x2apic'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <feature policy='require' name='tsc-deadline'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <feature policy='require' name='hypervisor'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <feature policy='require' name='tsc_adjust'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <feature policy='require' name='spec-ctrl'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <feature policy='require' name='stibp'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <feature policy='require' name='ssbd'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <feature policy='require' name='cmp_legacy'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <feature policy='require' name='overflow-recov'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <feature policy='require' name='succor'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <feature policy='require' name='ibrs'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <feature policy='require' name='amd-ssbd'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <feature policy='require' name='virt-ssbd'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <feature policy='require' name='lbrv'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <feature policy='require' name='tsc-scale'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <feature policy='require' name='vmcb-clean'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <feature policy='require' name='flushbyasid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <feature policy='require' name='pause-filter'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <feature policy='require' name='pfthreshold'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <feature policy='require' name='svme-addr-chk'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <feature policy='require' name='lfence-always-serializing'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <feature policy='disable' name='xsaves'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:     </mode>
Dec 05 09:11:58 compute-1 nova_compute[189066]:     <mode name='custom' supported='yes'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='Broadwell'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='erms'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='hle'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='invpcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='rtm'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='Broadwell-IBRS'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='erms'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='hle'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='invpcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='rtm'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='Broadwell-noTSX'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='erms'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='invpcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='Broadwell-noTSX-IBRS'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='erms'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='invpcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='Broadwell-v1'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='erms'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='hle'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='invpcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='rtm'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='Broadwell-v2'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='erms'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='invpcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='Broadwell-v3'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='erms'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='hle'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='invpcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='rtm'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='Broadwell-v4'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='erms'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='invpcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='Cascadelake-Server'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512bw'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512cd'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512dq'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512f'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512vl'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512vnni'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='erms'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='hle'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='invpcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pku'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='rtm'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='Cascadelake-Server-noTSX'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512bw'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512cd'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512dq'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512f'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512vl'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512vnni'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='erms'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='ibrs-all'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='invpcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pku'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='Cascadelake-Server-v1'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512bw'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512cd'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512dq'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512f'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512vl'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512vnni'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='erms'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='hle'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='invpcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pku'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='rtm'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='Cascadelake-Server-v2'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512bw'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512cd'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512dq'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512f'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512vl'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512vnni'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='erms'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='hle'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='ibrs-all'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='invpcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pku'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='rtm'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='Cascadelake-Server-v3'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512bw'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512cd'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512dq'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512f'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512vl'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512vnni'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='erms'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='ibrs-all'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='invpcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pku'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='Cascadelake-Server-v4'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512bw'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512cd'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512dq'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512f'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512vl'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512vnni'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='erms'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='ibrs-all'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='invpcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pku'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='Cascadelake-Server-v5'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512bw'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512cd'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512dq'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512f'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512vl'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512vnni'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='erms'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='ibrs-all'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='invpcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pku'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='xsaves'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='Cooperlake'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512-bf16'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512bw'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512cd'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512dq'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512f'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512vl'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512vnni'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='erms'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='hle'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='ibrs-all'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='invpcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pku'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='rtm'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='taa-no'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='Cooperlake-v1'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512-bf16'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512bw'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512cd'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512dq'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512f'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512vl'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512vnni'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='erms'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='hle'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='ibrs-all'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='invpcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pku'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='rtm'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='taa-no'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='Cooperlake-v2'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512-bf16'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512bw'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512cd'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512dq'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512f'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512vl'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512vnni'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='erms'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='hle'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='ibrs-all'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='invpcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pku'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='rtm'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='taa-no'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='xsaves'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='Denverton'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='erms'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='mpx'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='Denverton-v1'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='erms'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='mpx'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='Denverton-v2'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='erms'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='Denverton-v3'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='erms'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='xsaves'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='Dhyana-v2'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='xsaves'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='EPYC-Genoa'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='amd-psfd'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='auto-ibrs'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512-bf16'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512-vpopcntdq'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512bitalg'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512bw'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512cd'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512dq'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512f'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512ifma'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512vbmi'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512vbmi2'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512vl'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512vnni'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='erms'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='fsrm'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='gfni'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='invpcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='la57'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='no-nested-data-bp'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='null-sel-clr-base'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pku'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='stibp-always-on'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='vaes'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='vpclmulqdq'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='xsaves'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='EPYC-Genoa-v1'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='amd-psfd'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='auto-ibrs'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512-bf16'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512-vpopcntdq'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512bitalg'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512bw'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512cd'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512dq'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512f'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512ifma'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512vbmi'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512vbmi2'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512vl'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512vnni'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='erms'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='fsrm'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='gfni'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='invpcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='la57'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='no-nested-data-bp'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='null-sel-clr-base'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pku'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='stibp-always-on'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='vaes'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='vpclmulqdq'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='xsaves'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='EPYC-Milan'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='erms'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='fsrm'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='invpcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pku'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='xsaves'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='EPYC-Milan-v1'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='erms'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='fsrm'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='invpcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pku'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='xsaves'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='EPYC-Milan-v2'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='amd-psfd'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='erms'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='fsrm'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='invpcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='no-nested-data-bp'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='null-sel-clr-base'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pku'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='stibp-always-on'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='vaes'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='vpclmulqdq'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='xsaves'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='EPYC-Rome'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='xsaves'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='EPYC-Rome-v1'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='xsaves'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='EPYC-Rome-v2'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='xsaves'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='EPYC-Rome-v3'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='xsaves'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='EPYC-v3'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='xsaves'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='EPYC-v4'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='xsaves'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='GraniteRapids'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='amx-bf16'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='amx-fp16'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='amx-int8'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='amx-tile'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx-vnni'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512-bf16'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512-fp16'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512-vpopcntdq'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512bitalg'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512bw'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512cd'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512dq'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512f'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512ifma'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512vbmi'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512vbmi2'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512vl'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512vnni'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='bus-lock-detect'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='erms'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='fbsdp-no'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='fsrc'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='fsrm'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='fsrs'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='fzrm'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='gfni'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='hle'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='ibrs-all'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='invpcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='la57'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='mcdt-no'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pbrsb-no'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pku'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='prefetchiti'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='psdp-no'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='rtm'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='sbdr-ssdp-no'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='serialize'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='taa-no'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='tsx-ldtrk'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='vaes'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='vpclmulqdq'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='xfd'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='xsaves'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='GraniteRapids-v1'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='amx-bf16'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='amx-fp16'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='amx-int8'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='amx-tile'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx-vnni'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512-bf16'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512-fp16'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512-vpopcntdq'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512bitalg'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512bw'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512cd'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512dq'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512f'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512ifma'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512vbmi'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512vbmi2'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512vl'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512vnni'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='bus-lock-detect'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='erms'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='fbsdp-no'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='fsrc'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='fsrm'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='fsrs'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='fzrm'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='gfni'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='hle'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='ibrs-all'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='invpcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='la57'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='mcdt-no'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pbrsb-no'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pku'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='prefetchiti'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='psdp-no'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='rtm'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='sbdr-ssdp-no'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='serialize'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='taa-no'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='tsx-ldtrk'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='vaes'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='vpclmulqdq'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='xfd'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='xsaves'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='GraniteRapids-v2'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='amx-bf16'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='amx-fp16'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='amx-int8'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='amx-tile'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx-vnni'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx10'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx10-128'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx10-256'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx10-512'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512-bf16'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512-fp16'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512-vpopcntdq'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512bitalg'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512bw'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512cd'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512dq'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512f'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512ifma'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512vbmi'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512vbmi2'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512vl'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512vnni'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='bus-lock-detect'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='cldemote'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='erms'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='fbsdp-no'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='fsrc'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='fsrm'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='fsrs'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='fzrm'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='gfni'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='hle'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='ibrs-all'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='invpcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='la57'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='mcdt-no'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='movdir64b'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='movdiri'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pbrsb-no'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pku'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='prefetchiti'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='psdp-no'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='rtm'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='sbdr-ssdp-no'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='serialize'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='ss'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='taa-no'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='tsx-ldtrk'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='vaes'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='vpclmulqdq'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='xfd'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='xsaves'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='Haswell'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='erms'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='hle'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='invpcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='rtm'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='Haswell-IBRS'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='erms'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='hle'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='invpcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='rtm'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='Haswell-noTSX'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='erms'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='invpcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='Haswell-noTSX-IBRS'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='erms'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='invpcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='Haswell-v1'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='erms'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='hle'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='invpcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='rtm'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='Haswell-v2'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='erms'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='invpcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='Haswell-v3'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='erms'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='hle'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='invpcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='rtm'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='Haswell-v4'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='erms'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='invpcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='Icelake-Server'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512-vpopcntdq'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512bitalg'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512bw'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512cd'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512dq'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512f'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512vbmi'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512vbmi2'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512vl'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512vnni'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='erms'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='gfni'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='hle'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='invpcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='la57'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pku'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='rtm'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='vaes'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='vpclmulqdq'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='Icelake-Server-noTSX'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512-vpopcntdq'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512bitalg'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512bw'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512cd'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512dq'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512f'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512vbmi'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512vbmi2'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512vl'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512vnni'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='erms'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='gfni'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='invpcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='la57'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pku'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='vaes'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='vpclmulqdq'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='Icelake-Server-v1'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512-vpopcntdq'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512bitalg'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512bw'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512cd'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512dq'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512f'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512vbmi'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512vbmi2'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512vl'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512vnni'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='erms'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='gfni'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='hle'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='invpcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='la57'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pku'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='rtm'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='vaes'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='vpclmulqdq'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='Icelake-Server-v2'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512-vpopcntdq'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512bitalg'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512bw'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512cd'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512dq'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512f'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512vbmi'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512vbmi2'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512vl'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512vnni'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='erms'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='gfni'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='invpcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='la57'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pku'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='vaes'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='vpclmulqdq'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='Icelake-Server-v3'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512-vpopcntdq'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512bitalg'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512bw'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512cd'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512dq'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512f'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512vbmi'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512vbmi2'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512vl'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512vnni'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='erms'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='gfni'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='ibrs-all'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='invpcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='la57'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pku'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='taa-no'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='vaes'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='vpclmulqdq'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='Icelake-Server-v4'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512-vpopcntdq'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512bitalg'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512bw'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512cd'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512dq'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512f'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512ifma'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512vbmi'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512vbmi2'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512vl'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512vnni'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='erms'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='fsrm'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='gfni'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='ibrs-all'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='invpcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='la57'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pku'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='taa-no'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='vaes'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='vpclmulqdq'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='Icelake-Server-v5'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512-vpopcntdq'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512bitalg'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512bw'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512cd'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512dq'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512f'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512ifma'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512vbmi'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512vbmi2'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512vl'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512vnni'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='erms'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='fsrm'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='gfni'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='ibrs-all'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='invpcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='la57'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pku'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='taa-no'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='vaes'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='vpclmulqdq'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='xsaves'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='Icelake-Server-v6'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512-vpopcntdq'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512bitalg'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512bw'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512cd'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512dq'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512f'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512ifma'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512vbmi'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512vbmi2'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512vl'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512vnni'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='erms'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='fsrm'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='gfni'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='ibrs-all'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='invpcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='la57'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pku'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='taa-no'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='vaes'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='vpclmulqdq'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='xsaves'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='Icelake-Server-v7'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512-vpopcntdq'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512bitalg'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512bw'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512cd'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512dq'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512f'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512ifma'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512vbmi'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512vbmi2'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512vl'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512vnni'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='erms'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='fsrm'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='gfni'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='hle'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='ibrs-all'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='invpcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='la57'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pku'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='rtm'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='taa-no'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='vaes'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='vpclmulqdq'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='xsaves'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='IvyBridge'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='erms'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='IvyBridge-IBRS'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='erms'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='IvyBridge-v1'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='erms'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='IvyBridge-v2'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='erms'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='KnightsMill'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512-4fmaps'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512-4vnniw'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512-vpopcntdq'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512cd'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512er'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512f'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512pf'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='erms'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='ss'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='KnightsMill-v1'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512-4fmaps'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512-4vnniw'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512-vpopcntdq'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512cd'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512er'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512f'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512pf'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='erms'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='ss'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='Opteron_G4'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='fma4'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='xop'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='Opteron_G4-v1'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='fma4'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='xop'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='Opteron_G5'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='fma4'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='tbm'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='xop'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='Opteron_G5-v1'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='fma4'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='tbm'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='xop'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='SapphireRapids'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='amx-bf16'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='amx-int8'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='amx-tile'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx-vnni'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512-bf16'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512-fp16'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512-vpopcntdq'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512bitalg'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512bw'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512cd'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512dq'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512f'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512ifma'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512vbmi'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512vbmi2'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512vl'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512vnni'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='bus-lock-detect'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='erms'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='fsrc'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='fsrm'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='fsrs'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='fzrm'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='gfni'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='hle'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='ibrs-all'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='invpcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='la57'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pku'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='rtm'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='serialize'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='taa-no'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='tsx-ldtrk'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='vaes'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='vpclmulqdq'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='xfd'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='xsaves'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='SapphireRapids-v1'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='amx-bf16'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='amx-int8'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='amx-tile'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx-vnni'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512-bf16'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512-fp16'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512-vpopcntdq'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512bitalg'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512bw'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512cd'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512dq'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512f'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512ifma'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512vbmi'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512vbmi2'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512vl'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512vnni'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='bus-lock-detect'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='erms'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='fsrc'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='fsrm'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='fsrs'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='fzrm'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='gfni'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='hle'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='ibrs-all'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='invpcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='la57'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pku'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='rtm'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='serialize'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='taa-no'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='tsx-ldtrk'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='vaes'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='vpclmulqdq'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='xfd'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='xsaves'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='SapphireRapids-v2'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='amx-bf16'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='amx-int8'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='amx-tile'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx-vnni'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512-bf16'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512-fp16'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512-vpopcntdq'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512bitalg'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512bw'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512cd'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512dq'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512f'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512ifma'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512vbmi'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512vbmi2'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512vl'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512vnni'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='bus-lock-detect'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='erms'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='fbsdp-no'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='fsrc'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='fsrm'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='fsrs'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='fzrm'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='gfni'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='hle'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='ibrs-all'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='invpcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='la57'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pku'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='psdp-no'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='rtm'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='sbdr-ssdp-no'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='serialize'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='taa-no'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='tsx-ldtrk'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='vaes'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='vpclmulqdq'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='xfd'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='xsaves'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='SapphireRapids-v3'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='amx-bf16'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='amx-int8'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='amx-tile'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx-vnni'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512-bf16'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512-fp16'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512-vpopcntdq'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512bitalg'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512bw'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512cd'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512dq'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512f'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512ifma'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512vbmi'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512vbmi2'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512vl'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512vnni'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='bus-lock-detect'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='cldemote'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='erms'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='fbsdp-no'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='fsrc'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='fsrm'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='fsrs'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='fzrm'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='gfni'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='hle'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='ibrs-all'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='invpcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='la57'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='movdir64b'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='movdiri'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pku'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='psdp-no'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='rtm'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='sbdr-ssdp-no'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='serialize'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='ss'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='taa-no'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='tsx-ldtrk'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='vaes'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='vpclmulqdq'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='xfd'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='xsaves'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='SierraForest'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx-ifma'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx-ne-convert'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx-vnni'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx-vnni-int8'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='bus-lock-detect'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='cmpccxadd'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='erms'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='fbsdp-no'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='fsrm'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='fsrs'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='gfni'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='ibrs-all'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='invpcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='mcdt-no'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pbrsb-no'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pku'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='psdp-no'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='sbdr-ssdp-no'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='serialize'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='vaes'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='vpclmulqdq'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='xsaves'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='SierraForest-v1'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx-ifma'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx-ne-convert'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx-vnni'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx-vnni-int8'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='bus-lock-detect'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='cmpccxadd'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='erms'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='fbsdp-no'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='fsrm'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='fsrs'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='gfni'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='ibrs-all'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='invpcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='mcdt-no'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pbrsb-no'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pku'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='psdp-no'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='sbdr-ssdp-no'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='serialize'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='vaes'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='vpclmulqdq'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='xsaves'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='Skylake-Client'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='erms'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='hle'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='invpcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='rtm'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='Skylake-Client-IBRS'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='erms'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='hle'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='invpcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='rtm'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='erms'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='invpcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='Skylake-Client-v1'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='erms'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='hle'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='invpcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='rtm'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='Skylake-Client-v2'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='erms'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='hle'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='invpcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='rtm'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='Skylake-Client-v3'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='erms'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='invpcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='Skylake-Client-v4'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='erms'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='invpcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='xsaves'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='Skylake-Server'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512bw'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512cd'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512dq'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512f'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512vl'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='erms'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='hle'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='invpcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pku'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='rtm'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='Skylake-Server-IBRS'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512bw'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512cd'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512dq'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512f'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512vl'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='erms'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='hle'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='invpcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pku'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='rtm'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512bw'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512cd'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512dq'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512f'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512vl'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='erms'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='invpcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pku'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='Skylake-Server-v1'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512bw'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512cd'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512dq'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512f'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512vl'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='erms'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='hle'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='invpcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pku'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='rtm'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='Skylake-Server-v2'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512bw'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512cd'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512dq'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512f'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512vl'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='erms'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='hle'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='invpcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pku'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='rtm'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='Skylake-Server-v3'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512bw'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512cd'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512dq'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512f'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512vl'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='erms'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='invpcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pku'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='Skylake-Server-v4'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512bw'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512cd'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512dq'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512f'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512vl'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='erms'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='invpcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pku'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='Skylake-Server-v5'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512bw'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512cd'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512dq'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512f'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='avx512vl'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='erms'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='invpcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pcid'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='pku'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='xsaves'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='Snowridge'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='cldemote'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='core-capability'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='erms'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='gfni'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='movdir64b'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='movdiri'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='mpx'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='split-lock-detect'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='Snowridge-v1'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='cldemote'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='core-capability'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='erms'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='gfni'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='movdir64b'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='movdiri'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='mpx'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='split-lock-detect'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='Snowridge-v2'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='cldemote'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='core-capability'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='erms'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='gfni'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='movdir64b'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='movdiri'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='split-lock-detect'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='Snowridge-v3'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='cldemote'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='core-capability'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='erms'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='gfni'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='movdir64b'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='movdiri'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='split-lock-detect'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='xsaves'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='Snowridge-v4'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='cldemote'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='erms'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='gfni'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='movdir64b'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='movdiri'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='xsaves'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='athlon'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='3dnow'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='3dnowext'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='athlon-v1'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='3dnow'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='3dnowext'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='core2duo'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='ss'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='core2duo-v1'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='ss'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='coreduo'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='ss'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='coreduo-v1'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='ss'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='n270'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='ss'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='n270-v1'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='ss'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='phenom'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='3dnow'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='3dnowext'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <blockers model='phenom-v1'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='3dnow'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <feature name='3dnowext'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </blockers>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]:     </mode>
Dec 05 09:11:58 compute-1 nova_compute[189066]:   </cpu>
Dec 05 09:11:58 compute-1 nova_compute[189066]:   <memoryBacking supported='yes'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:     <enum name='sourceType'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <value>file</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <value>anonymous</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <value>memfd</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:     </enum>
Dec 05 09:11:58 compute-1 nova_compute[189066]:   </memoryBacking>
Dec 05 09:11:58 compute-1 nova_compute[189066]:   <devices>
Dec 05 09:11:58 compute-1 nova_compute[189066]:     <disk supported='yes'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <enum name='diskDevice'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>disk</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>cdrom</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>floppy</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>lun</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </enum>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <enum name='bus'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>fdc</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>scsi</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>virtio</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>usb</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>sata</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </enum>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <enum name='model'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>virtio</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>virtio-transitional</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>virtio-non-transitional</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </enum>
Dec 05 09:11:58 compute-1 nova_compute[189066]:     </disk>
Dec 05 09:11:58 compute-1 nova_compute[189066]:     <graphics supported='yes'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <enum name='type'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>vnc</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>egl-headless</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>dbus</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </enum>
Dec 05 09:11:58 compute-1 nova_compute[189066]:     </graphics>
Dec 05 09:11:58 compute-1 nova_compute[189066]:     <video supported='yes'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <enum name='modelType'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>vga</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>cirrus</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>virtio</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>none</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>bochs</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>ramfb</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </enum>
Dec 05 09:11:58 compute-1 nova_compute[189066]:     </video>
Dec 05 09:11:58 compute-1 nova_compute[189066]:     <hostdev supported='yes'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <enum name='mode'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>subsystem</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </enum>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <enum name='startupPolicy'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>default</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>mandatory</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>requisite</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>optional</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </enum>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <enum name='subsysType'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>usb</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>pci</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>scsi</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </enum>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <enum name='capsType'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <enum name='pciBackend'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:     </hostdev>
Dec 05 09:11:58 compute-1 nova_compute[189066]:     <rng supported='yes'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <enum name='model'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>virtio</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>virtio-transitional</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>virtio-non-transitional</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </enum>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <enum name='backendModel'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>random</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>egd</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>builtin</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </enum>
Dec 05 09:11:58 compute-1 nova_compute[189066]:     </rng>
Dec 05 09:11:58 compute-1 nova_compute[189066]:     <filesystem supported='yes'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <enum name='driverType'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>path</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>handle</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>virtiofs</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </enum>
Dec 05 09:11:58 compute-1 nova_compute[189066]:     </filesystem>
Dec 05 09:11:58 compute-1 nova_compute[189066]:     <tpm supported='yes'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <enum name='model'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>tpm-tis</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>tpm-crb</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </enum>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <enum name='backendModel'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>emulator</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>external</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </enum>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <enum name='backendVersion'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>2.0</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </enum>
Dec 05 09:11:58 compute-1 nova_compute[189066]:     </tpm>
Dec 05 09:11:58 compute-1 nova_compute[189066]:     <redirdev supported='yes'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <enum name='bus'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>usb</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </enum>
Dec 05 09:11:58 compute-1 nova_compute[189066]:     </redirdev>
Dec 05 09:11:58 compute-1 nova_compute[189066]:     <channel supported='yes'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <enum name='type'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>pty</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>unix</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </enum>
Dec 05 09:11:58 compute-1 nova_compute[189066]:     </channel>
Dec 05 09:11:58 compute-1 nova_compute[189066]:     <crypto supported='yes'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <enum name='model'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <enum name='type'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>qemu</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </enum>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <enum name='backendModel'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>builtin</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </enum>
Dec 05 09:11:58 compute-1 nova_compute[189066]:     </crypto>
Dec 05 09:11:58 compute-1 nova_compute[189066]:     <interface supported='yes'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <enum name='backendType'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>default</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>passt</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </enum>
Dec 05 09:11:58 compute-1 nova_compute[189066]:     </interface>
Dec 05 09:11:58 compute-1 nova_compute[189066]:     <panic supported='yes'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <enum name='model'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>isa</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>hyperv</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </enum>
Dec 05 09:11:58 compute-1 nova_compute[189066]:     </panic>
Dec 05 09:11:58 compute-1 nova_compute[189066]:     <console supported='yes'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <enum name='type'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>null</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>vc</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>pty</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>dev</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>file</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>pipe</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>stdio</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>udp</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>tcp</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>unix</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>qemu-vdagent</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>dbus</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </enum>
Dec 05 09:11:58 compute-1 nova_compute[189066]:     </console>
Dec 05 09:11:58 compute-1 nova_compute[189066]:   </devices>
Dec 05 09:11:58 compute-1 nova_compute[189066]:   <features>
Dec 05 09:11:58 compute-1 nova_compute[189066]:     <gic supported='no'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:     <vmcoreinfo supported='yes'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:     <genid supported='yes'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:     <backingStoreInput supported='yes'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:     <backup supported='yes'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:     <async-teardown supported='yes'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:     <ps2 supported='yes'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:     <sev supported='no'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:     <sgx supported='no'/>
Dec 05 09:11:58 compute-1 nova_compute[189066]:     <hyperv supported='yes'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <enum name='features'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>relaxed</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>vapic</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>spinlocks</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>vpindex</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>runtime</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>synic</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>stimer</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>reset</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>vendor_id</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>frequencies</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>reenlightenment</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>tlbflush</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>ipi</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>avic</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>emsr_bitmap</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>xmm_input</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </enum>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <defaults>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <spinlocks>4095</spinlocks>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <stimer_direct>on</stimer_direct>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <tlbflush_direct>on</tlbflush_direct>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <tlbflush_extended>on</tlbflush_extended>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <vendor_id>Linux KVM Hv</vendor_id>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </defaults>
Dec 05 09:11:58 compute-1 nova_compute[189066]:     </hyperv>
Dec 05 09:11:58 compute-1 nova_compute[189066]:     <launchSecurity supported='yes'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       <enum name='sectype'>
Dec 05 09:11:58 compute-1 nova_compute[189066]:         <value>tdx</value>
Dec 05 09:11:58 compute-1 nova_compute[189066]:       </enum>
Dec 05 09:11:58 compute-1 nova_compute[189066]:     </launchSecurity>
Dec 05 09:11:58 compute-1 nova_compute[189066]:   </features>
Dec 05 09:11:58 compute-1 nova_compute[189066]: </domainCapabilities>
Dec 05 09:11:58 compute-1 nova_compute[189066]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Dec 05 09:11:58 compute-1 nova_compute[189066]: 2025-12-05 09:11:58.179 189070 DEBUG nova.virt.libvirt.host [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782
Dec 05 09:11:58 compute-1 nova_compute[189066]: 2025-12-05 09:11:58.179 189070 DEBUG nova.virt.libvirt.host [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782
Dec 05 09:11:58 compute-1 nova_compute[189066]: 2025-12-05 09:11:58.179 189070 DEBUG nova.virt.libvirt.host [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782
Dec 05 09:11:58 compute-1 nova_compute[189066]: 2025-12-05 09:11:58.179 189070 INFO nova.virt.libvirt.host [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] Secure Boot support detected
Dec 05 09:11:58 compute-1 nova_compute[189066]: 2025-12-05 09:11:58.182 189070 INFO nova.virt.libvirt.driver [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.
Dec 05 09:11:58 compute-1 nova_compute[189066]: 2025-12-05 09:11:58.182 189070 INFO nova.virt.libvirt.driver [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.
Dec 05 09:11:58 compute-1 nova_compute[189066]: 2025-12-05 09:11:58.194 189070 DEBUG nova.virt.libvirt.driver [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] cpu compare xml: <cpu match="exact">
Dec 05 09:11:58 compute-1 nova_compute[189066]:   <model>Nehalem</model>
Dec 05 09:11:58 compute-1 nova_compute[189066]: </cpu>
Dec 05 09:11:58 compute-1 nova_compute[189066]:  _compare_cpu /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10019
Dec 05 09:11:58 compute-1 nova_compute[189066]: 2025-12-05 09:11:58.197 189070 DEBUG nova.virt.libvirt.driver [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] Enabling emulated TPM support _check_vtpm_support /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1097
Dec 05 09:11:58 compute-1 nova_compute[189066]: 2025-12-05 09:11:58.469 189070 INFO nova.virt.node [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] Determined node identity be68f9f1-7820-4bfa-8dbd-210e13729f64 from /var/lib/nova/compute_id
Dec 05 09:11:58 compute-1 nova_compute[189066]: 2025-12-05 09:11:58.828 189070 WARNING nova.compute.manager [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] Compute nodes ['be68f9f1-7820-4bfa-8dbd-210e13729f64'] for host compute-1.ctlplane.example.com were not found in the database. If this is the first time this service is starting on this host, then you can ignore this warning.
Dec 05 09:11:59 compute-1 nova_compute[189066]: 2025-12-05 09:11:59.325 189070 INFO nova.compute.manager [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] Looking for unclaimed instances stuck in BUILDING status for nodes managed by this host
Dec 05 09:11:59 compute-1 nova_compute[189066]: 2025-12-05 09:11:59.501 189070 WARNING nova.compute.manager [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] No compute node record found for host compute-1.ctlplane.example.com. If this is the first time this service is starting on this host, then you can ignore this warning.: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host compute-1.ctlplane.example.com could not be found.
Dec 05 09:11:59 compute-1 nova_compute[189066]: 2025-12-05 09:11:59.502 189070 DEBUG oslo_concurrency.lockutils [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:11:59 compute-1 nova_compute[189066]: 2025-12-05 09:11:59.502 189070 DEBUG oslo_concurrency.lockutils [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:11:59 compute-1 nova_compute[189066]: 2025-12-05 09:11:59.503 189070 DEBUG oslo_concurrency.lockutils [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:11:59 compute-1 nova_compute[189066]: 2025-12-05 09:11:59.503 189070 DEBUG nova.compute.resource_tracker [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 05 09:11:59 compute-1 systemd[1]: Starting libvirt nodedev daemon...
Dec 05 09:11:59 compute-1 systemd[1]: Started libvirt nodedev daemon.
Dec 05 09:11:59 compute-1 nova_compute[189066]: 2025-12-05 09:11:59.830 189070 WARNING nova.virt.libvirt.driver [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 05 09:11:59 compute-1 nova_compute[189066]: 2025-12-05 09:11:59.832 189070 DEBUG nova.compute.resource_tracker [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=6197MB free_disk=73.53977584838867GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 05 09:11:59 compute-1 nova_compute[189066]: 2025-12-05 09:11:59.832 189070 DEBUG oslo_concurrency.lockutils [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:11:59 compute-1 nova_compute[189066]: 2025-12-05 09:11:59.832 189070 DEBUG oslo_concurrency.lockutils [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:11:59 compute-1 nova_compute[189066]: 2025-12-05 09:11:59.856 189070 WARNING nova.compute.resource_tracker [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] No compute node record for compute-1.ctlplane.example.com:be68f9f1-7820-4bfa-8dbd-210e13729f64: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host be68f9f1-7820-4bfa-8dbd-210e13729f64 could not be found.
Dec 05 09:12:00 compute-1 nova_compute[189066]: 2025-12-05 09:12:00.168 189070 INFO nova.compute.resource_tracker [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] Compute node record created for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com with uuid: be68f9f1-7820-4bfa-8dbd-210e13729f64
Dec 05 09:12:00 compute-1 nova_compute[189066]: 2025-12-05 09:12:00.769 189070 DEBUG nova.compute.resource_tracker [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 05 09:12:00 compute-1 nova_compute[189066]: 2025-12-05 09:12:00.769 189070 DEBUG nova.compute.resource_tracker [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 05 09:12:01 compute-1 nova_compute[189066]: 2025-12-05 09:12:01.154 189070 INFO nova.scheduler.client.report [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] [req-7f1126b9-9ff8-45f5-a6ce-10fd4bef6b5b] Created resource provider record via placement API for resource provider with UUID be68f9f1-7820-4bfa-8dbd-210e13729f64 and name compute-1.ctlplane.example.com.
Dec 05 09:12:01 compute-1 nova_compute[189066]: 2025-12-05 09:12:01.234 189070 DEBUG nova.virt.libvirt.host [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] /sys/module/kvm_amd/parameters/sev contains [N
Dec 05 09:12:01 compute-1 nova_compute[189066]: ] _kernel_supports_amd_sev /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1803
Dec 05 09:12:01 compute-1 nova_compute[189066]: 2025-12-05 09:12:01.235 189070 INFO nova.virt.libvirt.host [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] kernel doesn't support AMD SEV
Dec 05 09:12:01 compute-1 nova_compute[189066]: 2025-12-05 09:12:01.235 189070 DEBUG nova.compute.provider_tree [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] Updating inventory in ProviderTree for provider be68f9f1-7820-4bfa-8dbd-210e13729f64 with inventory: {'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 0}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Dec 05 09:12:01 compute-1 nova_compute[189066]: 2025-12-05 09:12:01.236 189070 DEBUG nova.virt.libvirt.driver [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Dec 05 09:12:01 compute-1 nova_compute[189066]: 2025-12-05 09:12:01.237 189070 DEBUG nova.virt.libvirt.driver [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] Libvirt baseline CPU <cpu>
Dec 05 09:12:01 compute-1 nova_compute[189066]:   <arch>x86_64</arch>
Dec 05 09:12:01 compute-1 nova_compute[189066]:   <model>Nehalem</model>
Dec 05 09:12:01 compute-1 nova_compute[189066]:   <vendor>AMD</vendor>
Dec 05 09:12:01 compute-1 nova_compute[189066]:   <topology sockets="8" cores="1" threads="1"/>
Dec 05 09:12:01 compute-1 nova_compute[189066]: </cpu>
Dec 05 09:12:01 compute-1 nova_compute[189066]:  _get_guest_baseline_cpu_features /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12537
Dec 05 09:12:01 compute-1 anacron[77209]: Job `cron.daily' started
Dec 05 09:12:01 compute-1 anacron[77209]: Job `cron.daily' terminated
Dec 05 09:12:01 compute-1 nova_compute[189066]: 2025-12-05 09:12:01.900 189070 DEBUG nova.scheduler.client.report [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] Updated inventory for provider be68f9f1-7820-4bfa-8dbd-210e13729f64 with generation 0 in Placement from set_inventory_for_provider using data: {'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:957
Dec 05 09:12:01 compute-1 nova_compute[189066]: 2025-12-05 09:12:01.901 189070 DEBUG nova.compute.provider_tree [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] Updating resource provider be68f9f1-7820-4bfa-8dbd-210e13729f64 generation from 0 to 1 during operation: update_inventory _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164
Dec 05 09:12:01 compute-1 nova_compute[189066]: 2025-12-05 09:12:01.901 189070 DEBUG nova.compute.provider_tree [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] Updating inventory in ProviderTree for provider be68f9f1-7820-4bfa-8dbd-210e13729f64 with inventory: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Dec 05 09:12:02 compute-1 nova_compute[189066]: 2025-12-05 09:12:02.195 189070 DEBUG nova.compute.provider_tree [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] Updating resource provider be68f9f1-7820-4bfa-8dbd-210e13729f64 generation from 1 to 2 during operation: update_traits _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164
Dec 05 09:12:02 compute-1 nova_compute[189066]: 2025-12-05 09:12:02.938 189070 DEBUG nova.compute.resource_tracker [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 05 09:12:02 compute-1 nova_compute[189066]: 2025-12-05 09:12:02.939 189070 DEBUG oslo_concurrency.lockutils [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 3.107s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:12:02 compute-1 nova_compute[189066]: 2025-12-05 09:12:02.940 189070 DEBUG nova.service [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] Creating RPC server for service compute start /usr/lib/python3.9/site-packages/nova/service.py:182
Dec 05 09:12:03 compute-1 nova_compute[189066]: 2025-12-05 09:12:03.389 189070 DEBUG nova.service [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] Join ServiceGroup membership for this service compute start /usr/lib/python3.9/site-packages/nova/service.py:199
Dec 05 09:12:03 compute-1 nova_compute[189066]: 2025-12-05 09:12:03.390 189070 DEBUG nova.servicegroup.drivers.db [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] DB_Driver: join new ServiceGroup member compute-1.ctlplane.example.com to the compute group, service = <Service: host=compute-1.ctlplane.example.com, binary=nova-compute, manager_class_name=nova.compute.manager.ComputeManager> join /usr/lib/python3.9/site-packages/nova/servicegroup/drivers/db.py:44
Dec 05 09:12:03 compute-1 sshd-session[189389]: Accepted publickey for zuul from 192.168.122.30 port 44802 ssh2: ECDSA SHA256:xFeNApY160LxGIiSPyA1pr6/OAQjwgC1t0EAbYvSE3Q
Dec 05 09:12:03 compute-1 systemd-logind[807]: New session 27 of user zuul.
Dec 05 09:12:03 compute-1 systemd[1]: Started Session 27 of User zuul.
Dec 05 09:12:03 compute-1 sshd-session[189389]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 05 09:12:04 compute-1 python3.9[189542]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 05 09:12:06 compute-1 sudo[189696]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ampoadxdyjndsoshvsxoufvfabdvvluy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925925.690876-69-93423302241773/AnsiballZ_systemd_service.py'
Dec 05 09:12:06 compute-1 sudo[189696]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:12:06 compute-1 python3.9[189698]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Dec 05 09:12:06 compute-1 systemd[1]: Reloading.
Dec 05 09:12:06 compute-1 systemd-rc-local-generator[189726]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 05 09:12:06 compute-1 systemd-sysv-generator[189729]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 05 09:12:07 compute-1 sudo[189696]: pam_unix(sudo:session): session closed for user root
Dec 05 09:12:07 compute-1 python3.9[189883]: ansible-ansible.builtin.service_facts Invoked
Dec 05 09:12:07 compute-1 network[189900]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Dec 05 09:12:07 compute-1 network[189901]: 'network-scripts' will be removed from distribution in near future.
Dec 05 09:12:07 compute-1 network[189902]: It is advised to switch to 'NetworkManager' instead for network management.
Dec 05 09:12:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:12:08.853 105272 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:12:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:12:08.857 105272 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.004s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:12:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:12:08.857 105272 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:12:11 compute-1 podman[190049]: 2025-12-05 09:12:11.66743679 +0000 UTC m=+0.103347392 container health_status 0cdc951f3b455a1459471b20e1f1626ca53f702831c125f835d1b670aeb17ccf (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a42d21893a42142b0b00e96296a13ac9d2c0fedf55d6438ad104fd0309c63376-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Dec 05 09:12:11 compute-1 sudo[190202]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-emikvcexzlhmglotxlyasniceuepeqjz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925931.5941203-126-215028620386901/AnsiballZ_systemd_service.py'
Dec 05 09:12:11 compute-1 sudo[190202]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:12:12 compute-1 python3.9[190204]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_ceilometer_agent_compute.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 05 09:12:12 compute-1 sudo[190202]: pam_unix(sudo:session): session closed for user root
Dec 05 09:12:13 compute-1 sshd-session[190205]: Received disconnect from 185.118.15.236 port 34922:11: Bye Bye [preauth]
Dec 05 09:12:13 compute-1 sshd-session[190205]: Disconnected from authenticating user root 185.118.15.236 port 34922 [preauth]
Dec 05 09:12:13 compute-1 sudo[190357]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-guemqvajzufdkprfqsafrqvmjxkbrfvu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925932.6150517-156-180753851739405/AnsiballZ_file.py'
Dec 05 09:12:13 compute-1 sudo[190357]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:12:13 compute-1 python3.9[190359]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_ceilometer_agent_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 09:12:13 compute-1 sshd-session[190150]: Received disconnect from 122.168.194.41 port 54994:11: Bye Bye [preauth]
Dec 05 09:12:13 compute-1 sshd-session[190150]: Disconnected from authenticating user root 122.168.194.41 port 54994 [preauth]
Dec 05 09:12:13 compute-1 rsyslogd[1007]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Dec 05 09:12:13 compute-1 rsyslogd[1007]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Dec 05 09:12:13 compute-1 sudo[190357]: pam_unix(sudo:session): session closed for user root
Dec 05 09:12:13 compute-1 nova_compute[189066]: 2025-12-05 09:12:13.392 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:12:13 compute-1 nova_compute[189066]: 2025-12-05 09:12:13.424 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._cleanup_running_deleted_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:12:13 compute-1 sudo[190510]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hrhaedizsrupuemypfvhgkxvrgygbrzr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925933.51848-180-214270977433841/AnsiballZ_file.py'
Dec 05 09:12:13 compute-1 sudo[190510]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:12:14 compute-1 python3.9[190512]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_ceilometer_agent_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 09:12:14 compute-1 sudo[190510]: pam_unix(sudo:session): session closed for user root
Dec 05 09:12:14 compute-1 podman[190589]: 2025-12-05 09:12:14.617434685 +0000 UTC m=+0.055085525 container health_status e8cea6352187f28e672aa25c227f2705a90d58074b8cf42235a56df5d4026fd5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a42d21893a42142b0b00e96296a13ac9d2c0fedf55d6438ad104fd0309c63376-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent)
Dec 05 09:12:14 compute-1 sudo[190683]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bstfuwxvyemxqsataendlhjrpfxwgwhs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925934.4079618-207-91783121207255/AnsiballZ_command.py'
Dec 05 09:12:14 compute-1 sudo[190683]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:12:15 compute-1 python3.9[190685]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then
                                               systemctl disable --now certmonger.service
                                               test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service
                                             fi
                                              _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 05 09:12:15 compute-1 sudo[190683]: pam_unix(sudo:session): session closed for user root
Dec 05 09:12:15 compute-1 podman[190787]: 2025-12-05 09:12:15.647071872 +0000 UTC m=+0.076482621 container health_status 3f3288243aefe700d53cffce184353529fbaf6e44f1c19ec4792ed576af41176 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Dec 05 09:12:15 compute-1 python3.9[190857]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Dec 05 09:12:16 compute-1 sudo[191007]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hagcloyvujermgdffpifzjcwutfljouq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925936.249132-261-171332059310676/AnsiballZ_systemd_service.py'
Dec 05 09:12:16 compute-1 sudo[191007]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:12:16 compute-1 python3.9[191009]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Dec 05 09:12:16 compute-1 systemd[1]: Reloading.
Dec 05 09:12:16 compute-1 systemd-rc-local-generator[191037]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 05 09:12:16 compute-1 systemd-sysv-generator[191041]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 05 09:12:17 compute-1 sudo[191007]: pam_unix(sudo:session): session closed for user root
Dec 05 09:12:17 compute-1 sudo[191194]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-puitodorrfemqqsxqyiqiyqheyonnwna ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925937.6274793-285-234282750226219/AnsiballZ_command.py'
Dec 05 09:12:17 compute-1 sudo[191194]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:12:18 compute-1 python3.9[191196]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_ceilometer_agent_compute.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 05 09:12:18 compute-1 sudo[191194]: pam_unix(sudo:session): session closed for user root
Dec 05 09:12:18 compute-1 sudo[191347]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qacdjoufikzqliyootowuqihunovctft ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925938.4455042-312-255322421045187/AnsiballZ_file.py'
Dec 05 09:12:18 compute-1 sudo[191347]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:12:18 compute-1 python3.9[191349]: ansible-ansible.builtin.file Invoked with group=zuul mode=0750 owner=zuul path=/var/lib/openstack/telemetry recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 05 09:12:18 compute-1 sudo[191347]: pam_unix(sudo:session): session closed for user root
Dec 05 09:12:19 compute-1 python3.9[191499]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 05 09:12:20 compute-1 sudo[191651]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-owynwxhdnmdfifptakmahbokzysuxuqc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925940.1014457-360-126306522831082/AnsiballZ_group.py'
Dec 05 09:12:20 compute-1 sudo[191651]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:12:20 compute-1 python3.9[191653]: ansible-ansible.builtin.group Invoked with name=libvirt state=present force=False system=False local=False non_unique=False gid=None gid_min=None gid_max=None
Dec 05 09:12:20 compute-1 sudo[191651]: pam_unix(sudo:session): session closed for user root
Dec 05 09:12:21 compute-1 sudo[191803]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qitntaqcnaviysdlppxljueuuhtzbzrt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925941.3791547-393-6315942768067/AnsiballZ_getent.py'
Dec 05 09:12:21 compute-1 sudo[191803]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:12:22 compute-1 python3.9[191805]: ansible-ansible.builtin.getent Invoked with database=passwd key=ceilometer fail_key=True service=None split=None
Dec 05 09:12:22 compute-1 sudo[191803]: pam_unix(sudo:session): session closed for user root
Dec 05 09:12:22 compute-1 sudo[191956]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gkecbguxbrrdxdkggeonvhcpcyfohbqm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925942.461382-417-182879599116237/AnsiballZ_group.py'
Dec 05 09:12:22 compute-1 sudo[191956]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:12:22 compute-1 python3.9[191958]: ansible-ansible.builtin.group Invoked with gid=42405 name=ceilometer state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Dec 05 09:12:22 compute-1 groupadd[191959]: group added to /etc/group: name=ceilometer, GID=42405
Dec 05 09:12:22 compute-1 groupadd[191959]: group added to /etc/gshadow: name=ceilometer
Dec 05 09:12:22 compute-1 groupadd[191959]: new group: name=ceilometer, GID=42405
Dec 05 09:12:22 compute-1 sudo[191956]: pam_unix(sudo:session): session closed for user root
Dec 05 09:12:24 compute-1 sudo[192114]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lcliqluktilmwecsosymizesalnexjxs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925943.2393682-441-279620174794248/AnsiballZ_user.py'
Dec 05 09:12:24 compute-1 sudo[192114]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:12:24 compute-1 python3.9[192116]: ansible-ansible.builtin.user Invoked with comment=ceilometer user group=ceilometer groups=['libvirt'] name=ceilometer shell=/sbin/nologin state=present uid=42405 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-1 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Dec 05 09:12:24 compute-1 useradd[192118]: new user: name=ceilometer, UID=42405, GID=42405, home=/home/ceilometer, shell=/sbin/nologin, from=/dev/pts/0
Dec 05 09:12:24 compute-1 useradd[192118]: add 'ceilometer' to group 'libvirt'
Dec 05 09:12:24 compute-1 useradd[192118]: add 'ceilometer' to shadow group 'libvirt'
Dec 05 09:12:24 compute-1 sudo[192114]: pam_unix(sudo:session): session closed for user root
Dec 05 09:12:29 compute-1 python3.9[192274]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/telemetry/ceilometer.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 09:12:30 compute-1 python3.9[192395]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/telemetry/ceilometer.conf mode=0640 remote_src=False src=/home/zuul/.ansible/tmp/ansible-tmp-1764925948.89273-519-273588925372566/.source.conf _original_basename=ceilometer.conf follow=False checksum=f74f01c63e6cdeca5458ef9aff2a1db5d6a4e4b9 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 09:12:30 compute-1 python3.9[192545]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/telemetry/polling.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 09:12:31 compute-1 python3.9[192666]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/telemetry/polling.yaml mode=0640 remote_src=False src=/home/zuul/.ansible/tmp/ansible-tmp-1764925950.319145-519-67644369471330/.source.yaml _original_basename=polling.yaml follow=False checksum=6c8680a286285f2e0ef9fa528ca754765e5ed0e5 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 09:12:32 compute-1 python3.9[192816]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/telemetry/custom.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 09:12:33 compute-1 python3.9[192937]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/telemetry/custom.conf mode=0640 remote_src=False src=/home/zuul/.ansible/tmp/ansible-tmp-1764925951.6181417-519-257331955376819/.source.conf _original_basename=custom.conf follow=False checksum=838b8b0a7d7f72e55ab67d39f32e3cb3eca2139b backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 09:12:34 compute-1 python3.9[193087]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.crt follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 05 09:12:34 compute-1 python3.9[193239]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.key follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 05 09:12:35 compute-1 python3.9[193391]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/telemetry/ceilometer-host-specific.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 09:12:36 compute-1 python3.9[193512]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/telemetry/ceilometer-host-specific.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764925955.050798-696-221851043458942/.source.conf follow=False _original_basename=ceilometer-host-specific.conf.j2 checksum=a9bdb897f3979025d9a372b4beff53a09cbe0d55 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 05 09:12:37 compute-1 python3.9[193662]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/telemetry/openstack_network_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 09:12:37 compute-1 python3.9[193783]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/telemetry/openstack_network_exporter.yaml mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764925956.2291293-696-22606927590853/.source.yaml follow=False _original_basename=openstack_network_exporter.yaml.j2 checksum=2b6bd0891e609bf38a73282f42888052b750bed6 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 05 09:12:38 compute-1 python3.9[193933]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/telemetry/firewall.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 09:12:38 compute-1 python3.9[194054]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/telemetry/firewall.yaml mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764925957.9318485-783-257194296855882/.source.yaml _original_basename=firewall.yaml follow=False checksum=d942d984493b214bda2913f753ff68cdcedff00e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 05 09:12:39 compute-1 python3.9[194204]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/telemetry/node_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 09:12:40 compute-1 python3.9[194325]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/telemetry/node_exporter.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1764925959.351201-831-216159597103621/.source.yaml _original_basename=node_exporter.yaml follow=False checksum=81d906d3e1e8c4f8367276f5d3a67b80ca7e989e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 09:12:41 compute-1 python3.9[194475]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/telemetry/podman_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 09:12:41 compute-1 podman[194570]: 2025-12-05 09:12:41.875603566 +0000 UTC m=+0.112848743 container health_status 0cdc951f3b455a1459471b20e1f1626ca53f702831c125f835d1b670aeb17ccf (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a42d21893a42142b0b00e96296a13ac9d2c0fedf55d6438ad104fd0309c63376-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.vendor=CentOS)
Dec 05 09:12:41 compute-1 python3.9[194606]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/telemetry/podman_exporter.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1764925960.6297617-876-203674992427056/.source.yaml _original_basename=podman_exporter.yaml follow=False checksum=7ccb5eca2ff1dc337c3f3ecbbff5245af7149c47 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 09:12:42 compute-1 python3.9[194772]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 09:12:43 compute-1 python3.9[194893]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1764925962.242968-921-66542828291655/.source.yaml _original_basename=ceilometer_prom_exporter.yaml follow=False checksum=10157c879411ee6023e506dc85a343cedc52700f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 09:12:43 compute-1 sudo[195043]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xykobcfgcdawsfnmmmnvzzzwrzyihabw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925963.5430033-966-224928998686014/AnsiballZ_file.py'
Dec 05 09:12:43 compute-1 sudo[195043]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:12:44 compute-1 python3.9[195045]: ansible-ansible.builtin.file Invoked with group=ceilometer mode=0644 owner=ceilometer path=/var/lib/openstack/certs/telemetry/default/tls.crt recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False state=None _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 09:12:44 compute-1 sudo[195043]: pam_unix(sudo:session): session closed for user root
Dec 05 09:12:44 compute-1 sudo[195195]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qkshadjkodqwdxyfdiegkuzaaeuofxyp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925964.3665152-990-12667077948260/AnsiballZ_file.py'
Dec 05 09:12:44 compute-1 sudo[195195]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:12:44 compute-1 podman[195197]: 2025-12-05 09:12:44.752358691 +0000 UTC m=+0.051781417 container health_status e8cea6352187f28e672aa25c227f2705a90d58074b8cf42235a56df5d4026fd5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a42d21893a42142b0b00e96296a13ac9d2c0fedf55d6438ad104fd0309c63376-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 05 09:12:44 compute-1 python3.9[195198]: ansible-ansible.builtin.file Invoked with group=ceilometer mode=0644 owner=ceilometer path=/var/lib/openstack/certs/telemetry/default/tls.key recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False state=None _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 09:12:44 compute-1 sudo[195195]: pam_unix(sudo:session): session closed for user root
Dec 05 09:12:45 compute-1 python3.9[195366]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 05 09:12:46 compute-1 auditd[705]: Audit daemon rotating log files
Dec 05 09:12:46 compute-1 podman[195492]: 2025-12-05 09:12:46.387464741 +0000 UTC m=+0.057928598 container health_status 3f3288243aefe700d53cffce184353529fbaf6e44f1c19ec4792ed576af41176 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125)
Dec 05 09:12:46 compute-1 python3.9[195533]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.crt follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 05 09:12:47 compute-1 python3.9[195690]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.key follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 05 09:12:47 compute-1 sudo[195842]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rxihynxccodomogeifzdfhraucnntama ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925967.6071324-1087-196929809219980/AnsiballZ_file.py'
Dec 05 09:12:47 compute-1 sudo[195842]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:12:48 compute-1 python3.9[195844]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 05 09:12:48 compute-1 sudo[195842]: pam_unix(sudo:session): session closed for user root
Dec 05 09:12:48 compute-1 sudo[195994]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vqhtdhyzchsmercdgfbdgnwxqbwqijat ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925968.3848746-1111-151691658757520/AnsiballZ_systemd_service.py'
Dec 05 09:12:48 compute-1 sudo[195994]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:12:49 compute-1 python3.9[195996]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=podman.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 05 09:12:49 compute-1 systemd[1]: Reloading.
Dec 05 09:12:49 compute-1 systemd-sysv-generator[196030]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 05 09:12:49 compute-1 systemd-rc-local-generator[196026]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 05 09:12:49 compute-1 systemd[1]: Listening on Podman API Socket.
Dec 05 09:12:49 compute-1 sudo[195994]: pam_unix(sudo:session): session closed for user root
Dec 05 09:12:50 compute-1 sudo[196187]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uhpazndczlagjjwutihqbfwaisaywfzy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925969.8782723-1137-12740798644597/AnsiballZ_stat.py'
Dec 05 09:12:50 compute-1 sudo[196187]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:12:50 compute-1 sshd-session[196034]: Received disconnect from 43.225.158.169 port 41578:11: Bye Bye [preauth]
Dec 05 09:12:50 compute-1 sshd-session[196034]: Disconnected from authenticating user root 43.225.158.169 port 41578 [preauth]
Dec 05 09:12:50 compute-1 python3.9[196189]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ceilometer_agent_compute/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 09:12:50 compute-1 sudo[196187]: pam_unix(sudo:session): session closed for user root
Dec 05 09:12:51 compute-1 sudo[196310]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zdolshocttjnnzxafdzszdueqlcoslbm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925969.8782723-1137-12740798644597/AnsiballZ_copy.py'
Dec 05 09:12:51 compute-1 sudo[196310]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:12:51 compute-1 python3.9[196312]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ceilometer_agent_compute/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764925969.8782723-1137-12740798644597/.source _original_basename=healthcheck follow=False checksum=ebb343c21fce35a02591a9351660cb7035a47d42 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Dec 05 09:12:51 compute-1 sudo[196310]: pam_unix(sudo:session): session closed for user root
Dec 05 09:12:51 compute-1 sudo[196386]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nwwdixorivrijsqdmnzthyffnnwzhqyn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925969.8782723-1137-12740798644597/AnsiballZ_stat.py'
Dec 05 09:12:51 compute-1 sudo[196386]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:12:51 compute-1 python3.9[196388]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ceilometer_agent_compute/healthcheck.future follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 09:12:51 compute-1 sudo[196386]: pam_unix(sudo:session): session closed for user root
Dec 05 09:12:52 compute-1 sudo[196509]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-genwvdetzmodguvjafyjwvxrofsfskuw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925969.8782723-1137-12740798644597/AnsiballZ_copy.py'
Dec 05 09:12:52 compute-1 sudo[196509]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:12:52 compute-1 python3.9[196511]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ceilometer_agent_compute/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764925969.8782723-1137-12740798644597/.source.future _original_basename=healthcheck.future follow=False checksum=d500a98192f4ddd70b4dfdc059e2d81aed36a294 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Dec 05 09:12:52 compute-1 sudo[196509]: pam_unix(sudo:session): session closed for user root
Dec 05 09:12:53 compute-1 sudo[196661]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nfsoawfkcggurixfuvikkxmpwlcsoley ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925973.3744352-1233-23666951047086/AnsiballZ_file.py'
Dec 05 09:12:53 compute-1 sudo[196661]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:12:53 compute-1 python3.9[196663]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/edpm-config recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 09:12:53 compute-1 sudo[196661]: pam_unix(sudo:session): session closed for user root
Dec 05 09:12:54 compute-1 sudo[196813]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-swuksbwyheddbarxngnyksakskxzusgd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925974.0899916-1257-146597958778360/AnsiballZ_file.py'
Dec 05 09:12:54 compute-1 sudo[196813]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:12:54 compute-1 python3.9[196815]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 05 09:12:54 compute-1 sudo[196813]: pam_unix(sudo:session): session closed for user root
Dec 05 09:12:55 compute-1 sudo[196965]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bcehkwxllwtikvezjqthqclctbvpipyh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925975.174355-1281-258411878961251/AnsiballZ_stat.py'
Dec 05 09:12:55 compute-1 sudo[196965]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:12:55 compute-1 python3.9[196967]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/ceilometer_agent_compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 09:12:55 compute-1 sudo[196965]: pam_unix(sudo:session): session closed for user root
Dec 05 09:12:56 compute-1 sudo[197088]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ivyylabdaeqpkdlmqrqjjdpfoktxphhx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925975.174355-1281-258411878961251/AnsiballZ_copy.py'
Dec 05 09:12:56 compute-1 sudo[197088]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:12:56 compute-1 python3.9[197090]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/ceilometer_agent_compute.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1764925975.174355-1281-258411878961251/.source.json _original_basename=.ijq49868 follow=False checksum=ce2b0c83293a970bafffa087afa083dd7c93a79c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 09:12:56 compute-1 sudo[197088]: pam_unix(sudo:session): session closed for user root
Dec 05 09:12:56 compute-1 python3.9[197240]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/ceilometer_agent_compute state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 09:12:57 compute-1 nova_compute[189066]: 2025-12-05 09:12:57.024 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:12:57 compute-1 nova_compute[189066]: 2025-12-05 09:12:57.027 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:12:57 compute-1 nova_compute[189066]: 2025-12-05 09:12:57.028 189070 DEBUG nova.compute.manager [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 05 09:12:57 compute-1 nova_compute[189066]: 2025-12-05 09:12:57.028 189070 DEBUG nova.compute.manager [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 05 09:12:57 compute-1 nova_compute[189066]: 2025-12-05 09:12:57.249 189070 DEBUG nova.compute.manager [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Dec 05 09:12:57 compute-1 nova_compute[189066]: 2025-12-05 09:12:57.249 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:12:57 compute-1 nova_compute[189066]: 2025-12-05 09:12:57.250 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:12:57 compute-1 nova_compute[189066]: 2025-12-05 09:12:57.251 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:12:57 compute-1 nova_compute[189066]: 2025-12-05 09:12:57.251 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:12:57 compute-1 nova_compute[189066]: 2025-12-05 09:12:57.251 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:12:57 compute-1 nova_compute[189066]: 2025-12-05 09:12:57.252 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:12:57 compute-1 nova_compute[189066]: 2025-12-05 09:12:57.252 189070 DEBUG nova.compute.manager [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 05 09:12:57 compute-1 nova_compute[189066]: 2025-12-05 09:12:57.252 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:12:57 compute-1 nova_compute[189066]: 2025-12-05 09:12:57.302 189070 DEBUG oslo_concurrency.lockutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:12:57 compute-1 nova_compute[189066]: 2025-12-05 09:12:57.303 189070 DEBUG oslo_concurrency.lockutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:12:57 compute-1 nova_compute[189066]: 2025-12-05 09:12:57.303 189070 DEBUG oslo_concurrency.lockutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:12:57 compute-1 nova_compute[189066]: 2025-12-05 09:12:57.303 189070 DEBUG nova.compute.resource_tracker [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 05 09:12:57 compute-1 nova_compute[189066]: 2025-12-05 09:12:57.498 189070 WARNING nova.virt.libvirt.driver [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 05 09:12:57 compute-1 nova_compute[189066]: 2025-12-05 09:12:57.500 189070 DEBUG nova.compute.resource_tracker [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=6164MB free_disk=73.53985595703125GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 05 09:12:57 compute-1 nova_compute[189066]: 2025-12-05 09:12:57.500 189070 DEBUG oslo_concurrency.lockutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:12:57 compute-1 nova_compute[189066]: 2025-12-05 09:12:57.500 189070 DEBUG oslo_concurrency.lockutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:12:57 compute-1 nova_compute[189066]: 2025-12-05 09:12:57.618 189070 DEBUG nova.compute.resource_tracker [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 05 09:12:57 compute-1 nova_compute[189066]: 2025-12-05 09:12:57.619 189070 DEBUG nova.compute.resource_tracker [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 05 09:12:57 compute-1 nova_compute[189066]: 2025-12-05 09:12:57.644 189070 DEBUG nova.compute.provider_tree [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Inventory has not changed in ProviderTree for provider: be68f9f1-7820-4bfa-8dbd-210e13729f64 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 09:12:57 compute-1 nova_compute[189066]: 2025-12-05 09:12:57.663 189070 DEBUG nova.scheduler.client.report [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Inventory has not changed for provider be68f9f1-7820-4bfa-8dbd-210e13729f64 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 09:12:57 compute-1 nova_compute[189066]: 2025-12-05 09:12:57.665 189070 DEBUG nova.compute.resource_tracker [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 05 09:12:57 compute-1 nova_compute[189066]: 2025-12-05 09:12:57.666 189070 DEBUG oslo_concurrency.lockutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.166s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:12:59 compute-1 sudo[197661]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zcussacuolgplmbzecgmzffbcciewrca ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925978.7978866-1401-67976079592084/AnsiballZ_container_config_data.py'
Dec 05 09:12:59 compute-1 sudo[197661]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:12:59 compute-1 python3.9[197663]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/ceilometer_agent_compute config_pattern=*.json debug=False
Dec 05 09:12:59 compute-1 sudo[197661]: pam_unix(sudo:session): session closed for user root
Dec 05 09:13:00 compute-1 sudo[197813]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ukaadvcaayyhwthljnanlyylvwxgmise ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925980.1178975-1434-29668810639975/AnsiballZ_container_config_hash.py'
Dec 05 09:13:00 compute-1 sudo[197813]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:13:00 compute-1 python3.9[197815]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/openstack
Dec 05 09:13:00 compute-1 sudo[197813]: pam_unix(sudo:session): session closed for user root
Dec 05 09:13:01 compute-1 sudo[197965]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ytcqycfzntikavnyfxstklilhtlbbwfq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925981.1681905-1461-177340932889451/AnsiballZ_podman_container_info.py'
Dec 05 09:13:01 compute-1 sudo[197965]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:13:01 compute-1 python3.9[197967]: ansible-containers.podman.podman_container_info Invoked with executable=podman name=None
Dec 05 09:13:01 compute-1 sudo[197965]: pam_unix(sudo:session): session closed for user root
Dec 05 09:13:03 compute-1 sudo[198143]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qqrefisqokcjwxdmzvyixikidwftzdwi ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1764925983.1451116-1500-56457383346423/AnsiballZ_edpm_container_manage.py'
Dec 05 09:13:03 compute-1 sudo[198143]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:13:04 compute-1 python3[198145]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/ceilometer_agent_compute config_id=ceilometer_agent_compute config_overrides={} config_patterns=*.json containers=['ceilometer_agent_compute'] log_base_path=/var/log/containers/stdouts debug=False
Dec 05 09:13:04 compute-1 podman[198181]: 2025-12-05 09:13:04.193413733 +0000 UTC m=+0.055380496 container create d60d3dfd47255bc7c068dbedc03dbf86678863f0414a1b8f8536e4d53dd67a13 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, container_name=ceilometer_agent_compute, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a42d21893a42142b0b00e96296a13ac9d2c0fedf55d6438ad104fd0309c63376-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, org.label-schema.build-date=20251125, tcib_managed=true, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Dec 05 09:13:04 compute-1 podman[198181]: 2025-12-05 09:13:04.163952917 +0000 UTC m=+0.025919710 image pull 343ba269c9fe0a56d7572c8ca328dbce002017c4dd4986f43667971dd03085c2 quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified
Dec 05 09:13:04 compute-1 python3[198145]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name ceilometer_agent_compute --conmon-pidfile /run/ceilometer_agent_compute.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env OS_ENDPOINT_TYPE=internal --env EDPM_CONFIG_HASH=a42d21893a42142b0b00e96296a13ac9d2c0fedf55d6438ad104fd0309c63376-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78 --healthcheck-command /openstack/healthcheck compute --label config_id=ceilometer_agent_compute --label container_name=ceilometer_agent_compute --label managed_by=edpm_ansible --label config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a42d21893a42142b0b00e96296a13ac9d2c0fedf55d6438ad104fd0309c63376-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']} --log-driver journald --log-level info --network host --security-opt label:type:ceilometer_polling_t --user ceilometer --volume /var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z --volume /var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z --volume /run/libvirt:/run/libvirt:shared,ro --volume /etc/hosts:/etc/hosts:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z --volume /var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z --volume /dev/log:/dev/log --volume /var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified kolla_start
Dec 05 09:13:04 compute-1 sudo[198143]: pam_unix(sudo:session): session closed for user root
Dec 05 09:13:04 compute-1 sudo[198369]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fozlipetwnfguereaqbutnkasxdczbau ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925984.524505-1524-71064548372896/AnsiballZ_stat.py'
Dec 05 09:13:04 compute-1 sudo[198369]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:13:04 compute-1 python3.9[198371]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 05 09:13:05 compute-1 sudo[198369]: pam_unix(sudo:session): session closed for user root
Dec 05 09:13:05 compute-1 sudo[198523]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tyxbhhjnzypzoerpbempleifcuefbiwk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925985.3896782-1551-82827279438865/AnsiballZ_file.py'
Dec 05 09:13:05 compute-1 sudo[198523]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:13:05 compute-1 python3.9[198525]: ansible-file Invoked with path=/etc/systemd/system/edpm_ceilometer_agent_compute.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 09:13:05 compute-1 sudo[198523]: pam_unix(sudo:session): session closed for user root
Dec 05 09:13:06 compute-1 sudo[198599]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vvordsqmarpmbeqomztxpeifsbccijjr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925985.3896782-1551-82827279438865/AnsiballZ_stat.py'
Dec 05 09:13:06 compute-1 sudo[198599]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:13:06 compute-1 python3.9[198601]: ansible-stat Invoked with path=/etc/systemd/system/edpm_ceilometer_agent_compute_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 05 09:13:06 compute-1 sudo[198599]: pam_unix(sudo:session): session closed for user root
Dec 05 09:13:06 compute-1 sudo[198750]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kllrltrftuayyimcpzttjjvnhujddfiw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925986.3601997-1551-58265165055859/AnsiballZ_copy.py'
Dec 05 09:13:06 compute-1 sudo[198750]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:13:06 compute-1 python3.9[198752]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764925986.3601997-1551-58265165055859/source dest=/etc/systemd/system/edpm_ceilometer_agent_compute.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 09:13:07 compute-1 sudo[198750]: pam_unix(sudo:session): session closed for user root
Dec 05 09:13:07 compute-1 sudo[198826]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jrnrpyhhqxcwyzafzihkbntdcfmoiptu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925986.3601997-1551-58265165055859/AnsiballZ_systemd.py'
Dec 05 09:13:07 compute-1 sudo[198826]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:13:07 compute-1 python3.9[198828]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Dec 05 09:13:07 compute-1 systemd[1]: Reloading.
Dec 05 09:13:07 compute-1 systemd-rc-local-generator[198855]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 05 09:13:07 compute-1 systemd-sysv-generator[198858]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 05 09:13:08 compute-1 sudo[198826]: pam_unix(sudo:session): session closed for user root
Dec 05 09:13:08 compute-1 sudo[198936]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-caqygatnjgevjplqugwfqywjukruhswn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925986.3601997-1551-58265165055859/AnsiballZ_systemd.py'
Dec 05 09:13:08 compute-1 sudo[198936]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:13:08 compute-1 python3.9[198938]: ansible-systemd Invoked with state=restarted name=edpm_ceilometer_agent_compute.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 05 09:13:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:13:08.854 105272 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:13:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:13:08.857 105272 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.003s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:13:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:13:08.857 105272 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:13:08 compute-1 systemd[1]: Reloading.
Dec 05 09:13:08 compute-1 systemd-rc-local-generator[198963]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 05 09:13:08 compute-1 systemd-sysv-generator[198966]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 05 09:13:09 compute-1 systemd[1]: Starting ceilometer_agent_compute container...
Dec 05 09:13:09 compute-1 systemd[1]: Started libcrun container.
Dec 05 09:13:09 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/84a359fcdda18f00bc6f65a40bd1462f74b9b03800960cdde7ccb6aee53b20b4/merged/etc/ceilometer/ceilometer_prom_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Dec 05 09:13:09 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/84a359fcdda18f00bc6f65a40bd1462f74b9b03800960cdde7ccb6aee53b20b4/merged/etc/ceilometer/tls supports timestamps until 2038 (0x7fffffff)
Dec 05 09:13:09 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/84a359fcdda18f00bc6f65a40bd1462f74b9b03800960cdde7ccb6aee53b20b4/merged/var/lib/kolla/config_files/config.json supports timestamps until 2038 (0x7fffffff)
Dec 05 09:13:09 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/84a359fcdda18f00bc6f65a40bd1462f74b9b03800960cdde7ccb6aee53b20b4/merged/var/lib/kolla/config_files/src supports timestamps until 2038 (0x7fffffff)
Dec 05 09:13:09 compute-1 systemd[1]: Started /usr/bin/podman healthcheck run d60d3dfd47255bc7c068dbedc03dbf86678863f0414a1b8f8536e4d53dd67a13.
Dec 05 09:13:09 compute-1 podman[198978]: 2025-12-05 09:13:09.36280276 +0000 UTC m=+0.136372182 container init d60d3dfd47255bc7c068dbedc03dbf86678863f0414a1b8f8536e4d53dd67a13 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a42d21893a42142b0b00e96296a13ac9d2c0fedf55d6438ad104fd0309c63376-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team)
Dec 05 09:13:09 compute-1 ceilometer_agent_compute[198994]: + sudo -E kolla_set_configs
Dec 05 09:13:09 compute-1 podman[198978]: 2025-12-05 09:13:09.393608419 +0000 UTC m=+0.167177821 container start d60d3dfd47255bc7c068dbedc03dbf86678863f0414a1b8f8536e4d53dd67a13 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a42d21893a42142b0b00e96296a13ac9d2c0fedf55d6438ad104fd0309c63376-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ceilometer_agent_compute)
Dec 05 09:13:09 compute-1 podman[198978]: ceilometer_agent_compute
Dec 05 09:13:09 compute-1 ceilometer_agent_compute[198994]: sudo: unable to send audit message: Operation not permitted
Dec 05 09:13:09 compute-1 sudo[199000]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Dec 05 09:13:09 compute-1 systemd[1]: Started ceilometer_agent_compute container.
Dec 05 09:13:09 compute-1 sudo[199000]: pam_systemd(sudo:session): Failed to connect to system bus: No such file or directory
Dec 05 09:13:09 compute-1 sudo[199000]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Dec 05 09:13:09 compute-1 sudo[198936]: pam_unix(sudo:session): session closed for user root
Dec 05 09:13:09 compute-1 ceilometer_agent_compute[198994]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Dec 05 09:13:09 compute-1 ceilometer_agent_compute[198994]: INFO:__main__:Validating config file
Dec 05 09:13:09 compute-1 ceilometer_agent_compute[198994]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Dec 05 09:13:09 compute-1 ceilometer_agent_compute[198994]: INFO:__main__:Copying service configuration files
Dec 05 09:13:09 compute-1 ceilometer_agent_compute[198994]: INFO:__main__:Deleting /etc/ceilometer/ceilometer.conf
Dec 05 09:13:09 compute-1 ceilometer_agent_compute[198994]: INFO:__main__:Copying /var/lib/kolla/config_files/src/ceilometer.conf to /etc/ceilometer/ceilometer.conf
Dec 05 09:13:09 compute-1 ceilometer_agent_compute[198994]: INFO:__main__:Setting permission for /etc/ceilometer/ceilometer.conf
Dec 05 09:13:09 compute-1 ceilometer_agent_compute[198994]: INFO:__main__:Deleting /etc/ceilometer/polling.yaml
Dec 05 09:13:09 compute-1 ceilometer_agent_compute[198994]: INFO:__main__:Copying /var/lib/kolla/config_files/src/polling.yaml to /etc/ceilometer/polling.yaml
Dec 05 09:13:09 compute-1 ceilometer_agent_compute[198994]: INFO:__main__:Setting permission for /etc/ceilometer/polling.yaml
Dec 05 09:13:09 compute-1 ceilometer_agent_compute[198994]: INFO:__main__:Copying /var/lib/kolla/config_files/src/custom.conf to /etc/ceilometer/ceilometer.conf.d/01-ceilometer-custom.conf
Dec 05 09:13:09 compute-1 ceilometer_agent_compute[198994]: INFO:__main__:Setting permission for /etc/ceilometer/ceilometer.conf.d/01-ceilometer-custom.conf
Dec 05 09:13:09 compute-1 ceilometer_agent_compute[198994]: INFO:__main__:Copying /var/lib/kolla/config_files/src/ceilometer-host-specific.conf to /etc/ceilometer/ceilometer.conf.d/02-ceilometer-host-specific.conf
Dec 05 09:13:09 compute-1 ceilometer_agent_compute[198994]: INFO:__main__:Setting permission for /etc/ceilometer/ceilometer.conf.d/02-ceilometer-host-specific.conf
Dec 05 09:13:09 compute-1 ceilometer_agent_compute[198994]: INFO:__main__:Writing out command to execute
Dec 05 09:13:09 compute-1 sudo[199000]: pam_unix(sudo:session): session closed for user root
Dec 05 09:13:09 compute-1 ceilometer_agent_compute[198994]: ++ cat /run_command
Dec 05 09:13:09 compute-1 podman[199001]: 2025-12-05 09:13:09.480023687 +0000 UTC m=+0.072359353 container health_status d60d3dfd47255bc7c068dbedc03dbf86678863f0414a1b8f8536e4d53dd67a13 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=starting, health_failing_streak=1, health_log=, config_id=ceilometer_agent_compute, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a42d21893a42142b0b00e96296a13ac9d2c0fedf55d6438ad104fd0309c63376-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 05 09:13:09 compute-1 ceilometer_agent_compute[198994]: + CMD='/usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout'
Dec 05 09:13:09 compute-1 ceilometer_agent_compute[198994]: + ARGS=
Dec 05 09:13:09 compute-1 ceilometer_agent_compute[198994]: + sudo kolla_copy_cacerts
Dec 05 09:13:09 compute-1 systemd[1]: d60d3dfd47255bc7c068dbedc03dbf86678863f0414a1b8f8536e4d53dd67a13-5df4121e6410b318.service: Main process exited, code=exited, status=1/FAILURE
Dec 05 09:13:09 compute-1 systemd[1]: d60d3dfd47255bc7c068dbedc03dbf86678863f0414a1b8f8536e4d53dd67a13-5df4121e6410b318.service: Failed with result 'exit-code'.
Dec 05 09:13:09 compute-1 ceilometer_agent_compute[198994]: sudo: unable to send audit message: Operation not permitted
Dec 05 09:13:09 compute-1 sudo[199023]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_copy_cacerts
Dec 05 09:13:09 compute-1 sudo[199023]: pam_systemd(sudo:session): Failed to connect to system bus: No such file or directory
Dec 05 09:13:09 compute-1 sudo[199023]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Dec 05 09:13:09 compute-1 sudo[199023]: pam_unix(sudo:session): session closed for user root
Dec 05 09:13:09 compute-1 ceilometer_agent_compute[198994]: + [[ ! -n '' ]]
Dec 05 09:13:09 compute-1 ceilometer_agent_compute[198994]: + . kolla_extend_start
Dec 05 09:13:09 compute-1 ceilometer_agent_compute[198994]: Running command: '/usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout'
Dec 05 09:13:09 compute-1 ceilometer_agent_compute[198994]: + echo 'Running command: '\''/usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout'\'''
Dec 05 09:13:09 compute-1 ceilometer_agent_compute[198994]: + umask 0022
Dec 05 09:13:09 compute-1 ceilometer_agent_compute[198994]: + exec /usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.432 2 DEBUG cotyledon.oslo_config_glue [-] Full set of CONF: _load_service_manager_options /usr/lib/python3.9/site-packages/cotyledon/oslo_config_glue.py:40
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.433 2 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.433 2 DEBUG cotyledon.oslo_config_glue [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.433 2 DEBUG cotyledon.oslo_config_glue [-] command line args: ['--polling-namespaces', 'compute', '--logfile', '/dev/stdout'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.433 2 DEBUG cotyledon.oslo_config_glue [-] config files: ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.433 2 DEBUG cotyledon.oslo_config_glue [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.433 2 DEBUG cotyledon.oslo_config_glue [-] batch_size                     = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.434 2 DEBUG cotyledon.oslo_config_glue [-] cfg_file                       = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.434 2 DEBUG cotyledon.oslo_config_glue [-] config_dir                     = ['/etc/ceilometer/ceilometer.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.434 2 DEBUG cotyledon.oslo_config_glue [-] config_file                    = ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.434 2 DEBUG cotyledon.oslo_config_glue [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.434 2 DEBUG cotyledon.oslo_config_glue [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.434 2 DEBUG cotyledon.oslo_config_glue [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'futurist=INFO', 'neutronclient=INFO', 'keystoneclient=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.434 2 DEBUG cotyledon.oslo_config_glue [-] event_pipeline_cfg_file        = event_pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.434 2 DEBUG cotyledon.oslo_config_glue [-] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.434 2 DEBUG cotyledon.oslo_config_glue [-] host                           = compute-1.ctlplane.example.com log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.434 2 DEBUG cotyledon.oslo_config_glue [-] http_timeout                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.435 2 DEBUG cotyledon.oslo_config_glue [-] hypervisor_inspector           = libvirt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.435 2 DEBUG cotyledon.oslo_config_glue [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.435 2 DEBUG cotyledon.oslo_config_glue [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.435 2 DEBUG cotyledon.oslo_config_glue [-] libvirt_type                   = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.435 2 DEBUG cotyledon.oslo_config_glue [-] libvirt_uri                    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.435 2 DEBUG cotyledon.oslo_config_glue [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.435 2 DEBUG cotyledon.oslo_config_glue [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.436 2 DEBUG cotyledon.oslo_config_glue [-] log_dir                        = /var/log/ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.436 2 DEBUG cotyledon.oslo_config_glue [-] log_file                       = /dev/stdout log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.436 2 DEBUG cotyledon.oslo_config_glue [-] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.436 2 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.436 2 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.436 2 DEBUG cotyledon.oslo_config_glue [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.436 2 DEBUG cotyledon.oslo_config_glue [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.436 2 DEBUG cotyledon.oslo_config_glue [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.436 2 DEBUG cotyledon.oslo_config_glue [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.436 2 DEBUG cotyledon.oslo_config_glue [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.437 2 DEBUG cotyledon.oslo_config_glue [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.437 2 DEBUG cotyledon.oslo_config_glue [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.437 2 DEBUG cotyledon.oslo_config_glue [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.437 2 DEBUG cotyledon.oslo_config_glue [-] max_parallel_requests          = 64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.437 2 DEBUG cotyledon.oslo_config_glue [-] partitioning_group_prefix      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.437 2 DEBUG cotyledon.oslo_config_glue [-] pipeline_cfg_file              = pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.437 2 DEBUG cotyledon.oslo_config_glue [-] polling_namespaces             = ['compute'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.437 2 DEBUG cotyledon.oslo_config_glue [-] pollsters_definitions_dirs     = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.437 2 DEBUG cotyledon.oslo_config_glue [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.437 2 DEBUG cotyledon.oslo_config_glue [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.438 2 DEBUG cotyledon.oslo_config_glue [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.438 2 DEBUG cotyledon.oslo_config_glue [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.438 2 DEBUG cotyledon.oslo_config_glue [-] reseller_prefix                = AUTH_ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.438 2 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_keys         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.438 2 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_length       = 256 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.438 2 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_namespace    = ['metering.'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.438 2 DEBUG cotyledon.oslo_config_glue [-] rootwrap_config                = /etc/ceilometer/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.438 2 DEBUG cotyledon.oslo_config_glue [-] sample_source                  = openstack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.439 2 DEBUG cotyledon.oslo_config_glue [-] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.439 2 DEBUG cotyledon.oslo_config_glue [-] tenant_name_discovery          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.439 2 DEBUG cotyledon.oslo_config_glue [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.439 2 DEBUG cotyledon.oslo_config_glue [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.439 2 DEBUG cotyledon.oslo_config_glue [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.439 2 DEBUG cotyledon.oslo_config_glue [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.439 2 DEBUG cotyledon.oslo_config_glue [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.439 2 DEBUG cotyledon.oslo_config_glue [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.439 2 DEBUG cotyledon.oslo_config_glue [-] compute.instance_discovery_method = libvirt_metadata log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.440 2 DEBUG cotyledon.oslo_config_glue [-] compute.resource_cache_expiry  = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.440 2 DEBUG cotyledon.oslo_config_glue [-] compute.resource_update_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.440 2 DEBUG cotyledon.oslo_config_glue [-] coordination.backend_url       = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.440 2 DEBUG cotyledon.oslo_config_glue [-] event.definitions_cfg_file     = event_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.440 2 DEBUG cotyledon.oslo_config_glue [-] event.drop_unmatched_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.440 2 DEBUG cotyledon.oslo_config_glue [-] event.store_raw                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.440 2 DEBUG cotyledon.oslo_config_glue [-] ipmi.node_manager_init_retry   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.440 2 DEBUG cotyledon.oslo_config_glue [-] ipmi.polling_retry             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.440 2 DEBUG cotyledon.oslo_config_glue [-] meter.meter_definitions_dirs   = ['/etc/ceilometer/meters.d', '/usr/lib/python3.9/site-packages/ceilometer/data/meters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.440 2 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_on_failure     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.441 2 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_path           = mon_pub_failures.txt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.441 2 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_section           = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.441 2 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_type              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.441 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_count            = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.441 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_max_retries      = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.441 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_mode             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.441 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_polling_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.441 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_timeout          = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.441 2 DEBUG cotyledon.oslo_config_glue [-] monasca.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.441 2 DEBUG cotyledon.oslo_config_glue [-] monasca.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.442 2 DEBUG cotyledon.oslo_config_glue [-] monasca.client_max_retries     = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.442 2 DEBUG cotyledon.oslo_config_glue [-] monasca.client_retry_interval  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.442 2 DEBUG cotyledon.oslo_config_glue [-] monasca.clientapi_version      = 2_0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.442 2 DEBUG cotyledon.oslo_config_glue [-] monasca.cloud_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.442 2 DEBUG cotyledon.oslo_config_glue [-] monasca.cluster                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.442 2 DEBUG cotyledon.oslo_config_glue [-] monasca.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.442 2 DEBUG cotyledon.oslo_config_glue [-] monasca.control_plane          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.442 2 DEBUG cotyledon.oslo_config_glue [-] monasca.enable_api_pagination  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.442 2 DEBUG cotyledon.oslo_config_glue [-] monasca.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.442 2 DEBUG cotyledon.oslo_config_glue [-] monasca.interface              = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.443 2 DEBUG cotyledon.oslo_config_glue [-] monasca.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.443 2 DEBUG cotyledon.oslo_config_glue [-] monasca.monasca_mappings       = /etc/ceilometer/monasca_field_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.443 2 DEBUG cotyledon.oslo_config_glue [-] monasca.region_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.443 2 DEBUG cotyledon.oslo_config_glue [-] monasca.retry_on_failure       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.443 2 DEBUG cotyledon.oslo_config_glue [-] monasca.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.443 2 DEBUG cotyledon.oslo_config_glue [-] monasca.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.443 2 DEBUG cotyledon.oslo_config_glue [-] notification.ack_on_event_error = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.443 2 DEBUG cotyledon.oslo_config_glue [-] notification.batch_size        = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.443 2 DEBUG cotyledon.oslo_config_glue [-] notification.batch_timeout     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.444 2 DEBUG cotyledon.oslo_config_glue [-] notification.messaging_urls    = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.444 2 DEBUG cotyledon.oslo_config_glue [-] notification.notification_control_exchanges = ['nova', 'glance', 'neutron', 'cinder', 'heat', 'keystone', 'sahara', 'trove', 'zaqar', 'swift', 'ceilometer', 'magnum', 'dns', 'ironic', 'aodh'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.444 2 DEBUG cotyledon.oslo_config_glue [-] notification.pipelines         = ['meter', 'event'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.444 2 DEBUG cotyledon.oslo_config_glue [-] notification.workers           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.444 2 DEBUG cotyledon.oslo_config_glue [-] polling.batch_size             = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.444 2 DEBUG cotyledon.oslo_config_glue [-] polling.cfg_file               = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.444 2 DEBUG cotyledon.oslo_config_glue [-] polling.partitioning_group_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.444 2 DEBUG cotyledon.oslo_config_glue [-] polling.pollsters_definitions_dirs = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.444 2 DEBUG cotyledon.oslo_config_glue [-] polling.tenant_name_discovery  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.445 2 DEBUG cotyledon.oslo_config_glue [-] publisher.telemetry_secret     = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.445 2 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.event_topic = event log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.445 2 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.metering_topic = metering log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.445 2 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.telemetry_driver = messagingv2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.445 2 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.access_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.445 2 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.secret_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.445 2 DEBUG cotyledon.oslo_config_glue [-] rgw_client.implicit_tenants    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.445 2 DEBUG cotyledon.oslo_config_glue [-] service_types.cinder           = volumev3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.445 2 DEBUG cotyledon.oslo_config_glue [-] service_types.glance           = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.446 2 DEBUG cotyledon.oslo_config_glue [-] service_types.neutron          = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.446 2 DEBUG cotyledon.oslo_config_glue [-] service_types.nova             = compute log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.446 2 DEBUG cotyledon.oslo_config_glue [-] service_types.radosgw          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.446 2 DEBUG cotyledon.oslo_config_glue [-] service_types.swift            = object-store log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.446 2 DEBUG cotyledon.oslo_config_glue [-] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.446 2 DEBUG cotyledon.oslo_config_glue [-] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.446 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_ip                 = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.446 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.446 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.447 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_username           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.447 2 DEBUG cotyledon.oslo_config_glue [-] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.447 2 DEBUG cotyledon.oslo_config_glue [-] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.447 2 DEBUG cotyledon.oslo_config_glue [-] vmware.wsdl_location           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.447 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.447 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_type  = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.447 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.cafile     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.447 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.certfile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.447 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.447 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.insecure   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.448 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.interface  = internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.448 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.keyfile    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.448 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.448 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.448 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.timeout    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.448 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_section           = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.448 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_type              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.448 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.448 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.448 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.449 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.449 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.interface              = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.449 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.449 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.region_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.449 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.449 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.449 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_section             = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.449 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_type                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.449 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.449 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.450 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.450 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.450 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.interface                = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.450 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.450 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.region_name              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.450 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.450 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.450 2 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.471 12 INFO ceilometer.polling.manager [-] Looking for dynamic pollsters configurations at [['/etc/ceilometer/pollsters.d']].
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.473 12 INFO ceilometer.polling.manager [-] No dynamic pollsters found in folder [/etc/ceilometer/pollsters.d].
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.474 12 INFO ceilometer.polling.manager [-] No dynamic pollsters file found in dirs [['/etc/ceilometer/pollsters.d']].
Dec 05 09:13:10 compute-1 python3.9[199175]: ansible-ansible.builtin.slurp Invoked with src=/var/lib/edpm-config/deployed_services.yaml
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.602 12 DEBUG ceilometer.compute.virt.libvirt.utils [-] Connecting to libvirt: qemu:///system new_libvirt_connection /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/utils.py:93
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.696 12 DEBUG cotyledon.oslo_config_glue [-] Full set of CONF: _load_service_options /usr/lib/python3.9/site-packages/cotyledon/oslo_config_glue.py:48
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.696 12 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.696 12 DEBUG cotyledon.oslo_config_glue [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.696 12 DEBUG cotyledon.oslo_config_glue [-] command line args: ['--polling-namespaces', 'compute', '--logfile', '/dev/stdout'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.697 12 DEBUG cotyledon.oslo_config_glue [-] config files: ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.697 12 DEBUG cotyledon.oslo_config_glue [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.697 12 DEBUG cotyledon.oslo_config_glue [-] batch_size                     = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.697 12 DEBUG cotyledon.oslo_config_glue [-] cfg_file                       = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.697 12 DEBUG cotyledon.oslo_config_glue [-] config_dir                     = ['/etc/ceilometer/ceilometer.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.697 12 DEBUG cotyledon.oslo_config_glue [-] config_file                    = ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.697 12 DEBUG cotyledon.oslo_config_glue [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.697 12 DEBUG cotyledon.oslo_config_glue [-] control_exchange               = ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.697 12 DEBUG cotyledon.oslo_config_glue [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.697 12 DEBUG cotyledon.oslo_config_glue [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'futurist=INFO', 'neutronclient=INFO', 'keystoneclient=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.698 12 DEBUG cotyledon.oslo_config_glue [-] event_pipeline_cfg_file        = event_pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.698 12 DEBUG cotyledon.oslo_config_glue [-] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.698 12 DEBUG cotyledon.oslo_config_glue [-] host                           = compute-1.ctlplane.example.com log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.698 12 DEBUG cotyledon.oslo_config_glue [-] http_timeout                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.698 12 DEBUG cotyledon.oslo_config_glue [-] hypervisor_inspector           = libvirt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.698 12 DEBUG cotyledon.oslo_config_glue [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.698 12 DEBUG cotyledon.oslo_config_glue [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.698 12 DEBUG cotyledon.oslo_config_glue [-] libvirt_type                   = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.698 12 DEBUG cotyledon.oslo_config_glue [-] libvirt_uri                    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.698 12 DEBUG cotyledon.oslo_config_glue [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.699 12 DEBUG cotyledon.oslo_config_glue [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.699 12 DEBUG cotyledon.oslo_config_glue [-] log_dir                        = /var/log/ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.699 12 DEBUG cotyledon.oslo_config_glue [-] log_file                       = /dev/stdout log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.699 12 DEBUG cotyledon.oslo_config_glue [-] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.699 12 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.699 12 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.699 12 DEBUG cotyledon.oslo_config_glue [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.699 12 DEBUG cotyledon.oslo_config_glue [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.699 12 DEBUG cotyledon.oslo_config_glue [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.699 12 DEBUG cotyledon.oslo_config_glue [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.699 12 DEBUG cotyledon.oslo_config_glue [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.699 12 DEBUG cotyledon.oslo_config_glue [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.699 12 DEBUG cotyledon.oslo_config_glue [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.700 12 DEBUG cotyledon.oslo_config_glue [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.700 12 DEBUG cotyledon.oslo_config_glue [-] max_parallel_requests          = 64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.700 12 DEBUG cotyledon.oslo_config_glue [-] partitioning_group_prefix      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.700 12 DEBUG cotyledon.oslo_config_glue [-] pipeline_cfg_file              = pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.700 12 DEBUG cotyledon.oslo_config_glue [-] polling_namespaces             = ['compute'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.700 12 DEBUG cotyledon.oslo_config_glue [-] pollsters_definitions_dirs     = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.700 12 DEBUG cotyledon.oslo_config_glue [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.700 12 DEBUG cotyledon.oslo_config_glue [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.700 12 DEBUG cotyledon.oslo_config_glue [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.700 12 DEBUG cotyledon.oslo_config_glue [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.700 12 DEBUG cotyledon.oslo_config_glue [-] reseller_prefix                = AUTH_ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.701 12 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_keys         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.701 12 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_length       = 256 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.701 12 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_namespace    = ['metering.'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.701 12 DEBUG cotyledon.oslo_config_glue [-] rootwrap_config                = /etc/ceilometer/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.701 12 DEBUG cotyledon.oslo_config_glue [-] sample_source                  = openstack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.701 12 DEBUG cotyledon.oslo_config_glue [-] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.701 12 DEBUG cotyledon.oslo_config_glue [-] tenant_name_discovery          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.701 12 DEBUG cotyledon.oslo_config_glue [-] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.701 12 DEBUG cotyledon.oslo_config_glue [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.701 12 DEBUG cotyledon.oslo_config_glue [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.701 12 DEBUG cotyledon.oslo_config_glue [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.701 12 DEBUG cotyledon.oslo_config_glue [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.702 12 DEBUG cotyledon.oslo_config_glue [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.702 12 DEBUG cotyledon.oslo_config_glue [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.702 12 DEBUG cotyledon.oslo_config_glue [-] compute.instance_discovery_method = libvirt_metadata log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.702 12 DEBUG cotyledon.oslo_config_glue [-] compute.resource_cache_expiry  = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.702 12 DEBUG cotyledon.oslo_config_glue [-] compute.resource_update_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.702 12 DEBUG cotyledon.oslo_config_glue [-] coordination.backend_url       = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.702 12 DEBUG cotyledon.oslo_config_glue [-] event.definitions_cfg_file     = event_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.702 12 DEBUG cotyledon.oslo_config_glue [-] event.drop_unmatched_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.702 12 DEBUG cotyledon.oslo_config_glue [-] event.store_raw                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.702 12 DEBUG cotyledon.oslo_config_glue [-] ipmi.node_manager_init_retry   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.702 12 DEBUG cotyledon.oslo_config_glue [-] ipmi.polling_retry             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.702 12 DEBUG cotyledon.oslo_config_glue [-] meter.meter_definitions_dirs   = ['/etc/ceilometer/meters.d', '/usr/lib/python3.9/site-packages/ceilometer/data/meters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.703 12 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_on_failure     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.703 12 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_path           = mon_pub_failures.txt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.703 12 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_section           = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.703 12 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_type              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.703 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_count            = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.703 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_max_retries      = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.703 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_mode             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.703 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_polling_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.703 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_timeout          = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.703 12 DEBUG cotyledon.oslo_config_glue [-] monasca.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.703 12 DEBUG cotyledon.oslo_config_glue [-] monasca.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.704 12 DEBUG cotyledon.oslo_config_glue [-] monasca.client_max_retries     = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.704 12 DEBUG cotyledon.oslo_config_glue [-] monasca.client_retry_interval  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.704 12 DEBUG cotyledon.oslo_config_glue [-] monasca.clientapi_version      = 2_0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.704 12 DEBUG cotyledon.oslo_config_glue [-] monasca.cloud_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.704 12 DEBUG cotyledon.oslo_config_glue [-] monasca.cluster                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.704 12 DEBUG cotyledon.oslo_config_glue [-] monasca.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.704 12 DEBUG cotyledon.oslo_config_glue [-] monasca.control_plane          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.704 12 DEBUG cotyledon.oslo_config_glue [-] monasca.enable_api_pagination  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.704 12 DEBUG cotyledon.oslo_config_glue [-] monasca.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.705 12 DEBUG cotyledon.oslo_config_glue [-] monasca.interface              = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.705 12 DEBUG cotyledon.oslo_config_glue [-] monasca.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.705 12 DEBUG cotyledon.oslo_config_glue [-] monasca.monasca_mappings       = /etc/ceilometer/monasca_field_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.705 12 DEBUG cotyledon.oslo_config_glue [-] monasca.region_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.705 12 DEBUG cotyledon.oslo_config_glue [-] monasca.retry_on_failure       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.705 12 DEBUG cotyledon.oslo_config_glue [-] monasca.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.705 12 DEBUG cotyledon.oslo_config_glue [-] monasca.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.705 12 DEBUG cotyledon.oslo_config_glue [-] notification.ack_on_event_error = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.705 12 DEBUG cotyledon.oslo_config_glue [-] notification.batch_size        = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.705 12 DEBUG cotyledon.oslo_config_glue [-] notification.batch_timeout     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.705 12 DEBUG cotyledon.oslo_config_glue [-] notification.messaging_urls    = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.706 12 DEBUG cotyledon.oslo_config_glue [-] notification.notification_control_exchanges = ['nova', 'glance', 'neutron', 'cinder', 'heat', 'keystone', 'sahara', 'trove', 'zaqar', 'swift', 'ceilometer', 'magnum', 'dns', 'ironic', 'aodh'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.706 12 DEBUG cotyledon.oslo_config_glue [-] notification.pipelines         = ['meter', 'event'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.706 12 DEBUG cotyledon.oslo_config_glue [-] notification.workers           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.706 12 DEBUG cotyledon.oslo_config_glue [-] polling.batch_size             = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.706 12 DEBUG cotyledon.oslo_config_glue [-] polling.cfg_file               = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.706 12 DEBUG cotyledon.oslo_config_glue [-] polling.partitioning_group_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.706 12 DEBUG cotyledon.oslo_config_glue [-] polling.pollsters_definitions_dirs = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.706 12 DEBUG cotyledon.oslo_config_glue [-] polling.tenant_name_discovery  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.706 12 DEBUG cotyledon.oslo_config_glue [-] publisher.telemetry_secret     = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.707 12 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.event_topic = event log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.707 12 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.metering_topic = metering log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.707 12 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.telemetry_driver = messagingv2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.707 12 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.access_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.707 12 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.secret_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.707 12 DEBUG cotyledon.oslo_config_glue [-] rgw_client.implicit_tenants    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.707 12 DEBUG cotyledon.oslo_config_glue [-] service_types.cinder           = volumev3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.707 12 DEBUG cotyledon.oslo_config_glue [-] service_types.glance           = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.707 12 DEBUG cotyledon.oslo_config_glue [-] service_types.neutron          = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.707 12 DEBUG cotyledon.oslo_config_glue [-] service_types.nova             = compute log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.707 12 DEBUG cotyledon.oslo_config_glue [-] service_types.radosgw          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.708 12 DEBUG cotyledon.oslo_config_glue [-] service_types.swift            = object-store log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.708 12 DEBUG cotyledon.oslo_config_glue [-] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.708 12 DEBUG cotyledon.oslo_config_glue [-] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.708 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_ip                 = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.708 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.708 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.708 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_username           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.708 12 DEBUG cotyledon.oslo_config_glue [-] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.708 12 DEBUG cotyledon.oslo_config_glue [-] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.708 12 DEBUG cotyledon.oslo_config_glue [-] vmware.wsdl_location           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.709 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.709 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_type  = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.709 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_url   = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.709 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.cafile     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.709 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.certfile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.709 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.709 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.default_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.709 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.709 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.domain_id  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.709 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.710 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.insecure   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.710 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.interface  = internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.710 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.keyfile    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.710 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.password   = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.710 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.710 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_domain_name = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.710 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.710 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_name = service log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.710 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.710 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.710 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.system_scope = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.711 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.timeout    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.711 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.trust_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.711 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.user_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.711 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.user_domain_name = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.711 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.user_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.711 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.username   = ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.711 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_section           = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.711 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_type              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.711 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.711 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.711 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.712 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.712 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.interface              = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.712 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.712 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.region_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.712 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.712 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.712 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_section             = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.712 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_type                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.712 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.713 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.713 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.713 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.713 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.interface                = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.713 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.713 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.region_name              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.713 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.713 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.714 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.driver = ['noop'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.714 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.714 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.714 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.714 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.714 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.714 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.714 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.714 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.714 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.714 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.715 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.715 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.715 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.715 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.715 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.715 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.715 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.715 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.715 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.715 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.716 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.716 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.716 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.716 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.716 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.716 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.716 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.716 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.717 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.717 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.717 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.717 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.717 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.717 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.717 12 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.717 12 DEBUG cotyledon._service [-] Run service AgentManager(0) [12] wait_forever /usr/lib/python3.9/site-packages/cotyledon/_service.py:241
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.721 12 DEBUG ceilometer.agent [-] Config file: {'sources': [{'name': 'pollsters', 'interval': 120, 'meters': ['power.state', 'cpu', 'memory.usage', 'disk.*', 'network.*']}]} load_config /usr/lib/python3.9/site-packages/ceilometer/agent.py:64
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.728 12 DEBUG ceilometer.compute.virt.libvirt.utils [-] Connecting to libvirt: qemu:///system new_libvirt_connection /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/utils.py:93
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.733 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.733 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.733 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.733 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.734 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.734 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.734 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.734 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.734 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.734 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.734 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.734 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.734 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.734 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.734 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.734 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.734 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.735 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.735 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.735 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.735 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.735 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.735 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.735 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 09:13:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:13:10.735 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 09:13:11 compute-1 sudo[199331]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xiqihbrjrubarakrmzpmfvdjqsbefmvc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925991.0720031-1674-51125270355342/AnsiballZ_stat.py'
Dec 05 09:13:11 compute-1 sudo[199331]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:13:11 compute-1 python3.9[199333]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/deployed_services.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 09:13:11 compute-1 sudo[199331]: pam_unix(sudo:session): session closed for user root
Dec 05 09:13:12 compute-1 sudo[199467]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rswioykqpqtrhdoeyzctxlmfxrmoxcap ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925991.0720031-1674-51125270355342/AnsiballZ_copy.py'
Dec 05 09:13:12 compute-1 sudo[199467]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:13:12 compute-1 podman[199430]: 2025-12-05 09:13:12.052620958 +0000 UTC m=+0.096895529 container health_status 0cdc951f3b455a1459471b20e1f1626ca53f702831c125f835d1b670aeb17ccf (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a42d21893a42142b0b00e96296a13ac9d2c0fedf55d6438ad104fd0309c63376-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Dec 05 09:13:12 compute-1 sshd[130045]: Timeout before authentication for connection from 101.47.162.91 to 38.102.83.154, pid = 182640
Dec 05 09:13:12 compute-1 python3.9[199475]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/deployed_services.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764925991.0720031-1674-51125270355342/.source.yaml _original_basename=.4z814ehl follow=False checksum=41e448310035f63ff5c8b5bfcd24634b9e685b7c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 09:13:12 compute-1 sudo[199467]: pam_unix(sudo:session): session closed for user root
Dec 05 09:13:12 compute-1 sudo[199634]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hgppcmjrgigyoupokkkidlwcaykemxgg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925992.475008-1720-226326527582354/AnsiballZ_stat.py'
Dec 05 09:13:12 compute-1 sudo[199634]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:13:12 compute-1 python3.9[199636]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/node_exporter/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 09:13:12 compute-1 sudo[199634]: pam_unix(sudo:session): session closed for user root
Dec 05 09:13:13 compute-1 sudo[199757]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lwqwicsbwiitxnazgiqkndesxulgrwdo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925992.475008-1720-226326527582354/AnsiballZ_copy.py'
Dec 05 09:13:13 compute-1 sudo[199757]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:13:13 compute-1 python3.9[199759]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/node_exporter/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764925992.475008-1720-226326527582354/.source _original_basename=healthcheck follow=False checksum=e380c11c36804bfc65a818f2960cfa663daacfe5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Dec 05 09:13:13 compute-1 sudo[199757]: pam_unix(sudo:session): session closed for user root
Dec 05 09:13:14 compute-1 sudo[199920]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ebzgxisltxmawxgcnhflyslnitbcvhyh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925994.540952-1782-140316042880444/AnsiballZ_file.py'
Dec 05 09:13:14 compute-1 sudo[199920]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:13:14 compute-1 podman[199883]: 2025-12-05 09:13:14.87450868 +0000 UTC m=+0.067194946 container health_status e8cea6352187f28e672aa25c227f2705a90d58074b8cf42235a56df5d4026fd5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a42d21893a42142b0b00e96296a13ac9d2c0fedf55d6438ad104fd0309c63376-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Dec 05 09:13:15 compute-1 python3.9[199928]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/edpm-config recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 09:13:15 compute-1 sudo[199920]: pam_unix(sudo:session): session closed for user root
Dec 05 09:13:15 compute-1 sudo[200080]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-taozffscffmyhbukfjkuimxkebzxzqmc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925995.3249137-1806-84783895549697/AnsiballZ_file.py'
Dec 05 09:13:15 compute-1 sudo[200080]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:13:15 compute-1 python3.9[200082]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 05 09:13:15 compute-1 sudo[200080]: pam_unix(sudo:session): session closed for user root
Dec 05 09:13:16 compute-1 sudo[200232]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jjliooitymhgibbemwmjhzfkblhfilea ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925996.1080484-1830-192831088103119/AnsiballZ_stat.py'
Dec 05 09:13:16 compute-1 sudo[200232]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:13:16 compute-1 podman[200234]: 2025-12-05 09:13:16.535487439 +0000 UTC m=+0.101200746 container health_status 3f3288243aefe700d53cffce184353529fbaf6e44f1c19ec4792ed576af41176 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=multipathd, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Dec 05 09:13:16 compute-1 python3.9[200235]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/ceilometer_agent_compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 09:13:16 compute-1 sudo[200232]: pam_unix(sudo:session): session closed for user root
Dec 05 09:13:16 compute-1 sudo[200332]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rzdevkfdzfyezoiulpijppzfbmrzudzc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764925996.1080484-1830-192831088103119/AnsiballZ_file.py'
Dec 05 09:13:16 compute-1 sudo[200332]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:13:17 compute-1 python3.9[200334]: ansible-ansible.legacy.file Invoked with mode=0600 dest=/var/lib/kolla/config_files/ceilometer_agent_compute.json _original_basename=.maalkc42 recurse=False state=file path=/var/lib/kolla/config_files/ceilometer_agent_compute.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 09:13:17 compute-1 sudo[200332]: pam_unix(sudo:session): session closed for user root
Dec 05 09:13:17 compute-1 python3.9[200484]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/node_exporter state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 09:13:21 compute-1 sudo[200905]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fnpknllqttjelgvhvmeznffecwxtonpy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764926000.8199508-1941-181851350484329/AnsiballZ_container_config_data.py'
Dec 05 09:13:21 compute-1 sudo[200905]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:13:21 compute-1 python3.9[200907]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/node_exporter config_pattern=*.json debug=False
Dec 05 09:13:21 compute-1 sudo[200905]: pam_unix(sudo:session): session closed for user root
Dec 05 09:13:22 compute-1 sudo[201057]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vapssemhpzophrmhakricupfigmrltch ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764926001.9161813-1974-197774919344663/AnsiballZ_container_config_hash.py'
Dec 05 09:13:22 compute-1 sudo[201057]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:13:22 compute-1 python3.9[201059]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/openstack
Dec 05 09:13:22 compute-1 sudo[201057]: pam_unix(sudo:session): session closed for user root
Dec 05 09:13:23 compute-1 sudo[201209]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jbnppbmoyaccfawkzipooudgvirlohbx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764926003.1294632-2001-43730056133038/AnsiballZ_podman_container_info.py'
Dec 05 09:13:23 compute-1 sudo[201209]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:13:23 compute-1 python3.9[201211]: ansible-containers.podman.podman_container_info Invoked with executable=podman name=None
Dec 05 09:13:24 compute-1 sudo[201209]: pam_unix(sudo:session): session closed for user root
Dec 05 09:13:25 compute-1 sudo[201390]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lzlbwppsialqtddqxcxpxtoscoutgjhy ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1764926005.340263-2040-70467452713284/AnsiballZ_edpm_container_manage.py'
Dec 05 09:13:25 compute-1 sudo[201390]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:13:25 compute-1 python3[201392]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/node_exporter config_id=node_exporter config_overrides={} config_patterns=*.json containers=['node_exporter'] log_base_path=/var/log/containers/stdouts debug=False
Dec 05 09:13:26 compute-1 podman[201429]: 2025-12-05 09:13:26.184654367 +0000 UTC m=+0.049185678 container create 550b604b3b40c506829669f3af0f3e8dc4e7ac01464222ce7f26998708fe1a96 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, config_id=node_exporter, container_name=node_exporter, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Dec 05 09:13:26 compute-1 podman[201429]: 2025-12-05 09:13:26.159340321 +0000 UTC m=+0.023871632 image pull 0da6a335fe1356545476b749c68f022c897de3a2139e8f0054f6937349ee2b83 quay.io/prometheus/node-exporter:v1.5.0
Dec 05 09:13:26 compute-1 python3[201392]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name node_exporter --conmon-pidfile /run/node_exporter.pid --env OS_ENDPOINT_TYPE=internal --env EDPM_CONFIG_HASH=6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78 --healthcheck-command /openstack/healthcheck node_exporter --label config_id=node_exporter --label container_name=node_exporter --label managed_by=edpm_ansible --label config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --publish 9100:9100 --user root --volume /var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z --volume /var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z --volume /var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw --volume /var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z quay.io/prometheus/node-exporter:v1.5.0 --web.config.file=/etc/node_exporter/node_exporter.yaml --web.disable-exporter-metrics --collector.systemd --collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service --no-collector.dmi --no-collector.entropy --no-collector.thermal_zone --no-collector.time --no-collector.timex --no-collector.uname --no-collector.stat --no-collector.hwmon --no-collector.os --no-collector.selinux --no-collector.textfile --no-collector.powersupplyclass --no-collector.pressure --no-collector.rapl
Dec 05 09:13:26 compute-1 sudo[201390]: pam_unix(sudo:session): session closed for user root
Dec 05 09:13:26 compute-1 sshd-session[201263]: Received disconnect from 122.168.194.41 port 54576:11: Bye Bye [preauth]
Dec 05 09:13:26 compute-1 sshd-session[201263]: Disconnected from authenticating user root 122.168.194.41 port 54576 [preauth]
Dec 05 09:13:26 compute-1 sudo[201615]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bafgarqjlyewsdlbvqlofhgdyohjpgkl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764926006.5614684-2064-122758281712186/AnsiballZ_stat.py'
Dec 05 09:13:26 compute-1 sudo[201615]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:13:27 compute-1 python3.9[201617]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 05 09:13:27 compute-1 sudo[201615]: pam_unix(sudo:session): session closed for user root
Dec 05 09:13:27 compute-1 sudo[201769]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ycttgirustxfaoiqamrmjmhvutfeplky ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764926007.4450164-2091-25868412983198/AnsiballZ_file.py'
Dec 05 09:13:27 compute-1 sudo[201769]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:13:27 compute-1 python3.9[201771]: ansible-file Invoked with path=/etc/systemd/system/edpm_node_exporter.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 09:13:27 compute-1 sudo[201769]: pam_unix(sudo:session): session closed for user root
Dec 05 09:13:28 compute-1 sudo[201845]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bkurlqypexkmtnjrueidnwvsssmtbxoz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764926007.4450164-2091-25868412983198/AnsiballZ_stat.py'
Dec 05 09:13:28 compute-1 sudo[201845]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:13:28 compute-1 python3.9[201847]: ansible-stat Invoked with path=/etc/systemd/system/edpm_node_exporter_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 05 09:13:28 compute-1 sudo[201845]: pam_unix(sudo:session): session closed for user root
Dec 05 09:13:29 compute-1 sudo[201996]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ixblelnskmfordzacbkykuyvfdflxtdt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764926008.5790026-2091-203986077185892/AnsiballZ_copy.py'
Dec 05 09:13:29 compute-1 sudo[201996]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:13:29 compute-1 python3.9[201998]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764926008.5790026-2091-203986077185892/source dest=/etc/systemd/system/edpm_node_exporter.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 09:13:29 compute-1 sudo[201996]: pam_unix(sudo:session): session closed for user root
Dec 05 09:13:29 compute-1 sudo[202072]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lulslfkbhdwbrnkzeveempopasrxxlgf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764926008.5790026-2091-203986077185892/AnsiballZ_systemd.py'
Dec 05 09:13:29 compute-1 sudo[202072]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:13:29 compute-1 python3.9[202074]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Dec 05 09:13:29 compute-1 systemd[1]: Reloading.
Dec 05 09:13:29 compute-1 systemd-sysv-generator[202104]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 05 09:13:29 compute-1 systemd-rc-local-generator[202100]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 05 09:13:30 compute-1 sudo[202072]: pam_unix(sudo:session): session closed for user root
Dec 05 09:13:30 compute-1 sudo[202183]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wmpwwrkqflsriekblbbkkjyntoqyuvet ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764926008.5790026-2091-203986077185892/AnsiballZ_systemd.py'
Dec 05 09:13:30 compute-1 sudo[202183]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:13:30 compute-1 python3.9[202185]: ansible-systemd Invoked with state=restarted name=edpm_node_exporter.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 05 09:13:30 compute-1 systemd[1]: Reloading.
Dec 05 09:13:30 compute-1 systemd-rc-local-generator[202217]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 05 09:13:30 compute-1 systemd-sysv-generator[202220]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 05 09:13:31 compute-1 systemd[1]: Starting node_exporter container...
Dec 05 09:13:31 compute-1 systemd[1]: Started libcrun container.
Dec 05 09:13:31 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/32e1caee4de2cc01af82088ae7e920ad0867f6dcc8568fdb6a128580c0686e8a/merged/etc/node_exporter/node_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Dec 05 09:13:31 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/32e1caee4de2cc01af82088ae7e920ad0867f6dcc8568fdb6a128580c0686e8a/merged/etc/node_exporter/tls supports timestamps until 2038 (0x7fffffff)
Dec 05 09:13:31 compute-1 systemd[1]: Started /usr/bin/podman healthcheck run 550b604b3b40c506829669f3af0f3e8dc4e7ac01464222ce7f26998708fe1a96.
Dec 05 09:13:31 compute-1 podman[202227]: 2025-12-05 09:13:31.288344527 +0000 UTC m=+0.144256915 container init 550b604b3b40c506829669f3af0f3e8dc4e7ac01464222ce7f26998708fe1a96 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Dec 05 09:13:31 compute-1 node_exporter[202241]: ts=2025-12-05T09:13:31.307Z caller=node_exporter.go:180 level=info msg="Starting node_exporter" version="(version=1.5.0, branch=HEAD, revision=1b48970ffcf5630534fb00bb0687d73c66d1c959)"
Dec 05 09:13:31 compute-1 node_exporter[202241]: ts=2025-12-05T09:13:31.307Z caller=node_exporter.go:181 level=info msg="Build context" build_context="(go=go1.19.3, user=root@6e7732a7b81b, date=20221129-18:59:09)"
Dec 05 09:13:31 compute-1 node_exporter[202241]: ts=2025-12-05T09:13:31.307Z caller=node_exporter.go:183 level=warn msg="Node Exporter is running as root user. This exporter is designed to run as unprivileged user, root is not required."
Dec 05 09:13:31 compute-1 node_exporter[202241]: ts=2025-12-05T09:13:31.308Z caller=diskstats_common.go:111 level=info collector=diskstats msg="Parsed flag --collector.diskstats.device-exclude" flag=^(ram|loop|fd|(h|s|v|xv)d[a-z]|nvme\d+n\d+p)\d+$
Dec 05 09:13:31 compute-1 node_exporter[202241]: ts=2025-12-05T09:13:31.308Z caller=diskstats_linux.go:264 level=error collector=diskstats msg="Failed to open directory, disabling udev device properties" path=/run/udev/data
Dec 05 09:13:31 compute-1 node_exporter[202241]: ts=2025-12-05T09:13:31.308Z caller=filesystem_common.go:111 level=info collector=filesystem msg="Parsed flag --collector.filesystem.mount-points-exclude" flag=^/(dev|proc|run/credentials/.+|sys|var/lib/docker/.+|var/lib/containers/storage/.+)($|/)
Dec 05 09:13:31 compute-1 node_exporter[202241]: ts=2025-12-05T09:13:31.308Z caller=filesystem_common.go:113 level=info collector=filesystem msg="Parsed flag --collector.filesystem.fs-types-exclude" flag=^(autofs|binfmt_misc|bpf|cgroup2?|configfs|debugfs|devpts|devtmpfs|fusectl|hugetlbfs|iso9660|mqueue|nsfs|overlay|proc|procfs|pstore|rpc_pipefs|securityfs|selinuxfs|squashfs|sysfs|tracefs)$
Dec 05 09:13:31 compute-1 node_exporter[202241]: ts=2025-12-05T09:13:31.308Z caller=systemd_linux.go:152 level=info collector=systemd msg="Parsed flag --collector.systemd.unit-include" flag=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service
Dec 05 09:13:31 compute-1 node_exporter[202241]: ts=2025-12-05T09:13:31.308Z caller=systemd_linux.go:154 level=info collector=systemd msg="Parsed flag --collector.systemd.unit-exclude" flag=.+\.(automount|device|mount|scope|slice)
Dec 05 09:13:31 compute-1 node_exporter[202241]: ts=2025-12-05T09:13:31.309Z caller=node_exporter.go:110 level=info msg="Enabled collectors"
Dec 05 09:13:31 compute-1 node_exporter[202241]: ts=2025-12-05T09:13:31.309Z caller=node_exporter.go:117 level=info collector=arp
Dec 05 09:13:31 compute-1 node_exporter[202241]: ts=2025-12-05T09:13:31.309Z caller=node_exporter.go:117 level=info collector=bcache
Dec 05 09:13:31 compute-1 node_exporter[202241]: ts=2025-12-05T09:13:31.309Z caller=node_exporter.go:117 level=info collector=bonding
Dec 05 09:13:31 compute-1 node_exporter[202241]: ts=2025-12-05T09:13:31.309Z caller=node_exporter.go:117 level=info collector=btrfs
Dec 05 09:13:31 compute-1 node_exporter[202241]: ts=2025-12-05T09:13:31.309Z caller=node_exporter.go:117 level=info collector=conntrack
Dec 05 09:13:31 compute-1 node_exporter[202241]: ts=2025-12-05T09:13:31.309Z caller=node_exporter.go:117 level=info collector=cpu
Dec 05 09:13:31 compute-1 node_exporter[202241]: ts=2025-12-05T09:13:31.309Z caller=node_exporter.go:117 level=info collector=cpufreq
Dec 05 09:13:31 compute-1 node_exporter[202241]: ts=2025-12-05T09:13:31.309Z caller=node_exporter.go:117 level=info collector=diskstats
Dec 05 09:13:31 compute-1 node_exporter[202241]: ts=2025-12-05T09:13:31.309Z caller=node_exporter.go:117 level=info collector=edac
Dec 05 09:13:31 compute-1 node_exporter[202241]: ts=2025-12-05T09:13:31.309Z caller=node_exporter.go:117 level=info collector=fibrechannel
Dec 05 09:13:31 compute-1 node_exporter[202241]: ts=2025-12-05T09:13:31.309Z caller=node_exporter.go:117 level=info collector=filefd
Dec 05 09:13:31 compute-1 node_exporter[202241]: ts=2025-12-05T09:13:31.309Z caller=node_exporter.go:117 level=info collector=filesystem
Dec 05 09:13:31 compute-1 node_exporter[202241]: ts=2025-12-05T09:13:31.309Z caller=node_exporter.go:117 level=info collector=infiniband
Dec 05 09:13:31 compute-1 node_exporter[202241]: ts=2025-12-05T09:13:31.309Z caller=node_exporter.go:117 level=info collector=ipvs
Dec 05 09:13:31 compute-1 node_exporter[202241]: ts=2025-12-05T09:13:31.309Z caller=node_exporter.go:117 level=info collector=loadavg
Dec 05 09:13:31 compute-1 node_exporter[202241]: ts=2025-12-05T09:13:31.309Z caller=node_exporter.go:117 level=info collector=mdadm
Dec 05 09:13:31 compute-1 node_exporter[202241]: ts=2025-12-05T09:13:31.309Z caller=node_exporter.go:117 level=info collector=meminfo
Dec 05 09:13:31 compute-1 node_exporter[202241]: ts=2025-12-05T09:13:31.309Z caller=node_exporter.go:117 level=info collector=netclass
Dec 05 09:13:31 compute-1 node_exporter[202241]: ts=2025-12-05T09:13:31.309Z caller=node_exporter.go:117 level=info collector=netdev
Dec 05 09:13:31 compute-1 node_exporter[202241]: ts=2025-12-05T09:13:31.309Z caller=node_exporter.go:117 level=info collector=netstat
Dec 05 09:13:31 compute-1 node_exporter[202241]: ts=2025-12-05T09:13:31.309Z caller=node_exporter.go:117 level=info collector=nfs
Dec 05 09:13:31 compute-1 node_exporter[202241]: ts=2025-12-05T09:13:31.309Z caller=node_exporter.go:117 level=info collector=nfsd
Dec 05 09:13:31 compute-1 node_exporter[202241]: ts=2025-12-05T09:13:31.309Z caller=node_exporter.go:117 level=info collector=nvme
Dec 05 09:13:31 compute-1 node_exporter[202241]: ts=2025-12-05T09:13:31.309Z caller=node_exporter.go:117 level=info collector=schedstat
Dec 05 09:13:31 compute-1 node_exporter[202241]: ts=2025-12-05T09:13:31.309Z caller=node_exporter.go:117 level=info collector=sockstat
Dec 05 09:13:31 compute-1 node_exporter[202241]: ts=2025-12-05T09:13:31.309Z caller=node_exporter.go:117 level=info collector=softnet
Dec 05 09:13:31 compute-1 node_exporter[202241]: ts=2025-12-05T09:13:31.309Z caller=node_exporter.go:117 level=info collector=systemd
Dec 05 09:13:31 compute-1 node_exporter[202241]: ts=2025-12-05T09:13:31.309Z caller=node_exporter.go:117 level=info collector=tapestats
Dec 05 09:13:31 compute-1 node_exporter[202241]: ts=2025-12-05T09:13:31.309Z caller=node_exporter.go:117 level=info collector=udp_queues
Dec 05 09:13:31 compute-1 node_exporter[202241]: ts=2025-12-05T09:13:31.309Z caller=node_exporter.go:117 level=info collector=vmstat
Dec 05 09:13:31 compute-1 node_exporter[202241]: ts=2025-12-05T09:13:31.309Z caller=node_exporter.go:117 level=info collector=xfs
Dec 05 09:13:31 compute-1 node_exporter[202241]: ts=2025-12-05T09:13:31.309Z caller=node_exporter.go:117 level=info collector=zfs
Dec 05 09:13:31 compute-1 node_exporter[202241]: ts=2025-12-05T09:13:31.310Z caller=tls_config.go:232 level=info msg="Listening on" address=[::]:9100
Dec 05 09:13:31 compute-1 node_exporter[202241]: ts=2025-12-05T09:13:31.311Z caller=tls_config.go:268 level=info msg="TLS is enabled." http2=true address=[::]:9100
Dec 05 09:13:31 compute-1 podman[202227]: 2025-12-05 09:13:31.31603709 +0000 UTC m=+0.171949448 container start 550b604b3b40c506829669f3af0f3e8dc4e7ac01464222ce7f26998708fe1a96 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec 05 09:13:31 compute-1 podman[202227]: node_exporter
Dec 05 09:13:31 compute-1 systemd[1]: Started node_exporter container.
Dec 05 09:13:31 compute-1 sudo[202183]: pam_unix(sudo:session): session closed for user root
Dec 05 09:13:31 compute-1 podman[202251]: 2025-12-05 09:13:31.415305538 +0000 UTC m=+0.087127938 container health_status 550b604b3b40c506829669f3af0f3e8dc4e7ac01464222ce7f26998708fe1a96 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec 05 09:13:31 compute-1 sshd-session[202186]: Received disconnect from 185.118.15.236 port 35046:11: Bye Bye [preauth]
Dec 05 09:13:31 compute-1 sshd-session[202186]: Disconnected from authenticating user root 185.118.15.236 port 35046 [preauth]
Dec 05 09:13:32 compute-1 python3.9[202423]: ansible-ansible.builtin.slurp Invoked with src=/var/lib/edpm-config/deployed_services.yaml
Dec 05 09:13:33 compute-1 sshd[130045]: drop connection #0 from [101.47.162.91]:39218 on [38.102.83.154]:22 penalty: exceeded LoginGraceTime
Dec 05 09:13:34 compute-1 sudo[202573]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fbkegeplrfijgtundqnvuoewjspzlogp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764926013.7817101-2214-103907590293606/AnsiballZ_stat.py'
Dec 05 09:13:34 compute-1 sudo[202573]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:13:34 compute-1 python3.9[202575]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/deployed_services.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 09:13:34 compute-1 sudo[202573]: pam_unix(sudo:session): session closed for user root
Dec 05 09:13:34 compute-1 sudo[202698]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hnplrkvrvpoglfyycncextguqvhkigwq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764926013.7817101-2214-103907590293606/AnsiballZ_copy.py'
Dec 05 09:13:34 compute-1 sudo[202698]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:13:34 compute-1 python3.9[202700]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/deployed_services.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764926013.7817101-2214-103907590293606/.source.yaml _original_basename=.j1n8a4vm follow=False checksum=90f1137b1cb26b58a8bdb23c78f3a76c77ae62a6 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 09:13:34 compute-1 sudo[202698]: pam_unix(sudo:session): session closed for user root
Dec 05 09:13:35 compute-1 sudo[202850]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zavgrywtwrbijehdrdbtglzikxgqyyqt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764926015.1541667-2259-46894368627313/AnsiballZ_stat.py'
Dec 05 09:13:35 compute-1 sudo[202850]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:13:35 compute-1 python3.9[202852]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/podman_exporter/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 09:13:35 compute-1 sudo[202850]: pam_unix(sudo:session): session closed for user root
Dec 05 09:13:36 compute-1 sudo[202973]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bciealjzsfvdkzrhesbgcrycuqxjhaxi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764926015.1541667-2259-46894368627313/AnsiballZ_copy.py'
Dec 05 09:13:36 compute-1 sudo[202973]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:13:36 compute-1 python3.9[202975]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/podman_exporter/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764926015.1541667-2259-46894368627313/.source _original_basename=healthcheck follow=False checksum=e380c11c36804bfc65a818f2960cfa663daacfe5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Dec 05 09:13:36 compute-1 sudo[202973]: pam_unix(sudo:session): session closed for user root
Dec 05 09:13:38 compute-1 sudo[203125]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ulzicafyyudjnzkpycbhhiwsntvxizcj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764926017.6556551-2322-277105381595769/AnsiballZ_file.py'
Dec 05 09:13:38 compute-1 sudo[203125]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:13:38 compute-1 python3.9[203127]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/edpm-config recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 09:13:38 compute-1 sudo[203125]: pam_unix(sudo:session): session closed for user root
Dec 05 09:13:39 compute-1 sudo[203277]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gtlymfaqlmcwmekffsfhljfcwsxxnezv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764926018.794513-2346-169756155478742/AnsiballZ_file.py'
Dec 05 09:13:39 compute-1 sudo[203277]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:13:39 compute-1 python3.9[203279]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 05 09:13:39 compute-1 sudo[203277]: pam_unix(sudo:session): session closed for user root
Dec 05 09:13:39 compute-1 podman[203304]: 2025-12-05 09:13:39.637597363 +0000 UTC m=+0.070424497 container health_status d60d3dfd47255bc7c068dbedc03dbf86678863f0414a1b8f8536e4d53dd67a13 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=starting, health_failing_streak=2, health_log=, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a42d21893a42142b0b00e96296a13ac9d2c0fedf55d6438ad104fd0309c63376-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Dec 05 09:13:39 compute-1 systemd[1]: d60d3dfd47255bc7c068dbedc03dbf86678863f0414a1b8f8536e4d53dd67a13-5df4121e6410b318.service: Main process exited, code=exited, status=1/FAILURE
Dec 05 09:13:39 compute-1 systemd[1]: d60d3dfd47255bc7c068dbedc03dbf86678863f0414a1b8f8536e4d53dd67a13-5df4121e6410b318.service: Failed with result 'exit-code'.
Dec 05 09:13:39 compute-1 sudo[203449]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kcwqgepdczaycutwevzxcyftjjuutmho ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764926019.6511846-2370-85315490967573/AnsiballZ_stat.py'
Dec 05 09:13:39 compute-1 sudo[203449]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:13:40 compute-1 python3.9[203451]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/ceilometer_agent_compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 09:13:40 compute-1 sudo[203449]: pam_unix(sudo:session): session closed for user root
Dec 05 09:13:40 compute-1 sudo[203527]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jvdqekayzstwkacmidrinjbhlivjnslb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764926019.6511846-2370-85315490967573/AnsiballZ_file.py'
Dec 05 09:13:40 compute-1 sudo[203527]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:13:40 compute-1 python3.9[203529]: ansible-ansible.legacy.file Invoked with mode=0600 dest=/var/lib/kolla/config_files/ceilometer_agent_compute.json _original_basename=.sq9ueayo recurse=False state=file path=/var/lib/kolla/config_files/ceilometer_agent_compute.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 09:13:40 compute-1 sudo[203527]: pam_unix(sudo:session): session closed for user root
Dec 05 09:13:41 compute-1 python3.9[203679]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/podman_exporter state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 09:13:42 compute-1 podman[203900]: 2025-12-05 09:13:42.665040487 +0000 UTC m=+0.096176604 container health_status 0cdc951f3b455a1459471b20e1f1626ca53f702831c125f835d1b670aeb17ccf (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a42d21893a42142b0b00e96296a13ac9d2c0fedf55d6438ad104fd0309c63376-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 05 09:13:44 compute-1 sudo[204126]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vmuzecgpxljsgfwymyspxlzooedcocyl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764926023.9225736-2481-277069642600801/AnsiballZ_container_config_data.py'
Dec 05 09:13:44 compute-1 sudo[204126]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:13:44 compute-1 python3.9[204128]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/podman_exporter config_pattern=*.json debug=False
Dec 05 09:13:44 compute-1 sudo[204126]: pam_unix(sudo:session): session closed for user root
Dec 05 09:13:45 compute-1 sudo[204291]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-auugbmpjdhbrracgloptoqgkhwyzjthg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764926024.9902072-2514-223312000721051/AnsiballZ_container_config_hash.py'
Dec 05 09:13:45 compute-1 sudo[204291]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:13:45 compute-1 podman[204252]: 2025-12-05 09:13:45.287696057 +0000 UTC m=+0.050620723 container health_status e8cea6352187f28e672aa25c227f2705a90d58074b8cf42235a56df5d4026fd5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a42d21893a42142b0b00e96296a13ac9d2c0fedf55d6438ad104fd0309c63376-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Dec 05 09:13:45 compute-1 python3.9[204297]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/openstack
Dec 05 09:13:45 compute-1 sudo[204291]: pam_unix(sudo:session): session closed for user root
Dec 05 09:13:46 compute-1 sudo[204447]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gunymjhpvzfvbdttucuhkdsqpyirxuqg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764926025.9077997-2541-211225060173863/AnsiballZ_podman_container_info.py'
Dec 05 09:13:46 compute-1 sudo[204447]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:13:46 compute-1 python3.9[204449]: ansible-containers.podman.podman_container_info Invoked with executable=podman name=None
Dec 05 09:13:46 compute-1 sudo[204447]: pam_unix(sudo:session): session closed for user root
Dec 05 09:13:47 compute-1 podman[204499]: 2025-12-05 09:13:47.637900783 +0000 UTC m=+0.070742084 container health_status 3f3288243aefe700d53cffce184353529fbaf6e44f1c19ec4792ed576af41176 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_id=multipathd, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true)
Dec 05 09:13:48 compute-1 sudo[204644]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hxiyuhitdxknqvhuulydnkkwntqfjnbg ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1764926028.3652053-2580-260180803265316/AnsiballZ_edpm_container_manage.py'
Dec 05 09:13:48 compute-1 sudo[204644]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:13:48 compute-1 python3[204646]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/podman_exporter config_id=podman_exporter config_overrides={} config_patterns=*.json containers=['podman_exporter'] log_base_path=/var/log/containers/stdouts debug=False
Dec 05 09:13:51 compute-1 podman[204659]: 2025-12-05 09:13:51.227002047 +0000 UTC m=+2.158970039 image pull e56d40e393eb5ea8704d9af8cf0d74665df83747106713fda91530f201837815 quay.io/navidys/prometheus-podman-exporter:v1.10.1
Dec 05 09:13:51 compute-1 podman[204756]: 2025-12-05 09:13:51.409869455 +0000 UTC m=+0.074613548 container create b9f45cdcb567bb250381fa80d86c78c226d5638c7de1b6d79ee017275814ff68 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, config_id=podman_exporter, container_name=podman_exporter, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Dec 05 09:13:51 compute-1 podman[204756]: 2025-12-05 09:13:51.362115151 +0000 UTC m=+0.026859264 image pull e56d40e393eb5ea8704d9af8cf0d74665df83747106713fda91530f201837815 quay.io/navidys/prometheus-podman-exporter:v1.10.1
Dec 05 09:13:51 compute-1 python3[204646]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name podman_exporter --conmon-pidfile /run/podman_exporter.pid --env CONTAINER_HOST=unix:///run/podman/podman.sock --env OS_ENDPOINT_TYPE=internal --env EDPM_CONFIG_HASH=6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78 --healthcheck-command /openstack/healthcheck podman_exporter --label config_id=podman_exporter --label container_name=podman_exporter --label managed_by=edpm_ansible --label config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --publish 9882:9882 --user root --volume /var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z --volume /var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z --volume /run/podman/podman.sock:/run/podman/podman.sock:rw,z --volume /var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z quay.io/navidys/prometheus-podman-exporter:v1.10.1 --web.config.file=/etc/podman_exporter/podman_exporter.yaml
Dec 05 09:13:51 compute-1 sudo[204644]: pam_unix(sudo:session): session closed for user root
Dec 05 09:13:52 compute-1 sudo[204942]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mczfssaaszdnkamobwtndjncnsvrhmmq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764926031.8498023-2604-163694769022963/AnsiballZ_stat.py'
Dec 05 09:13:52 compute-1 sudo[204942]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:13:52 compute-1 python3.9[204944]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 05 09:13:52 compute-1 sudo[204942]: pam_unix(sudo:session): session closed for user root
Dec 05 09:13:53 compute-1 sudo[205096]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cvwvfihlntabgmoccosaqwrhqwlenbnx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764926032.73725-2631-39506542968408/AnsiballZ_file.py'
Dec 05 09:13:53 compute-1 sudo[205096]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:13:53 compute-1 python3.9[205098]: ansible-file Invoked with path=/etc/systemd/system/edpm_podman_exporter.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 09:13:53 compute-1 sudo[205096]: pam_unix(sudo:session): session closed for user root
Dec 05 09:13:53 compute-1 sudo[205172]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ktzzrenyqayvilfizddnpqdmdpskxibh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764926032.73725-2631-39506542968408/AnsiballZ_stat.py'
Dec 05 09:13:53 compute-1 sudo[205172]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:13:53 compute-1 python3.9[205174]: ansible-stat Invoked with path=/etc/systemd/system/edpm_podman_exporter_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 05 09:13:53 compute-1 sudo[205172]: pam_unix(sudo:session): session closed for user root
Dec 05 09:13:54 compute-1 sudo[205323]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qacgrfmcskommqdsdulxranvgjmpozne ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764926033.739661-2631-93135690504884/AnsiballZ_copy.py'
Dec 05 09:13:54 compute-1 sudo[205323]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:13:54 compute-1 python3.9[205325]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764926033.739661-2631-93135690504884/source dest=/etc/systemd/system/edpm_podman_exporter.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 09:13:54 compute-1 sudo[205323]: pam_unix(sudo:session): session closed for user root
Dec 05 09:13:54 compute-1 sudo[205399]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rbaeorplucugpopjfrcaqvyanchpbivs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764926033.739661-2631-93135690504884/AnsiballZ_systemd.py'
Dec 05 09:13:54 compute-1 sudo[205399]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:13:55 compute-1 python3.9[205401]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Dec 05 09:13:55 compute-1 systemd[1]: Reloading.
Dec 05 09:13:55 compute-1 systemd-sysv-generator[205433]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 05 09:13:55 compute-1 systemd-rc-local-generator[205429]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 05 09:13:55 compute-1 sudo[205399]: pam_unix(sudo:session): session closed for user root
Dec 05 09:13:55 compute-1 sudo[205510]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ckuajudppxedlyysrgocckawsrmkaokq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764926033.739661-2631-93135690504884/AnsiballZ_systemd.py'
Dec 05 09:13:55 compute-1 sudo[205510]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:13:55 compute-1 python3.9[205512]: ansible-systemd Invoked with state=restarted name=edpm_podman_exporter.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 05 09:13:55 compute-1 systemd[1]: Reloading.
Dec 05 09:13:56 compute-1 systemd-sysv-generator[205546]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 05 09:13:56 compute-1 systemd-rc-local-generator[205543]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 05 09:13:56 compute-1 systemd[1]: Starting podman_exporter container...
Dec 05 09:13:56 compute-1 systemd[1]: Started libcrun container.
Dec 05 09:13:56 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/bb30ff3d150a5f800f247a971dd9d74c2fdb82820bc58d62c72421764c0a6470/merged/etc/podman_exporter/tls supports timestamps until 2038 (0x7fffffff)
Dec 05 09:13:56 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/bb30ff3d150a5f800f247a971dd9d74c2fdb82820bc58d62c72421764c0a6470/merged/etc/podman_exporter/podman_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Dec 05 09:13:56 compute-1 systemd[1]: Started /usr/bin/podman healthcheck run b9f45cdcb567bb250381fa80d86c78c226d5638c7de1b6d79ee017275814ff68.
Dec 05 09:13:56 compute-1 podman[205552]: 2025-12-05 09:13:56.456098318 +0000 UTC m=+0.138680302 container init b9f45cdcb567bb250381fa80d86c78c226d5638c7de1b6d79ee017275814ff68 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Dec 05 09:13:56 compute-1 podman_exporter[205567]: ts=2025-12-05T09:13:56.476Z caller=exporter.go:68 level=info msg="Starting podman-prometheus-exporter" version="(version=1.10.1, branch=HEAD, revision=1)"
Dec 05 09:13:56 compute-1 podman_exporter[205567]: ts=2025-12-05T09:13:56.476Z caller=exporter.go:69 level=info msg=metrics enhanced=false
Dec 05 09:13:56 compute-1 podman_exporter[205567]: ts=2025-12-05T09:13:56.476Z caller=handler.go:94 level=info msg="enabled collectors"
Dec 05 09:13:56 compute-1 podman_exporter[205567]: ts=2025-12-05T09:13:56.476Z caller=handler.go:105 level=info collector=container
Dec 05 09:13:56 compute-1 podman[205552]: 2025-12-05 09:13:56.486918646 +0000 UTC m=+0.169500630 container start b9f45cdcb567bb250381fa80d86c78c226d5638c7de1b6d79ee017275814ff68 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Dec 05 09:13:56 compute-1 podman[205552]: podman_exporter
Dec 05 09:13:56 compute-1 systemd[1]: Starting Podman API Service...
Dec 05 09:13:56 compute-1 systemd[1]: Started Podman API Service.
Dec 05 09:13:56 compute-1 systemd[1]: Started podman_exporter container.
Dec 05 09:13:56 compute-1 sudo[205510]: pam_unix(sudo:session): session closed for user root
Dec 05 09:13:56 compute-1 podman[205580]: time="2025-12-05T09:13:56Z" level=info msg="/usr/bin/podman filtering at log level info"
Dec 05 09:13:56 compute-1 podman[205580]: time="2025-12-05T09:13:56Z" level=info msg="Setting parallel job count to 25"
Dec 05 09:13:56 compute-1 podman[205580]: time="2025-12-05T09:13:56Z" level=info msg="Using sqlite as database backend"
Dec 05 09:13:56 compute-1 podman[205580]: time="2025-12-05T09:13:56Z" level=info msg="Not using native diff for overlay, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled"
Dec 05 09:13:56 compute-1 podman[205580]: time="2025-12-05T09:13:56Z" level=info msg="Using systemd socket activation to determine API endpoint"
Dec 05 09:13:56 compute-1 podman[205580]: time="2025-12-05T09:13:56Z" level=info msg="API service listening on \"/run/podman/podman.sock\". URI: \"unix:///run/podman/podman.sock\""
Dec 05 09:13:56 compute-1 podman[205580]: @ - - [05/Dec/2025:09:13:56 +0000] "GET /v4.9.3/libpod/_ping HTTP/1.1" 200 2 "" "Go-http-client/1.1"
Dec 05 09:13:56 compute-1 podman[205580]: time="2025-12-05T09:13:56Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 05 09:13:56 compute-1 podman[205577]: 2025-12-05 09:13:56.604716358 +0000 UTC m=+0.100389535 container health_status b9f45cdcb567bb250381fa80d86c78c226d5638c7de1b6d79ee017275814ff68 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=starting, health_failing_streak=1, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 05 09:13:56 compute-1 podman[205580]: @ - - [05/Dec/2025:09:13:56 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=true&sync=false HTTP/1.1" 200 20705 "" "Go-http-client/1.1"
Dec 05 09:13:56 compute-1 podman_exporter[205567]: ts=2025-12-05T09:13:56.606Z caller=exporter.go:96 level=info msg="Listening on" address=:9882
Dec 05 09:13:56 compute-1 podman_exporter[205567]: ts=2025-12-05T09:13:56.607Z caller=tls_config.go:313 level=info msg="Listening on" address=[::]:9882
Dec 05 09:13:56 compute-1 podman_exporter[205567]: ts=2025-12-05T09:13:56.607Z caller=tls_config.go:349 level=info msg="TLS is enabled." http2=true address=[::]:9882
Dec 05 09:13:56 compute-1 systemd[1]: b9f45cdcb567bb250381fa80d86c78c226d5638c7de1b6d79ee017275814ff68-6ac2323c86f04350.service: Main process exited, code=exited, status=1/FAILURE
Dec 05 09:13:56 compute-1 systemd[1]: b9f45cdcb567bb250381fa80d86c78c226d5638c7de1b6d79ee017275814ff68-6ac2323c86f04350.service: Failed with result 'exit-code'.
Dec 05 09:13:57 compute-1 nova_compute[189066]: 2025-12-05 09:13:57.654 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:13:57 compute-1 nova_compute[189066]: 2025-12-05 09:13:57.903 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:13:57 compute-1 nova_compute[189066]: 2025-12-05 09:13:57.904 189070 DEBUG nova.compute.manager [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 05 09:13:57 compute-1 nova_compute[189066]: 2025-12-05 09:13:57.904 189070 DEBUG nova.compute.manager [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 05 09:13:57 compute-1 python3.9[205765]: ansible-ansible.builtin.slurp Invoked with src=/var/lib/edpm-config/deployed_services.yaml
Dec 05 09:13:58 compute-1 sshd-session[205638]: Received disconnect from 43.225.158.169 port 54721:11: Bye Bye [preauth]
Dec 05 09:13:58 compute-1 sshd-session[205638]: Disconnected from authenticating user root 43.225.158.169 port 54721 [preauth]
Dec 05 09:13:58 compute-1 sudo[205915]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yzgtejithdqyremneveewvsdpudinbpt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764926038.5544581-2754-173130725033554/AnsiballZ_stat.py'
Dec 05 09:13:58 compute-1 sudo[205915]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:13:58 compute-1 nova_compute[189066]: 2025-12-05 09:13:58.946 189070 DEBUG nova.compute.manager [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Dec 05 09:13:58 compute-1 nova_compute[189066]: 2025-12-05 09:13:58.947 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:13:58 compute-1 nova_compute[189066]: 2025-12-05 09:13:58.948 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:13:58 compute-1 nova_compute[189066]: 2025-12-05 09:13:58.948 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:13:58 compute-1 nova_compute[189066]: 2025-12-05 09:13:58.948 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:13:58 compute-1 nova_compute[189066]: 2025-12-05 09:13:58.948 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:13:58 compute-1 nova_compute[189066]: 2025-12-05 09:13:58.948 189070 DEBUG nova.compute.manager [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 05 09:13:58 compute-1 nova_compute[189066]: 2025-12-05 09:13:58.949 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:13:59 compute-1 nova_compute[189066]: 2025-12-05 09:13:59.004 189070 DEBUG oslo_concurrency.lockutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:13:59 compute-1 nova_compute[189066]: 2025-12-05 09:13:59.005 189070 DEBUG oslo_concurrency.lockutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:13:59 compute-1 nova_compute[189066]: 2025-12-05 09:13:59.005 189070 DEBUG oslo_concurrency.lockutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:13:59 compute-1 nova_compute[189066]: 2025-12-05 09:13:59.006 189070 DEBUG nova.compute.resource_tracker [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 05 09:13:59 compute-1 python3.9[205917]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/deployed_services.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 09:13:59 compute-1 sudo[205915]: pam_unix(sudo:session): session closed for user root
Dec 05 09:13:59 compute-1 nova_compute[189066]: 2025-12-05 09:13:59.213 189070 WARNING nova.virt.libvirt.driver [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 05 09:13:59 compute-1 nova_compute[189066]: 2025-12-05 09:13:59.214 189070 DEBUG nova.compute.resource_tracker [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=6053MB free_disk=73.48635482788086GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 05 09:13:59 compute-1 nova_compute[189066]: 2025-12-05 09:13:59.214 189070 DEBUG oslo_concurrency.lockutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:13:59 compute-1 nova_compute[189066]: 2025-12-05 09:13:59.215 189070 DEBUG oslo_concurrency.lockutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:13:59 compute-1 nova_compute[189066]: 2025-12-05 09:13:59.309 189070 DEBUG nova.compute.resource_tracker [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 05 09:13:59 compute-1 nova_compute[189066]: 2025-12-05 09:13:59.309 189070 DEBUG nova.compute.resource_tracker [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 05 09:13:59 compute-1 nova_compute[189066]: 2025-12-05 09:13:59.346 189070 DEBUG nova.compute.provider_tree [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Inventory has not changed in ProviderTree for provider: be68f9f1-7820-4bfa-8dbd-210e13729f64 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 09:13:59 compute-1 nova_compute[189066]: 2025-12-05 09:13:59.370 189070 DEBUG nova.scheduler.client.report [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Inventory has not changed for provider be68f9f1-7820-4bfa-8dbd-210e13729f64 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 09:13:59 compute-1 nova_compute[189066]: 2025-12-05 09:13:59.373 189070 DEBUG nova.compute.resource_tracker [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 05 09:13:59 compute-1 nova_compute[189066]: 2025-12-05 09:13:59.374 189070 DEBUG oslo_concurrency.lockutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.159s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:13:59 compute-1 nova_compute[189066]: 2025-12-05 09:13:59.447 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:13:59 compute-1 nova_compute[189066]: 2025-12-05 09:13:59.449 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:13:59 compute-1 sudo[206040]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ymfzyqroituhqkqwklxsyjucqdffdfdc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764926038.5544581-2754-173130725033554/AnsiballZ_copy.py'
Dec 05 09:13:59 compute-1 sudo[206040]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:13:59 compute-1 python3.9[206042]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/deployed_services.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764926038.5544581-2754-173130725033554/.source.yaml _original_basename=.cx24g3l7 follow=False checksum=c953f870697b7b81e66562db3a2f6a52c706688b backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 09:13:59 compute-1 sudo[206040]: pam_unix(sudo:session): session closed for user root
Dec 05 09:14:00 compute-1 sudo[206192]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gjopdynrdnxsxcrnnborfsxbpzimlfzy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764926040.0055661-2799-13852530415261/AnsiballZ_stat.py'
Dec 05 09:14:00 compute-1 sudo[206192]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:14:00 compute-1 python3.9[206194]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/openstack_network_exporter/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 09:14:00 compute-1 sudo[206192]: pam_unix(sudo:session): session closed for user root
Dec 05 09:14:01 compute-1 sudo[206315]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yybdugjlaytceqtmsafvvmqeiblwvhxp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764926040.0055661-2799-13852530415261/AnsiballZ_copy.py'
Dec 05 09:14:01 compute-1 sudo[206315]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:14:01 compute-1 python3.9[206317]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/openstack_network_exporter/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764926040.0055661-2799-13852530415261/.source _original_basename=healthcheck follow=False checksum=e380c11c36804bfc65a818f2960cfa663daacfe5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Dec 05 09:14:01 compute-1 sudo[206315]: pam_unix(sudo:session): session closed for user root
Dec 05 09:14:01 compute-1 podman[206318]: 2025-12-05 09:14:01.54065432 +0000 UTC m=+0.062182410 container health_status 550b604b3b40c506829669f3af0f3e8dc4e7ac01464222ce7f26998708fe1a96 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Dec 05 09:14:02 compute-1 sudo[206491]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zmciywnuczonukpszxsdhzeijhvyxnce ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764926042.5224233-2862-42101148438037/AnsiballZ_file.py'
Dec 05 09:14:02 compute-1 sudo[206491]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:14:03 compute-1 python3.9[206493]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/edpm-config recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 09:14:03 compute-1 sudo[206491]: pam_unix(sudo:session): session closed for user root
Dec 05 09:14:03 compute-1 sudo[206643]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fxbbxjnbrsbrjvgibzlmhtyrzsjdqqwi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764926043.3134909-2886-91299867778657/AnsiballZ_file.py'
Dec 05 09:14:03 compute-1 sudo[206643]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:14:03 compute-1 python3.9[206645]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 05 09:14:03 compute-1 sudo[206643]: pam_unix(sudo:session): session closed for user root
Dec 05 09:14:04 compute-1 sudo[206795]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vhfhklierwjkyyephkimgjuwcldrzhkh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764926044.08916-2910-164845086559965/AnsiballZ_stat.py'
Dec 05 09:14:04 compute-1 sudo[206795]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:14:04 compute-1 python3.9[206797]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/ceilometer_agent_compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 09:14:04 compute-1 sudo[206795]: pam_unix(sudo:session): session closed for user root
Dec 05 09:14:04 compute-1 sudo[206873]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-myejqbiwzlrwtmmsdrxgeeuqkpmmyzxm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764926044.08916-2910-164845086559965/AnsiballZ_file.py'
Dec 05 09:14:04 compute-1 sudo[206873]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:14:05 compute-1 python3.9[206875]: ansible-ansible.legacy.file Invoked with mode=0600 dest=/var/lib/kolla/config_files/ceilometer_agent_compute.json _original_basename=.m8cbijx5 recurse=False state=file path=/var/lib/kolla/config_files/ceilometer_agent_compute.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 09:14:05 compute-1 sudo[206873]: pam_unix(sudo:session): session closed for user root
Dec 05 09:14:05 compute-1 python3.9[207025]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/openstack_network_exporter state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 09:14:07 compute-1 sudo[207446]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hphxysrecbvczytwuubdazukicviskar ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764926047.5816884-3021-198493279476200/AnsiballZ_container_config_data.py'
Dec 05 09:14:07 compute-1 sudo[207446]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:14:08 compute-1 python3.9[207448]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/openstack_network_exporter config_pattern=*.json debug=False
Dec 05 09:14:08 compute-1 sudo[207446]: pam_unix(sudo:session): session closed for user root
Dec 05 09:14:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:14:08.856 105272 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:14:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:14:08.859 105272 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.004s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:14:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:14:08.860 105272 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:14:08 compute-1 sudo[207598]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bdsjxrawofowodfwuwsannkyfgbypdge ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764926048.6197512-3054-179578046075715/AnsiballZ_container_config_hash.py'
Dec 05 09:14:08 compute-1 sudo[207598]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:14:09 compute-1 python3.9[207600]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/openstack
Dec 05 09:14:09 compute-1 sudo[207598]: pam_unix(sudo:session): session closed for user root
Dec 05 09:14:09 compute-1 sudo[207763]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wwryhksmbjbjmonbyjemmtuwcacitjtr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764926049.5398927-3081-240064077453240/AnsiballZ_podman_container_info.py'
Dec 05 09:14:09 compute-1 sudo[207763]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:14:09 compute-1 podman[207724]: 2025-12-05 09:14:09.882161841 +0000 UTC m=+0.092014465 container health_status d60d3dfd47255bc7c068dbedc03dbf86678863f0414a1b8f8536e4d53dd67a13 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=unhealthy, health_failing_streak=3, health_log=, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a42d21893a42142b0b00e96296a13ac9d2c0fedf55d6438ad104fd0309c63376-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125)
Dec 05 09:14:09 compute-1 systemd[1]: d60d3dfd47255bc7c068dbedc03dbf86678863f0414a1b8f8536e4d53dd67a13-5df4121e6410b318.service: Main process exited, code=exited, status=1/FAILURE
Dec 05 09:14:09 compute-1 systemd[1]: d60d3dfd47255bc7c068dbedc03dbf86678863f0414a1b8f8536e4d53dd67a13-5df4121e6410b318.service: Failed with result 'exit-code'.
Dec 05 09:14:10 compute-1 python3.9[207771]: ansible-containers.podman.podman_container_info Invoked with executable=podman name=None
Dec 05 09:14:10 compute-1 sudo[207763]: pam_unix(sudo:session): session closed for user root
Dec 05 09:14:12 compute-1 sudo[207949]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-shtypgntljpqssdkeqgvczwzfckcodee ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1764926052.0921378-3120-216239974872175/AnsiballZ_edpm_container_manage.py'
Dec 05 09:14:12 compute-1 sudo[207949]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:14:12 compute-1 python3[207951]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/openstack_network_exporter config_id=openstack_network_exporter config_overrides={} config_patterns=*.json containers=['openstack_network_exporter'] log_base_path=/var/log/containers/stdouts debug=False
Dec 05 09:14:13 compute-1 podman[207993]: 2025-12-05 09:14:13.669364416 +0000 UTC m=+0.110091406 container health_status 0cdc951f3b455a1459471b20e1f1626ca53f702831c125f835d1b670aeb17ccf (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_controller, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a42d21893a42142b0b00e96296a13ac9d2c0fedf55d6438ad104fd0309c63376-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Dec 05 09:14:15 compute-1 podman[208033]: 2025-12-05 09:14:15.844757087 +0000 UTC m=+0.200453871 container health_status e8cea6352187f28e672aa25c227f2705a90d58074b8cf42235a56df5d4026fd5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a42d21893a42142b0b00e96296a13ac9d2c0fedf55d6438ad104fd0309c63376-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true)
Dec 05 09:14:17 compute-1 podman[207966]: 2025-12-05 09:14:17.908200486 +0000 UTC m=+5.129965870 image pull 186c5e97c6f6912533851a0044ea6da23938910e7bddfb4a6c0be9b48ab2a1d1 quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified
Dec 05 09:14:18 compute-1 podman[208106]: 2025-12-05 09:14:18.100862669 +0000 UTC m=+0.060927989 container create b07f36300303a5c1729056a61e344d6c6fd9b2e404082d8e8ce7185d45652c2b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, distribution-scope=public, com.redhat.component=ubi9-minimal-container, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, build-date=2025-08-20T13:12:41, vcs-type=git, config_id=openstack_network_exporter, vendor=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., managed_by=edpm_ansible, io.openshift.tags=minimal rhel9, io.openshift.expose-services=, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, version=9.6)
Dec 05 09:14:18 compute-1 podman[208106]: 2025-12-05 09:14:18.067702726 +0000 UTC m=+0.027768076 image pull 186c5e97c6f6912533851a0044ea6da23938910e7bddfb4a6c0be9b48ab2a1d1 quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified
Dec 05 09:14:18 compute-1 python3[207951]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name openstack_network_exporter --conmon-pidfile /run/openstack_network_exporter.pid --env OPENSTACK_NETWORK_EXPORTER_YAML=/etc/openstack_network_exporter/openstack_network_exporter.yaml --env OS_ENDPOINT_TYPE=internal --env EDPM_CONFIG_HASH=6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78 --healthcheck-command /openstack/healthcheck openstack-netwo --label config_id=openstack_network_exporter --label container_name=openstack_network_exporter --label managed_by=edpm_ansible --label config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --publish 9105:9105 --volume /var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z --volume /var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z --volume /var/run/openvswitch:/run/openvswitch:rw,z --volume /var/lib/openvswitch/ovn:/run/ovn:rw,z --volume /proc:/host/proc:ro --volume /var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified
Dec 05 09:14:18 compute-1 sudo[207949]: pam_unix(sudo:session): session closed for user root
Dec 05 09:14:18 compute-1 podman[208220]: 2025-12-05 09:14:18.640076142 +0000 UTC m=+0.067825166 container health_status 3f3288243aefe700d53cffce184353529fbaf6e44f1c19ec4792ed576af41176 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=multipathd)
Dec 05 09:14:18 compute-1 sudo[208313]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wpxtmmwmlkuepqitxuhdwwyfxxcstffd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764926058.464975-3144-156579848743371/AnsiballZ_stat.py'
Dec 05 09:14:18 compute-1 sudo[208313]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:14:18 compute-1 python3.9[208315]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 05 09:14:18 compute-1 sudo[208313]: pam_unix(sudo:session): session closed for user root
Dec 05 09:14:19 compute-1 sudo[208467]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-clozvyikwmmsfcxydonxjgozhkjzresx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764926059.4127123-3171-3717787106449/AnsiballZ_file.py'
Dec 05 09:14:19 compute-1 sudo[208467]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:14:19 compute-1 python3.9[208469]: ansible-file Invoked with path=/etc/systemd/system/edpm_openstack_network_exporter.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 09:14:19 compute-1 sudo[208467]: pam_unix(sudo:session): session closed for user root
Dec 05 09:14:20 compute-1 sudo[208543]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hlsacrexhvmvwhpyyervuqbhjblbvywl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764926059.4127123-3171-3717787106449/AnsiballZ_stat.py'
Dec 05 09:14:20 compute-1 sudo[208543]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:14:20 compute-1 python3.9[208545]: ansible-stat Invoked with path=/etc/systemd/system/edpm_openstack_network_exporter_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 05 09:14:20 compute-1 sudo[208543]: pam_unix(sudo:session): session closed for user root
Dec 05 09:14:21 compute-1 sudo[208694]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cjxhergsrzewghfdovjuiodsswsrvcob ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764926060.5237224-3171-135066146230653/AnsiballZ_copy.py'
Dec 05 09:14:21 compute-1 sudo[208694]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:14:21 compute-1 python3.9[208696]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764926060.5237224-3171-135066146230653/source dest=/etc/systemd/system/edpm_openstack_network_exporter.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 09:14:21 compute-1 sudo[208694]: pam_unix(sudo:session): session closed for user root
Dec 05 09:14:21 compute-1 sudo[208770]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wdxypbesvqkcdypjjmsitilvysucnmfd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764926060.5237224-3171-135066146230653/AnsiballZ_systemd.py'
Dec 05 09:14:21 compute-1 sudo[208770]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:14:21 compute-1 python3.9[208772]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Dec 05 09:14:21 compute-1 systemd[1]: Reloading.
Dec 05 09:14:21 compute-1 systemd-rc-local-generator[208798]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 05 09:14:21 compute-1 systemd-sysv-generator[208801]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 05 09:14:22 compute-1 sudo[208770]: pam_unix(sudo:session): session closed for user root
Dec 05 09:14:22 compute-1 sudo[208881]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pxepdgskzxgkyryoeyaihapanvfgrbvr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764926060.5237224-3171-135066146230653/AnsiballZ_systemd.py'
Dec 05 09:14:22 compute-1 sudo[208881]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:14:22 compute-1 python3.9[208883]: ansible-systemd Invoked with state=restarted name=edpm_openstack_network_exporter.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 05 09:14:22 compute-1 systemd[1]: Reloading.
Dec 05 09:14:22 compute-1 systemd-rc-local-generator[208912]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 05 09:14:22 compute-1 systemd-sysv-generator[208917]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 05 09:14:23 compute-1 systemd[1]: Starting openstack_network_exporter container...
Dec 05 09:14:23 compute-1 systemd[1]: Started libcrun container.
Dec 05 09:14:23 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d36f331b8deed1f86994addc156b7fa91d4666d0d242b6f37798b56250d15038/merged/run/ovn supports timestamps until 2038 (0x7fffffff)
Dec 05 09:14:23 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d36f331b8deed1f86994addc156b7fa91d4666d0d242b6f37798b56250d15038/merged/etc/openstack_network_exporter/openstack_network_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Dec 05 09:14:23 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d36f331b8deed1f86994addc156b7fa91d4666d0d242b6f37798b56250d15038/merged/etc/openstack_network_exporter/tls supports timestamps until 2038 (0x7fffffff)
Dec 05 09:14:23 compute-1 systemd[1]: Started /usr/bin/podman healthcheck run b07f36300303a5c1729056a61e344d6c6fd9b2e404082d8e8ce7185d45652c2b.
Dec 05 09:14:23 compute-1 podman[208922]: 2025-12-05 09:14:23.357961783 +0000 UTC m=+0.136726305 container init b07f36300303a5c1729056a61e344d6c6fd9b2e404082d8e8ce7185d45652c2b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, managed_by=edpm_ansible, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, name=ubi9-minimal, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, vendor=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, architecture=x86_64, config_id=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, vcs-type=git, maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=ubi9-minimal-container, build-date=2025-08-20T13:12:41, distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Dec 05 09:14:23 compute-1 openstack_network_exporter[208938]: INFO    09:14:23 main.go:48: registering *bridge.Collector
Dec 05 09:14:23 compute-1 openstack_network_exporter[208938]: INFO    09:14:23 main.go:48: registering *coverage.Collector
Dec 05 09:14:23 compute-1 openstack_network_exporter[208938]: INFO    09:14:23 main.go:48: registering *datapath.Collector
Dec 05 09:14:23 compute-1 openstack_network_exporter[208938]: INFO    09:14:23 main.go:48: registering *iface.Collector
Dec 05 09:14:23 compute-1 openstack_network_exporter[208938]: INFO    09:14:23 main.go:48: registering *memory.Collector
Dec 05 09:14:23 compute-1 openstack_network_exporter[208938]: INFO    09:14:23 main.go:48: registering *ovnnorthd.Collector
Dec 05 09:14:23 compute-1 openstack_network_exporter[208938]: INFO    09:14:23 main.go:48: registering *ovn.Collector
Dec 05 09:14:23 compute-1 openstack_network_exporter[208938]: INFO    09:14:23 main.go:48: registering *ovsdbserver.Collector
Dec 05 09:14:23 compute-1 openstack_network_exporter[208938]: INFO    09:14:23 main.go:48: registering *pmd_perf.Collector
Dec 05 09:14:23 compute-1 openstack_network_exporter[208938]: INFO    09:14:23 main.go:48: registering *pmd_rxq.Collector
Dec 05 09:14:23 compute-1 openstack_network_exporter[208938]: INFO    09:14:23 main.go:48: registering *vswitch.Collector
Dec 05 09:14:23 compute-1 openstack_network_exporter[208938]: NOTICE  09:14:23 main.go:76: listening on https://:9105/metrics
Dec 05 09:14:23 compute-1 podman[208922]: 2025-12-05 09:14:23.389055377 +0000 UTC m=+0.167819879 container start b07f36300303a5c1729056a61e344d6c6fd9b2e404082d8e8ce7185d45652c2b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc., distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=openstack_network_exporter, managed_by=edpm_ansible, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, name=ubi9-minimal, build-date=2025-08-20T13:12:41, maintainer=Red Hat, Inc., io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container)
Dec 05 09:14:23 compute-1 podman[208922]: openstack_network_exporter
Dec 05 09:14:23 compute-1 systemd[1]: Started openstack_network_exporter container.
Dec 05 09:14:23 compute-1 sudo[208881]: pam_unix(sudo:session): session closed for user root
Dec 05 09:14:23 compute-1 podman[208944]: 2025-12-05 09:14:23.491748687 +0000 UTC m=+0.085093639 container health_status b07f36300303a5c1729056a61e344d6c6fd9b2e404082d8e8ce7185d45652c2b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-type=git, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, com.redhat.component=ubi9-minimal-container, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, name=ubi9-minimal, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., managed_by=edpm_ansible, release=1755695350, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., version=9.6, config_id=openstack_network_exporter, io.openshift.expose-services=, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9)
Dec 05 09:14:24 compute-1 python3.9[209120]: ansible-ansible.builtin.slurp Invoked with src=/var/lib/edpm-config/deployed_services.yaml
Dec 05 09:14:25 compute-1 sudo[209270]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rbloqjuduykwwmrcnbltribjwifcjxzq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764926065.3260984-3294-92474944680298/AnsiballZ_stat.py'
Dec 05 09:14:25 compute-1 sudo[209270]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:14:25 compute-1 python3.9[209272]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/deployed_services.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 09:14:25 compute-1 sudo[209270]: pam_unix(sudo:session): session closed for user root
Dec 05 09:14:26 compute-1 sudo[209395]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hdwjejnscxarzykloyrmapleknvoilml ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764926065.3260984-3294-92474944680298/AnsiballZ_copy.py'
Dec 05 09:14:26 compute-1 sudo[209395]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:14:26 compute-1 python3.9[209397]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/deployed_services.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764926065.3260984-3294-92474944680298/.source.yaml _original_basename=.wqyc5nsb follow=False checksum=d4b2f258b6c738b1fcd15ed914d3706c758e0c3f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 09:14:26 compute-1 sudo[209395]: pam_unix(sudo:session): session closed for user root
Dec 05 09:14:27 compute-1 sudo[209561]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-negbwnejpevwniqhglvsblxnisbfaics ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764926066.6988823-3339-47364576694623/AnsiballZ_find.py'
Dec 05 09:14:27 compute-1 sudo[209561]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:14:27 compute-1 podman[209521]: 2025-12-05 09:14:27.023518706 +0000 UTC m=+0.066482728 container health_status b9f45cdcb567bb250381fa80d86c78c226d5638c7de1b6d79ee017275814ff68 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Dec 05 09:14:27 compute-1 python3.9[209574]: ansible-ansible.builtin.find Invoked with file_type=directory paths=['/var/lib/openstack/healthchecks/'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Dec 05 09:14:27 compute-1 sudo[209561]: pam_unix(sudo:session): session closed for user root
Dec 05 09:14:32 compute-1 podman[209599]: 2025-12-05 09:14:32.746788573 +0000 UTC m=+0.063441988 container health_status 550b604b3b40c506829669f3af0f3e8dc4e7ac01464222ce7f26998708fe1a96 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec 05 09:14:40 compute-1 podman[209623]: 2025-12-05 09:14:40.609754411 +0000 UTC m=+0.047355973 container health_status d60d3dfd47255bc7c068dbedc03dbf86678863f0414a1b8f8536e4d53dd67a13 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=unhealthy, health_failing_streak=4, health_log=, org.label-schema.build-date=20251125, config_id=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a42d21893a42142b0b00e96296a13ac9d2c0fedf55d6438ad104fd0309c63376-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3)
Dec 05 09:14:40 compute-1 systemd[1]: d60d3dfd47255bc7c068dbedc03dbf86678863f0414a1b8f8536e4d53dd67a13-5df4121e6410b318.service: Main process exited, code=exited, status=1/FAILURE
Dec 05 09:14:40 compute-1 systemd[1]: d60d3dfd47255bc7c068dbedc03dbf86678863f0414a1b8f8536e4d53dd67a13-5df4121e6410b318.service: Failed with result 'exit-code'.
Dec 05 09:14:44 compute-1 sshd-session[209642]: Received disconnect from 122.168.194.41 port 34064:11: Bye Bye [preauth]
Dec 05 09:14:44 compute-1 sshd-session[209642]: Disconnected from authenticating user root 122.168.194.41 port 34064 [preauth]
Dec 05 09:14:44 compute-1 podman[209644]: 2025-12-05 09:14:44.664711651 +0000 UTC m=+0.103081254 container health_status 0cdc951f3b455a1459471b20e1f1626ca53f702831c125f835d1b670aeb17ccf (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a42d21893a42142b0b00e96296a13ac9d2c0fedf55d6438ad104fd0309c63376-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125)
Dec 05 09:14:46 compute-1 podman[209670]: 2025-12-05 09:14:46.608060476 +0000 UTC m=+0.047474725 container health_status e8cea6352187f28e672aa25c227f2705a90d58074b8cf42235a56df5d4026fd5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a42d21893a42142b0b00e96296a13ac9d2c0fedf55d6438ad104fd0309c63376-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible)
Dec 05 09:14:49 compute-1 podman[209689]: 2025-12-05 09:14:49.61524479 +0000 UTC m=+0.058874217 container health_status 3f3288243aefe700d53cffce184353529fbaf6e44f1c19ec4792ed576af41176 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_id=multipathd, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 05 09:14:50 compute-1 sshd-session[209710]: Received disconnect from 185.118.15.236 port 35166:11: Bye Bye [preauth]
Dec 05 09:14:50 compute-1 sshd-session[209710]: Disconnected from authenticating user root 185.118.15.236 port 35166 [preauth]
Dec 05 09:14:51 compute-1 sudo[209837]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wqlajcoxxkwhnopvhwwwfbtndtzntqkk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764926091.4892313-3597-164390421234837/AnsiballZ_podman_container_info.py'
Dec 05 09:14:51 compute-1 sudo[209837]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:14:51 compute-1 python3.9[209839]: ansible-containers.podman.podman_container_info Invoked with name=['ovn_controller'] executable=podman
Dec 05 09:14:52 compute-1 sudo[209837]: pam_unix(sudo:session): session closed for user root
Dec 05 09:14:52 compute-1 sudo[210002]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-maxqqlzarpcikmqitgmeulkfaufnxjjy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764926092.2857823-3605-66784706611515/AnsiballZ_podman_container_exec.py'
Dec 05 09:14:52 compute-1 sudo[210002]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:14:52 compute-1 python3.9[210004]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=ovn_controller detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Dec 05 09:14:52 compute-1 systemd[1]: Started libpod-conmon-0cdc951f3b455a1459471b20e1f1626ca53f702831c125f835d1b670aeb17ccf.scope.
Dec 05 09:14:52 compute-1 podman[210005]: 2025-12-05 09:14:52.891167824 +0000 UTC m=+0.081273292 container exec 0cdc951f3b455a1459471b20e1f1626ca53f702831c125f835d1b670aeb17ccf (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a42d21893a42142b0b00e96296a13ac9d2c0fedf55d6438ad104fd0309c63376-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller)
Dec 05 09:14:52 compute-1 podman[210005]: 2025-12-05 09:14:52.929365536 +0000 UTC m=+0.119470994 container exec_died 0cdc951f3b455a1459471b20e1f1626ca53f702831c125f835d1b670aeb17ccf (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a42d21893a42142b0b00e96296a13ac9d2c0fedf55d6438ad104fd0309c63376-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, config_id=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 05 09:14:52 compute-1 systemd[1]: libpod-conmon-0cdc951f3b455a1459471b20e1f1626ca53f702831c125f835d1b670aeb17ccf.scope: Deactivated successfully.
Dec 05 09:14:52 compute-1 sudo[210002]: pam_unix(sudo:session): session closed for user root
Dec 05 09:14:53 compute-1 sudo[210184]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-utzbqnltswbhfamyrwjkiyqazbqzzhlg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764926093.1435893-3613-99507010132200/AnsiballZ_podman_container_exec.py'
Dec 05 09:14:53 compute-1 sudo[210184]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:14:53 compute-1 python3.9[210186]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=ovn_controller detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Dec 05 09:14:53 compute-1 podman[210187]: 2025-12-05 09:14:53.63666485 +0000 UTC m=+0.072908672 container health_status b07f36300303a5c1729056a61e344d6c6fd9b2e404082d8e8ce7185d45652c2b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, architecture=x86_64, com.redhat.component=ubi9-minimal-container, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, name=ubi9-minimal, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., config_id=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, version=9.6, io.openshift.expose-services=, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter)
Dec 05 09:14:53 compute-1 systemd[1]: Started libpod-conmon-0cdc951f3b455a1459471b20e1f1626ca53f702831c125f835d1b670aeb17ccf.scope.
Dec 05 09:14:53 compute-1 podman[210202]: 2025-12-05 09:14:53.900708476 +0000 UTC m=+0.282475088 container exec 0cdc951f3b455a1459471b20e1f1626ca53f702831c125f835d1b670aeb17ccf (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a42d21893a42142b0b00e96296a13ac9d2c0fedf55d6438ad104fd0309c63376-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, container_name=ovn_controller)
Dec 05 09:14:53 compute-1 podman[210202]: 2025-12-05 09:14:53.907434537 +0000 UTC m=+0.289201149 container exec_died 0cdc951f3b455a1459471b20e1f1626ca53f702831c125f835d1b670aeb17ccf (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a42d21893a42142b0b00e96296a13ac9d2c0fedf55d6438ad104fd0309c63376-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Dec 05 09:14:53 compute-1 sudo[210184]: pam_unix(sudo:session): session closed for user root
Dec 05 09:14:53 compute-1 systemd[1]: libpod-conmon-0cdc951f3b455a1459471b20e1f1626ca53f702831c125f835d1b670aeb17ccf.scope: Deactivated successfully.
Dec 05 09:14:54 compute-1 sudo[210388]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rhchdallkcfscutzjqfinmejuojbycet ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764926094.1291091-3621-18123307081858/AnsiballZ_file.py'
Dec 05 09:14:54 compute-1 sudo[210388]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:14:54 compute-1 python3.9[210390]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/ovn_controller recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 09:14:54 compute-1 sudo[210388]: pam_unix(sudo:session): session closed for user root
Dec 05 09:14:55 compute-1 sudo[210540]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ittkmdswyntcvknbwksrtipewifquvfc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764926094.8903558-3630-63577165118998/AnsiballZ_podman_container_info.py'
Dec 05 09:14:55 compute-1 sudo[210540]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:14:55 compute-1 python3.9[210542]: ansible-containers.podman.podman_container_info Invoked with name=['ovn_metadata_agent'] executable=podman
Dec 05 09:14:55 compute-1 sudo[210540]: pam_unix(sudo:session): session closed for user root
Dec 05 09:14:56 compute-1 sudo[210706]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rkodzsmyyiwdiecchfszmxweqqotmhcd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764926095.856682-3638-232782378394111/AnsiballZ_podman_container_exec.py'
Dec 05 09:14:56 compute-1 sudo[210706]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:14:56 compute-1 python3.9[210708]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=ovn_metadata_agent detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Dec 05 09:14:56 compute-1 systemd[1]: Started libpod-conmon-e8cea6352187f28e672aa25c227f2705a90d58074b8cf42235a56df5d4026fd5.scope.
Dec 05 09:14:56 compute-1 podman[210709]: 2025-12-05 09:14:56.497198992 +0000 UTC m=+0.091253091 container exec e8cea6352187f28e672aa25c227f2705a90d58074b8cf42235a56df5d4026fd5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a42d21893a42142b0b00e96296a13ac9d2c0fedf55d6438ad104fd0309c63376-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team)
Dec 05 09:14:56 compute-1 podman[210709]: 2025-12-05 09:14:56.531095301 +0000 UTC m=+0.125149370 container exec_died e8cea6352187f28e672aa25c227f2705a90d58074b8cf42235a56df5d4026fd5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a42d21893a42142b0b00e96296a13ac9d2c0fedf55d6438ad104fd0309c63376-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent)
Dec 05 09:14:56 compute-1 systemd[1]: libpod-conmon-e8cea6352187f28e672aa25c227f2705a90d58074b8cf42235a56df5d4026fd5.scope: Deactivated successfully.
Dec 05 09:14:56 compute-1 sudo[210706]: pam_unix(sudo:session): session closed for user root
Dec 05 09:14:57 compute-1 sudo[210890]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bywxjmhnpydiufaorodtjuywssaimudu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764926096.7711616-3646-70804688427104/AnsiballZ_podman_container_exec.py'
Dec 05 09:14:57 compute-1 sudo[210890]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:14:57 compute-1 podman[210892]: 2025-12-05 09:14:57.15867256 +0000 UTC m=+0.064781548 container health_status b9f45cdcb567bb250381fa80d86c78c226d5638c7de1b6d79ee017275814ff68 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Dec 05 09:14:57 compute-1 python3.9[210893]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=ovn_metadata_agent detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Dec 05 09:14:57 compute-1 systemd[1]: Started libpod-conmon-e8cea6352187f28e672aa25c227f2705a90d58074b8cf42235a56df5d4026fd5.scope.
Dec 05 09:14:57 compute-1 podman[210917]: 2025-12-05 09:14:57.39771257 +0000 UTC m=+0.082715066 container exec e8cea6352187f28e672aa25c227f2705a90d58074b8cf42235a56df5d4026fd5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a42d21893a42142b0b00e96296a13ac9d2c0fedf55d6438ad104fd0309c63376-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3)
Dec 05 09:14:57 compute-1 podman[210917]: 2025-12-05 09:14:57.4069303 +0000 UTC m=+0.091932776 container exec_died e8cea6352187f28e672aa25c227f2705a90d58074b8cf42235a56df5d4026fd5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a42d21893a42142b0b00e96296a13ac9d2c0fedf55d6438ad104fd0309c63376-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Dec 05 09:14:57 compute-1 sudo[210890]: pam_unix(sudo:session): session closed for user root
Dec 05 09:14:57 compute-1 systemd[1]: libpod-conmon-e8cea6352187f28e672aa25c227f2705a90d58074b8cf42235a56df5d4026fd5.scope: Deactivated successfully.
Dec 05 09:14:58 compute-1 nova_compute[189066]: 2025-12-05 09:14:58.021 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:14:58 compute-1 nova_compute[189066]: 2025-12-05 09:14:58.022 189070 DEBUG nova.compute.manager [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 05 09:14:58 compute-1 nova_compute[189066]: 2025-12-05 09:14:58.022 189070 DEBUG nova.compute.manager [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 05 09:14:58 compute-1 nova_compute[189066]: 2025-12-05 09:14:58.101 189070 DEBUG nova.compute.manager [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Dec 05 09:14:58 compute-1 nova_compute[189066]: 2025-12-05 09:14:58.101 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:14:58 compute-1 nova_compute[189066]: 2025-12-05 09:14:58.102 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:14:58 compute-1 nova_compute[189066]: 2025-12-05 09:14:58.102 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:14:58 compute-1 sudo[211099]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tytrdfktgxagdwtksvfssgamxpskrndw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764926097.647359-3654-81229064179382/AnsiballZ_file.py'
Dec 05 09:14:58 compute-1 sudo[211099]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:14:58 compute-1 python3.9[211101]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/ovn_metadata_agent recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 09:14:58 compute-1 sudo[211099]: pam_unix(sudo:session): session closed for user root
Dec 05 09:14:58 compute-1 sudo[211251]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gkevdztgkecxysciupthbywkxnvtdlqi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764926098.6018574-3663-218742827233991/AnsiballZ_podman_container_info.py'
Dec 05 09:14:58 compute-1 sudo[211251]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:14:59 compute-1 nova_compute[189066]: 2025-12-05 09:14:59.021 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:14:59 compute-1 nova_compute[189066]: 2025-12-05 09:14:59.021 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:14:59 compute-1 nova_compute[189066]: 2025-12-05 09:14:59.021 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:14:59 compute-1 nova_compute[189066]: 2025-12-05 09:14:59.021 189070 DEBUG nova.compute.manager [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 05 09:14:59 compute-1 nova_compute[189066]: 2025-12-05 09:14:59.022 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:14:59 compute-1 nova_compute[189066]: 2025-12-05 09:14:59.097 189070 DEBUG oslo_concurrency.lockutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:14:59 compute-1 nova_compute[189066]: 2025-12-05 09:14:59.098 189070 DEBUG oslo_concurrency.lockutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:14:59 compute-1 nova_compute[189066]: 2025-12-05 09:14:59.098 189070 DEBUG oslo_concurrency.lockutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:14:59 compute-1 nova_compute[189066]: 2025-12-05 09:14:59.099 189070 DEBUG nova.compute.resource_tracker [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 05 09:14:59 compute-1 python3.9[211253]: ansible-containers.podman.podman_container_info Invoked with name=['multipathd'] executable=podman
Dec 05 09:14:59 compute-1 sudo[211251]: pam_unix(sudo:session): session closed for user root
Dec 05 09:14:59 compute-1 nova_compute[189066]: 2025-12-05 09:14:59.277 189070 WARNING nova.virt.libvirt.driver [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 05 09:14:59 compute-1 nova_compute[189066]: 2025-12-05 09:14:59.279 189070 DEBUG nova.compute.resource_tracker [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5908MB free_disk=73.36846923828125GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 05 09:14:59 compute-1 nova_compute[189066]: 2025-12-05 09:14:59.279 189070 DEBUG oslo_concurrency.lockutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:14:59 compute-1 nova_compute[189066]: 2025-12-05 09:14:59.280 189070 DEBUG oslo_concurrency.lockutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:14:59 compute-1 nova_compute[189066]: 2025-12-05 09:14:59.458 189070 DEBUG nova.compute.resource_tracker [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 05 09:14:59 compute-1 nova_compute[189066]: 2025-12-05 09:14:59.458 189070 DEBUG nova.compute.resource_tracker [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 05 09:14:59 compute-1 nova_compute[189066]: 2025-12-05 09:14:59.490 189070 DEBUG nova.compute.provider_tree [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Inventory has not changed in ProviderTree for provider: be68f9f1-7820-4bfa-8dbd-210e13729f64 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 09:14:59 compute-1 nova_compute[189066]: 2025-12-05 09:14:59.520 189070 DEBUG nova.scheduler.client.report [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Inventory has not changed for provider be68f9f1-7820-4bfa-8dbd-210e13729f64 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 09:14:59 compute-1 nova_compute[189066]: 2025-12-05 09:14:59.522 189070 DEBUG nova.compute.resource_tracker [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 05 09:14:59 compute-1 nova_compute[189066]: 2025-12-05 09:14:59.523 189070 DEBUG oslo_concurrency.lockutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.243s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:14:59 compute-1 sudo[211417]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jcyemfclvsoakcfhzfvjykzzzeicxyot ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764926099.4839008-3671-200938696733596/AnsiballZ_podman_container_exec.py'
Dec 05 09:14:59 compute-1 sudo[211417]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:14:59 compute-1 python3.9[211419]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=multipathd detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Dec 05 09:15:00 compute-1 systemd[1]: Started libpod-conmon-3f3288243aefe700d53cffce184353529fbaf6e44f1c19ec4792ed576af41176.scope.
Dec 05 09:15:00 compute-1 podman[211420]: 2025-12-05 09:15:00.452790418 +0000 UTC m=+0.444772954 container exec 3f3288243aefe700d53cffce184353529fbaf6e44f1c19ec4792ed576af41176 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_id=multipathd)
Dec 05 09:15:00 compute-1 nova_compute[189066]: 2025-12-05 09:15:00.522 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:15:00 compute-1 podman[211420]: 2025-12-05 09:15:00.720354789 +0000 UTC m=+0.712337305 container exec_died 3f3288243aefe700d53cffce184353529fbaf6e44f1c19ec4792ed576af41176 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=multipathd, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125)
Dec 05 09:15:00 compute-1 systemd[1]: libpod-conmon-3f3288243aefe700d53cffce184353529fbaf6e44f1c19ec4792ed576af41176.scope: Deactivated successfully.
Dec 05 09:15:00 compute-1 sudo[211417]: pam_unix(sudo:session): session closed for user root
Dec 05 09:15:01 compute-1 sudo[211602]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gviuxgjuxppetexjfjjfnzbydrlggqjp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764926101.1645446-3679-66472656137240/AnsiballZ_podman_container_exec.py'
Dec 05 09:15:01 compute-1 sudo[211602]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:15:01 compute-1 python3.9[211604]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=multipathd detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Dec 05 09:15:01 compute-1 systemd[1]: Started libpod-conmon-3f3288243aefe700d53cffce184353529fbaf6e44f1c19ec4792ed576af41176.scope.
Dec 05 09:15:01 compute-1 podman[211605]: 2025-12-05 09:15:01.826507999 +0000 UTC m=+0.084501449 container exec 3f3288243aefe700d53cffce184353529fbaf6e44f1c19ec4792ed576af41176 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, container_name=multipathd, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0)
Dec 05 09:15:01 compute-1 podman[211605]: 2025-12-05 09:15:01.860981482 +0000 UTC m=+0.118974922 container exec_died 3f3288243aefe700d53cffce184353529fbaf6e44f1c19ec4792ed576af41176 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, io.buildah.version=1.41.3)
Dec 05 09:15:01 compute-1 systemd[1]: libpod-conmon-3f3288243aefe700d53cffce184353529fbaf6e44f1c19ec4792ed576af41176.scope: Deactivated successfully.
Dec 05 09:15:01 compute-1 sudo[211602]: pam_unix(sudo:session): session closed for user root
Dec 05 09:15:02 compute-1 sudo[211784]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pqsqejoulezifdsjjhvljtdimyczzinn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764926102.1665635-3687-221192044523992/AnsiballZ_file.py'
Dec 05 09:15:02 compute-1 sudo[211784]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:15:02 compute-1 python3.9[211786]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/multipathd recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 09:15:02 compute-1 sudo[211784]: pam_unix(sudo:session): session closed for user root
Dec 05 09:15:03 compute-1 sudo[211945]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-czrqgwimcpaqjbxzjhwcogfodjrghaxo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764926102.941943-3696-169521625129813/AnsiballZ_podman_container_info.py'
Dec 05 09:15:03 compute-1 sudo[211945]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:15:03 compute-1 podman[211910]: 2025-12-05 09:15:03.281341646 +0000 UTC m=+0.078920286 container health_status 550b604b3b40c506829669f3af0f3e8dc4e7ac01464222ce7f26998708fe1a96 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec 05 09:15:03 compute-1 python3.9[211955]: ansible-containers.podman.podman_container_info Invoked with name=['ceilometer_agent_compute'] executable=podman
Dec 05 09:15:03 compute-1 sudo[211945]: pam_unix(sudo:session): session closed for user root
Dec 05 09:15:04 compute-1 sudo[212125]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ilyuvearedyqiryugxbiafclbcffnwrv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764926103.7776463-3704-135568278565092/AnsiballZ_podman_container_exec.py'
Dec 05 09:15:04 compute-1 sudo[212125]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:15:04 compute-1 python3.9[212127]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=ceilometer_agent_compute detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Dec 05 09:15:04 compute-1 systemd[1]: Started libpod-conmon-d60d3dfd47255bc7c068dbedc03dbf86678863f0414a1b8f8536e4d53dd67a13.scope.
Dec 05 09:15:04 compute-1 podman[212128]: 2025-12-05 09:15:04.462081557 +0000 UTC m=+0.101068015 container exec d60d3dfd47255bc7c068dbedc03dbf86678863f0414a1b8f8536e4d53dd67a13 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a42d21893a42142b0b00e96296a13ac9d2c0fedf55d6438ad104fd0309c63376-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Dec 05 09:15:04 compute-1 podman[212128]: 2025-12-05 09:15:04.496996751 +0000 UTC m=+0.135983179 container exec_died d60d3dfd47255bc7c068dbedc03dbf86678863f0414a1b8f8536e4d53dd67a13 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a42d21893a42142b0b00e96296a13ac9d2c0fedf55d6438ad104fd0309c63376-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Dec 05 09:15:04 compute-1 systemd[1]: libpod-conmon-d60d3dfd47255bc7c068dbedc03dbf86678863f0414a1b8f8536e4d53dd67a13.scope: Deactivated successfully.
Dec 05 09:15:04 compute-1 sudo[212125]: pam_unix(sudo:session): session closed for user root
Dec 05 09:15:05 compute-1 sudo[212310]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-griueckxevxrbgdnjkbzojpxcmslqmfy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764926104.7817929-3712-273565403816609/AnsiballZ_podman_container_exec.py'
Dec 05 09:15:05 compute-1 sudo[212310]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:15:05 compute-1 python3.9[212312]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=ceilometer_agent_compute detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Dec 05 09:15:05 compute-1 systemd[1]: Started libpod-conmon-d60d3dfd47255bc7c068dbedc03dbf86678863f0414a1b8f8536e4d53dd67a13.scope.
Dec 05 09:15:05 compute-1 podman[212313]: 2025-12-05 09:15:05.498930441 +0000 UTC m=+0.167987713 container exec d60d3dfd47255bc7c068dbedc03dbf86678863f0414a1b8f8536e4d53dd67a13 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a42d21893a42142b0b00e96296a13ac9d2c0fedf55d6438ad104fd0309c63376-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ceilometer_agent_compute, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team)
Dec 05 09:15:05 compute-1 podman[212313]: 2025-12-05 09:15:05.540825902 +0000 UTC m=+0.209883144 container exec_died d60d3dfd47255bc7c068dbedc03dbf86678863f0414a1b8f8536e4d53dd67a13 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ceilometer_agent_compute, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a42d21893a42142b0b00e96296a13ac9d2c0fedf55d6438ad104fd0309c63376-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Dec 05 09:15:05 compute-1 systemd[1]: libpod-conmon-d60d3dfd47255bc7c068dbedc03dbf86678863f0414a1b8f8536e4d53dd67a13.scope: Deactivated successfully.
Dec 05 09:15:05 compute-1 sudo[212310]: pam_unix(sudo:session): session closed for user root
Dec 05 09:15:06 compute-1 sudo[212496]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-axoeimhznnhboxwxngxuznjnvgmpfaxl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764926105.7902348-3720-134171430691464/AnsiballZ_file.py'
Dec 05 09:15:06 compute-1 sudo[212496]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:15:06 compute-1 python3.9[212498]: ansible-ansible.builtin.file Invoked with group=42405 mode=0700 owner=42405 path=/var/lib/openstack/healthchecks/ceilometer_agent_compute recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 09:15:06 compute-1 sudo[212496]: pam_unix(sudo:session): session closed for user root
Dec 05 09:15:06 compute-1 sshd-session[212358]: Received disconnect from 43.225.158.169 port 39631:11: Bye Bye [preauth]
Dec 05 09:15:06 compute-1 sshd-session[212358]: Disconnected from authenticating user root 43.225.158.169 port 39631 [preauth]
Dec 05 09:15:07 compute-1 sudo[212648]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kllitebyrmrxlkrqdicokvjiwvmyjisw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764926106.8403382-3729-4479734605387/AnsiballZ_podman_container_info.py'
Dec 05 09:15:07 compute-1 sudo[212648]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:15:07 compute-1 python3.9[212650]: ansible-containers.podman.podman_container_info Invoked with name=['node_exporter'] executable=podman
Dec 05 09:15:07 compute-1 sudo[212648]: pam_unix(sudo:session): session closed for user root
Dec 05 09:15:07 compute-1 sudo[212813]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wzzcjerxpwtgsrnqvjspyzmvquzrjeni ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764926107.6084604-3737-104267592196751/AnsiballZ_podman_container_exec.py'
Dec 05 09:15:07 compute-1 sudo[212813]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:15:08 compute-1 python3.9[212815]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=node_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Dec 05 09:15:08 compute-1 systemd[1]: Started libpod-conmon-550b604b3b40c506829669f3af0f3e8dc4e7ac01464222ce7f26998708fe1a96.scope.
Dec 05 09:15:08 compute-1 podman[212816]: 2025-12-05 09:15:08.195665281 +0000 UTC m=+0.082641835 container exec 550b604b3b40c506829669f3af0f3e8dc4e7ac01464222ce7f26998708fe1a96 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Dec 05 09:15:08 compute-1 podman[212816]: 2025-12-05 09:15:08.226037276 +0000 UTC m=+0.113013820 container exec_died 550b604b3b40c506829669f3af0f3e8dc4e7ac01464222ce7f26998708fe1a96 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec 05 09:15:08 compute-1 systemd[1]: libpod-conmon-550b604b3b40c506829669f3af0f3e8dc4e7ac01464222ce7f26998708fe1a96.scope: Deactivated successfully.
Dec 05 09:15:08 compute-1 sudo[212813]: pam_unix(sudo:session): session closed for user root
Dec 05 09:15:08 compute-1 sudo[212998]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ncvmcijqxpnonsqozgvwovwnkgskndyy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764926108.4367285-3745-154353543787680/AnsiballZ_podman_container_exec.py'
Dec 05 09:15:08 compute-1 sudo[212998]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:15:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:15:08.858 105272 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:15:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:15:08.860 105272 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.003s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:15:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:15:08.860 105272 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:15:08 compute-1 python3.9[213000]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=node_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Dec 05 09:15:09 compute-1 systemd[1]: Started libpod-conmon-550b604b3b40c506829669f3af0f3e8dc4e7ac01464222ce7f26998708fe1a96.scope.
Dec 05 09:15:09 compute-1 podman[213001]: 2025-12-05 09:15:09.132103137 +0000 UTC m=+0.178763141 container exec 550b604b3b40c506829669f3af0f3e8dc4e7ac01464222ce7f26998708fe1a96 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Dec 05 09:15:09 compute-1 podman[213001]: 2025-12-05 09:15:09.168134797 +0000 UTC m=+0.214794711 container exec_died 550b604b3b40c506829669f3af0f3e8dc4e7ac01464222ce7f26998708fe1a96 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Dec 05 09:15:09 compute-1 systemd[1]: libpod-conmon-550b604b3b40c506829669f3af0f3e8dc4e7ac01464222ce7f26998708fe1a96.scope: Deactivated successfully.
Dec 05 09:15:09 compute-1 sudo[212998]: pam_unix(sudo:session): session closed for user root
Dec 05 09:15:09 compute-1 sudo[213183]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uxrpjhegwbizxkcvpkugyywrrfpbgjfp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764926109.5776894-3753-222977685948537/AnsiballZ_file.py'
Dec 05 09:15:09 compute-1 sudo[213183]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:15:10 compute-1 python3.9[213185]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/node_exporter recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 09:15:10 compute-1 sudo[213183]: pam_unix(sudo:session): session closed for user root
Dec 05 09:15:10 compute-1 sudo[213335]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wtjalxxuzwnuxcdyzodcwtumayfnoepp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764926110.3289587-3762-272534367435533/AnsiballZ_podman_container_info.py'
Dec 05 09:15:10 compute-1 sudo[213335]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:15:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:15:10.735 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 09:15:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:15:10.737 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 09:15:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:15:10.737 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 09:15:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:15:10.737 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 09:15:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:15:10.737 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 09:15:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:15:10.737 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 09:15:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:15:10.737 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 09:15:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:15:10.737 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 09:15:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:15:10.737 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 09:15:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:15:10.738 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 09:15:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:15:10.738 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 09:15:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:15:10.738 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 09:15:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:15:10.738 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 09:15:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:15:10.738 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 09:15:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:15:10.738 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 09:15:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:15:10.738 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 09:15:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:15:10.738 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 09:15:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:15:10.738 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 09:15:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:15:10.738 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 09:15:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:15:10.738 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 09:15:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:15:10.738 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 09:15:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:15:10.739 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 09:15:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:15:10.739 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 09:15:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:15:10.739 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 09:15:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:15:10.739 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 09:15:10 compute-1 python3.9[213337]: ansible-containers.podman.podman_container_info Invoked with name=['podman_exporter'] executable=podman
Dec 05 09:15:10 compute-1 sudo[213335]: pam_unix(sudo:session): session closed for user root
Dec 05 09:15:11 compute-1 sudo[213510]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vpiqxjpbnwilpwceaikrssklpcjxgrth ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764926111.1008806-3770-113590605485070/AnsiballZ_podman_container_exec.py'
Dec 05 09:15:11 compute-1 sudo[213510]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:15:11 compute-1 podman[213474]: 2025-12-05 09:15:11.448588884 +0000 UTC m=+0.074583362 container health_status d60d3dfd47255bc7c068dbedc03dbf86678863f0414a1b8f8536e4d53dd67a13 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a42d21893a42142b0b00e96296a13ac9d2c0fedf55d6438ad104fd0309c63376-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0)
Dec 05 09:15:11 compute-1 python3.9[213517]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=podman_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Dec 05 09:15:11 compute-1 systemd[1]: Started libpod-conmon-b9f45cdcb567bb250381fa80d86c78c226d5638c7de1b6d79ee017275814ff68.scope.
Dec 05 09:15:11 compute-1 podman[213520]: 2025-12-05 09:15:11.801771489 +0000 UTC m=+0.086000734 container exec b9f45cdcb567bb250381fa80d86c78c226d5638c7de1b6d79ee017275814ff68 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Dec 05 09:15:11 compute-1 podman[213520]: 2025-12-05 09:15:11.833959538 +0000 UTC m=+0.118188763 container exec_died b9f45cdcb567bb250381fa80d86c78c226d5638c7de1b6d79ee017275814ff68 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Dec 05 09:15:11 compute-1 systemd[1]: libpod-conmon-b9f45cdcb567bb250381fa80d86c78c226d5638c7de1b6d79ee017275814ff68.scope: Deactivated successfully.
Dec 05 09:15:11 compute-1 sudo[213510]: pam_unix(sudo:session): session closed for user root
Dec 05 09:15:12 compute-1 sudo[213702]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ynptvduzgsnlnmoeyremfqutvffwukmr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764926112.0743036-3778-156445969811431/AnsiballZ_podman_container_exec.py'
Dec 05 09:15:12 compute-1 sudo[213702]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:15:12 compute-1 python3.9[213704]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=podman_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Dec 05 09:15:12 compute-1 systemd[1]: Started libpod-conmon-b9f45cdcb567bb250381fa80d86c78c226d5638c7de1b6d79ee017275814ff68.scope.
Dec 05 09:15:12 compute-1 podman[213705]: 2025-12-05 09:15:12.842172428 +0000 UTC m=+0.237386540 container exec b9f45cdcb567bb250381fa80d86c78c226d5638c7de1b6d79ee017275814ff68 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Dec 05 09:15:12 compute-1 podman[213724]: 2025-12-05 09:15:12.959731186 +0000 UTC m=+0.101125626 container exec_died b9f45cdcb567bb250381fa80d86c78c226d5638c7de1b6d79ee017275814ff68 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 05 09:15:12 compute-1 podman[213705]: 2025-12-05 09:15:12.967287787 +0000 UTC m=+0.362501889 container exec_died b9f45cdcb567bb250381fa80d86c78c226d5638c7de1b6d79ee017275814ff68 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 05 09:15:12 compute-1 systemd[1]: libpod-conmon-b9f45cdcb567bb250381fa80d86c78c226d5638c7de1b6d79ee017275814ff68.scope: Deactivated successfully.
Dec 05 09:15:13 compute-1 sudo[213702]: pam_unix(sudo:session): session closed for user root
Dec 05 09:15:13 compute-1 sudo[213886]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nwyialgrkjgoospausakoodcvmqrsygm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764926113.2119694-3786-833171226801/AnsiballZ_file.py'
Dec 05 09:15:13 compute-1 sudo[213886]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:15:13 compute-1 python3.9[213888]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/podman_exporter recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 09:15:13 compute-1 sudo[213886]: pam_unix(sudo:session): session closed for user root
Dec 05 09:15:14 compute-1 sudo[214038]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vvrxaoblngwyihijthxjxhnskridtfrr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764926113.9480894-3795-16815079527810/AnsiballZ_podman_container_info.py'
Dec 05 09:15:14 compute-1 sudo[214038]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:15:14 compute-1 python3.9[214040]: ansible-containers.podman.podman_container_info Invoked with name=['openstack_network_exporter'] executable=podman
Dec 05 09:15:14 compute-1 sudo[214038]: pam_unix(sudo:session): session closed for user root
Dec 05 09:15:15 compute-1 sudo[214215]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cysekfxpahetiapekhfzwkevenxpcwta ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764926114.7231665-3803-233928581168207/AnsiballZ_podman_container_exec.py'
Dec 05 09:15:15 compute-1 sudo[214215]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:15:15 compute-1 podman[214177]: 2025-12-05 09:15:15.114928272 +0000 UTC m=+0.113226285 container health_status 0cdc951f3b455a1459471b20e1f1626ca53f702831c125f835d1b670aeb17ccf (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a42d21893a42142b0b00e96296a13ac9d2c0fedf55d6438ad104fd0309c63376-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.schema-version=1.0, container_name=ovn_controller)
Dec 05 09:15:15 compute-1 python3.9[214223]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=openstack_network_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Dec 05 09:15:15 compute-1 systemd[1]: Started libpod-conmon-b07f36300303a5c1729056a61e344d6c6fd9b2e404082d8e8ce7185d45652c2b.scope.
Dec 05 09:15:15 compute-1 podman[214232]: 2025-12-05 09:15:15.656088177 +0000 UTC m=+0.083199768 container exec b07f36300303a5c1729056a61e344d6c6fd9b2e404082d8e8ce7185d45652c2b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, architecture=x86_64, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, vcs-type=git, release=1755695350, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, config_id=openstack_network_exporter, io.openshift.expose-services=)
Dec 05 09:15:15 compute-1 podman[214232]: 2025-12-05 09:15:15.6863538 +0000 UTC m=+0.113465361 container exec_died b07f36300303a5c1729056a61e344d6c6fd9b2e404082d8e8ce7185d45652c2b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, io.openshift.tags=minimal rhel9, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, io.buildah.version=1.33.7, managed_by=edpm_ansible, vcs-type=git, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., maintainer=Red Hat, Inc., distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., config_id=openstack_network_exporter, io.openshift.expose-services=, build-date=2025-08-20T13:12:41, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1755695350, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Dec 05 09:15:15 compute-1 systemd[1]: libpod-conmon-b07f36300303a5c1729056a61e344d6c6fd9b2e404082d8e8ce7185d45652c2b.scope: Deactivated successfully.
Dec 05 09:15:15 compute-1 sudo[214215]: pam_unix(sudo:session): session closed for user root
Dec 05 09:15:16 compute-1 sudo[214417]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bbphzoetgsbpmbkygsmfphkkehculdlv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764926115.9127588-3811-47253288639616/AnsiballZ_podman_container_exec.py'
Dec 05 09:15:16 compute-1 sudo[214417]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:15:16 compute-1 python3.9[214419]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=openstack_network_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Dec 05 09:15:16 compute-1 systemd[1]: Started libpod-conmon-b07f36300303a5c1729056a61e344d6c6fd9b2e404082d8e8ce7185d45652c2b.scope.
Dec 05 09:15:16 compute-1 podman[214420]: 2025-12-05 09:15:16.55258571 +0000 UTC m=+0.089027008 container exec b07f36300303a5c1729056a61e344d6c6fd9b2e404082d8e8ce7185d45652c2b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, vendor=Red Hat, Inc., container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, release=1755695350, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., architecture=x86_64, com.redhat.component=ubi9-minimal-container, config_id=openstack_network_exporter)
Dec 05 09:15:16 compute-1 podman[214420]: 2025-12-05 09:15:16.588038796 +0000 UTC m=+0.124480094 container exec_died b07f36300303a5c1729056a61e344d6c6fd9b2e404082d8e8ce7185d45652c2b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, name=ubi9-minimal, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, config_id=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, vendor=Red Hat, Inc., distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, container_name=openstack_network_exporter, io.buildah.version=1.33.7)
Dec 05 09:15:17 compute-1 sudo[214417]: pam_unix(sudo:session): session closed for user root
Dec 05 09:15:17 compute-1 systemd[1]: libpod-conmon-b07f36300303a5c1729056a61e344d6c6fd9b2e404082d8e8ce7185d45652c2b.scope: Deactivated successfully.
Dec 05 09:15:17 compute-1 podman[214466]: 2025-12-05 09:15:17.345393735 +0000 UTC m=+0.065235929 container health_status e8cea6352187f28e672aa25c227f2705a90d58074b8cf42235a56df5d4026fd5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a42d21893a42142b0b00e96296a13ac9d2c0fedf55d6438ad104fd0309c63376-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true)
Dec 05 09:15:17 compute-1 sudo[214621]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kxozwmozcahlnixwpccvvuembarpedly ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764926117.3884823-3819-260915304900759/AnsiballZ_file.py'
Dec 05 09:15:17 compute-1 sudo[214621]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:15:17 compute-1 python3.9[214623]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/openstack_network_exporter recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 09:15:17 compute-1 sudo[214621]: pam_unix(sudo:session): session closed for user root
Dec 05 09:15:20 compute-1 podman[214648]: 2025-12-05 09:15:20.634078583 +0000 UTC m=+0.071749876 container health_status 3f3288243aefe700d53cffce184353529fbaf6e44f1c19ec4792ed576af41176 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible)
Dec 05 09:15:24 compute-1 podman[214669]: 2025-12-05 09:15:24.627487652 +0000 UTC m=+0.062726540 container health_status b07f36300303a5c1729056a61e344d6c6fd9b2e404082d8e8ce7185d45652c2b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, io.openshift.expose-services=, maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9-minimal, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, distribution-scope=public, architecture=x86_64, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, version=9.6, managed_by=edpm_ansible, config_id=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Dec 05 09:15:27 compute-1 podman[214690]: 2025-12-05 09:15:27.615209601 +0000 UTC m=+0.053910910 container health_status b9f45cdcb567bb250381fa80d86c78c226d5638c7de1b6d79ee017275814ff68 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Dec 05 09:15:33 compute-1 podman[214714]: 2025-12-05 09:15:33.636553405 +0000 UTC m=+0.078076349 container health_status 550b604b3b40c506829669f3af0f3e8dc4e7ac01464222ce7f26998708fe1a96 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Dec 05 09:15:41 compute-1 podman[214738]: 2025-12-05 09:15:41.652130083 +0000 UTC m=+0.081687858 container health_status d60d3dfd47255bc7c068dbedc03dbf86678863f0414a1b8f8536e4d53dd67a13 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251125, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a42d21893a42142b0b00e96296a13ac9d2c0fedf55d6438ad104fd0309c63376-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true)
Dec 05 09:15:42 compute-1 sudo[214885]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cqwlduiatuoarlouqlrhabwlbfqysogk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764926142.44682-4063-163137131919900/AnsiballZ_file.py'
Dec 05 09:15:42 compute-1 sudo[214885]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:15:42 compute-1 python3.9[214887]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall/ state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 09:15:42 compute-1 sudo[214885]: pam_unix(sudo:session): session closed for user root
Dec 05 09:15:43 compute-1 sudo[215037]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-agpumjsjoebanabmeykqqogboqpzntfn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764926143.1980517-4086-2784072469943/AnsiballZ_stat.py'
Dec 05 09:15:43 compute-1 sudo[215037]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:15:43 compute-1 python3.9[215039]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/telemetry.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 09:15:43 compute-1 sudo[215037]: pam_unix(sudo:session): session closed for user root
Dec 05 09:15:44 compute-1 sudo[215160]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wdkgnilomhjbxfbgetvkvlcevjahiskt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764926143.1980517-4086-2784072469943/AnsiballZ_copy.py'
Dec 05 09:15:44 compute-1 sudo[215160]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:15:44 compute-1 python3.9[215162]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/telemetry.yaml mode=0640 src=/home/zuul/.ansible/tmp/ansible-tmp-1764926143.1980517-4086-2784072469943/.source.yaml _original_basename=firewall.yaml follow=False checksum=d942d984493b214bda2913f753ff68cdcedff00e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 09:15:44 compute-1 sudo[215160]: pam_unix(sudo:session): session closed for user root
Dec 05 09:15:45 compute-1 sudo[215326]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sdpeampzqiufebwhyiduihiplhhxqtsw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764926144.9013906-4134-53368010976494/AnsiballZ_file.py'
Dec 05 09:15:45 compute-1 sudo[215326]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:15:45 compute-1 podman[215286]: 2025-12-05 09:15:45.266499761 +0000 UTC m=+0.096434941 container health_status 0cdc951f3b455a1459471b20e1f1626ca53f702831c125f835d1b670aeb17ccf (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a42d21893a42142b0b00e96296a13ac9d2c0fedf55d6438ad104fd0309c63376-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251125, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Dec 05 09:15:45 compute-1 python3.9[215333]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 09:15:45 compute-1 sudo[215326]: pam_unix(sudo:session): session closed for user root
Dec 05 09:15:45 compute-1 sudo[215490]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-czidgjxwdlrvizpohrdagndamzrfmrjc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764926145.6531303-4158-100572136398577/AnsiballZ_stat.py'
Dec 05 09:15:45 compute-1 sudo[215490]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:15:46 compute-1 python3.9[215492]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 09:15:46 compute-1 sudo[215490]: pam_unix(sudo:session): session closed for user root
Dec 05 09:15:46 compute-1 sudo[215568]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-quqnqiyfbaydawrrmbgijdyuxdawybtx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764926145.6531303-4158-100572136398577/AnsiballZ_file.py'
Dec 05 09:15:46 compute-1 sudo[215568]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:15:46 compute-1 python3.9[215570]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 09:15:46 compute-1 sudo[215568]: pam_unix(sudo:session): session closed for user root
Dec 05 09:15:47 compute-1 sudo[215720]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cooxblbzasdqvmpfnbnzpsdjlevejrlm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764926146.8443134-4194-36994312115308/AnsiballZ_stat.py'
Dec 05 09:15:47 compute-1 sudo[215720]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:15:47 compute-1 python3.9[215722]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 09:15:47 compute-1 sudo[215720]: pam_unix(sudo:session): session closed for user root
Dec 05 09:15:47 compute-1 podman[215723]: 2025-12-05 09:15:47.457104329 +0000 UTC m=+0.054326782 container health_status e8cea6352187f28e672aa25c227f2705a90d58074b8cf42235a56df5d4026fd5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a42d21893a42142b0b00e96296a13ac9d2c0fedf55d6438ad104fd0309c63376-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team)
Dec 05 09:15:47 compute-1 sudo[215818]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eyvazicxpkrxndbjmzrixyngrjkucpzt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764926146.8443134-4194-36994312115308/AnsiballZ_file.py'
Dec 05 09:15:47 compute-1 sudo[215818]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:15:47 compute-1 python3.9[215820]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.79f81kdz recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 09:15:47 compute-1 sudo[215818]: pam_unix(sudo:session): session closed for user root
Dec 05 09:15:48 compute-1 sudo[215970]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bdmrmciuaoqlkbqvjkecmikpdnyaipih ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764926148.1179886-4230-274502299517111/AnsiballZ_stat.py'
Dec 05 09:15:48 compute-1 sudo[215970]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:15:48 compute-1 python3.9[215972]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 09:15:48 compute-1 sudo[215970]: pam_unix(sudo:session): session closed for user root
Dec 05 09:15:48 compute-1 sudo[216048]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yizybkwnvuhknyksjqzkbjclmvuolzul ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764926148.1179886-4230-274502299517111/AnsiballZ_file.py'
Dec 05 09:15:48 compute-1 sudo[216048]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:15:49 compute-1 python3.9[216050]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 09:15:49 compute-1 sudo[216048]: pam_unix(sudo:session): session closed for user root
Dec 05 09:15:49 compute-1 sudo[216200]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-opitkgxiprpozmbzevohepqgeoglpdzr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764926149.406733-4269-223642793316974/AnsiballZ_command.py'
Dec 05 09:15:49 compute-1 sudo[216200]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:15:49 compute-1 python3.9[216202]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 05 09:15:49 compute-1 sudo[216200]: pam_unix(sudo:session): session closed for user root
Dec 05 09:15:50 compute-1 sudo[216366]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xaeytetzpfbblggiihzyrwphevyltdxi ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1764926150.1813664-4293-42788492613431/AnsiballZ_edpm_nftables_from_files.py'
Dec 05 09:15:50 compute-1 sudo[216366]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:15:50 compute-1 podman[216327]: 2025-12-05 09:15:50.752807288 +0000 UTC m=+0.062123826 container health_status 3f3288243aefe700d53cffce184353529fbaf6e44f1c19ec4792ed576af41176 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=multipathd, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Dec 05 09:15:50 compute-1 python3[216374]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Dec 05 09:15:50 compute-1 sudo[216366]: pam_unix(sudo:session): session closed for user root
Dec 05 09:15:51 compute-1 sudo[216525]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iogqugonimpdgfxejbcblujqkhcnzann ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764926151.1793447-4317-242496928055655/AnsiballZ_stat.py'
Dec 05 09:15:51 compute-1 sudo[216525]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:15:51 compute-1 python3.9[216527]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 09:15:51 compute-1 sudo[216525]: pam_unix(sudo:session): session closed for user root
Dec 05 09:15:51 compute-1 sudo[216603]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wqjsxfvcmqrbwbqlbwbtxpglepyqzyxr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764926151.1793447-4317-242496928055655/AnsiballZ_file.py'
Dec 05 09:15:51 compute-1 sudo[216603]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:15:52 compute-1 python3.9[216605]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 09:15:52 compute-1 sudo[216603]: pam_unix(sudo:session): session closed for user root
Dec 05 09:15:52 compute-1 sudo[216755]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dfqcbfzjjzvqujfphxeuuzxtswnumjdx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764926152.4647138-4353-150972974968731/AnsiballZ_stat.py'
Dec 05 09:15:52 compute-1 sudo[216755]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:15:52 compute-1 python3.9[216757]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 09:15:53 compute-1 sudo[216755]: pam_unix(sudo:session): session closed for user root
Dec 05 09:15:53 compute-1 sudo[216833]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mnznvomocpsmmfvdoqahifhiohggbecx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764926152.4647138-4353-150972974968731/AnsiballZ_file.py'
Dec 05 09:15:53 compute-1 sudo[216833]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:15:53 compute-1 python3.9[216835]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-update-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-update-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 09:15:53 compute-1 sudo[216833]: pam_unix(sudo:session): session closed for user root
Dec 05 09:15:54 compute-1 sudo[216985]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hrwbocqfakbpiepycdzfovmcecomqivz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764926153.710294-4389-109138000768993/AnsiballZ_stat.py'
Dec 05 09:15:54 compute-1 sudo[216985]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:15:54 compute-1 python3.9[216987]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 09:15:54 compute-1 sudo[216985]: pam_unix(sudo:session): session closed for user root
Dec 05 09:15:54 compute-1 sudo[217063]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dhuvxbqbirxljwcbpmksrrrxspurvotc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764926153.710294-4389-109138000768993/AnsiballZ_file.py'
Dec 05 09:15:54 compute-1 sudo[217063]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:15:54 compute-1 python3.9[217065]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-flushes.nft _original_basename=flush-chain.j2 recurse=False state=file path=/etc/nftables/edpm-flushes.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 09:15:54 compute-1 sudo[217063]: pam_unix(sudo:session): session closed for user root
Dec 05 09:15:55 compute-1 sudo[217226]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tvbsmasxbnkxinhomvqiixfbxmxogjfp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764926154.9565713-4425-214751877056004/AnsiballZ_stat.py'
Dec 05 09:15:55 compute-1 sudo[217226]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:15:55 compute-1 podman[217189]: 2025-12-05 09:15:55.311583589 +0000 UTC m=+0.065454438 container health_status b07f36300303a5c1729056a61e344d6c6fd9b2e404082d8e8ce7185d45652c2b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Red Hat, Inc., distribution-scope=public, io.buildah.version=1.33.7, io.openshift.expose-services=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1755695350, config_id=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., version=9.6, architecture=x86_64, build-date=2025-08-20T13:12:41, managed_by=edpm_ansible, vcs-type=git, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., name=ubi9-minimal, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers)
Dec 05 09:15:55 compute-1 python3.9[217234]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 09:15:55 compute-1 sudo[217226]: pam_unix(sudo:session): session closed for user root
Dec 05 09:15:55 compute-1 sudo[217314]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nvmvrbjrodpgclkvzrvlgxlyztwvcajg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764926154.9565713-4425-214751877056004/AnsiballZ_file.py'
Dec 05 09:15:55 compute-1 sudo[217314]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:15:55 compute-1 python3.9[217316]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-chains.nft _original_basename=chains.j2 recurse=False state=file path=/etc/nftables/edpm-chains.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 09:15:56 compute-1 sudo[217314]: pam_unix(sudo:session): session closed for user root
Dec 05 09:15:56 compute-1 sudo[217466]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kexitnvstzrrzdrffqcjkjbhgwajoztd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764926156.1934428-4461-78308845313317/AnsiballZ_stat.py'
Dec 05 09:15:56 compute-1 sudo[217466]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:15:56 compute-1 python3.9[217468]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 09:15:56 compute-1 sudo[217466]: pam_unix(sudo:session): session closed for user root
Dec 05 09:15:57 compute-1 sudo[217591]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wrnptvwwsnycrhcxviwgbknzgjknzcto ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764926156.1934428-4461-78308845313317/AnsiballZ_copy.py'
Dec 05 09:15:57 compute-1 sudo[217591]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:15:57 compute-1 python3.9[217593]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764926156.1934428-4461-78308845313317/.source.nft follow=False _original_basename=ruleset.j2 checksum=fb3275eced3a2e06312143189928124e1b2df34a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 09:15:57 compute-1 sudo[217591]: pam_unix(sudo:session): session closed for user root
Dec 05 09:15:57 compute-1 sudo[217754]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xkboiucahujxhbpxvwxhultxrhiewsir ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764926157.6918144-4506-276978956679822/AnsiballZ_file.py'
Dec 05 09:15:57 compute-1 sudo[217754]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:15:58 compute-1 podman[217717]: 2025-12-05 09:15:58.012755375 +0000 UTC m=+0.064587385 container health_status b9f45cdcb567bb250381fa80d86c78c226d5638c7de1b6d79ee017275814ff68 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Dec 05 09:15:58 compute-1 nova_compute[189066]: 2025-12-05 09:15:58.015 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:15:58 compute-1 nova_compute[189066]: 2025-12-05 09:15:58.045 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:15:58 compute-1 python3.9[217760]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 09:15:58 compute-1 sudo[217754]: pam_unix(sudo:session): session closed for user root
Dec 05 09:15:58 compute-1 sudo[217917]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-byokdwouyvcfkrhdatfstbpyfleomlii ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764926158.4318876-4530-37004833316311/AnsiballZ_command.py'
Dec 05 09:15:58 compute-1 sudo[217917]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:15:58 compute-1 python3.9[217919]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 05 09:15:58 compute-1 sudo[217917]: pam_unix(sudo:session): session closed for user root
Dec 05 09:15:59 compute-1 nova_compute[189066]: 2025-12-05 09:15:59.020 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:15:59 compute-1 nova_compute[189066]: 2025-12-05 09:15:59.021 189070 DEBUG nova.compute.manager [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 05 09:15:59 compute-1 nova_compute[189066]: 2025-12-05 09:15:59.021 189070 DEBUG nova.compute.manager [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 05 09:15:59 compute-1 nova_compute[189066]: 2025-12-05 09:15:59.037 189070 DEBUG nova.compute.manager [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Dec 05 09:15:59 compute-1 sudo[218072]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qhjxpyhfktcamnpahbgkkulglajqdosz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764926159.2787018-4554-233529522111402/AnsiballZ_blockinfile.py'
Dec 05 09:15:59 compute-1 sudo[218072]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:16:00 compute-1 nova_compute[189066]: 2025-12-05 09:16:00.021 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:16:00 compute-1 nova_compute[189066]: 2025-12-05 09:16:00.021 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:16:00 compute-1 nova_compute[189066]: 2025-12-05 09:16:00.021 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:16:00 compute-1 nova_compute[189066]: 2025-12-05 09:16:00.021 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:16:00 compute-1 nova_compute[189066]: 2025-12-05 09:16:00.022 189070 DEBUG nova.compute.manager [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 05 09:16:00 compute-1 nova_compute[189066]: 2025-12-05 09:16:00.022 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:16:00 compute-1 python3.9[218074]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"
                                             include "/etc/nftables/edpm-chains.nft"
                                             include "/etc/nftables/edpm-rules.nft"
                                             include "/etc/nftables/edpm-jumps.nft"
                                              path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 09:16:00 compute-1 nova_compute[189066]: 2025-12-05 09:16:00.064 189070 DEBUG oslo_concurrency.lockutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:16:00 compute-1 nova_compute[189066]: 2025-12-05 09:16:00.065 189070 DEBUG oslo_concurrency.lockutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:16:00 compute-1 nova_compute[189066]: 2025-12-05 09:16:00.065 189070 DEBUG oslo_concurrency.lockutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:16:00 compute-1 nova_compute[189066]: 2025-12-05 09:16:00.065 189070 DEBUG nova.compute.resource_tracker [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 05 09:16:00 compute-1 sudo[218072]: pam_unix(sudo:session): session closed for user root
Dec 05 09:16:00 compute-1 nova_compute[189066]: 2025-12-05 09:16:00.245 189070 WARNING nova.virt.libvirt.driver [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 05 09:16:00 compute-1 nova_compute[189066]: 2025-12-05 09:16:00.246 189070 DEBUG nova.compute.resource_tracker [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5954MB free_disk=73.3713150024414GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 05 09:16:00 compute-1 nova_compute[189066]: 2025-12-05 09:16:00.247 189070 DEBUG oslo_concurrency.lockutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:16:00 compute-1 nova_compute[189066]: 2025-12-05 09:16:00.247 189070 DEBUG oslo_concurrency.lockutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:16:00 compute-1 nova_compute[189066]: 2025-12-05 09:16:00.332 189070 DEBUG nova.compute.resource_tracker [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 05 09:16:00 compute-1 nova_compute[189066]: 2025-12-05 09:16:00.333 189070 DEBUG nova.compute.resource_tracker [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 05 09:16:00 compute-1 nova_compute[189066]: 2025-12-05 09:16:00.359 189070 DEBUG nova.compute.provider_tree [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Inventory has not changed in ProviderTree for provider: be68f9f1-7820-4bfa-8dbd-210e13729f64 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 09:16:00 compute-1 nova_compute[189066]: 2025-12-05 09:16:00.381 189070 DEBUG nova.scheduler.client.report [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Inventory has not changed for provider be68f9f1-7820-4bfa-8dbd-210e13729f64 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 09:16:00 compute-1 nova_compute[189066]: 2025-12-05 09:16:00.383 189070 DEBUG nova.compute.resource_tracker [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 05 09:16:00 compute-1 nova_compute[189066]: 2025-12-05 09:16:00.384 189070 DEBUG oslo_concurrency.lockutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.137s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:16:00 compute-1 sudo[218224]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fwuvcxcdifmuwzjqsdonlzavywgomlvi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764926160.4984357-4581-86702065649893/AnsiballZ_command.py'
Dec 05 09:16:00 compute-1 sudo[218224]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:16:00 compute-1 python3.9[218226]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 05 09:16:01 compute-1 sudo[218224]: pam_unix(sudo:session): session closed for user root
Dec 05 09:16:01 compute-1 nova_compute[189066]: 2025-12-05 09:16:01.384 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:16:01 compute-1 nova_compute[189066]: 2025-12-05 09:16:01.384 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:16:01 compute-1 sudo[218379]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yvlaksvptyvavchovgxjwcobtjyvcsxf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764926161.2592835-4606-196710207624917/AnsiballZ_stat.py'
Dec 05 09:16:01 compute-1 sudo[218379]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:16:01 compute-1 python3.9[218381]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 05 09:16:01 compute-1 sudo[218379]: pam_unix(sudo:session): session closed for user root
Dec 05 09:16:02 compute-1 sudo[218533]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lokuueejznsomscztugjesvpxdybyzun ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764926161.9913266-4629-148894521493565/AnsiballZ_command.py'
Dec 05 09:16:02 compute-1 sudo[218533]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:16:02 compute-1 python3.9[218535]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 05 09:16:02 compute-1 sudo[218533]: pam_unix(sudo:session): session closed for user root
Dec 05 09:16:02 compute-1 sudo[218688]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-enuyanjdphrcupshsinklrppdcwmbyzl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764926162.719732-4653-175088498311932/AnsiballZ_file.py'
Dec 05 09:16:02 compute-1 sudo[218688]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:16:03 compute-1 python3.9[218690]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 09:16:03 compute-1 sudo[218688]: pam_unix(sudo:session): session closed for user root
Dec 05 09:16:03 compute-1 sshd-session[189392]: Connection closed by 192.168.122.30 port 44802
Dec 05 09:16:03 compute-1 sshd-session[189389]: pam_unix(sshd:session): session closed for user zuul
Dec 05 09:16:03 compute-1 systemd[1]: session-27.scope: Deactivated successfully.
Dec 05 09:16:03 compute-1 systemd[1]: session-27.scope: Consumed 2min 1.610s CPU time.
Dec 05 09:16:03 compute-1 systemd-logind[807]: Session 27 logged out. Waiting for processes to exit.
Dec 05 09:16:03 compute-1 systemd-logind[807]: Removed session 27.
Dec 05 09:16:03 compute-1 podman[218715]: 2025-12-05 09:16:03.910991365 +0000 UTC m=+0.061277164 container health_status 550b604b3b40c506829669f3af0f3e8dc4e7ac01464222ce7f26998708fe1a96 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec 05 09:16:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:16:08.859 105272 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:16:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:16:08.861 105272 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:16:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:16:08.861 105272 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:16:11 compute-1 sshd-session[218739]: Received disconnect from 122.168.194.41 port 45544:11: Bye Bye [preauth]
Dec 05 09:16:11 compute-1 sshd-session[218739]: Disconnected from authenticating user root 122.168.194.41 port 45544 [preauth]
Dec 05 09:16:12 compute-1 podman[218743]: 2025-12-05 09:16:12.737328063 +0000 UTC m=+0.155450400 container health_status d60d3dfd47255bc7c068dbedc03dbf86678863f0414a1b8f8536e4d53dd67a13 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a42d21893a42142b0b00e96296a13ac9d2c0fedf55d6438ad104fd0309c63376-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ceilometer_agent_compute)
Dec 05 09:16:13 compute-1 sshd-session[218741]: Received disconnect from 185.118.15.236 port 35292:11: Bye Bye [preauth]
Dec 05 09:16:13 compute-1 sshd-session[218741]: Disconnected from authenticating user root 185.118.15.236 port 35292 [preauth]
Dec 05 09:16:15 compute-1 podman[218763]: 2025-12-05 09:16:15.697708381 +0000 UTC m=+0.122393674 container health_status 0cdc951f3b455a1459471b20e1f1626ca53f702831c125f835d1b670aeb17ccf (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_controller, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a42d21893a42142b0b00e96296a13ac9d2c0fedf55d6438ad104fd0309c63376-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3)
Dec 05 09:16:17 compute-1 podman[218792]: 2025-12-05 09:16:17.620444033 +0000 UTC m=+0.056414574 container health_status e8cea6352187f28e672aa25c227f2705a90d58074b8cf42235a56df5d4026fd5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a42d21893a42142b0b00e96296a13ac9d2c0fedf55d6438ad104fd0309c63376-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, managed_by=edpm_ansible)
Dec 05 09:16:18 compute-1 sshd-session[218790]: Received disconnect from 43.225.158.169 port 52774:11: Bye Bye [preauth]
Dec 05 09:16:18 compute-1 sshd-session[218790]: Disconnected from authenticating user root 43.225.158.169 port 52774 [preauth]
Dec 05 09:16:21 compute-1 podman[218811]: 2025-12-05 09:16:21.641952716 +0000 UTC m=+0.081502844 container health_status 3f3288243aefe700d53cffce184353529fbaf6e44f1c19ec4792ed576af41176 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0)
Dec 05 09:16:22 compute-1 sshd-session[218227]: ssh_dispatch_run_fatal: Connection from 101.47.162.91 port 40590: Connection timed out [preauth]
Dec 05 09:16:25 compute-1 podman[218831]: 2025-12-05 09:16:25.630774511 +0000 UTC m=+0.068272167 container health_status b07f36300303a5c1729056a61e344d6c6fd9b2e404082d8e8ce7185d45652c2b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, distribution-scope=public, vcs-type=git, version=9.6, architecture=x86_64, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., config_id=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.buildah.version=1.33.7, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, vendor=Red Hat, Inc., name=ubi9-minimal, container_name=openstack_network_exporter)
Dec 05 09:16:28 compute-1 podman[218853]: 2025-12-05 09:16:28.62678469 +0000 UTC m=+0.071558319 container health_status b9f45cdcb567bb250381fa80d86c78c226d5638c7de1b6d79ee017275814ff68 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 05 09:16:34 compute-1 podman[218879]: 2025-12-05 09:16:34.627643134 +0000 UTC m=+0.062786962 container health_status 550b604b3b40c506829669f3af0f3e8dc4e7ac01464222ce7f26998708fe1a96 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Dec 05 09:16:43 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:16:43.462 105272 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=2, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f2:d3:89', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '8e:43:2c:7d:20:dd'}, ipsec=False) old=SB_Global(nb_cfg=1) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 09:16:43 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:16:43.467 105272 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Dec 05 09:16:43 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:16:43.469 105272 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=2deaed7a-68f6-453c-b7f8-10ef033f3762, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '2'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 09:16:43 compute-1 podman[218903]: 2025-12-05 09:16:43.654848223 +0000 UTC m=+0.087793098 container health_status d60d3dfd47255bc7c068dbedc03dbf86678863f0414a1b8f8536e4d53dd67a13 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a42d21893a42142b0b00e96296a13ac9d2c0fedf55d6438ad104fd0309c63376-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ceilometer_agent_compute, tcib_managed=true)
Dec 05 09:16:45 compute-1 sshd-session[218916]: Received disconnect from 122.114.113.177 port 33970:11: Bye Bye [preauth]
Dec 05 09:16:45 compute-1 sshd-session[218916]: Disconnected from authenticating user root 122.114.113.177 port 33970 [preauth]
Dec 05 09:16:46 compute-1 podman[218925]: 2025-12-05 09:16:46.67926445 +0000 UTC m=+0.114414246 container health_status 0cdc951f3b455a1459471b20e1f1626ca53f702831c125f835d1b670aeb17ccf (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a42d21893a42142b0b00e96296a13ac9d2c0fedf55d6438ad104fd0309c63376-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true)
Dec 05 09:16:48 compute-1 podman[218952]: 2025-12-05 09:16:48.607881855 +0000 UTC m=+0.051733182 container health_status e8cea6352187f28e672aa25c227f2705a90d58074b8cf42235a56df5d4026fd5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, managed_by=edpm_ansible, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a42d21893a42142b0b00e96296a13ac9d2c0fedf55d6438ad104fd0309c63376-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Dec 05 09:16:52 compute-1 podman[218972]: 2025-12-05 09:16:52.639705372 +0000 UTC m=+0.072429848 container health_status 3f3288243aefe700d53cffce184353529fbaf6e44f1c19ec4792ed576af41176 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 05 09:16:56 compute-1 podman[218992]: 2025-12-05 09:16:56.638254462 +0000 UTC m=+0.071805454 container health_status b07f36300303a5c1729056a61e344d6c6fd9b2e404082d8e8ce7185d45652c2b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., version=9.6, container_name=openstack_network_exporter, io.openshift.expose-services=, name=ubi9-minimal, architecture=x86_64, com.redhat.component=ubi9-minimal-container, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=openstack_network_exporter, io.buildah.version=1.33.7)
Dec 05 09:16:57 compute-1 nova_compute[189066]: 2025-12-05 09:16:57.021 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:16:57 compute-1 nova_compute[189066]: 2025-12-05 09:16:57.022 189070 DEBUG nova.compute.manager [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Dec 05 09:16:57 compute-1 nova_compute[189066]: 2025-12-05 09:16:57.102 189070 DEBUG nova.compute.manager [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Dec 05 09:16:57 compute-1 nova_compute[189066]: 2025-12-05 09:16:57.105 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:16:57 compute-1 nova_compute[189066]: 2025-12-05 09:16:57.105 189070 DEBUG nova.compute.manager [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Dec 05 09:16:57 compute-1 nova_compute[189066]: 2025-12-05 09:16:57.153 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:16:58 compute-1 nova_compute[189066]: 2025-12-05 09:16:58.256 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:16:59 compute-1 podman[219013]: 2025-12-05 09:16:59.634554703 +0000 UTC m=+0.066690551 container health_status b9f45cdcb567bb250381fa80d86c78c226d5638c7de1b6d79ee017275814ff68 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Dec 05 09:17:00 compute-1 nova_compute[189066]: 2025-12-05 09:17:00.020 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:17:00 compute-1 nova_compute[189066]: 2025-12-05 09:17:00.021 189070 DEBUG nova.compute.manager [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 05 09:17:00 compute-1 nova_compute[189066]: 2025-12-05 09:17:00.021 189070 DEBUG nova.compute.manager [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 05 09:17:00 compute-1 nova_compute[189066]: 2025-12-05 09:17:00.187 189070 DEBUG nova.compute.manager [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Dec 05 09:17:00 compute-1 nova_compute[189066]: 2025-12-05 09:17:00.189 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:17:00 compute-1 nova_compute[189066]: 2025-12-05 09:17:00.225 189070 DEBUG oslo_concurrency.lockutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:17:00 compute-1 nova_compute[189066]: 2025-12-05 09:17:00.226 189070 DEBUG oslo_concurrency.lockutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:17:00 compute-1 nova_compute[189066]: 2025-12-05 09:17:00.226 189070 DEBUG oslo_concurrency.lockutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:17:00 compute-1 nova_compute[189066]: 2025-12-05 09:17:00.226 189070 DEBUG nova.compute.resource_tracker [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 05 09:17:00 compute-1 nova_compute[189066]: 2025-12-05 09:17:00.402 189070 WARNING nova.virt.libvirt.driver [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 05 09:17:00 compute-1 nova_compute[189066]: 2025-12-05 09:17:00.403 189070 DEBUG nova.compute.resource_tracker [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=6025MB free_disk=73.37164306640625GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 05 09:17:00 compute-1 nova_compute[189066]: 2025-12-05 09:17:00.404 189070 DEBUG oslo_concurrency.lockutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:17:00 compute-1 nova_compute[189066]: 2025-12-05 09:17:00.404 189070 DEBUG oslo_concurrency.lockutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:17:00 compute-1 nova_compute[189066]: 2025-12-05 09:17:00.731 189070 DEBUG nova.compute.resource_tracker [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 05 09:17:00 compute-1 nova_compute[189066]: 2025-12-05 09:17:00.732 189070 DEBUG nova.compute.resource_tracker [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 05 09:17:00 compute-1 nova_compute[189066]: 2025-12-05 09:17:00.820 189070 DEBUG nova.compute.provider_tree [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Inventory has not changed in ProviderTree for provider: be68f9f1-7820-4bfa-8dbd-210e13729f64 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 09:17:00 compute-1 nova_compute[189066]: 2025-12-05 09:17:00.885 189070 DEBUG nova.scheduler.client.report [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Inventory has not changed for provider be68f9f1-7820-4bfa-8dbd-210e13729f64 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 09:17:00 compute-1 nova_compute[189066]: 2025-12-05 09:17:00.887 189070 DEBUG nova.compute.resource_tracker [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 05 09:17:00 compute-1 nova_compute[189066]: 2025-12-05 09:17:00.888 189070 DEBUG oslo_concurrency.lockutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.483s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:17:01 compute-1 nova_compute[189066]: 2025-12-05 09:17:01.720 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:17:01 compute-1 nova_compute[189066]: 2025-12-05 09:17:01.721 189070 DEBUG nova.compute.manager [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 05 09:17:02 compute-1 nova_compute[189066]: 2025-12-05 09:17:02.016 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:17:02 compute-1 nova_compute[189066]: 2025-12-05 09:17:02.019 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:17:02 compute-1 nova_compute[189066]: 2025-12-05 09:17:02.020 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:17:02 compute-1 nova_compute[189066]: 2025-12-05 09:17:02.020 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:17:03 compute-1 nova_compute[189066]: 2025-12-05 09:17:03.021 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:17:05 compute-1 podman[219037]: 2025-12-05 09:17:05.624787381 +0000 UTC m=+0.063830792 container health_status 550b604b3b40c506829669f3af0f3e8dc4e7ac01464222ce7f26998708fe1a96 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Dec 05 09:17:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:17:08.861 105272 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:17:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:17:08.862 105272 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:17:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:17:08.862 105272 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:17:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:17:10.737 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 09:17:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:17:10.739 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 09:17:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:17:10.739 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 09:17:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:17:10.739 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 09:17:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:17:10.739 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 09:17:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:17:10.740 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 09:17:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:17:10.740 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 09:17:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:17:10.740 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 09:17:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:17:10.740 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 09:17:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:17:10.740 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 09:17:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:17:10.740 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 09:17:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:17:10.740 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 09:17:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:17:10.740 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 09:17:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:17:10.740 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 09:17:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:17:10.740 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 09:17:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:17:10.740 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 09:17:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:17:10.741 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 09:17:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:17:10.741 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 09:17:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:17:10.741 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 09:17:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:17:10.741 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 09:17:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:17:10.741 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 09:17:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:17:10.741 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 09:17:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:17:10.741 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 09:17:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:17:10.741 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 09:17:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:17:10.741 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 09:17:14 compute-1 podman[219063]: 2025-12-05 09:17:14.623925233 +0000 UTC m=+0.065363910 container health_status d60d3dfd47255bc7c068dbedc03dbf86678863f0414a1b8f8536e4d53dd67a13 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_id=ceilometer_agent_compute, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a42d21893a42142b0b00e96296a13ac9d2c0fedf55d6438ad104fd0309c63376-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 05 09:17:17 compute-1 podman[219083]: 2025-12-05 09:17:17.715491731 +0000 UTC m=+0.147506321 container health_status 0cdc951f3b455a1459471b20e1f1626ca53f702831c125f835d1b670aeb17ccf (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a42d21893a42142b0b00e96296a13ac9d2c0fedf55d6438ad104fd0309c63376-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3)
Dec 05 09:17:19 compute-1 podman[219108]: 2025-12-05 09:17:19.619518895 +0000 UTC m=+0.057655276 container health_status e8cea6352187f28e672aa25c227f2705a90d58074b8cf42235a56df5d4026fd5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a42d21893a42142b0b00e96296a13ac9d2c0fedf55d6438ad104fd0309c63376-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Dec 05 09:17:23 compute-1 podman[219129]: 2025-12-05 09:17:23.65120874 +0000 UTC m=+0.088089255 container health_status 3f3288243aefe700d53cffce184353529fbaf6e44f1c19ec4792ed576af41176 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125)
Dec 05 09:17:27 compute-1 podman[219152]: 2025-12-05 09:17:27.626793189 +0000 UTC m=+0.061333862 container health_status b07f36300303a5c1729056a61e344d6c6fd9b2e404082d8e8ce7185d45652c2b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=openstack_network_exporter, distribution-scope=public, managed_by=edpm_ansible, maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, name=ubi9-minimal, version=9.6, architecture=x86_64, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, release=1755695350)
Dec 05 09:17:28 compute-1 sshd-session[219150]: Received disconnect from 122.168.194.41 port 54114:11: Bye Bye [preauth]
Dec 05 09:17:28 compute-1 sshd-session[219150]: Disconnected from authenticating user root 122.168.194.41 port 54114 [preauth]
Dec 05 09:17:30 compute-1 podman[219174]: 2025-12-05 09:17:30.612307882 +0000 UTC m=+0.051402415 container health_status b9f45cdcb567bb250381fa80d86c78c226d5638c7de1b6d79ee017275814ff68 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 05 09:17:32 compute-1 sshd-session[219198]: Received disconnect from 43.225.158.169 port 37682:11: Bye Bye [preauth]
Dec 05 09:17:32 compute-1 sshd-session[219198]: Disconnected from authenticating user root 43.225.158.169 port 37682 [preauth]
Dec 05 09:17:36 compute-1 podman[219202]: 2025-12-05 09:17:36.608656206 +0000 UTC m=+0.044899489 container health_status 550b604b3b40c506829669f3af0f3e8dc4e7ac01464222ce7f26998708fe1a96 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec 05 09:17:37 compute-1 sshd-session[219200]: Received disconnect from 185.118.15.236 port 35414:11: Bye Bye [preauth]
Dec 05 09:17:37 compute-1 sshd-session[219200]: Disconnected from authenticating user root 185.118.15.236 port 35414 [preauth]
Dec 05 09:17:45 compute-1 podman[219226]: 2025-12-05 09:17:45.624786979 +0000 UTC m=+0.064717084 container health_status d60d3dfd47255bc7c068dbedc03dbf86678863f0414a1b8f8536e4d53dd67a13 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a42d21893a42142b0b00e96296a13ac9d2c0fedf55d6438ad104fd0309c63376-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Dec 05 09:17:48 compute-1 podman[219247]: 2025-12-05 09:17:48.659694837 +0000 UTC m=+0.095927463 container health_status 0cdc951f3b455a1459471b20e1f1626ca53f702831c125f835d1b670aeb17ccf (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a42d21893a42142b0b00e96296a13ac9d2c0fedf55d6438ad104fd0309c63376-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Dec 05 09:17:50 compute-1 podman[219275]: 2025-12-05 09:17:50.620042382 +0000 UTC m=+0.054336188 container health_status e8cea6352187f28e672aa25c227f2705a90d58074b8cf42235a56df5d4026fd5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a42d21893a42142b0b00e96296a13ac9d2c0fedf55d6438ad104fd0309c63376-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Dec 05 09:17:51 compute-1 sshd-session[219273]: Connection reset by authenticating user root 45.135.232.92 port 29878 [preauth]
Dec 05 09:17:54 compute-1 sshd-session[219296]: Invalid user user from 45.135.232.92 port 29898
Dec 05 09:17:54 compute-1 podman[219298]: 2025-12-05 09:17:54.273768152 +0000 UTC m=+0.061161127 container health_status 3f3288243aefe700d53cffce184353529fbaf6e44f1c19ec4792ed576af41176 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true)
Dec 05 09:17:54 compute-1 sshd-session[219296]: Connection reset by invalid user user 45.135.232.92 port 29898 [preauth]
Dec 05 09:17:57 compute-1 sshd-session[219318]: Connection reset by authenticating user root 45.135.232.92 port 40490 [preauth]
Dec 05 09:17:58 compute-1 nova_compute[189066]: 2025-12-05 09:17:58.016 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:17:58 compute-1 podman[219322]: 2025-12-05 09:17:58.658971189 +0000 UTC m=+0.093499103 container health_status b07f36300303a5c1729056a61e344d6c6fd9b2e404082d8e8ce7185d45652c2b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., config_id=openstack_network_exporter, container_name=openstack_network_exporter, vcs-type=git, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, release=1755695350, vendor=Red Hat, Inc., io.buildah.version=1.33.7, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, architecture=x86_64, io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 05 09:17:59 compute-1 sshd-session[219320]: Connection reset by authenticating user root 45.135.232.92 port 40536 [preauth]
Dec 05 09:18:01 compute-1 sshd-session[219343]: Invalid user Admin from 45.135.232.92 port 40542
Dec 05 09:18:01 compute-1 podman[219345]: 2025-12-05 09:18:01.089372106 +0000 UTC m=+0.056090571 container health_status b9f45cdcb567bb250381fa80d86c78c226d5638c7de1b6d79ee017275814ff68 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 05 09:18:01 compute-1 sshd-session[219343]: Connection reset by invalid user Admin 45.135.232.92 port 40542 [preauth]
Dec 05 09:18:05 compute-1 nova_compute[189066]: 2025-12-05 09:18:05.297 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:18:05 compute-1 nova_compute[189066]: 2025-12-05 09:18:05.298 189070 DEBUG nova.compute.manager [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 05 09:18:05 compute-1 nova_compute[189066]: 2025-12-05 09:18:05.299 189070 DEBUG nova.compute.manager [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 05 09:18:06 compute-1 nova_compute[189066]: 2025-12-05 09:18:06.191 189070 DEBUG nova.compute.manager [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Dec 05 09:18:06 compute-1 nova_compute[189066]: 2025-12-05 09:18:06.192 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:18:06 compute-1 nova_compute[189066]: 2025-12-05 09:18:06.192 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:18:06 compute-1 nova_compute[189066]: 2025-12-05 09:18:06.192 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:18:06 compute-1 nova_compute[189066]: 2025-12-05 09:18:06.192 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:18:06 compute-1 nova_compute[189066]: 2025-12-05 09:18:06.193 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:18:06 compute-1 nova_compute[189066]: 2025-12-05 09:18:06.193 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:18:06 compute-1 nova_compute[189066]: 2025-12-05 09:18:06.193 189070 DEBUG nova.compute.manager [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 05 09:18:06 compute-1 nova_compute[189066]: 2025-12-05 09:18:06.193 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:18:07 compute-1 podman[219369]: 2025-12-05 09:18:07.625407889 +0000 UTC m=+0.066969020 container health_status 550b604b3b40c506829669f3af0f3e8dc4e7ac01464222ce7f26998708fe1a96 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec 05 09:18:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:18:08.862 105272 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:18:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:18:08.863 105272 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:18:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:18:08.863 105272 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:18:09 compute-1 nova_compute[189066]: 2025-12-05 09:18:09.369 189070 DEBUG oslo_concurrency.lockutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:18:09 compute-1 nova_compute[189066]: 2025-12-05 09:18:09.370 189070 DEBUG oslo_concurrency.lockutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:18:09 compute-1 nova_compute[189066]: 2025-12-05 09:18:09.370 189070 DEBUG oslo_concurrency.lockutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:18:09 compute-1 nova_compute[189066]: 2025-12-05 09:18:09.370 189070 DEBUG nova.compute.resource_tracker [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 05 09:18:09 compute-1 nova_compute[189066]: 2025-12-05 09:18:09.522 189070 WARNING nova.virt.libvirt.driver [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 05 09:18:09 compute-1 nova_compute[189066]: 2025-12-05 09:18:09.523 189070 DEBUG nova.compute.resource_tracker [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=6055MB free_disk=73.37162399291992GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 05 09:18:09 compute-1 nova_compute[189066]: 2025-12-05 09:18:09.523 189070 DEBUG oslo_concurrency.lockutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:18:09 compute-1 nova_compute[189066]: 2025-12-05 09:18:09.523 189070 DEBUG oslo_concurrency.lockutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:18:11 compute-1 nova_compute[189066]: 2025-12-05 09:18:11.096 189070 DEBUG nova.compute.resource_tracker [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 05 09:18:11 compute-1 nova_compute[189066]: 2025-12-05 09:18:11.096 189070 DEBUG nova.compute.resource_tracker [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 05 09:18:11 compute-1 nova_compute[189066]: 2025-12-05 09:18:11.253 189070 DEBUG nova.scheduler.client.report [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Refreshing inventories for resource provider be68f9f1-7820-4bfa-8dbd-210e13729f64 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Dec 05 09:18:11 compute-1 nova_compute[189066]: 2025-12-05 09:18:11.289 189070 DEBUG nova.scheduler.client.report [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Updating ProviderTree inventory for provider be68f9f1-7820-4bfa-8dbd-210e13729f64 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Dec 05 09:18:11 compute-1 nova_compute[189066]: 2025-12-05 09:18:11.289 189070 DEBUG nova.compute.provider_tree [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Updating inventory in ProviderTree for provider be68f9f1-7820-4bfa-8dbd-210e13729f64 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Dec 05 09:18:11 compute-1 nova_compute[189066]: 2025-12-05 09:18:11.336 189070 DEBUG nova.scheduler.client.report [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Refreshing aggregate associations for resource provider be68f9f1-7820-4bfa-8dbd-210e13729f64, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Dec 05 09:18:11 compute-1 nova_compute[189066]: 2025-12-05 09:18:11.399 189070 DEBUG nova.scheduler.client.report [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Refreshing trait associations for resource provider be68f9f1-7820-4bfa-8dbd-210e13729f64, traits: COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_DEVICE_TAGGING,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_STORAGE_BUS_USB,COMPUTE_NODE,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_VOLUME_EXTEND,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_GRAPHICS_MODEL_NONE,HW_CPU_X86_MMX,COMPUTE_TRUSTED_CERTS,COMPUTE_SECURITY_UEFI_SECURE_BOOT,HW_CPU_X86_SSE,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_RESCUE_BFV,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_NET_VIF_MODEL_E1000E,HW_CPU_X86_SSE41,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_STORAGE_BUS_FDC,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_STORAGE_BUS_IDE,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_SECURITY_TPM_1_2,HW_CPU_X86_SSE42,COMPUTE_SECURITY_TPM_2_0,COMPUTE_NET_VIF_MODEL_NE2K_PCI,HW_CPU_X86_SSSE3,COMPUTE_STORAGE_BUS_SATA,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_ACCELERATORS,HW_CPU_X86_SSE2,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_NET_VIF_MODEL_RTL8139 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Dec 05 09:18:11 compute-1 nova_compute[189066]: 2025-12-05 09:18:11.429 189070 DEBUG nova.compute.provider_tree [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Inventory has not changed in ProviderTree for provider: be68f9f1-7820-4bfa-8dbd-210e13729f64 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 09:18:12 compute-1 nova_compute[189066]: 2025-12-05 09:18:12.357 189070 DEBUG nova.scheduler.client.report [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Inventory has not changed for provider be68f9f1-7820-4bfa-8dbd-210e13729f64 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 09:18:12 compute-1 nova_compute[189066]: 2025-12-05 09:18:12.358 189070 DEBUG nova.compute.resource_tracker [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 05 09:18:12 compute-1 nova_compute[189066]: 2025-12-05 09:18:12.359 189070 DEBUG oslo_concurrency.lockutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.835s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:18:16 compute-1 podman[219393]: 2025-12-05 09:18:16.642725329 +0000 UTC m=+0.085189417 container health_status d60d3dfd47255bc7c068dbedc03dbf86678863f0414a1b8f8536e4d53dd67a13 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a42d21893a42142b0b00e96296a13ac9d2c0fedf55d6438ad104fd0309c63376-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Dec 05 09:18:17 compute-1 nova_compute[189066]: 2025-12-05 09:18:17.363 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:18:19 compute-1 podman[219413]: 2025-12-05 09:18:19.662856384 +0000 UTC m=+0.098031764 container health_status 0cdc951f3b455a1459471b20e1f1626ca53f702831c125f835d1b670aeb17ccf (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a42d21893a42142b0b00e96296a13ac9d2c0fedf55d6438ad104fd0309c63376-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125)
Dec 05 09:18:21 compute-1 podman[219440]: 2025-12-05 09:18:21.612289892 +0000 UTC m=+0.054267287 container health_status e8cea6352187f28e672aa25c227f2705a90d58074b8cf42235a56df5d4026fd5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_metadata_agent, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a42d21893a42142b0b00e96296a13ac9d2c0fedf55d6438ad104fd0309c63376-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2)
Dec 05 09:18:24 compute-1 podman[219459]: 2025-12-05 09:18:24.617998619 +0000 UTC m=+0.054571524 container health_status 3f3288243aefe700d53cffce184353529fbaf6e44f1c19ec4792ed576af41176 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=multipathd, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Dec 05 09:18:29 compute-1 podman[219479]: 2025-12-05 09:18:29.622712536 +0000 UTC m=+0.060607313 container health_status b07f36300303a5c1729056a61e344d6c6fd9b2e404082d8e8ce7185d45652c2b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., release=1755695350, distribution-scope=public, managed_by=edpm_ansible, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, architecture=x86_64, container_name=openstack_network_exporter, io.buildah.version=1.33.7, build-date=2025-08-20T13:12:41, io.openshift.expose-services=, version=9.6, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal)
Dec 05 09:18:31 compute-1 podman[219501]: 2025-12-05 09:18:31.80970118 +0000 UTC m=+0.055385874 container health_status b9f45cdcb567bb250381fa80d86c78c226d5638c7de1b6d79ee017275814ff68 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Dec 05 09:18:35 compute-1 sshd-session[219457]: Received disconnect from 101.47.162.91 port 44980:11: Bye Bye [preauth]
Dec 05 09:18:35 compute-1 sshd-session[219457]: Disconnected from 101.47.162.91 port 44980 [preauth]
Dec 05 09:18:38 compute-1 podman[219528]: 2025-12-05 09:18:38.64058824 +0000 UTC m=+0.084667166 container health_status 550b604b3b40c506829669f3af0f3e8dc4e7ac01464222ce7f26998708fe1a96 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Dec 05 09:18:39 compute-1 sshd-session[219526]: Received disconnect from 43.225.158.169 port 50822:11: Bye Bye [preauth]
Dec 05 09:18:39 compute-1 sshd-session[219526]: Disconnected from authenticating user root 43.225.158.169 port 50822 [preauth]
Dec 05 09:18:42 compute-1 sshd-session[219554]: Received disconnect from 122.168.194.41 port 58598:11: Bye Bye [preauth]
Dec 05 09:18:42 compute-1 sshd-session[219554]: Disconnected from authenticating user root 122.168.194.41 port 58598 [preauth]
Dec 05 09:18:47 compute-1 podman[219556]: 2025-12-05 09:18:47.622997402 +0000 UTC m=+0.062708782 container health_status d60d3dfd47255bc7c068dbedc03dbf86678863f0414a1b8f8536e4d53dd67a13 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a42d21893a42142b0b00e96296a13ac9d2c0fedf55d6438ad104fd0309c63376-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ceilometer_agent_compute)
Dec 05 09:18:50 compute-1 podman[219576]: 2025-12-05 09:18:50.685080341 +0000 UTC m=+0.118792611 container health_status 0cdc951f3b455a1459471b20e1f1626ca53f702831c125f835d1b670aeb17ccf (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a42d21893a42142b0b00e96296a13ac9d2c0fedf55d6438ad104fd0309c63376-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Dec 05 09:18:52 compute-1 podman[219604]: 2025-12-05 09:18:52.631174804 +0000 UTC m=+0.062376435 container health_status e8cea6352187f28e672aa25c227f2705a90d58074b8cf42235a56df5d4026fd5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a42d21893a42142b0b00e96296a13ac9d2c0fedf55d6438ad104fd0309c63376-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Dec 05 09:18:55 compute-1 podman[219623]: 2025-12-05 09:18:55.680731806 +0000 UTC m=+0.068475533 container health_status 3f3288243aefe700d53cffce184353529fbaf6e44f1c19ec4792ed576af41176 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, config_id=multipathd, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Dec 05 09:18:58 compute-1 sshd-session[219644]: Received disconnect from 185.118.15.236 port 35534:11: Bye Bye [preauth]
Dec 05 09:18:58 compute-1 sshd-session[219644]: Disconnected from authenticating user root 185.118.15.236 port 35534 [preauth]
Dec 05 09:19:00 compute-1 nova_compute[189066]: 2025-12-05 09:19:00.022 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:19:00 compute-1 podman[219646]: 2025-12-05 09:19:00.617927795 +0000 UTC m=+0.060342014 container health_status b07f36300303a5c1729056a61e344d6c6fd9b2e404082d8e8ce7185d45652c2b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, vcs-type=git, managed_by=edpm_ansible, config_id=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., io.openshift.tags=minimal rhel9, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, io.openshift.expose-services=, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 05 09:19:02 compute-1 nova_compute[189066]: 2025-12-05 09:19:02.021 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:19:02 compute-1 nova_compute[189066]: 2025-12-05 09:19:02.022 189070 DEBUG nova.compute.manager [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 05 09:19:02 compute-1 nova_compute[189066]: 2025-12-05 09:19:02.022 189070 DEBUG nova.compute.manager [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 05 09:19:02 compute-1 nova_compute[189066]: 2025-12-05 09:19:02.326 189070 DEBUG nova.compute.manager [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Dec 05 09:19:02 compute-1 podman[219668]: 2025-12-05 09:19:02.616959139 +0000 UTC m=+0.052840972 container health_status b9f45cdcb567bb250381fa80d86c78c226d5638c7de1b6d79ee017275814ff68 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Dec 05 09:19:03 compute-1 nova_compute[189066]: 2025-12-05 09:19:03.020 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:19:03 compute-1 nova_compute[189066]: 2025-12-05 09:19:03.021 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:19:03 compute-1 nova_compute[189066]: 2025-12-05 09:19:03.092 189070 DEBUG oslo_concurrency.lockutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:19:03 compute-1 nova_compute[189066]: 2025-12-05 09:19:03.093 189070 DEBUG oslo_concurrency.lockutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:19:03 compute-1 nova_compute[189066]: 2025-12-05 09:19:03.093 189070 DEBUG oslo_concurrency.lockutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:19:03 compute-1 nova_compute[189066]: 2025-12-05 09:19:03.093 189070 DEBUG nova.compute.resource_tracker [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 05 09:19:03 compute-1 nova_compute[189066]: 2025-12-05 09:19:03.263 189070 WARNING nova.virt.libvirt.driver [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 05 09:19:03 compute-1 nova_compute[189066]: 2025-12-05 09:19:03.264 189070 DEBUG nova.compute.resource_tracker [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=6074MB free_disk=73.37162399291992GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 05 09:19:03 compute-1 nova_compute[189066]: 2025-12-05 09:19:03.265 189070 DEBUG oslo_concurrency.lockutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:19:03 compute-1 nova_compute[189066]: 2025-12-05 09:19:03.265 189070 DEBUG oslo_concurrency.lockutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:19:03 compute-1 nova_compute[189066]: 2025-12-05 09:19:03.453 189070 DEBUG nova.compute.resource_tracker [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 05 09:19:03 compute-1 nova_compute[189066]: 2025-12-05 09:19:03.453 189070 DEBUG nova.compute.resource_tracker [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 05 09:19:03 compute-1 nova_compute[189066]: 2025-12-05 09:19:03.481 189070 DEBUG nova.compute.provider_tree [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Inventory has not changed in ProviderTree for provider: be68f9f1-7820-4bfa-8dbd-210e13729f64 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 09:19:03 compute-1 nova_compute[189066]: 2025-12-05 09:19:03.546 189070 DEBUG nova.scheduler.client.report [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Inventory has not changed for provider be68f9f1-7820-4bfa-8dbd-210e13729f64 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 09:19:03 compute-1 nova_compute[189066]: 2025-12-05 09:19:03.548 189070 DEBUG nova.compute.resource_tracker [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 05 09:19:03 compute-1 nova_compute[189066]: 2025-12-05 09:19:03.548 189070 DEBUG oslo_concurrency.lockutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.283s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:19:04 compute-1 nova_compute[189066]: 2025-12-05 09:19:04.542 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:19:04 compute-1 nova_compute[189066]: 2025-12-05 09:19:04.543 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:19:04 compute-1 nova_compute[189066]: 2025-12-05 09:19:04.543 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:19:05 compute-1 nova_compute[189066]: 2025-12-05 09:19:05.021 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:19:05 compute-1 nova_compute[189066]: 2025-12-05 09:19:05.021 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:19:05 compute-1 nova_compute[189066]: 2025-12-05 09:19:05.021 189070 DEBUG nova.compute.manager [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 05 09:19:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:19:08.863 105272 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:19:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:19:08.864 105272 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:19:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:19:08.865 105272 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:19:09 compute-1 podman[219692]: 2025-12-05 09:19:09.616079887 +0000 UTC m=+0.055858895 container health_status 550b604b3b40c506829669f3af0f3e8dc4e7ac01464222ce7f26998708fe1a96 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec 05 09:19:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:19:10.737 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 09:19:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:19:10.738 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 09:19:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:19:10.738 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 09:19:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:19:10.738 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 09:19:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:19:10.738 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 09:19:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:19:10.738 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 09:19:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:19:10.738 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 09:19:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:19:10.738 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 09:19:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:19:10.738 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 09:19:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:19:10.738 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 09:19:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:19:10.739 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 09:19:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:19:10.739 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 09:19:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:19:10.739 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 09:19:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:19:10.739 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 09:19:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:19:10.739 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 09:19:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:19:10.739 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 09:19:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:19:10.739 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 09:19:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:19:10.739 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 09:19:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:19:10.739 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 09:19:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:19:10.739 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 09:19:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:19:10.739 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 09:19:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:19:10.739 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 09:19:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:19:10.739 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 09:19:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:19:10.740 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 09:19:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:19:10.740 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 09:19:18 compute-1 podman[219716]: 2025-12-05 09:19:18.629047422 +0000 UTC m=+0.064687820 container health_status d60d3dfd47255bc7c068dbedc03dbf86678863f0414a1b8f8536e4d53dd67a13 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_id=ceilometer_agent_compute, managed_by=edpm_ansible, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a42d21893a42142b0b00e96296a13ac9d2c0fedf55d6438ad104fd0309c63376-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Dec 05 09:19:21 compute-1 podman[219736]: 2025-12-05 09:19:21.67927912 +0000 UTC m=+0.116609567 container health_status 0cdc951f3b455a1459471b20e1f1626ca53f702831c125f835d1b670aeb17ccf (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a42d21893a42142b0b00e96296a13ac9d2c0fedf55d6438ad104fd0309c63376-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Dec 05 09:19:23 compute-1 podman[219763]: 2025-12-05 09:19:23.617771358 +0000 UTC m=+0.058410748 container health_status e8cea6352187f28e672aa25c227f2705a90d58074b8cf42235a56df5d4026fd5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a42d21893a42142b0b00e96296a13ac9d2c0fedf55d6438ad104fd0309c63376-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Dec 05 09:19:26 compute-1 podman[219782]: 2025-12-05 09:19:26.644927713 +0000 UTC m=+0.062322603 container health_status 3f3288243aefe700d53cffce184353529fbaf6e44f1c19ec4792ed576af41176 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=multipathd, container_name=multipathd)
Dec 05 09:19:31 compute-1 podman[219802]: 2025-12-05 09:19:31.656118448 +0000 UTC m=+0.095449481 container health_status b07f36300303a5c1729056a61e344d6c6fd9b2e404082d8e8ce7185d45652c2b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, container_name=openstack_network_exporter, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, vcs-type=git, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, config_id=openstack_network_exporter, name=ubi9-minimal, io.buildah.version=1.33.7, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public)
Dec 05 09:19:33 compute-1 podman[219825]: 2025-12-05 09:19:33.617700218 +0000 UTC m=+0.055222820 container health_status b9f45cdcb567bb250381fa80d86c78c226d5638c7de1b6d79ee017275814ff68 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 05 09:19:40 compute-1 podman[219849]: 2025-12-05 09:19:40.61127531 +0000 UTC m=+0.048261730 container health_status 550b604b3b40c506829669f3af0f3e8dc4e7ac01464222ce7f26998708fe1a96 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Dec 05 09:19:49 compute-1 sshd-session[219873]: Received disconnect from 43.225.158.169 port 35735:11: Bye Bye [preauth]
Dec 05 09:19:49 compute-1 sshd-session[219873]: Disconnected from authenticating user root 43.225.158.169 port 35735 [preauth]
Dec 05 09:19:49 compute-1 podman[219875]: 2025-12-05 09:19:49.631598542 +0000 UTC m=+0.063483541 container health_status d60d3dfd47255bc7c068dbedc03dbf86678863f0414a1b8f8536e4d53dd67a13 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a42d21893a42142b0b00e96296a13ac9d2c0fedf55d6438ad104fd0309c63376-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_managed=true)
Dec 05 09:19:50 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:19:50.423 105272 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=3, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f2:d3:89', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '8e:43:2c:7d:20:dd'}, ipsec=False) old=SB_Global(nb_cfg=2) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 09:19:50 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:19:50.424 105272 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Dec 05 09:19:52 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:19:52.427 105272 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=2deaed7a-68f6-453c-b7f8-10ef033f3762, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '3'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 09:19:52 compute-1 podman[219895]: 2025-12-05 09:19:52.680158591 +0000 UTC m=+0.119775386 container health_status 0cdc951f3b455a1459471b20e1f1626ca53f702831c125f835d1b670aeb17ccf (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a42d21893a42142b0b00e96296a13ac9d2c0fedf55d6438ad104fd0309c63376-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_managed=true)
Dec 05 09:19:54 compute-1 podman[219922]: 2025-12-05 09:19:54.613629575 +0000 UTC m=+0.052847081 container health_status e8cea6352187f28e672aa25c227f2705a90d58074b8cf42235a56df5d4026fd5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a42d21893a42142b0b00e96296a13ac9d2c0fedf55d6438ad104fd0309c63376-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125)
Dec 05 09:19:57 compute-1 podman[219941]: 2025-12-05 09:19:57.633714598 +0000 UTC m=+0.072256545 container health_status 3f3288243aefe700d53cffce184353529fbaf6e44f1c19ec4792ed576af41176 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 05 09:19:58 compute-1 nova_compute[189066]: 2025-12-05 09:19:58.015 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:20:01 compute-1 sshd-session[219961]: Received disconnect from 122.168.194.41 port 51680:11: Bye Bye [preauth]
Dec 05 09:20:01 compute-1 sshd-session[219961]: Disconnected from authenticating user root 122.168.194.41 port 51680 [preauth]
Dec 05 09:20:02 compute-1 nova_compute[189066]: 2025-12-05 09:20:02.304 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:20:02 compute-1 podman[219963]: 2025-12-05 09:20:02.625225152 +0000 UTC m=+0.064157458 container health_status b07f36300303a5c1729056a61e344d6c6fd9b2e404082d8e8ce7185d45652c2b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, version=9.6, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, vendor=Red Hat, Inc., config_id=openstack_network_exporter, managed_by=edpm_ansible, release=1755695350, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, name=ubi9-minimal, architecture=x86_64, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Dec 05 09:20:03 compute-1 nova_compute[189066]: 2025-12-05 09:20:03.021 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:20:03 compute-1 nova_compute[189066]: 2025-12-05 09:20:03.021 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:20:03 compute-1 nova_compute[189066]: 2025-12-05 09:20:03.183 189070 DEBUG oslo_concurrency.lockutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:20:03 compute-1 nova_compute[189066]: 2025-12-05 09:20:03.184 189070 DEBUG oslo_concurrency.lockutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:20:03 compute-1 nova_compute[189066]: 2025-12-05 09:20:03.184 189070 DEBUG oslo_concurrency.lockutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:20:03 compute-1 nova_compute[189066]: 2025-12-05 09:20:03.184 189070 DEBUG nova.compute.resource_tracker [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 05 09:20:03 compute-1 nova_compute[189066]: 2025-12-05 09:20:03.362 189070 WARNING nova.virt.libvirt.driver [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 05 09:20:03 compute-1 nova_compute[189066]: 2025-12-05 09:20:03.363 189070 DEBUG nova.compute.resource_tracker [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=6077MB free_disk=73.37201309204102GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 05 09:20:03 compute-1 nova_compute[189066]: 2025-12-05 09:20:03.364 189070 DEBUG oslo_concurrency.lockutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:20:03 compute-1 nova_compute[189066]: 2025-12-05 09:20:03.364 189070 DEBUG oslo_concurrency.lockutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:20:04 compute-1 podman[219984]: 2025-12-05 09:20:04.619110891 +0000 UTC m=+0.062265152 container health_status b9f45cdcb567bb250381fa80d86c78c226d5638c7de1b6d79ee017275814ff68 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Dec 05 09:20:04 compute-1 nova_compute[189066]: 2025-12-05 09:20:04.709 189070 DEBUG nova.compute.resource_tracker [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 05 09:20:04 compute-1 nova_compute[189066]: 2025-12-05 09:20:04.709 189070 DEBUG nova.compute.resource_tracker [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 05 09:20:04 compute-1 nova_compute[189066]: 2025-12-05 09:20:04.741 189070 DEBUG nova.compute.provider_tree [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Inventory has not changed in ProviderTree for provider: be68f9f1-7820-4bfa-8dbd-210e13729f64 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 09:20:04 compute-1 nova_compute[189066]: 2025-12-05 09:20:04.992 189070 DEBUG nova.scheduler.client.report [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Inventory has not changed for provider be68f9f1-7820-4bfa-8dbd-210e13729f64 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 09:20:04 compute-1 nova_compute[189066]: 2025-12-05 09:20:04.994 189070 DEBUG nova.compute.resource_tracker [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 05 09:20:04 compute-1 nova_compute[189066]: 2025-12-05 09:20:04.994 189070 DEBUG oslo_concurrency.lockutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.631s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:20:05 compute-1 nova_compute[189066]: 2025-12-05 09:20:05.989 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:20:05 compute-1 nova_compute[189066]: 2025-12-05 09:20:05.989 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:20:05 compute-1 nova_compute[189066]: 2025-12-05 09:20:05.990 189070 DEBUG nova.compute.manager [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 05 09:20:05 compute-1 nova_compute[189066]: 2025-12-05 09:20:05.990 189070 DEBUG nova.compute.manager [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 05 09:20:06 compute-1 nova_compute[189066]: 2025-12-05 09:20:06.030 189070 DEBUG nova.compute.manager [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Dec 05 09:20:06 compute-1 nova_compute[189066]: 2025-12-05 09:20:06.030 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:20:06 compute-1 nova_compute[189066]: 2025-12-05 09:20:06.031 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:20:06 compute-1 nova_compute[189066]: 2025-12-05 09:20:06.031 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:20:07 compute-1 nova_compute[189066]: 2025-12-05 09:20:07.020 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:20:07 compute-1 nova_compute[189066]: 2025-12-05 09:20:07.021 189070 DEBUG nova.compute.manager [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 05 09:20:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:20:08.864 105272 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:20:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:20:08.865 105272 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:20:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:20:08.865 105272 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:20:11 compute-1 podman[220009]: 2025-12-05 09:20:11.650915256 +0000 UTC m=+0.082488405 container health_status 550b604b3b40c506829669f3af0f3e8dc4e7ac01464222ce7f26998708fe1a96 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Dec 05 09:20:18 compute-1 sshd-session[220033]: Received disconnect from 185.118.15.236 port 35664:11: Bye Bye [preauth]
Dec 05 09:20:18 compute-1 sshd-session[220033]: Disconnected from authenticating user root 185.118.15.236 port 35664 [preauth]
Dec 05 09:20:20 compute-1 podman[220035]: 2025-12-05 09:20:20.629224632 +0000 UTC m=+0.066614807 container health_status d60d3dfd47255bc7c068dbedc03dbf86678863f0414a1b8f8536e4d53dd67a13 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a42d21893a42142b0b00e96296a13ac9d2c0fedf55d6438ad104fd0309c63376-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible)
Dec 05 09:20:23 compute-1 podman[220055]: 2025-12-05 09:20:23.648597497 +0000 UTC m=+0.087650351 container health_status 0cdc951f3b455a1459471b20e1f1626ca53f702831c125f835d1b670aeb17ccf (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a42d21893a42142b0b00e96296a13ac9d2c0fedf55d6438ad104fd0309c63376-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Dec 05 09:20:25 compute-1 podman[220081]: 2025-12-05 09:20:25.613704464 +0000 UTC m=+0.052585675 container health_status e8cea6352187f28e672aa25c227f2705a90d58074b8cf42235a56df5d4026fd5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a42d21893a42142b0b00e96296a13ac9d2c0fedf55d6438ad104fd0309c63376-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2)
Dec 05 09:20:28 compute-1 podman[220099]: 2025-12-05 09:20:28.629149223 +0000 UTC m=+0.071714192 container health_status 3f3288243aefe700d53cffce184353529fbaf6e44f1c19ec4792ed576af41176 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3)
Dec 05 09:20:33 compute-1 podman[220119]: 2025-12-05 09:20:33.626276994 +0000 UTC m=+0.063955412 container health_status b07f36300303a5c1729056a61e344d6c6fd9b2e404082d8e8ce7185d45652c2b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.33.7, managed_by=edpm_ansible, vendor=Red Hat, Inc., version=9.6, name=ubi9-minimal, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, config_id=openstack_network_exporter, vcs-type=git, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.tags=minimal rhel9, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, io.openshift.expose-services=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container)
Dec 05 09:20:35 compute-1 podman[220140]: 2025-12-05 09:20:35.640488301 +0000 UTC m=+0.081028790 container health_status b9f45cdcb567bb250381fa80d86c78c226d5638c7de1b6d79ee017275814ff68 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Dec 05 09:20:42 compute-1 podman[220166]: 2025-12-05 09:20:42.616893193 +0000 UTC m=+0.060616001 container health_status 550b604b3b40c506829669f3af0f3e8dc4e7ac01464222ce7f26998708fe1a96 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Dec 05 09:20:51 compute-1 podman[220192]: 2025-12-05 09:20:51.626992496 +0000 UTC m=+0.064691480 container health_status d60d3dfd47255bc7c068dbedc03dbf86678863f0414a1b8f8536e4d53dd67a13 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a42d21893a42142b0b00e96296a13ac9d2c0fedf55d6438ad104fd0309c63376-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute)
Dec 05 09:20:54 compute-1 podman[220212]: 2025-12-05 09:20:54.663628033 +0000 UTC m=+0.104793780 container health_status 0cdc951f3b455a1459471b20e1f1626ca53f702831c125f835d1b670aeb17ccf (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a42d21893a42142b0b00e96296a13ac9d2c0fedf55d6438ad104fd0309c63376-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Dec 05 09:20:55 compute-1 sshd-session[220190]: Received disconnect from 101.47.162.91 port 42764:11: Bye Bye [preauth]
Dec 05 09:20:55 compute-1 sshd-session[220190]: Disconnected from authenticating user root 101.47.162.91 port 42764 [preauth]
Dec 05 09:20:56 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:20:56.010 105272 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=4, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f2:d3:89', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '8e:43:2c:7d:20:dd'}, ipsec=False) old=SB_Global(nb_cfg=3) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 09:20:56 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:20:56.012 105272 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Dec 05 09:20:56 compute-1 podman[220238]: 2025-12-05 09:20:56.647836896 +0000 UTC m=+0.087912762 container health_status e8cea6352187f28e672aa25c227f2705a90d58074b8cf42235a56df5d4026fd5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a42d21893a42142b0b00e96296a13ac9d2c0fedf55d6438ad104fd0309c63376-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3)
Dec 05 09:20:59 compute-1 podman[220257]: 2025-12-05 09:20:59.62113171 +0000 UTC m=+0.061488990 container health_status 3f3288243aefe700d53cffce184353529fbaf6e44f1c19ec4792ed576af41176 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd)
Dec 05 09:21:00 compute-1 nova_compute[189066]: 2025-12-05 09:21:00.022 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:21:03 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:21:03.015 105272 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=2deaed7a-68f6-453c-b7f8-10ef033f3762, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '4'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 09:21:03 compute-1 nova_compute[189066]: 2025-12-05 09:21:03.021 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:21:03 compute-1 sshd-session[220277]: Received disconnect from 43.225.158.169 port 48875:11: Bye Bye [preauth]
Dec 05 09:21:03 compute-1 sshd-session[220277]: Disconnected from authenticating user root 43.225.158.169 port 48875 [preauth]
Dec 05 09:21:04 compute-1 nova_compute[189066]: 2025-12-05 09:21:04.014 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:21:04 compute-1 podman[220279]: 2025-12-05 09:21:04.619449366 +0000 UTC m=+0.055203805 container health_status b07f36300303a5c1729056a61e344d6c6fd9b2e404082d8e8ce7185d45652c2b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Red Hat, Inc., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, io.openshift.expose-services=, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, name=ubi9-minimal, vendor=Red Hat, Inc., architecture=x86_64, io.openshift.tags=minimal rhel9, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=openstack_network_exporter, container_name=openstack_network_exporter, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, distribution-scope=public)
Dec 05 09:21:05 compute-1 nova_compute[189066]: 2025-12-05 09:21:05.020 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:21:05 compute-1 nova_compute[189066]: 2025-12-05 09:21:05.065 189070 DEBUG oslo_concurrency.lockutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:21:05 compute-1 nova_compute[189066]: 2025-12-05 09:21:05.065 189070 DEBUG oslo_concurrency.lockutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:21:05 compute-1 nova_compute[189066]: 2025-12-05 09:21:05.066 189070 DEBUG oslo_concurrency.lockutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:21:05 compute-1 nova_compute[189066]: 2025-12-05 09:21:05.066 189070 DEBUG nova.compute.resource_tracker [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 05 09:21:05 compute-1 nova_compute[189066]: 2025-12-05 09:21:05.245 189070 WARNING nova.virt.libvirt.driver [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 05 09:21:05 compute-1 nova_compute[189066]: 2025-12-05 09:21:05.247 189070 DEBUG nova.compute.resource_tracker [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=6078MB free_disk=73.37199401855469GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 05 09:21:05 compute-1 nova_compute[189066]: 2025-12-05 09:21:05.247 189070 DEBUG oslo_concurrency.lockutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:21:05 compute-1 nova_compute[189066]: 2025-12-05 09:21:05.248 189070 DEBUG oslo_concurrency.lockutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:21:05 compute-1 nova_compute[189066]: 2025-12-05 09:21:05.337 189070 DEBUG nova.compute.resource_tracker [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 05 09:21:05 compute-1 nova_compute[189066]: 2025-12-05 09:21:05.338 189070 DEBUG nova.compute.resource_tracker [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 05 09:21:05 compute-1 nova_compute[189066]: 2025-12-05 09:21:05.376 189070 DEBUG nova.compute.provider_tree [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Inventory has not changed in ProviderTree for provider: be68f9f1-7820-4bfa-8dbd-210e13729f64 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 09:21:05 compute-1 nova_compute[189066]: 2025-12-05 09:21:05.398 189070 DEBUG nova.scheduler.client.report [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Inventory has not changed for provider be68f9f1-7820-4bfa-8dbd-210e13729f64 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 09:21:05 compute-1 nova_compute[189066]: 2025-12-05 09:21:05.400 189070 DEBUG nova.compute.resource_tracker [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 05 09:21:05 compute-1 nova_compute[189066]: 2025-12-05 09:21:05.401 189070 DEBUG oslo_concurrency.lockutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.153s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:21:06 compute-1 nova_compute[189066]: 2025-12-05 09:21:06.401 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:21:06 compute-1 nova_compute[189066]: 2025-12-05 09:21:06.403 189070 DEBUG nova.compute.manager [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 05 09:21:06 compute-1 nova_compute[189066]: 2025-12-05 09:21:06.403 189070 DEBUG nova.compute.manager [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 05 09:21:06 compute-1 podman[220300]: 2025-12-05 09:21:06.611144895 +0000 UTC m=+0.051693756 container health_status b9f45cdcb567bb250381fa80d86c78c226d5638c7de1b6d79ee017275814ff68 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 05 09:21:06 compute-1 nova_compute[189066]: 2025-12-05 09:21:06.618 189070 DEBUG nova.compute.manager [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Dec 05 09:21:06 compute-1 nova_compute[189066]: 2025-12-05 09:21:06.619 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:21:06 compute-1 nova_compute[189066]: 2025-12-05 09:21:06.619 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:21:07 compute-1 nova_compute[189066]: 2025-12-05 09:21:07.020 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:21:07 compute-1 nova_compute[189066]: 2025-12-05 09:21:07.021 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:21:07 compute-1 nova_compute[189066]: 2025-12-05 09:21:07.021 189070 DEBUG nova.compute.manager [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 05 09:21:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:21:08.865 105272 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:21:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:21:08.866 105272 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:21:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:21:08.866 105272 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:21:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:21:10.739 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 09:21:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:21:10.740 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 09:21:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:21:10.740 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 09:21:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:21:10.740 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 09:21:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:21:10.740 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 09:21:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:21:10.740 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 09:21:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:21:10.740 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 09:21:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:21:10.740 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 09:21:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:21:10.741 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 09:21:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:21:10.741 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 09:21:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:21:10.741 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 09:21:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:21:10.741 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 09:21:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:21:10.741 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 09:21:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:21:10.741 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 09:21:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:21:10.741 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 09:21:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:21:10.741 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 09:21:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:21:10.741 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 09:21:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:21:10.741 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 09:21:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:21:10.741 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 09:21:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:21:10.741 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 09:21:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:21:10.742 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 09:21:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:21:10.742 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 09:21:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:21:10.742 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 09:21:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:21:10.742 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 09:21:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:21:10.742 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 09:21:13 compute-1 podman[220324]: 2025-12-05 09:21:13.612638435 +0000 UTC m=+0.058651810 container health_status 550b604b3b40c506829669f3af0f3e8dc4e7ac01464222ce7f26998708fe1a96 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Dec 05 09:21:20 compute-1 sshd-session[220348]: Received disconnect from 122.168.194.41 port 47716:11: Bye Bye [preauth]
Dec 05 09:21:20 compute-1 sshd-session[220348]: Disconnected from authenticating user root 122.168.194.41 port 47716 [preauth]
Dec 05 09:21:22 compute-1 podman[220350]: 2025-12-05 09:21:22.619546583 +0000 UTC m=+0.063530470 container health_status d60d3dfd47255bc7c068dbedc03dbf86678863f0414a1b8f8536e4d53dd67a13 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a42d21893a42142b0b00e96296a13ac9d2c0fedf55d6438ad104fd0309c63376-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible)
Dec 05 09:21:25 compute-1 podman[220371]: 2025-12-05 09:21:25.650787997 +0000 UTC m=+0.089407809 container health_status 0cdc951f3b455a1459471b20e1f1626ca53f702831c125f835d1b670aeb17ccf (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, tcib_managed=true, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a42d21893a42142b0b00e96296a13ac9d2c0fedf55d6438ad104fd0309c63376-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Dec 05 09:21:27 compute-1 podman[220396]: 2025-12-05 09:21:27.612889966 +0000 UTC m=+0.058604049 container health_status e8cea6352187f28e672aa25c227f2705a90d58074b8cf42235a56df5d4026fd5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a42d21893a42142b0b00e96296a13ac9d2c0fedf55d6438ad104fd0309c63376-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_metadata_agent)
Dec 05 09:21:30 compute-1 podman[220415]: 2025-12-05 09:21:30.624469624 +0000 UTC m=+0.062367751 container health_status 3f3288243aefe700d53cffce184353529fbaf6e44f1c19ec4792ed576af41176 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=multipathd, container_name=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 05 09:21:35 compute-1 podman[220435]: 2025-12-05 09:21:35.641939782 +0000 UTC m=+0.073753802 container health_status b07f36300303a5c1729056a61e344d6c6fd9b2e404082d8e8ce7185d45652c2b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.component=ubi9-minimal-container, config_id=openstack_network_exporter, io.buildah.version=1.33.7, vcs-type=git, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.tags=minimal rhel9, container_name=openstack_network_exporter, vendor=Red Hat, Inc., maintainer=Red Hat, Inc., version=9.6, architecture=x86_64, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, name=ubi9-minimal, release=1755695350, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Dec 05 09:21:37 compute-1 podman[220456]: 2025-12-05 09:21:37.642977734 +0000 UTC m=+0.081385822 container health_status b9f45cdcb567bb250381fa80d86c78c226d5638c7de1b6d79ee017275814ff68 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Dec 05 09:21:44 compute-1 podman[220483]: 2025-12-05 09:21:44.676795231 +0000 UTC m=+0.116605741 container health_status 550b604b3b40c506829669f3af0f3e8dc4e7ac01464222ce7f26998708fe1a96 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec 05 09:21:45 compute-1 sshd-session[220481]: Received disconnect from 185.118.15.236 port 35786:11: Bye Bye [preauth]
Dec 05 09:21:45 compute-1 sshd-session[220481]: Disconnected from authenticating user root 185.118.15.236 port 35786 [preauth]
Dec 05 09:21:53 compute-1 podman[220506]: 2025-12-05 09:21:53.636274687 +0000 UTC m=+0.074035139 container health_status d60d3dfd47255bc7c068dbedc03dbf86678863f0414a1b8f8536e4d53dd67a13 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a42d21893a42142b0b00e96296a13ac9d2c0fedf55d6438ad104fd0309c63376-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Dec 05 09:21:55 compute-1 nova_compute[189066]: 2025-12-05 09:21:55.946 189070 DEBUG oslo_concurrency.lockutils [None req-b09eff91-d682-40db-9795-f01015de55bc 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] Acquiring lock "2b94d7d0-b000-460f-8883-8953d60115d0" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:21:55 compute-1 nova_compute[189066]: 2025-12-05 09:21:55.946 189070 DEBUG oslo_concurrency.lockutils [None req-b09eff91-d682-40db-9795-f01015de55bc 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] Lock "2b94d7d0-b000-460f-8883-8953d60115d0" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:21:56 compute-1 nova_compute[189066]: 2025-12-05 09:21:56.039 189070 DEBUG nova.compute.manager [None req-b09eff91-d682-40db-9795-f01015de55bc 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] [instance: 2b94d7d0-b000-460f-8883-8953d60115d0] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Dec 05 09:21:56 compute-1 nova_compute[189066]: 2025-12-05 09:21:56.204 189070 DEBUG oslo_concurrency.lockutils [None req-b09eff91-d682-40db-9795-f01015de55bc 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:21:56 compute-1 nova_compute[189066]: 2025-12-05 09:21:56.205 189070 DEBUG oslo_concurrency.lockutils [None req-b09eff91-d682-40db-9795-f01015de55bc 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:21:56 compute-1 nova_compute[189066]: 2025-12-05 09:21:56.216 189070 DEBUG nova.virt.hardware [None req-b09eff91-d682-40db-9795-f01015de55bc 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Dec 05 09:21:56 compute-1 nova_compute[189066]: 2025-12-05 09:21:56.217 189070 INFO nova.compute.claims [None req-b09eff91-d682-40db-9795-f01015de55bc 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] [instance: 2b94d7d0-b000-460f-8883-8953d60115d0] Claim successful on node compute-1.ctlplane.example.com
Dec 05 09:21:56 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:21:56.638 105272 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=5, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f2:d3:89', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '8e:43:2c:7d:20:dd'}, ipsec=False) old=SB_Global(nb_cfg=4) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 09:21:56 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:21:56.640 105272 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Dec 05 09:21:56 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:21:56.641 105272 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=2deaed7a-68f6-453c-b7f8-10ef033f3762, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '5'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 09:21:56 compute-1 podman[220526]: 2025-12-05 09:21:56.70150395 +0000 UTC m=+0.140357537 container health_status 0cdc951f3b455a1459471b20e1f1626ca53f702831c125f835d1b670aeb17ccf (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a42d21893a42142b0b00e96296a13ac9d2c0fedf55d6438ad104fd0309c63376-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_managed=true)
Dec 05 09:21:56 compute-1 nova_compute[189066]: 2025-12-05 09:21:56.718 189070 DEBUG nova.compute.provider_tree [None req-b09eff91-d682-40db-9795-f01015de55bc 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] Inventory has not changed in ProviderTree for provider: be68f9f1-7820-4bfa-8dbd-210e13729f64 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 09:21:57 compute-1 nova_compute[189066]: 2025-12-05 09:21:57.483 189070 DEBUG nova.scheduler.client.report [None req-b09eff91-d682-40db-9795-f01015de55bc 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] Inventory has not changed for provider be68f9f1-7820-4bfa-8dbd-210e13729f64 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 09:21:57 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:21:57.500 105272 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d1:dc:b6 10.100.0.2 2001:db8::f816:3eff:fed1:dcb6'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8::f816:3eff:fed1:dcb6/64', 'neutron:device_id': 'ovnmeta-f58cc02f-396f-494d-8f1e-d6f4412689c2', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f58cc02f-396f-494d-8f1e-d6f4412689c2', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fa1cd463d74b49139a088d332d37e611', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a03fccf0-8b31-495a-b68a-70be5d3c0194, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=fdbc0f28-ff71-4c6c-87fd-d55723f69ac2) old=Port_Binding(mac=['fa:16:3e:d1:dc:b6 10.100.0.2'], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-f58cc02f-396f-494d-8f1e-d6f4412689c2', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f58cc02f-396f-494d-8f1e-d6f4412689c2', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fa1cd463d74b49139a088d332d37e611', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 09:21:57 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:21:57.502 105272 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port fdbc0f28-ff71-4c6c-87fd-d55723f69ac2 in datapath f58cc02f-396f-494d-8f1e-d6f4412689c2 updated
Dec 05 09:21:57 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:21:57.505 105272 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network f58cc02f-396f-494d-8f1e-d6f4412689c2, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 05 09:21:57 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:21:57.507 105272 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.default', '--privsep_sock_path', '/tmp/tmppzj8vekf/privsep.sock']
Dec 05 09:21:57 compute-1 nova_compute[189066]: 2025-12-05 09:21:57.524 189070 DEBUG oslo_concurrency.lockutils [None req-b09eff91-d682-40db-9795-f01015de55bc 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 1.319s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:21:57 compute-1 nova_compute[189066]: 2025-12-05 09:21:57.525 189070 DEBUG nova.compute.manager [None req-b09eff91-d682-40db-9795-f01015de55bc 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] [instance: 2b94d7d0-b000-460f-8883-8953d60115d0] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Dec 05 09:21:57 compute-1 nova_compute[189066]: 2025-12-05 09:21:57.684 189070 DEBUG nova.compute.manager [None req-b09eff91-d682-40db-9795-f01015de55bc 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] [instance: 2b94d7d0-b000-460f-8883-8953d60115d0] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Dec 05 09:21:57 compute-1 nova_compute[189066]: 2025-12-05 09:21:57.685 189070 DEBUG nova.network.neutron [None req-b09eff91-d682-40db-9795-f01015de55bc 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] [instance: 2b94d7d0-b000-460f-8883-8953d60115d0] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Dec 05 09:21:57 compute-1 nova_compute[189066]: 2025-12-05 09:21:57.753 189070 INFO nova.virt.libvirt.driver [None req-b09eff91-d682-40db-9795-f01015de55bc 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] [instance: 2b94d7d0-b000-460f-8883-8953d60115d0] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Dec 05 09:21:57 compute-1 nova_compute[189066]: 2025-12-05 09:21:57.824 189070 DEBUG nova.compute.manager [None req-b09eff91-d682-40db-9795-f01015de55bc 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] [instance: 2b94d7d0-b000-460f-8883-8953d60115d0] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Dec 05 09:21:58 compute-1 nova_compute[189066]: 2025-12-05 09:21:58.000 189070 DEBUG nova.compute.manager [None req-b09eff91-d682-40db-9795-f01015de55bc 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] [instance: 2b94d7d0-b000-460f-8883-8953d60115d0] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Dec 05 09:21:58 compute-1 nova_compute[189066]: 2025-12-05 09:21:58.002 189070 DEBUG nova.virt.libvirt.driver [None req-b09eff91-d682-40db-9795-f01015de55bc 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] [instance: 2b94d7d0-b000-460f-8883-8953d60115d0] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Dec 05 09:21:58 compute-1 nova_compute[189066]: 2025-12-05 09:21:58.002 189070 INFO nova.virt.libvirt.driver [None req-b09eff91-d682-40db-9795-f01015de55bc 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] [instance: 2b94d7d0-b000-460f-8883-8953d60115d0] Creating image(s)
Dec 05 09:21:58 compute-1 nova_compute[189066]: 2025-12-05 09:21:58.003 189070 DEBUG oslo_concurrency.lockutils [None req-b09eff91-d682-40db-9795-f01015de55bc 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] Acquiring lock "/var/lib/nova/instances/2b94d7d0-b000-460f-8883-8953d60115d0/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:21:58 compute-1 nova_compute[189066]: 2025-12-05 09:21:58.013 189070 DEBUG oslo_concurrency.lockutils [None req-b09eff91-d682-40db-9795-f01015de55bc 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] Lock "/var/lib/nova/instances/2b94d7d0-b000-460f-8883-8953d60115d0/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.010s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:21:58 compute-1 nova_compute[189066]: 2025-12-05 09:21:58.015 189070 DEBUG oslo_concurrency.lockutils [None req-b09eff91-d682-40db-9795-f01015de55bc 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] Lock "/var/lib/nova/instances/2b94d7d0-b000-460f-8883-8953d60115d0/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:21:58 compute-1 nova_compute[189066]: 2025-12-05 09:21:58.015 189070 DEBUG oslo_concurrency.lockutils [None req-b09eff91-d682-40db-9795-f01015de55bc 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] Acquiring lock "5b1bdb5cc50b9bf43ebe3094e9279a12a18df0ce" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:21:58 compute-1 nova_compute[189066]: 2025-12-05 09:21:58.016 189070 DEBUG oslo_concurrency.lockutils [None req-b09eff91-d682-40db-9795-f01015de55bc 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] Lock "5b1bdb5cc50b9bf43ebe3094e9279a12a18df0ce" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:21:58 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:21:58.443 105272 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap
Dec 05 09:21:58 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:21:58.444 105272 DEBUG oslo.privsep.daemon [-] Accepted privsep connection to /tmp/tmppzj8vekf/privsep.sock __init__ /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:362
Dec 05 09:21:58 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:21:58.245 220556 INFO oslo.privsep.daemon [-] privsep daemon starting
Dec 05 09:21:58 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:21:58.250 220556 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Dec 05 09:21:58 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:21:58.253 220556 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_NET_ADMIN|CAP_SYS_ADMIN|CAP_SYS_PTRACE/CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_NET_ADMIN|CAP_SYS_ADMIN|CAP_SYS_PTRACE/none
Dec 05 09:21:58 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:21:58.253 220556 INFO oslo.privsep.daemon [-] privsep daemon running as pid 220556
Dec 05 09:21:58 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:21:58.447 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[4913476f-d70f-4cfb-80b8-866637e5b0c4]: (2,) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:21:58 compute-1 podman[220560]: 2025-12-05 09:21:58.608531769 +0000 UTC m=+0.052609059 container health_status e8cea6352187f28e672aa25c227f2705a90d58074b8cf42235a56df5d4026fd5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a42d21893a42142b0b00e96296a13ac9d2c0fedf55d6438ad104fd0309c63376-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Dec 05 09:21:59 compute-1 nova_compute[189066]: 2025-12-05 09:21:59.016 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:21:59 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:21:59.173 220556 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "context-manager" by "neutron_lib.db.api._create_context_manager" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:21:59 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:21:59.174 220556 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" acquired by "neutron_lib.db.api._create_context_manager" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:21:59 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:21:59.174 220556 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" "released" by "neutron_lib.db.api._create_context_manager" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:21:59 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:21:59.285 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[0c51cfcd-382d-4261-ba3a-2acb5cf61942]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:22:00 compute-1 nova_compute[189066]: 2025-12-05 09:22:00.623 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:22:00 compute-1 nova_compute[189066]: 2025-12-05 09:22:00.623 189070 DEBUG nova.compute.manager [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Dec 05 09:22:00 compute-1 nova_compute[189066]: 2025-12-05 09:22:00.657 189070 DEBUG nova.compute.manager [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Dec 05 09:22:01 compute-1 podman[220580]: 2025-12-05 09:22:01.637814435 +0000 UTC m=+0.071393474 container health_status 3f3288243aefe700d53cffce184353529fbaf6e44f1c19ec4792ed576af41176 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Dec 05 09:22:02 compute-1 nova_compute[189066]: 2025-12-05 09:22:02.056 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:22:02 compute-1 nova_compute[189066]: 2025-12-05 09:22:02.266 189070 WARNING oslo_policy.policy [None req-b09eff91-d682-40db-9795-f01015de55bc 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] JSON formatted policy_file support is deprecated since Victoria release. You need to use YAML format which will be default in future. You can use ``oslopolicy-convert-json-to-yaml`` tool to convert existing JSON-formatted policy file to YAML-formatted in backward compatible way: https://docs.openstack.org/oslo.policy/latest/cli/oslopolicy-convert-json-to-yaml.html.
Dec 05 09:22:02 compute-1 nova_compute[189066]: 2025-12-05 09:22:02.267 189070 WARNING oslo_policy.policy [None req-b09eff91-d682-40db-9795-f01015de55bc 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] JSON formatted policy_file support is deprecated since Victoria release. You need to use YAML format which will be default in future. You can use ``oslopolicy-convert-json-to-yaml`` tool to convert existing JSON-formatted policy file to YAML-formatted in backward compatible way: https://docs.openstack.org/oslo.policy/latest/cli/oslopolicy-convert-json-to-yaml.html.
Dec 05 09:22:02 compute-1 nova_compute[189066]: 2025-12-05 09:22:02.270 189070 DEBUG nova.policy [None req-b09eff91-d682-40db-9795-f01015de55bc 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '65751a90715341b2984ef84ebbaa1650', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'e26ae3fdd48d4947978a480f70e14f84', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Dec 05 09:22:02 compute-1 nova_compute[189066]: 2025-12-05 09:22:02.311 189070 DEBUG oslo_concurrency.processutils [None req-b09eff91-d682-40db-9795-f01015de55bc 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5b1bdb5cc50b9bf43ebe3094e9279a12a18df0ce.part --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 09:22:02 compute-1 nova_compute[189066]: 2025-12-05 09:22:02.396 189070 DEBUG oslo_concurrency.processutils [None req-b09eff91-d682-40db-9795-f01015de55bc 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5b1bdb5cc50b9bf43ebe3094e9279a12a18df0ce.part --force-share --output=json" returned: 0 in 0.084s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 09:22:02 compute-1 nova_compute[189066]: 2025-12-05 09:22:02.397 189070 DEBUG nova.virt.images [None req-b09eff91-d682-40db-9795-f01015de55bc 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] 3ebffd97-b242-42d7-b245-ebdaf8e4377c was qcow2, converting to raw fetch_to_raw /usr/lib/python3.9/site-packages/nova/virt/images.py:242
Dec 05 09:22:02 compute-1 nova_compute[189066]: 2025-12-05 09:22:02.399 189070 DEBUG nova.privsep.utils [None req-b09eff91-d682-40db-9795-f01015de55bc 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63
Dec 05 09:22:02 compute-1 nova_compute[189066]: 2025-12-05 09:22:02.399 189070 DEBUG oslo_concurrency.processutils [None req-b09eff91-d682-40db-9795-f01015de55bc 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] Running cmd (subprocess): qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/5b1bdb5cc50b9bf43ebe3094e9279a12a18df0ce.part /var/lib/nova/instances/_base/5b1bdb5cc50b9bf43ebe3094e9279a12a18df0ce.converted execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 09:22:03 compute-1 nova_compute[189066]: 2025-12-05 09:22:03.021 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:22:03 compute-1 nova_compute[189066]: 2025-12-05 09:22:03.439 189070 DEBUG oslo_concurrency.processutils [None req-b09eff91-d682-40db-9795-f01015de55bc 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] CMD "qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/5b1bdb5cc50b9bf43ebe3094e9279a12a18df0ce.part /var/lib/nova/instances/_base/5b1bdb5cc50b9bf43ebe3094e9279a12a18df0ce.converted" returned: 0 in 1.039s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 09:22:03 compute-1 nova_compute[189066]: 2025-12-05 09:22:03.444 189070 DEBUG oslo_concurrency.processutils [None req-b09eff91-d682-40db-9795-f01015de55bc 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5b1bdb5cc50b9bf43ebe3094e9279a12a18df0ce.converted --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 09:22:03 compute-1 nova_compute[189066]: 2025-12-05 09:22:03.525 189070 DEBUG oslo_concurrency.processutils [None req-b09eff91-d682-40db-9795-f01015de55bc 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5b1bdb5cc50b9bf43ebe3094e9279a12a18df0ce.converted --force-share --output=json" returned: 0 in 0.081s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 09:22:03 compute-1 nova_compute[189066]: 2025-12-05 09:22:03.526 189070 DEBUG oslo_concurrency.lockutils [None req-b09eff91-d682-40db-9795-f01015de55bc 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] Lock "5b1bdb5cc50b9bf43ebe3094e9279a12a18df0ce" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 5.510s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:22:03 compute-1 nova_compute[189066]: 2025-12-05 09:22:03.541 189070 INFO oslo.privsep.daemon [None req-b09eff91-d682-40db-9795-f01015de55bc 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] Running privsep helper: ['sudo', 'nova-rootwrap', '/etc/nova/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/nova/nova.conf', '--config-file', '/etc/nova/nova-compute.conf', '--config-dir', '/etc/nova/nova.conf.d', '--privsep_context', 'nova.privsep.sys_admin_pctxt', '--privsep_sock_path', '/tmp/tmpkg6gi1j1/privsep.sock']
Dec 05 09:22:04 compute-1 nova_compute[189066]: 2025-12-05 09:22:04.169 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:22:04 compute-1 nova_compute[189066]: 2025-12-05 09:22:04.171 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:22:04 compute-1 nova_compute[189066]: 2025-12-05 09:22:04.362 189070 INFO oslo.privsep.daemon [None req-b09eff91-d682-40db-9795-f01015de55bc 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] Spawned new privsep daemon via rootwrap
Dec 05 09:22:04 compute-1 nova_compute[189066]: 2025-12-05 09:22:04.178 220618 INFO oslo.privsep.daemon [-] privsep daemon starting
Dec 05 09:22:04 compute-1 nova_compute[189066]: 2025-12-05 09:22:04.185 220618 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Dec 05 09:22:04 compute-1 nova_compute[189066]: 2025-12-05 09:22:04.187 220618 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_CHOWN|CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_FOWNER|CAP_NET_ADMIN|CAP_SYS_ADMIN/CAP_CHOWN|CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_FOWNER|CAP_NET_ADMIN|CAP_SYS_ADMIN/none
Dec 05 09:22:04 compute-1 nova_compute[189066]: 2025-12-05 09:22:04.187 220618 INFO oslo.privsep.daemon [-] privsep daemon running as pid 220618
Dec 05 09:22:04 compute-1 nova_compute[189066]: 2025-12-05 09:22:04.471 189070 DEBUG oslo_concurrency.processutils [None req-b09eff91-d682-40db-9795-f01015de55bc 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5b1bdb5cc50b9bf43ebe3094e9279a12a18df0ce --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 09:22:04 compute-1 nova_compute[189066]: 2025-12-05 09:22:04.526 189070 DEBUG oslo_concurrency.processutils [None req-b09eff91-d682-40db-9795-f01015de55bc 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5b1bdb5cc50b9bf43ebe3094e9279a12a18df0ce --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 09:22:04 compute-1 nova_compute[189066]: 2025-12-05 09:22:04.527 189070 DEBUG oslo_concurrency.lockutils [None req-b09eff91-d682-40db-9795-f01015de55bc 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] Acquiring lock "5b1bdb5cc50b9bf43ebe3094e9279a12a18df0ce" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:22:04 compute-1 nova_compute[189066]: 2025-12-05 09:22:04.528 189070 DEBUG oslo_concurrency.lockutils [None req-b09eff91-d682-40db-9795-f01015de55bc 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] Lock "5b1bdb5cc50b9bf43ebe3094e9279a12a18df0ce" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:22:04 compute-1 nova_compute[189066]: 2025-12-05 09:22:04.539 189070 DEBUG oslo_concurrency.processutils [None req-b09eff91-d682-40db-9795-f01015de55bc 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5b1bdb5cc50b9bf43ebe3094e9279a12a18df0ce --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 09:22:04 compute-1 nova_compute[189066]: 2025-12-05 09:22:04.592 189070 DEBUG oslo_concurrency.processutils [None req-b09eff91-d682-40db-9795-f01015de55bc 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5b1bdb5cc50b9bf43ebe3094e9279a12a18df0ce --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 09:22:04 compute-1 nova_compute[189066]: 2025-12-05 09:22:04.593 189070 DEBUG oslo_concurrency.processutils [None req-b09eff91-d682-40db-9795-f01015de55bc 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5b1bdb5cc50b9bf43ebe3094e9279a12a18df0ce,backing_fmt=raw /var/lib/nova/instances/2b94d7d0-b000-460f-8883-8953d60115d0/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 09:22:04 compute-1 nova_compute[189066]: 2025-12-05 09:22:04.629 189070 DEBUG oslo_concurrency.processutils [None req-b09eff91-d682-40db-9795-f01015de55bc 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5b1bdb5cc50b9bf43ebe3094e9279a12a18df0ce,backing_fmt=raw /var/lib/nova/instances/2b94d7d0-b000-460f-8883-8953d60115d0/disk 1073741824" returned: 0 in 0.035s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 09:22:04 compute-1 nova_compute[189066]: 2025-12-05 09:22:04.630 189070 DEBUG oslo_concurrency.lockutils [None req-b09eff91-d682-40db-9795-f01015de55bc 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] Lock "5b1bdb5cc50b9bf43ebe3094e9279a12a18df0ce" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.102s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:22:04 compute-1 nova_compute[189066]: 2025-12-05 09:22:04.630 189070 DEBUG oslo_concurrency.processutils [None req-b09eff91-d682-40db-9795-f01015de55bc 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5b1bdb5cc50b9bf43ebe3094e9279a12a18df0ce --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 09:22:04 compute-1 nova_compute[189066]: 2025-12-05 09:22:04.682 189070 DEBUG oslo_concurrency.processutils [None req-b09eff91-d682-40db-9795-f01015de55bc 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5b1bdb5cc50b9bf43ebe3094e9279a12a18df0ce --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 09:22:04 compute-1 nova_compute[189066]: 2025-12-05 09:22:04.684 189070 DEBUG nova.virt.disk.api [None req-b09eff91-d682-40db-9795-f01015de55bc 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] Checking if we can resize image /var/lib/nova/instances/2b94d7d0-b000-460f-8883-8953d60115d0/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Dec 05 09:22:04 compute-1 nova_compute[189066]: 2025-12-05 09:22:04.684 189070 DEBUG oslo_concurrency.processutils [None req-b09eff91-d682-40db-9795-f01015de55bc 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2b94d7d0-b000-460f-8883-8953d60115d0/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 09:22:04 compute-1 nova_compute[189066]: 2025-12-05 09:22:04.738 189070 DEBUG oslo_concurrency.processutils [None req-b09eff91-d682-40db-9795-f01015de55bc 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2b94d7d0-b000-460f-8883-8953d60115d0/disk --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 09:22:04 compute-1 nova_compute[189066]: 2025-12-05 09:22:04.739 189070 DEBUG nova.virt.disk.api [None req-b09eff91-d682-40db-9795-f01015de55bc 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] Cannot resize image /var/lib/nova/instances/2b94d7d0-b000-460f-8883-8953d60115d0/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Dec 05 09:22:04 compute-1 nova_compute[189066]: 2025-12-05 09:22:04.739 189070 DEBUG nova.objects.instance [None req-b09eff91-d682-40db-9795-f01015de55bc 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] Lazy-loading 'migration_context' on Instance uuid 2b94d7d0-b000-460f-8883-8953d60115d0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 09:22:04 compute-1 nova_compute[189066]: 2025-12-05 09:22:04.760 189070 DEBUG nova.virt.libvirt.driver [None req-b09eff91-d682-40db-9795-f01015de55bc 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] [instance: 2b94d7d0-b000-460f-8883-8953d60115d0] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Dec 05 09:22:04 compute-1 nova_compute[189066]: 2025-12-05 09:22:04.760 189070 DEBUG nova.virt.libvirt.driver [None req-b09eff91-d682-40db-9795-f01015de55bc 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] [instance: 2b94d7d0-b000-460f-8883-8953d60115d0] Ensure instance console log exists: /var/lib/nova/instances/2b94d7d0-b000-460f-8883-8953d60115d0/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Dec 05 09:22:04 compute-1 nova_compute[189066]: 2025-12-05 09:22:04.761 189070 DEBUG oslo_concurrency.lockutils [None req-b09eff91-d682-40db-9795-f01015de55bc 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:22:04 compute-1 nova_compute[189066]: 2025-12-05 09:22:04.761 189070 DEBUG oslo_concurrency.lockutils [None req-b09eff91-d682-40db-9795-f01015de55bc 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:22:04 compute-1 nova_compute[189066]: 2025-12-05 09:22:04.762 189070 DEBUG oslo_concurrency.lockutils [None req-b09eff91-d682-40db-9795-f01015de55bc 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:22:05 compute-1 nova_compute[189066]: 2025-12-05 09:22:05.020 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:22:05 compute-1 nova_compute[189066]: 2025-12-05 09:22:05.080 189070 DEBUG oslo_concurrency.lockutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:22:05 compute-1 nova_compute[189066]: 2025-12-05 09:22:05.081 189070 DEBUG oslo_concurrency.lockutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:22:05 compute-1 nova_compute[189066]: 2025-12-05 09:22:05.081 189070 DEBUG oslo_concurrency.lockutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:22:05 compute-1 nova_compute[189066]: 2025-12-05 09:22:05.081 189070 DEBUG nova.compute.resource_tracker [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 05 09:22:05 compute-1 nova_compute[189066]: 2025-12-05 09:22:05.242 189070 WARNING nova.virt.libvirt.driver [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 05 09:22:05 compute-1 nova_compute[189066]: 2025-12-05 09:22:05.243 189070 DEBUG nova.compute.resource_tracker [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5930MB free_disk=73.33760070800781GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 05 09:22:05 compute-1 nova_compute[189066]: 2025-12-05 09:22:05.243 189070 DEBUG oslo_concurrency.lockutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:22:05 compute-1 nova_compute[189066]: 2025-12-05 09:22:05.243 189070 DEBUG oslo_concurrency.lockutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:22:05 compute-1 nova_compute[189066]: 2025-12-05 09:22:05.424 189070 DEBUG nova.compute.resource_tracker [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Instance 2b94d7d0-b000-460f-8883-8953d60115d0 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Dec 05 09:22:05 compute-1 nova_compute[189066]: 2025-12-05 09:22:05.425 189070 DEBUG nova.compute.resource_tracker [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 05 09:22:05 compute-1 nova_compute[189066]: 2025-12-05 09:22:05.425 189070 DEBUG nova.compute.resource_tracker [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 05 09:22:05 compute-1 nova_compute[189066]: 2025-12-05 09:22:05.581 189070 DEBUG nova.compute.provider_tree [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Updating inventory in ProviderTree for provider be68f9f1-7820-4bfa-8dbd-210e13729f64 with inventory: {'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 1}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Dec 05 09:22:05 compute-1 nova_compute[189066]: 2025-12-05 09:22:05.649 189070 ERROR nova.scheduler.client.report [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] [req-bc3afa9e-8392-45dc-aca2-6f574aabfb1d] Failed to update inventory to [{'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 1}}] for resource provider with UUID be68f9f1-7820-4bfa-8dbd-210e13729f64.  Got 409: {"errors": [{"status": 409, "title": "Conflict", "detail": "There was a conflict when trying to complete your request.\n\n resource provider generation conflict  ", "code": "placement.concurrent_update", "request_id": "req-bc3afa9e-8392-45dc-aca2-6f574aabfb1d"}]}
Dec 05 09:22:05 compute-1 nova_compute[189066]: 2025-12-05 09:22:05.673 189070 DEBUG nova.scheduler.client.report [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Refreshing inventories for resource provider be68f9f1-7820-4bfa-8dbd-210e13729f64 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Dec 05 09:22:05 compute-1 nova_compute[189066]: 2025-12-05 09:22:05.705 189070 DEBUG nova.scheduler.client.report [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Updating ProviderTree inventory for provider be68f9f1-7820-4bfa-8dbd-210e13729f64 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Dec 05 09:22:05 compute-1 nova_compute[189066]: 2025-12-05 09:22:05.706 189070 DEBUG nova.compute.provider_tree [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Updating inventory in ProviderTree for provider be68f9f1-7820-4bfa-8dbd-210e13729f64 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Dec 05 09:22:05 compute-1 nova_compute[189066]: 2025-12-05 09:22:05.738 189070 DEBUG nova.scheduler.client.report [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Refreshing aggregate associations for resource provider be68f9f1-7820-4bfa-8dbd-210e13729f64, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Dec 05 09:22:05 compute-1 nova_compute[189066]: 2025-12-05 09:22:05.786 189070 DEBUG nova.scheduler.client.report [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Refreshing trait associations for resource provider be68f9f1-7820-4bfa-8dbd-210e13729f64, traits: COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_DEVICE_TAGGING,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_STORAGE_BUS_USB,COMPUTE_NODE,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_VOLUME_EXTEND,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_GRAPHICS_MODEL_NONE,HW_CPU_X86_MMX,COMPUTE_TRUSTED_CERTS,COMPUTE_SECURITY_UEFI_SECURE_BOOT,HW_CPU_X86_SSE,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_RESCUE_BFV,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_NET_VIF_MODEL_E1000E,HW_CPU_X86_SSE41,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_STORAGE_BUS_FDC,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_STORAGE_BUS_IDE,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_SECURITY_TPM_1_2,HW_CPU_X86_SSE42,COMPUTE_SECURITY_TPM_2_0,COMPUTE_NET_VIF_MODEL_NE2K_PCI,HW_CPU_X86_SSSE3,COMPUTE_STORAGE_BUS_SATA,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_ACCELERATORS,HW_CPU_X86_SSE2,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_NET_VIF_MODEL_RTL8139 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Dec 05 09:22:05 compute-1 nova_compute[189066]: 2025-12-05 09:22:05.844 189070 DEBUG nova.compute.provider_tree [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Updating inventory in ProviderTree for provider be68f9f1-7820-4bfa-8dbd-210e13729f64 with inventory: {'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 1}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Dec 05 09:22:05 compute-1 nova_compute[189066]: 2025-12-05 09:22:05.926 189070 DEBUG nova.scheduler.client.report [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Updated inventory for provider be68f9f1-7820-4bfa-8dbd-210e13729f64 with generation 3 in Placement from set_inventory_for_provider using data: {'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 1}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:957
Dec 05 09:22:05 compute-1 nova_compute[189066]: 2025-12-05 09:22:05.927 189070 DEBUG nova.compute.provider_tree [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Updating resource provider be68f9f1-7820-4bfa-8dbd-210e13729f64 generation from 3 to 4 during operation: update_inventory _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164
Dec 05 09:22:05 compute-1 nova_compute[189066]: 2025-12-05 09:22:05.927 189070 DEBUG nova.compute.provider_tree [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Updating inventory in ProviderTree for provider be68f9f1-7820-4bfa-8dbd-210e13729f64 with inventory: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Dec 05 09:22:05 compute-1 nova_compute[189066]: 2025-12-05 09:22:05.964 189070 DEBUG nova.compute.resource_tracker [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 05 09:22:05 compute-1 nova_compute[189066]: 2025-12-05 09:22:05.964 189070 DEBUG oslo_concurrency.lockutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.721s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:22:06 compute-1 nova_compute[189066]: 2025-12-05 09:22:06.021 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:22:06 compute-1 nova_compute[189066]: 2025-12-05 09:22:06.022 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:22:06 compute-1 nova_compute[189066]: 2025-12-05 09:22:06.022 189070 DEBUG nova.compute.manager [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Dec 05 09:22:06 compute-1 podman[220635]: 2025-12-05 09:22:06.622994386 +0000 UTC m=+0.062495033 container health_status b07f36300303a5c1729056a61e344d6c6fd9b2e404082d8e8ce7185d45652c2b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, architecture=x86_64, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-type=git, vendor=Red Hat, Inc., config_id=openstack_network_exporter, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., io.buildah.version=1.33.7, build-date=2025-08-20T13:12:41)
Dec 05 09:22:07 compute-1 nova_compute[189066]: 2025-12-05 09:22:07.044 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:22:07 compute-1 nova_compute[189066]: 2025-12-05 09:22:07.044 189070 DEBUG nova.compute.manager [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 05 09:22:07 compute-1 nova_compute[189066]: 2025-12-05 09:22:07.045 189070 DEBUG nova.compute.manager [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 05 09:22:07 compute-1 nova_compute[189066]: 2025-12-05 09:22:07.071 189070 DEBUG nova.compute.manager [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] [instance: 2b94d7d0-b000-460f-8883-8953d60115d0] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871
Dec 05 09:22:07 compute-1 nova_compute[189066]: 2025-12-05 09:22:07.072 189070 DEBUG nova.compute.manager [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Dec 05 09:22:07 compute-1 nova_compute[189066]: 2025-12-05 09:22:07.072 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:22:07 compute-1 nova_compute[189066]: 2025-12-05 09:22:07.102 189070 DEBUG oslo_concurrency.lockutils [None req-2e8fba2f-945a-4074-ac8b-d3171dfe8b43 a9b7ab1c9c854146af8af16f337a063d aa5b008731384d18ba83c8d69e76bcef - - default default] Acquiring lock "95f85266-e2bc-4615-b523-e7346ad3ab40" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:22:07 compute-1 nova_compute[189066]: 2025-12-05 09:22:07.102 189070 DEBUG oslo_concurrency.lockutils [None req-2e8fba2f-945a-4074-ac8b-d3171dfe8b43 a9b7ab1c9c854146af8af16f337a063d aa5b008731384d18ba83c8d69e76bcef - - default default] Lock "95f85266-e2bc-4615-b523-e7346ad3ab40" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:22:07 compute-1 nova_compute[189066]: 2025-12-05 09:22:07.133 189070 DEBUG nova.compute.manager [None req-2e8fba2f-945a-4074-ac8b-d3171dfe8b43 a9b7ab1c9c854146af8af16f337a063d aa5b008731384d18ba83c8d69e76bcef - - default default] [instance: 95f85266-e2bc-4615-b523-e7346ad3ab40] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Dec 05 09:22:07 compute-1 nova_compute[189066]: 2025-12-05 09:22:07.265 189070 DEBUG oslo_concurrency.lockutils [None req-2e8fba2f-945a-4074-ac8b-d3171dfe8b43 a9b7ab1c9c854146af8af16f337a063d aa5b008731384d18ba83c8d69e76bcef - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:22:07 compute-1 nova_compute[189066]: 2025-12-05 09:22:07.266 189070 DEBUG oslo_concurrency.lockutils [None req-2e8fba2f-945a-4074-ac8b-d3171dfe8b43 a9b7ab1c9c854146af8af16f337a063d aa5b008731384d18ba83c8d69e76bcef - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:22:07 compute-1 nova_compute[189066]: 2025-12-05 09:22:07.282 189070 DEBUG nova.virt.hardware [None req-2e8fba2f-945a-4074-ac8b-d3171dfe8b43 a9b7ab1c9c854146af8af16f337a063d aa5b008731384d18ba83c8d69e76bcef - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Dec 05 09:22:07 compute-1 nova_compute[189066]: 2025-12-05 09:22:07.283 189070 INFO nova.compute.claims [None req-2e8fba2f-945a-4074-ac8b-d3171dfe8b43 a9b7ab1c9c854146af8af16f337a063d aa5b008731384d18ba83c8d69e76bcef - - default default] [instance: 95f85266-e2bc-4615-b523-e7346ad3ab40] Claim successful on node compute-1.ctlplane.example.com
Dec 05 09:22:07 compute-1 nova_compute[189066]: 2025-12-05 09:22:07.457 189070 DEBUG nova.compute.provider_tree [None req-2e8fba2f-945a-4074-ac8b-d3171dfe8b43 a9b7ab1c9c854146af8af16f337a063d aa5b008731384d18ba83c8d69e76bcef - - default default] Inventory has not changed in ProviderTree for provider: be68f9f1-7820-4bfa-8dbd-210e13729f64 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 09:22:07 compute-1 nova_compute[189066]: 2025-12-05 09:22:07.481 189070 DEBUG nova.scheduler.client.report [None req-2e8fba2f-945a-4074-ac8b-d3171dfe8b43 a9b7ab1c9c854146af8af16f337a063d aa5b008731384d18ba83c8d69e76bcef - - default default] Inventory has not changed for provider be68f9f1-7820-4bfa-8dbd-210e13729f64 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 09:22:07 compute-1 nova_compute[189066]: 2025-12-05 09:22:07.509 189070 DEBUG oslo_concurrency.lockutils [None req-2e8fba2f-945a-4074-ac8b-d3171dfe8b43 a9b7ab1c9c854146af8af16f337a063d aa5b008731384d18ba83c8d69e76bcef - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.244s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:22:07 compute-1 nova_compute[189066]: 2025-12-05 09:22:07.510 189070 DEBUG nova.compute.manager [None req-2e8fba2f-945a-4074-ac8b-d3171dfe8b43 a9b7ab1c9c854146af8af16f337a063d aa5b008731384d18ba83c8d69e76bcef - - default default] [instance: 95f85266-e2bc-4615-b523-e7346ad3ab40] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Dec 05 09:22:07 compute-1 nova_compute[189066]: 2025-12-05 09:22:07.598 189070 DEBUG nova.compute.manager [None req-2e8fba2f-945a-4074-ac8b-d3171dfe8b43 a9b7ab1c9c854146af8af16f337a063d aa5b008731384d18ba83c8d69e76bcef - - default default] [instance: 95f85266-e2bc-4615-b523-e7346ad3ab40] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Dec 05 09:22:07 compute-1 nova_compute[189066]: 2025-12-05 09:22:07.599 189070 DEBUG nova.network.neutron [None req-2e8fba2f-945a-4074-ac8b-d3171dfe8b43 a9b7ab1c9c854146af8af16f337a063d aa5b008731384d18ba83c8d69e76bcef - - default default] [instance: 95f85266-e2bc-4615-b523-e7346ad3ab40] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Dec 05 09:22:07 compute-1 nova_compute[189066]: 2025-12-05 09:22:07.629 189070 INFO nova.virt.libvirt.driver [None req-2e8fba2f-945a-4074-ac8b-d3171dfe8b43 a9b7ab1c9c854146af8af16f337a063d aa5b008731384d18ba83c8d69e76bcef - - default default] [instance: 95f85266-e2bc-4615-b523-e7346ad3ab40] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Dec 05 09:22:07 compute-1 nova_compute[189066]: 2025-12-05 09:22:07.666 189070 DEBUG nova.compute.manager [None req-2e8fba2f-945a-4074-ac8b-d3171dfe8b43 a9b7ab1c9c854146af8af16f337a063d aa5b008731384d18ba83c8d69e76bcef - - default default] [instance: 95f85266-e2bc-4615-b523-e7346ad3ab40] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Dec 05 09:22:07 compute-1 nova_compute[189066]: 2025-12-05 09:22:07.839 189070 DEBUG nova.compute.manager [None req-2e8fba2f-945a-4074-ac8b-d3171dfe8b43 a9b7ab1c9c854146af8af16f337a063d aa5b008731384d18ba83c8d69e76bcef - - default default] [instance: 95f85266-e2bc-4615-b523-e7346ad3ab40] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Dec 05 09:22:07 compute-1 nova_compute[189066]: 2025-12-05 09:22:07.840 189070 DEBUG nova.virt.libvirt.driver [None req-2e8fba2f-945a-4074-ac8b-d3171dfe8b43 a9b7ab1c9c854146af8af16f337a063d aa5b008731384d18ba83c8d69e76bcef - - default default] [instance: 95f85266-e2bc-4615-b523-e7346ad3ab40] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Dec 05 09:22:07 compute-1 nova_compute[189066]: 2025-12-05 09:22:07.840 189070 INFO nova.virt.libvirt.driver [None req-2e8fba2f-945a-4074-ac8b-d3171dfe8b43 a9b7ab1c9c854146af8af16f337a063d aa5b008731384d18ba83c8d69e76bcef - - default default] [instance: 95f85266-e2bc-4615-b523-e7346ad3ab40] Creating image(s)
Dec 05 09:22:07 compute-1 nova_compute[189066]: 2025-12-05 09:22:07.841 189070 DEBUG oslo_concurrency.lockutils [None req-2e8fba2f-945a-4074-ac8b-d3171dfe8b43 a9b7ab1c9c854146af8af16f337a063d aa5b008731384d18ba83c8d69e76bcef - - default default] Acquiring lock "/var/lib/nova/instances/95f85266-e2bc-4615-b523-e7346ad3ab40/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:22:07 compute-1 nova_compute[189066]: 2025-12-05 09:22:07.841 189070 DEBUG oslo_concurrency.lockutils [None req-2e8fba2f-945a-4074-ac8b-d3171dfe8b43 a9b7ab1c9c854146af8af16f337a063d aa5b008731384d18ba83c8d69e76bcef - - default default] Lock "/var/lib/nova/instances/95f85266-e2bc-4615-b523-e7346ad3ab40/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:22:07 compute-1 nova_compute[189066]: 2025-12-05 09:22:07.842 189070 DEBUG oslo_concurrency.lockutils [None req-2e8fba2f-945a-4074-ac8b-d3171dfe8b43 a9b7ab1c9c854146af8af16f337a063d aa5b008731384d18ba83c8d69e76bcef - - default default] Lock "/var/lib/nova/instances/95f85266-e2bc-4615-b523-e7346ad3ab40/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:22:07 compute-1 nova_compute[189066]: 2025-12-05 09:22:07.855 189070 DEBUG oslo_concurrency.processutils [None req-2e8fba2f-945a-4074-ac8b-d3171dfe8b43 a9b7ab1c9c854146af8af16f337a063d aa5b008731384d18ba83c8d69e76bcef - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5b1bdb5cc50b9bf43ebe3094e9279a12a18df0ce --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 09:22:07 compute-1 nova_compute[189066]: 2025-12-05 09:22:07.897 189070 DEBUG nova.policy [None req-2e8fba2f-945a-4074-ac8b-d3171dfe8b43 a9b7ab1c9c854146af8af16f337a063d aa5b008731384d18ba83c8d69e76bcef - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'a9b7ab1c9c854146af8af16f337a063d', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'aa5b008731384d18ba83c8d69e76bcef', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Dec 05 09:22:07 compute-1 nova_compute[189066]: 2025-12-05 09:22:07.922 189070 DEBUG oslo_concurrency.processutils [None req-2e8fba2f-945a-4074-ac8b-d3171dfe8b43 a9b7ab1c9c854146af8af16f337a063d aa5b008731384d18ba83c8d69e76bcef - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5b1bdb5cc50b9bf43ebe3094e9279a12a18df0ce --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 09:22:07 compute-1 nova_compute[189066]: 2025-12-05 09:22:07.923 189070 DEBUG oslo_concurrency.lockutils [None req-2e8fba2f-945a-4074-ac8b-d3171dfe8b43 a9b7ab1c9c854146af8af16f337a063d aa5b008731384d18ba83c8d69e76bcef - - default default] Acquiring lock "5b1bdb5cc50b9bf43ebe3094e9279a12a18df0ce" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:22:07 compute-1 nova_compute[189066]: 2025-12-05 09:22:07.924 189070 DEBUG oslo_concurrency.lockutils [None req-2e8fba2f-945a-4074-ac8b-d3171dfe8b43 a9b7ab1c9c854146af8af16f337a063d aa5b008731384d18ba83c8d69e76bcef - - default default] Lock "5b1bdb5cc50b9bf43ebe3094e9279a12a18df0ce" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:22:07 compute-1 nova_compute[189066]: 2025-12-05 09:22:07.935 189070 DEBUG oslo_concurrency.processutils [None req-2e8fba2f-945a-4074-ac8b-d3171dfe8b43 a9b7ab1c9c854146af8af16f337a063d aa5b008731384d18ba83c8d69e76bcef - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5b1bdb5cc50b9bf43ebe3094e9279a12a18df0ce --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 09:22:07 compute-1 nova_compute[189066]: 2025-12-05 09:22:07.997 189070 DEBUG oslo_concurrency.processutils [None req-2e8fba2f-945a-4074-ac8b-d3171dfe8b43 a9b7ab1c9c854146af8af16f337a063d aa5b008731384d18ba83c8d69e76bcef - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5b1bdb5cc50b9bf43ebe3094e9279a12a18df0ce --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 09:22:07 compute-1 nova_compute[189066]: 2025-12-05 09:22:07.998 189070 DEBUG oslo_concurrency.processutils [None req-2e8fba2f-945a-4074-ac8b-d3171dfe8b43 a9b7ab1c9c854146af8af16f337a063d aa5b008731384d18ba83c8d69e76bcef - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5b1bdb5cc50b9bf43ebe3094e9279a12a18df0ce,backing_fmt=raw /var/lib/nova/instances/95f85266-e2bc-4615-b523-e7346ad3ab40/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 09:22:08 compute-1 nova_compute[189066]: 2025-12-05 09:22:08.020 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:22:08 compute-1 nova_compute[189066]: 2025-12-05 09:22:08.036 189070 DEBUG oslo_concurrency.processutils [None req-2e8fba2f-945a-4074-ac8b-d3171dfe8b43 a9b7ab1c9c854146af8af16f337a063d aa5b008731384d18ba83c8d69e76bcef - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5b1bdb5cc50b9bf43ebe3094e9279a12a18df0ce,backing_fmt=raw /var/lib/nova/instances/95f85266-e2bc-4615-b523-e7346ad3ab40/disk 1073741824" returned: 0 in 0.038s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 09:22:08 compute-1 nova_compute[189066]: 2025-12-05 09:22:08.037 189070 DEBUG oslo_concurrency.lockutils [None req-2e8fba2f-945a-4074-ac8b-d3171dfe8b43 a9b7ab1c9c854146af8af16f337a063d aa5b008731384d18ba83c8d69e76bcef - - default default] Lock "5b1bdb5cc50b9bf43ebe3094e9279a12a18df0ce" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.113s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:22:08 compute-1 nova_compute[189066]: 2025-12-05 09:22:08.037 189070 DEBUG oslo_concurrency.processutils [None req-2e8fba2f-945a-4074-ac8b-d3171dfe8b43 a9b7ab1c9c854146af8af16f337a063d aa5b008731384d18ba83c8d69e76bcef - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5b1bdb5cc50b9bf43ebe3094e9279a12a18df0ce --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 09:22:08 compute-1 nova_compute[189066]: 2025-12-05 09:22:08.099 189070 DEBUG oslo_concurrency.processutils [None req-2e8fba2f-945a-4074-ac8b-d3171dfe8b43 a9b7ab1c9c854146af8af16f337a063d aa5b008731384d18ba83c8d69e76bcef - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5b1bdb5cc50b9bf43ebe3094e9279a12a18df0ce --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 09:22:08 compute-1 nova_compute[189066]: 2025-12-05 09:22:08.100 189070 DEBUG nova.virt.disk.api [None req-2e8fba2f-945a-4074-ac8b-d3171dfe8b43 a9b7ab1c9c854146af8af16f337a063d aa5b008731384d18ba83c8d69e76bcef - - default default] Checking if we can resize image /var/lib/nova/instances/95f85266-e2bc-4615-b523-e7346ad3ab40/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Dec 05 09:22:08 compute-1 nova_compute[189066]: 2025-12-05 09:22:08.101 189070 DEBUG oslo_concurrency.processutils [None req-2e8fba2f-945a-4074-ac8b-d3171dfe8b43 a9b7ab1c9c854146af8af16f337a063d aa5b008731384d18ba83c8d69e76bcef - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/95f85266-e2bc-4615-b523-e7346ad3ab40/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 09:22:08 compute-1 nova_compute[189066]: 2025-12-05 09:22:08.167 189070 DEBUG oslo_concurrency.processutils [None req-2e8fba2f-945a-4074-ac8b-d3171dfe8b43 a9b7ab1c9c854146af8af16f337a063d aa5b008731384d18ba83c8d69e76bcef - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/95f85266-e2bc-4615-b523-e7346ad3ab40/disk --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 09:22:08 compute-1 nova_compute[189066]: 2025-12-05 09:22:08.168 189070 DEBUG nova.virt.disk.api [None req-2e8fba2f-945a-4074-ac8b-d3171dfe8b43 a9b7ab1c9c854146af8af16f337a063d aa5b008731384d18ba83c8d69e76bcef - - default default] Cannot resize image /var/lib/nova/instances/95f85266-e2bc-4615-b523-e7346ad3ab40/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Dec 05 09:22:08 compute-1 nova_compute[189066]: 2025-12-05 09:22:08.169 189070 DEBUG nova.objects.instance [None req-2e8fba2f-945a-4074-ac8b-d3171dfe8b43 a9b7ab1c9c854146af8af16f337a063d aa5b008731384d18ba83c8d69e76bcef - - default default] Lazy-loading 'migration_context' on Instance uuid 95f85266-e2bc-4615-b523-e7346ad3ab40 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 09:22:08 compute-1 nova_compute[189066]: 2025-12-05 09:22:08.192 189070 DEBUG nova.network.neutron [None req-b09eff91-d682-40db-9795-f01015de55bc 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] [instance: 2b94d7d0-b000-460f-8883-8953d60115d0] Successfully created port: b2d6abb2-b32e-4bf7-bbdd-00d28c47ddc5 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Dec 05 09:22:08 compute-1 nova_compute[189066]: 2025-12-05 09:22:08.199 189070 DEBUG nova.virt.libvirt.driver [None req-2e8fba2f-945a-4074-ac8b-d3171dfe8b43 a9b7ab1c9c854146af8af16f337a063d aa5b008731384d18ba83c8d69e76bcef - - default default] [instance: 95f85266-e2bc-4615-b523-e7346ad3ab40] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Dec 05 09:22:08 compute-1 nova_compute[189066]: 2025-12-05 09:22:08.200 189070 DEBUG nova.virt.libvirt.driver [None req-2e8fba2f-945a-4074-ac8b-d3171dfe8b43 a9b7ab1c9c854146af8af16f337a063d aa5b008731384d18ba83c8d69e76bcef - - default default] [instance: 95f85266-e2bc-4615-b523-e7346ad3ab40] Ensure instance console log exists: /var/lib/nova/instances/95f85266-e2bc-4615-b523-e7346ad3ab40/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Dec 05 09:22:08 compute-1 nova_compute[189066]: 2025-12-05 09:22:08.200 189070 DEBUG oslo_concurrency.lockutils [None req-2e8fba2f-945a-4074-ac8b-d3171dfe8b43 a9b7ab1c9c854146af8af16f337a063d aa5b008731384d18ba83c8d69e76bcef - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:22:08 compute-1 nova_compute[189066]: 2025-12-05 09:22:08.201 189070 DEBUG oslo_concurrency.lockutils [None req-2e8fba2f-945a-4074-ac8b-d3171dfe8b43 a9b7ab1c9c854146af8af16f337a063d aa5b008731384d18ba83c8d69e76bcef - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:22:08 compute-1 nova_compute[189066]: 2025-12-05 09:22:08.201 189070 DEBUG oslo_concurrency.lockutils [None req-2e8fba2f-945a-4074-ac8b-d3171dfe8b43 a9b7ab1c9c854146af8af16f337a063d aa5b008731384d18ba83c8d69e76bcef - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:22:08 compute-1 podman[220672]: 2025-12-05 09:22:08.640605696 +0000 UTC m=+0.075296360 container health_status b9f45cdcb567bb250381fa80d86c78c226d5638c7de1b6d79ee017275814ff68 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Dec 05 09:22:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:22:08.867 105272 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:22:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:22:08.868 105272 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:22:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:22:08.868 105272 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:22:09 compute-1 nova_compute[189066]: 2025-12-05 09:22:09.020 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:22:09 compute-1 nova_compute[189066]: 2025-12-05 09:22:09.021 189070 DEBUG nova.compute.manager [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 05 09:22:09 compute-1 nova_compute[189066]: 2025-12-05 09:22:09.670 189070 DEBUG nova.network.neutron [None req-2e8fba2f-945a-4074-ac8b-d3171dfe8b43 a9b7ab1c9c854146af8af16f337a063d aa5b008731384d18ba83c8d69e76bcef - - default default] [instance: 95f85266-e2bc-4615-b523-e7346ad3ab40] Successfully created port: fbbb4f47-a5b9-460c-ae65-29a664051272 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Dec 05 09:22:11 compute-1 nova_compute[189066]: 2025-12-05 09:22:11.855 189070 DEBUG nova.network.neutron [None req-2e8fba2f-945a-4074-ac8b-d3171dfe8b43 a9b7ab1c9c854146af8af16f337a063d aa5b008731384d18ba83c8d69e76bcef - - default default] [instance: 95f85266-e2bc-4615-b523-e7346ad3ab40] Successfully updated port: fbbb4f47-a5b9-460c-ae65-29a664051272 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Dec 05 09:22:12 compute-1 nova_compute[189066]: 2025-12-05 09:22:12.090 189070 DEBUG oslo_concurrency.lockutils [None req-2e8fba2f-945a-4074-ac8b-d3171dfe8b43 a9b7ab1c9c854146af8af16f337a063d aa5b008731384d18ba83c8d69e76bcef - - default default] Acquiring lock "refresh_cache-95f85266-e2bc-4615-b523-e7346ad3ab40" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 09:22:12 compute-1 nova_compute[189066]: 2025-12-05 09:22:12.091 189070 DEBUG oslo_concurrency.lockutils [None req-2e8fba2f-945a-4074-ac8b-d3171dfe8b43 a9b7ab1c9c854146af8af16f337a063d aa5b008731384d18ba83c8d69e76bcef - - default default] Acquired lock "refresh_cache-95f85266-e2bc-4615-b523-e7346ad3ab40" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 09:22:12 compute-1 nova_compute[189066]: 2025-12-05 09:22:12.091 189070 DEBUG nova.network.neutron [None req-2e8fba2f-945a-4074-ac8b-d3171dfe8b43 a9b7ab1c9c854146af8af16f337a063d aa5b008731384d18ba83c8d69e76bcef - - default default] [instance: 95f85266-e2bc-4615-b523-e7346ad3ab40] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Dec 05 09:22:12 compute-1 nova_compute[189066]: 2025-12-05 09:22:12.740 189070 DEBUG nova.network.neutron [None req-2e8fba2f-945a-4074-ac8b-d3171dfe8b43 a9b7ab1c9c854146af8af16f337a063d aa5b008731384d18ba83c8d69e76bcef - - default default] [instance: 95f85266-e2bc-4615-b523-e7346ad3ab40] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Dec 05 09:22:13 compute-1 nova_compute[189066]: 2025-12-05 09:22:13.218 189070 DEBUG nova.compute.manager [req-633898c8-60e1-4334-82c4-df473922a6b8 req-c91a7ce3-c11b-4927-a567-a7b2798258f1 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 95f85266-e2bc-4615-b523-e7346ad3ab40] Received event network-changed-fbbb4f47-a5b9-460c-ae65-29a664051272 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 09:22:13 compute-1 nova_compute[189066]: 2025-12-05 09:22:13.219 189070 DEBUG nova.compute.manager [req-633898c8-60e1-4334-82c4-df473922a6b8 req-c91a7ce3-c11b-4927-a567-a7b2798258f1 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 95f85266-e2bc-4615-b523-e7346ad3ab40] Refreshing instance network info cache due to event network-changed-fbbb4f47-a5b9-460c-ae65-29a664051272. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Dec 05 09:22:13 compute-1 nova_compute[189066]: 2025-12-05 09:22:13.219 189070 DEBUG oslo_concurrency.lockutils [req-633898c8-60e1-4334-82c4-df473922a6b8 req-c91a7ce3-c11b-4927-a567-a7b2798258f1 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Acquiring lock "refresh_cache-95f85266-e2bc-4615-b523-e7346ad3ab40" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 09:22:13 compute-1 nova_compute[189066]: 2025-12-05 09:22:13.395 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:22:13 compute-1 nova_compute[189066]: 2025-12-05 09:22:13.429 189070 WARNING nova.compute.manager [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] While synchronizing instance power states, found 2 instances in the database and 0 instances on the hypervisor.
Dec 05 09:22:13 compute-1 nova_compute[189066]: 2025-12-05 09:22:13.429 189070 DEBUG nova.compute.manager [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Triggering sync for uuid 2b94d7d0-b000-460f-8883-8953d60115d0 _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268
Dec 05 09:22:13 compute-1 nova_compute[189066]: 2025-12-05 09:22:13.429 189070 DEBUG nova.compute.manager [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Triggering sync for uuid 95f85266-e2bc-4615-b523-e7346ad3ab40 _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268
Dec 05 09:22:13 compute-1 nova_compute[189066]: 2025-12-05 09:22:13.430 189070 DEBUG oslo_concurrency.lockutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Acquiring lock "2b94d7d0-b000-460f-8883-8953d60115d0" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:22:13 compute-1 nova_compute[189066]: 2025-12-05 09:22:13.430 189070 DEBUG oslo_concurrency.lockutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Acquiring lock "95f85266-e2bc-4615-b523-e7346ad3ab40" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:22:15 compute-1 podman[220697]: 2025-12-05 09:22:15.62971763 +0000 UTC m=+0.068351619 container health_status 550b604b3b40c506829669f3af0f3e8dc4e7ac01464222ce7f26998708fe1a96 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec 05 09:22:20 compute-1 sshd-session[220721]: Received disconnect from 43.225.158.169 port 33783:11: Bye Bye [preauth]
Dec 05 09:22:20 compute-1 sshd-session[220721]: Disconnected from authenticating user root 43.225.158.169 port 33783 [preauth]
Dec 05 09:22:20 compute-1 nova_compute[189066]: 2025-12-05 09:22:20.903 189070 DEBUG nova.network.neutron [None req-2e8fba2f-945a-4074-ac8b-d3171dfe8b43 a9b7ab1c9c854146af8af16f337a063d aa5b008731384d18ba83c8d69e76bcef - - default default] [instance: 95f85266-e2bc-4615-b523-e7346ad3ab40] Updating instance_info_cache with network_info: [{"id": "fbbb4f47-a5b9-460c-ae65-29a664051272", "address": "fa:16:3e:d6:6e:e7", "network": {"id": "0a97aec7-0780-4b5e-9498-e796fd7b42fd", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-971219995-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "aa5b008731384d18ba83c8d69e76bcef", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfbbb4f47-a5", "ovs_interfaceid": "fbbb4f47-a5b9-460c-ae65-29a664051272", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 09:22:21 compute-1 nova_compute[189066]: 2025-12-05 09:22:21.179 189070 DEBUG oslo_concurrency.lockutils [None req-2e8fba2f-945a-4074-ac8b-d3171dfe8b43 a9b7ab1c9c854146af8af16f337a063d aa5b008731384d18ba83c8d69e76bcef - - default default] Releasing lock "refresh_cache-95f85266-e2bc-4615-b523-e7346ad3ab40" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 09:22:21 compute-1 nova_compute[189066]: 2025-12-05 09:22:21.179 189070 DEBUG nova.compute.manager [None req-2e8fba2f-945a-4074-ac8b-d3171dfe8b43 a9b7ab1c9c854146af8af16f337a063d aa5b008731384d18ba83c8d69e76bcef - - default default] [instance: 95f85266-e2bc-4615-b523-e7346ad3ab40] Instance network_info: |[{"id": "fbbb4f47-a5b9-460c-ae65-29a664051272", "address": "fa:16:3e:d6:6e:e7", "network": {"id": "0a97aec7-0780-4b5e-9498-e796fd7b42fd", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-971219995-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "aa5b008731384d18ba83c8d69e76bcef", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfbbb4f47-a5", "ovs_interfaceid": "fbbb4f47-a5b9-460c-ae65-29a664051272", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Dec 05 09:22:21 compute-1 nova_compute[189066]: 2025-12-05 09:22:21.179 189070 DEBUG oslo_concurrency.lockutils [req-633898c8-60e1-4334-82c4-df473922a6b8 req-c91a7ce3-c11b-4927-a567-a7b2798258f1 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Acquired lock "refresh_cache-95f85266-e2bc-4615-b523-e7346ad3ab40" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 09:22:21 compute-1 nova_compute[189066]: 2025-12-05 09:22:21.180 189070 DEBUG nova.network.neutron [req-633898c8-60e1-4334-82c4-df473922a6b8 req-c91a7ce3-c11b-4927-a567-a7b2798258f1 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 95f85266-e2bc-4615-b523-e7346ad3ab40] Refreshing network info cache for port fbbb4f47-a5b9-460c-ae65-29a664051272 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Dec 05 09:22:21 compute-1 nova_compute[189066]: 2025-12-05 09:22:21.183 189070 DEBUG nova.virt.libvirt.driver [None req-2e8fba2f-945a-4074-ac8b-d3171dfe8b43 a9b7ab1c9c854146af8af16f337a063d aa5b008731384d18ba83c8d69e76bcef - - default default] [instance: 95f85266-e2bc-4615-b523-e7346ad3ab40] Start _get_guest_xml network_info=[{"id": "fbbb4f47-a5b9-460c-ae65-29a664051272", "address": "fa:16:3e:d6:6e:e7", "network": {"id": "0a97aec7-0780-4b5e-9498-e796fd7b42fd", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-971219995-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "aa5b008731384d18ba83c8d69e76bcef", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfbbb4f47-a5", "ovs_interfaceid": "fbbb4f47-a5b9-460c-ae65-29a664051272", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T09:19:04Z,direct_url=<?>,disk_format='qcow2',id=3ebffd97-b242-42d7-b245-ebdaf8e4377c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='1cb29f3743454b7fb71af92993455437',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T09:19:06Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encrypted': False, 'encryption_secret_uuid': None, 'size': 0, 'boot_index': 0, 'device_name': '/dev/vda', 'disk_bus': 'virtio', 'guest_format': None, 'encryption_options': None, 'encryption_format': None, 'device_type': 'disk', 'image_id': '3ebffd97-b242-42d7-b245-ebdaf8e4377c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Dec 05 09:22:21 compute-1 nova_compute[189066]: 2025-12-05 09:22:21.189 189070 WARNING nova.virt.libvirt.driver [None req-2e8fba2f-945a-4074-ac8b-d3171dfe8b43 a9b7ab1c9c854146af8af16f337a063d aa5b008731384d18ba83c8d69e76bcef - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 05 09:22:21 compute-1 nova_compute[189066]: 2025-12-05 09:22:21.200 189070 DEBUG nova.virt.libvirt.host [None req-2e8fba2f-945a-4074-ac8b-d3171dfe8b43 a9b7ab1c9c854146af8af16f337a063d aa5b008731384d18ba83c8d69e76bcef - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Dec 05 09:22:21 compute-1 nova_compute[189066]: 2025-12-05 09:22:21.201 189070 DEBUG nova.virt.libvirt.host [None req-2e8fba2f-945a-4074-ac8b-d3171dfe8b43 a9b7ab1c9c854146af8af16f337a063d aa5b008731384d18ba83c8d69e76bcef - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Dec 05 09:22:21 compute-1 nova_compute[189066]: 2025-12-05 09:22:21.207 189070 DEBUG nova.virt.libvirt.host [None req-2e8fba2f-945a-4074-ac8b-d3171dfe8b43 a9b7ab1c9c854146af8af16f337a063d aa5b008731384d18ba83c8d69e76bcef - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Dec 05 09:22:21 compute-1 nova_compute[189066]: 2025-12-05 09:22:21.207 189070 DEBUG nova.virt.libvirt.host [None req-2e8fba2f-945a-4074-ac8b-d3171dfe8b43 a9b7ab1c9c854146af8af16f337a063d aa5b008731384d18ba83c8d69e76bcef - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Dec 05 09:22:21 compute-1 nova_compute[189066]: 2025-12-05 09:22:21.209 189070 DEBUG nova.virt.libvirt.driver [None req-2e8fba2f-945a-4074-ac8b-d3171dfe8b43 a9b7ab1c9c854146af8af16f337a063d aa5b008731384d18ba83c8d69e76bcef - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Dec 05 09:22:21 compute-1 nova_compute[189066]: 2025-12-05 09:22:21.209 189070 DEBUG nova.virt.hardware [None req-2e8fba2f-945a-4074-ac8b-d3171dfe8b43 a9b7ab1c9c854146af8af16f337a063d aa5b008731384d18ba83c8d69e76bcef - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-05T09:19:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='fbadeab4-f24f-4100-963a-d228b2a6f7c4',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T09:19:04Z,direct_url=<?>,disk_format='qcow2',id=3ebffd97-b242-42d7-b245-ebdaf8e4377c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='1cb29f3743454b7fb71af92993455437',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T09:19:06Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Dec 05 09:22:21 compute-1 nova_compute[189066]: 2025-12-05 09:22:21.210 189070 DEBUG nova.virt.hardware [None req-2e8fba2f-945a-4074-ac8b-d3171dfe8b43 a9b7ab1c9c854146af8af16f337a063d aa5b008731384d18ba83c8d69e76bcef - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Dec 05 09:22:21 compute-1 nova_compute[189066]: 2025-12-05 09:22:21.210 189070 DEBUG nova.virt.hardware [None req-2e8fba2f-945a-4074-ac8b-d3171dfe8b43 a9b7ab1c9c854146af8af16f337a063d aa5b008731384d18ba83c8d69e76bcef - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Dec 05 09:22:21 compute-1 nova_compute[189066]: 2025-12-05 09:22:21.210 189070 DEBUG nova.virt.hardware [None req-2e8fba2f-945a-4074-ac8b-d3171dfe8b43 a9b7ab1c9c854146af8af16f337a063d aa5b008731384d18ba83c8d69e76bcef - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Dec 05 09:22:21 compute-1 nova_compute[189066]: 2025-12-05 09:22:21.211 189070 DEBUG nova.virt.hardware [None req-2e8fba2f-945a-4074-ac8b-d3171dfe8b43 a9b7ab1c9c854146af8af16f337a063d aa5b008731384d18ba83c8d69e76bcef - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Dec 05 09:22:21 compute-1 nova_compute[189066]: 2025-12-05 09:22:21.211 189070 DEBUG nova.virt.hardware [None req-2e8fba2f-945a-4074-ac8b-d3171dfe8b43 a9b7ab1c9c854146af8af16f337a063d aa5b008731384d18ba83c8d69e76bcef - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Dec 05 09:22:21 compute-1 nova_compute[189066]: 2025-12-05 09:22:21.211 189070 DEBUG nova.virt.hardware [None req-2e8fba2f-945a-4074-ac8b-d3171dfe8b43 a9b7ab1c9c854146af8af16f337a063d aa5b008731384d18ba83c8d69e76bcef - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Dec 05 09:22:21 compute-1 nova_compute[189066]: 2025-12-05 09:22:21.211 189070 DEBUG nova.virt.hardware [None req-2e8fba2f-945a-4074-ac8b-d3171dfe8b43 a9b7ab1c9c854146af8af16f337a063d aa5b008731384d18ba83c8d69e76bcef - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Dec 05 09:22:21 compute-1 nova_compute[189066]: 2025-12-05 09:22:21.212 189070 DEBUG nova.virt.hardware [None req-2e8fba2f-945a-4074-ac8b-d3171dfe8b43 a9b7ab1c9c854146af8af16f337a063d aa5b008731384d18ba83c8d69e76bcef - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Dec 05 09:22:21 compute-1 nova_compute[189066]: 2025-12-05 09:22:21.212 189070 DEBUG nova.virt.hardware [None req-2e8fba2f-945a-4074-ac8b-d3171dfe8b43 a9b7ab1c9c854146af8af16f337a063d aa5b008731384d18ba83c8d69e76bcef - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Dec 05 09:22:21 compute-1 nova_compute[189066]: 2025-12-05 09:22:21.212 189070 DEBUG nova.virt.hardware [None req-2e8fba2f-945a-4074-ac8b-d3171dfe8b43 a9b7ab1c9c854146af8af16f337a063d aa5b008731384d18ba83c8d69e76bcef - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Dec 05 09:22:21 compute-1 nova_compute[189066]: 2025-12-05 09:22:21.218 189070 DEBUG nova.privsep.utils [None req-2e8fba2f-945a-4074-ac8b-d3171dfe8b43 a9b7ab1c9c854146af8af16f337a063d aa5b008731384d18ba83c8d69e76bcef - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63
Dec 05 09:22:21 compute-1 nova_compute[189066]: 2025-12-05 09:22:21.219 189070 DEBUG nova.virt.libvirt.vif [None req-2e8fba2f-945a-4074-ac8b-d3171dfe8b43 a9b7ab1c9c854146af8af16f337a063d aa5b008731384d18ba83c8d69e76bcef - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T09:22:06Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-LiveAutoBlockMigrationV225Test-server-1827012883',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-liveautoblockmigrationv225test-server-1827012883',id=4,image_ref='3ebffd97-b242-42d7-b245-ebdaf8e4377c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='aa5b008731384d18ba83c8d69e76bcef',ramdisk_id='',reservation_id='r-0233w6ol',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='3ebffd97-b242-42d7-b245-ebdaf8e4377c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-LiveAutoBlockMigrationV225Test-63392288',owner_user_name='tempest-LiveAutoBlockMigrationV225Test-63392288-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T09:22:07Z,user_data=None,user_id='a9b7ab1c9c854146af8af16f337a063d',uuid=95f85266-e2bc-4615-b523-e7346ad3ab40,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "fbbb4f47-a5b9-460c-ae65-29a664051272", "address": "fa:16:3e:d6:6e:e7", "network": {"id": "0a97aec7-0780-4b5e-9498-e796fd7b42fd", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-971219995-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "aa5b008731384d18ba83c8d69e76bcef", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfbbb4f47-a5", "ovs_interfaceid": "fbbb4f47-a5b9-460c-ae65-29a664051272", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Dec 05 09:22:21 compute-1 nova_compute[189066]: 2025-12-05 09:22:21.219 189070 DEBUG nova.network.os_vif_util [None req-2e8fba2f-945a-4074-ac8b-d3171dfe8b43 a9b7ab1c9c854146af8af16f337a063d aa5b008731384d18ba83c8d69e76bcef - - default default] Converting VIF {"id": "fbbb4f47-a5b9-460c-ae65-29a664051272", "address": "fa:16:3e:d6:6e:e7", "network": {"id": "0a97aec7-0780-4b5e-9498-e796fd7b42fd", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-971219995-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "aa5b008731384d18ba83c8d69e76bcef", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfbbb4f47-a5", "ovs_interfaceid": "fbbb4f47-a5b9-460c-ae65-29a664051272", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 09:22:21 compute-1 nova_compute[189066]: 2025-12-05 09:22:21.221 189070 DEBUG nova.network.os_vif_util [None req-2e8fba2f-945a-4074-ac8b-d3171dfe8b43 a9b7ab1c9c854146af8af16f337a063d aa5b008731384d18ba83c8d69e76bcef - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d6:6e:e7,bridge_name='br-int',has_traffic_filtering=True,id=fbbb4f47-a5b9-460c-ae65-29a664051272,network=Network(0a97aec7-0780-4b5e-9498-e796fd7b42fd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfbbb4f47-a5') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 09:22:21 compute-1 nova_compute[189066]: 2025-12-05 09:22:21.222 189070 DEBUG nova.objects.instance [None req-2e8fba2f-945a-4074-ac8b-d3171dfe8b43 a9b7ab1c9c854146af8af16f337a063d aa5b008731384d18ba83c8d69e76bcef - - default default] Lazy-loading 'pci_devices' on Instance uuid 95f85266-e2bc-4615-b523-e7346ad3ab40 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 09:22:21 compute-1 nova_compute[189066]: 2025-12-05 09:22:21.338 189070 DEBUG nova.virt.libvirt.driver [None req-2e8fba2f-945a-4074-ac8b-d3171dfe8b43 a9b7ab1c9c854146af8af16f337a063d aa5b008731384d18ba83c8d69e76bcef - - default default] [instance: 95f85266-e2bc-4615-b523-e7346ad3ab40] End _get_guest_xml xml=<domain type="kvm">
Dec 05 09:22:21 compute-1 nova_compute[189066]:   <uuid>95f85266-e2bc-4615-b523-e7346ad3ab40</uuid>
Dec 05 09:22:21 compute-1 nova_compute[189066]:   <name>instance-00000004</name>
Dec 05 09:22:21 compute-1 nova_compute[189066]:   <memory>131072</memory>
Dec 05 09:22:21 compute-1 nova_compute[189066]:   <vcpu>1</vcpu>
Dec 05 09:22:21 compute-1 nova_compute[189066]:   <metadata>
Dec 05 09:22:21 compute-1 nova_compute[189066]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 05 09:22:21 compute-1 nova_compute[189066]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec 05 09:22:21 compute-1 nova_compute[189066]:       <nova:name>tempest-LiveAutoBlockMigrationV225Test-server-1827012883</nova:name>
Dec 05 09:22:21 compute-1 nova_compute[189066]:       <nova:creationTime>2025-12-05 09:22:21</nova:creationTime>
Dec 05 09:22:21 compute-1 nova_compute[189066]:       <nova:flavor name="m1.nano">
Dec 05 09:22:21 compute-1 nova_compute[189066]:         <nova:memory>128</nova:memory>
Dec 05 09:22:21 compute-1 nova_compute[189066]:         <nova:disk>1</nova:disk>
Dec 05 09:22:21 compute-1 nova_compute[189066]:         <nova:swap>0</nova:swap>
Dec 05 09:22:21 compute-1 nova_compute[189066]:         <nova:ephemeral>0</nova:ephemeral>
Dec 05 09:22:21 compute-1 nova_compute[189066]:         <nova:vcpus>1</nova:vcpus>
Dec 05 09:22:21 compute-1 nova_compute[189066]:       </nova:flavor>
Dec 05 09:22:21 compute-1 nova_compute[189066]:       <nova:owner>
Dec 05 09:22:21 compute-1 nova_compute[189066]:         <nova:user uuid="a9b7ab1c9c854146af8af16f337a063d">tempest-LiveAutoBlockMigrationV225Test-63392288-project-member</nova:user>
Dec 05 09:22:21 compute-1 nova_compute[189066]:         <nova:project uuid="aa5b008731384d18ba83c8d69e76bcef">tempest-LiveAutoBlockMigrationV225Test-63392288</nova:project>
Dec 05 09:22:21 compute-1 nova_compute[189066]:       </nova:owner>
Dec 05 09:22:21 compute-1 nova_compute[189066]:       <nova:root type="image" uuid="3ebffd97-b242-42d7-b245-ebdaf8e4377c"/>
Dec 05 09:22:21 compute-1 nova_compute[189066]:       <nova:ports>
Dec 05 09:22:21 compute-1 nova_compute[189066]:         <nova:port uuid="fbbb4f47-a5b9-460c-ae65-29a664051272">
Dec 05 09:22:21 compute-1 nova_compute[189066]:           <nova:ip type="fixed" address="10.100.0.10" ipVersion="4"/>
Dec 05 09:22:21 compute-1 nova_compute[189066]:         </nova:port>
Dec 05 09:22:21 compute-1 nova_compute[189066]:       </nova:ports>
Dec 05 09:22:21 compute-1 nova_compute[189066]:     </nova:instance>
Dec 05 09:22:21 compute-1 nova_compute[189066]:   </metadata>
Dec 05 09:22:21 compute-1 nova_compute[189066]:   <sysinfo type="smbios">
Dec 05 09:22:21 compute-1 nova_compute[189066]:     <system>
Dec 05 09:22:21 compute-1 nova_compute[189066]:       <entry name="manufacturer">RDO</entry>
Dec 05 09:22:21 compute-1 nova_compute[189066]:       <entry name="product">OpenStack Compute</entry>
Dec 05 09:22:21 compute-1 nova_compute[189066]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec 05 09:22:21 compute-1 nova_compute[189066]:       <entry name="serial">95f85266-e2bc-4615-b523-e7346ad3ab40</entry>
Dec 05 09:22:21 compute-1 nova_compute[189066]:       <entry name="uuid">95f85266-e2bc-4615-b523-e7346ad3ab40</entry>
Dec 05 09:22:21 compute-1 nova_compute[189066]:       <entry name="family">Virtual Machine</entry>
Dec 05 09:22:21 compute-1 nova_compute[189066]:     </system>
Dec 05 09:22:21 compute-1 nova_compute[189066]:   </sysinfo>
Dec 05 09:22:21 compute-1 nova_compute[189066]:   <os>
Dec 05 09:22:21 compute-1 nova_compute[189066]:     <type arch="x86_64" machine="q35">hvm</type>
Dec 05 09:22:21 compute-1 nova_compute[189066]:     <boot dev="hd"/>
Dec 05 09:22:21 compute-1 nova_compute[189066]:     <smbios mode="sysinfo"/>
Dec 05 09:22:21 compute-1 nova_compute[189066]:   </os>
Dec 05 09:22:21 compute-1 nova_compute[189066]:   <features>
Dec 05 09:22:21 compute-1 nova_compute[189066]:     <acpi/>
Dec 05 09:22:21 compute-1 nova_compute[189066]:     <apic/>
Dec 05 09:22:21 compute-1 nova_compute[189066]:     <vmcoreinfo/>
Dec 05 09:22:21 compute-1 nova_compute[189066]:   </features>
Dec 05 09:22:21 compute-1 nova_compute[189066]:   <clock offset="utc">
Dec 05 09:22:21 compute-1 nova_compute[189066]:     <timer name="pit" tickpolicy="delay"/>
Dec 05 09:22:21 compute-1 nova_compute[189066]:     <timer name="rtc" tickpolicy="catchup"/>
Dec 05 09:22:21 compute-1 nova_compute[189066]:     <timer name="hpet" present="no"/>
Dec 05 09:22:21 compute-1 nova_compute[189066]:   </clock>
Dec 05 09:22:21 compute-1 nova_compute[189066]:   <cpu mode="custom" match="exact">
Dec 05 09:22:21 compute-1 nova_compute[189066]:     <model>Nehalem</model>
Dec 05 09:22:21 compute-1 nova_compute[189066]:     <topology sockets="1" cores="1" threads="1"/>
Dec 05 09:22:21 compute-1 nova_compute[189066]:   </cpu>
Dec 05 09:22:21 compute-1 nova_compute[189066]:   <devices>
Dec 05 09:22:21 compute-1 nova_compute[189066]:     <disk type="file" device="disk">
Dec 05 09:22:21 compute-1 nova_compute[189066]:       <driver name="qemu" type="qcow2" cache="none"/>
Dec 05 09:22:21 compute-1 nova_compute[189066]:       <source file="/var/lib/nova/instances/95f85266-e2bc-4615-b523-e7346ad3ab40/disk"/>
Dec 05 09:22:21 compute-1 nova_compute[189066]:       <target dev="vda" bus="virtio"/>
Dec 05 09:22:21 compute-1 nova_compute[189066]:     </disk>
Dec 05 09:22:21 compute-1 nova_compute[189066]:     <disk type="file" device="cdrom">
Dec 05 09:22:21 compute-1 nova_compute[189066]:       <driver name="qemu" type="raw" cache="none"/>
Dec 05 09:22:21 compute-1 nova_compute[189066]:       <source file="/var/lib/nova/instances/95f85266-e2bc-4615-b523-e7346ad3ab40/disk.config"/>
Dec 05 09:22:21 compute-1 nova_compute[189066]:       <target dev="sda" bus="sata"/>
Dec 05 09:22:21 compute-1 nova_compute[189066]:     </disk>
Dec 05 09:22:21 compute-1 nova_compute[189066]:     <interface type="ethernet">
Dec 05 09:22:21 compute-1 nova_compute[189066]:       <mac address="fa:16:3e:d6:6e:e7"/>
Dec 05 09:22:21 compute-1 nova_compute[189066]:       <model type="virtio"/>
Dec 05 09:22:21 compute-1 nova_compute[189066]:       <driver name="vhost" rx_queue_size="512"/>
Dec 05 09:22:21 compute-1 nova_compute[189066]:       <mtu size="1442"/>
Dec 05 09:22:21 compute-1 nova_compute[189066]:       <target dev="tapfbbb4f47-a5"/>
Dec 05 09:22:21 compute-1 nova_compute[189066]:     </interface>
Dec 05 09:22:21 compute-1 nova_compute[189066]:     <serial type="pty">
Dec 05 09:22:21 compute-1 nova_compute[189066]:       <log file="/var/lib/nova/instances/95f85266-e2bc-4615-b523-e7346ad3ab40/console.log" append="off"/>
Dec 05 09:22:21 compute-1 nova_compute[189066]:     </serial>
Dec 05 09:22:21 compute-1 nova_compute[189066]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Dec 05 09:22:21 compute-1 nova_compute[189066]:     <video>
Dec 05 09:22:21 compute-1 nova_compute[189066]:       <model type="virtio"/>
Dec 05 09:22:21 compute-1 nova_compute[189066]:     </video>
Dec 05 09:22:21 compute-1 nova_compute[189066]:     <input type="tablet" bus="usb"/>
Dec 05 09:22:21 compute-1 nova_compute[189066]:     <rng model="virtio">
Dec 05 09:22:21 compute-1 nova_compute[189066]:       <backend model="random">/dev/urandom</backend>
Dec 05 09:22:21 compute-1 nova_compute[189066]:     </rng>
Dec 05 09:22:21 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root"/>
Dec 05 09:22:21 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:22:21 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:22:21 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:22:21 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:22:21 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:22:21 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:22:21 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:22:21 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:22:21 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:22:21 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:22:21 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:22:21 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:22:21 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:22:21 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:22:21 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:22:21 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:22:21 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:22:21 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:22:21 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:22:21 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:22:21 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:22:21 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:22:21 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:22:21 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:22:21 compute-1 nova_compute[189066]:     <controller type="usb" index="0"/>
Dec 05 09:22:21 compute-1 nova_compute[189066]:     <memballoon model="virtio">
Dec 05 09:22:21 compute-1 nova_compute[189066]:       <stats period="10"/>
Dec 05 09:22:21 compute-1 nova_compute[189066]:     </memballoon>
Dec 05 09:22:21 compute-1 nova_compute[189066]:   </devices>
Dec 05 09:22:21 compute-1 nova_compute[189066]: </domain>
Dec 05 09:22:21 compute-1 nova_compute[189066]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Dec 05 09:22:21 compute-1 nova_compute[189066]: 2025-12-05 09:22:21.340 189070 DEBUG nova.compute.manager [None req-2e8fba2f-945a-4074-ac8b-d3171dfe8b43 a9b7ab1c9c854146af8af16f337a063d aa5b008731384d18ba83c8d69e76bcef - - default default] [instance: 95f85266-e2bc-4615-b523-e7346ad3ab40] Preparing to wait for external event network-vif-plugged-fbbb4f47-a5b9-460c-ae65-29a664051272 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Dec 05 09:22:21 compute-1 nova_compute[189066]: 2025-12-05 09:22:21.340 189070 DEBUG oslo_concurrency.lockutils [None req-2e8fba2f-945a-4074-ac8b-d3171dfe8b43 a9b7ab1c9c854146af8af16f337a063d aa5b008731384d18ba83c8d69e76bcef - - default default] Acquiring lock "95f85266-e2bc-4615-b523-e7346ad3ab40-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:22:21 compute-1 nova_compute[189066]: 2025-12-05 09:22:21.341 189070 DEBUG oslo_concurrency.lockutils [None req-2e8fba2f-945a-4074-ac8b-d3171dfe8b43 a9b7ab1c9c854146af8af16f337a063d aa5b008731384d18ba83c8d69e76bcef - - default default] Lock "95f85266-e2bc-4615-b523-e7346ad3ab40-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:22:21 compute-1 nova_compute[189066]: 2025-12-05 09:22:21.341 189070 DEBUG oslo_concurrency.lockutils [None req-2e8fba2f-945a-4074-ac8b-d3171dfe8b43 a9b7ab1c9c854146af8af16f337a063d aa5b008731384d18ba83c8d69e76bcef - - default default] Lock "95f85266-e2bc-4615-b523-e7346ad3ab40-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:22:21 compute-1 nova_compute[189066]: 2025-12-05 09:22:21.342 189070 DEBUG nova.virt.libvirt.vif [None req-2e8fba2f-945a-4074-ac8b-d3171dfe8b43 a9b7ab1c9c854146af8af16f337a063d aa5b008731384d18ba83c8d69e76bcef - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T09:22:06Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-LiveAutoBlockMigrationV225Test-server-1827012883',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-liveautoblockmigrationv225test-server-1827012883',id=4,image_ref='3ebffd97-b242-42d7-b245-ebdaf8e4377c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='aa5b008731384d18ba83c8d69e76bcef',ramdisk_id='',reservation_id='r-0233w6ol',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='3ebffd97-b242-42d7-b245-ebdaf8e4377c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-LiveAutoBlockMigrationV225Test-63392288',owner_user_name='tempest-LiveAutoBlockMigrationV225Test-63392288-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T09:22:07Z,user_data=None,user_id='a9b7ab1c9c854146af8af16f337a063d',uuid=95f85266-e2bc-4615-b523-e7346ad3ab40,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "fbbb4f47-a5b9-460c-ae65-29a664051272", "address": "fa:16:3e:d6:6e:e7", "network": {"id": "0a97aec7-0780-4b5e-9498-e796fd7b42fd", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-971219995-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "aa5b008731384d18ba83c8d69e76bcef", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfbbb4f47-a5", "ovs_interfaceid": "fbbb4f47-a5b9-460c-ae65-29a664051272", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Dec 05 09:22:21 compute-1 nova_compute[189066]: 2025-12-05 09:22:21.342 189070 DEBUG nova.network.os_vif_util [None req-2e8fba2f-945a-4074-ac8b-d3171dfe8b43 a9b7ab1c9c854146af8af16f337a063d aa5b008731384d18ba83c8d69e76bcef - - default default] Converting VIF {"id": "fbbb4f47-a5b9-460c-ae65-29a664051272", "address": "fa:16:3e:d6:6e:e7", "network": {"id": "0a97aec7-0780-4b5e-9498-e796fd7b42fd", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-971219995-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "aa5b008731384d18ba83c8d69e76bcef", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfbbb4f47-a5", "ovs_interfaceid": "fbbb4f47-a5b9-460c-ae65-29a664051272", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 09:22:21 compute-1 nova_compute[189066]: 2025-12-05 09:22:21.343 189070 DEBUG nova.network.os_vif_util [None req-2e8fba2f-945a-4074-ac8b-d3171dfe8b43 a9b7ab1c9c854146af8af16f337a063d aa5b008731384d18ba83c8d69e76bcef - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d6:6e:e7,bridge_name='br-int',has_traffic_filtering=True,id=fbbb4f47-a5b9-460c-ae65-29a664051272,network=Network(0a97aec7-0780-4b5e-9498-e796fd7b42fd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfbbb4f47-a5') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 09:22:21 compute-1 nova_compute[189066]: 2025-12-05 09:22:21.343 189070 DEBUG os_vif [None req-2e8fba2f-945a-4074-ac8b-d3171dfe8b43 a9b7ab1c9c854146af8af16f337a063d aa5b008731384d18ba83c8d69e76bcef - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:d6:6e:e7,bridge_name='br-int',has_traffic_filtering=True,id=fbbb4f47-a5b9-460c-ae65-29a664051272,network=Network(0a97aec7-0780-4b5e-9498-e796fd7b42fd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfbbb4f47-a5') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Dec 05 09:22:21 compute-1 nova_compute[189066]: 2025-12-05 09:22:21.386 189070 DEBUG ovsdbapp.backend.ovs_idl [None req-2e8fba2f-945a-4074-ac8b-d3171dfe8b43 a9b7ab1c9c854146af8af16f337a063d aa5b008731384d18ba83c8d69e76bcef - - default default] Created schema index Interface.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Dec 05 09:22:21 compute-1 nova_compute[189066]: 2025-12-05 09:22:21.386 189070 DEBUG ovsdbapp.backend.ovs_idl [None req-2e8fba2f-945a-4074-ac8b-d3171dfe8b43 a9b7ab1c9c854146af8af16f337a063d aa5b008731384d18ba83c8d69e76bcef - - default default] Created schema index Port.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Dec 05 09:22:21 compute-1 nova_compute[189066]: 2025-12-05 09:22:21.386 189070 DEBUG ovsdbapp.backend.ovs_idl [None req-2e8fba2f-945a-4074-ac8b-d3171dfe8b43 a9b7ab1c9c854146af8af16f337a063d aa5b008731384d18ba83c8d69e76bcef - - default default] Created schema index Bridge.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Dec 05 09:22:21 compute-1 nova_compute[189066]: 2025-12-05 09:22:21.387 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-2e8fba2f-945a-4074-ac8b-d3171dfe8b43 a9b7ab1c9c854146af8af16f337a063d aa5b008731384d18ba83c8d69e76bcef - - default default] tcp:127.0.0.1:6640: entering CONNECTING _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 05 09:22:21 compute-1 nova_compute[189066]: 2025-12-05 09:22:21.388 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-2e8fba2f-945a-4074-ac8b-d3171dfe8b43 a9b7ab1c9c854146af8af16f337a063d aa5b008731384d18ba83c8d69e76bcef - - default default] [POLLOUT] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:22:21 compute-1 nova_compute[189066]: 2025-12-05 09:22:21.388 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-2e8fba2f-945a-4074-ac8b-d3171dfe8b43 a9b7ab1c9c854146af8af16f337a063d aa5b008731384d18ba83c8d69e76bcef - - default default] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 05 09:22:21 compute-1 nova_compute[189066]: 2025-12-05 09:22:21.388 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-2e8fba2f-945a-4074-ac8b-d3171dfe8b43 a9b7ab1c9c854146af8af16f337a063d aa5b008731384d18ba83c8d69e76bcef - - default default] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:22:21 compute-1 nova_compute[189066]: 2025-12-05 09:22:21.390 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-2e8fba2f-945a-4074-ac8b-d3171dfe8b43 a9b7ab1c9c854146af8af16f337a063d aa5b008731384d18ba83c8d69e76bcef - - default default] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:22:21 compute-1 nova_compute[189066]: 2025-12-05 09:22:21.391 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-2e8fba2f-945a-4074-ac8b-d3171dfe8b43 a9b7ab1c9c854146af8af16f337a063d aa5b008731384d18ba83c8d69e76bcef - - default default] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:22:21 compute-1 nova_compute[189066]: 2025-12-05 09:22:21.400 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:22:21 compute-1 nova_compute[189066]: 2025-12-05 09:22:21.401 189070 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 09:22:21 compute-1 nova_compute[189066]: 2025-12-05 09:22:21.401 189070 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 09:22:21 compute-1 nova_compute[189066]: 2025-12-05 09:22:21.402 189070 INFO oslo.privsep.daemon [None req-2e8fba2f-945a-4074-ac8b-d3171dfe8b43 a9b7ab1c9c854146af8af16f337a063d aa5b008731384d18ba83c8d69e76bcef - - default default] Running privsep helper: ['sudo', 'nova-rootwrap', '/etc/nova/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/nova/nova.conf', '--config-file', '/etc/nova/nova-compute.conf', '--config-dir', '/etc/nova/nova.conf.d', '--privsep_context', 'vif_plug_ovs.privsep.vif_plug', '--privsep_sock_path', '/tmp/tmploljbi8s/privsep.sock']
Dec 05 09:22:22 compute-1 nova_compute[189066]: 2025-12-05 09:22:22.358 189070 INFO oslo.privsep.daemon [None req-2e8fba2f-945a-4074-ac8b-d3171dfe8b43 a9b7ab1c9c854146af8af16f337a063d aa5b008731384d18ba83c8d69e76bcef - - default default] Spawned new privsep daemon via rootwrap
Dec 05 09:22:22 compute-1 nova_compute[189066]: 2025-12-05 09:22:22.024 220727 INFO oslo.privsep.daemon [-] privsep daemon starting
Dec 05 09:22:22 compute-1 nova_compute[189066]: 2025-12-05 09:22:22.146 220727 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Dec 05 09:22:22 compute-1 nova_compute[189066]: 2025-12-05 09:22:22.149 220727 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_DAC_OVERRIDE|CAP_NET_ADMIN/CAP_DAC_OVERRIDE|CAP_NET_ADMIN/none
Dec 05 09:22:22 compute-1 nova_compute[189066]: 2025-12-05 09:22:22.149 220727 INFO oslo.privsep.daemon [-] privsep daemon running as pid 220727
Dec 05 09:22:22 compute-1 nova_compute[189066]: 2025-12-05 09:22:22.971 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:22:22 compute-1 nova_compute[189066]: 2025-12-05 09:22:22.972 189070 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapfbbb4f47-a5, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 09:22:22 compute-1 nova_compute[189066]: 2025-12-05 09:22:22.973 189070 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapfbbb4f47-a5, col_values=(('external_ids', {'iface-id': 'fbbb4f47-a5b9-460c-ae65-29a664051272', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:d6:6e:e7', 'vm-uuid': '95f85266-e2bc-4615-b523-e7346ad3ab40'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 09:22:22 compute-1 nova_compute[189066]: 2025-12-05 09:22:22.975 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:22:22 compute-1 NetworkManager[55704]: <info>  [1764926542.9778] manager: (tapfbbb4f47-a5): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/21)
Dec 05 09:22:22 compute-1 nova_compute[189066]: 2025-12-05 09:22:22.981 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 09:22:22 compute-1 nova_compute[189066]: 2025-12-05 09:22:22.984 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:22:22 compute-1 nova_compute[189066]: 2025-12-05 09:22:22.985 189070 INFO os_vif [None req-2e8fba2f-945a-4074-ac8b-d3171dfe8b43 a9b7ab1c9c854146af8af16f337a063d aa5b008731384d18ba83c8d69e76bcef - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:d6:6e:e7,bridge_name='br-int',has_traffic_filtering=True,id=fbbb4f47-a5b9-460c-ae65-29a664051272,network=Network(0a97aec7-0780-4b5e-9498-e796fd7b42fd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfbbb4f47-a5')
Dec 05 09:22:23 compute-1 nova_compute[189066]: 2025-12-05 09:22:23.177 189070 DEBUG nova.virt.libvirt.driver [None req-2e8fba2f-945a-4074-ac8b-d3171dfe8b43 a9b7ab1c9c854146af8af16f337a063d aa5b008731384d18ba83c8d69e76bcef - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 05 09:22:23 compute-1 nova_compute[189066]: 2025-12-05 09:22:23.178 189070 DEBUG nova.virt.libvirt.driver [None req-2e8fba2f-945a-4074-ac8b-d3171dfe8b43 a9b7ab1c9c854146af8af16f337a063d aa5b008731384d18ba83c8d69e76bcef - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 05 09:22:23 compute-1 nova_compute[189066]: 2025-12-05 09:22:23.178 189070 DEBUG nova.virt.libvirt.driver [None req-2e8fba2f-945a-4074-ac8b-d3171dfe8b43 a9b7ab1c9c854146af8af16f337a063d aa5b008731384d18ba83c8d69e76bcef - - default default] No VIF found with MAC fa:16:3e:d6:6e:e7, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Dec 05 09:22:23 compute-1 nova_compute[189066]: 2025-12-05 09:22:23.179 189070 INFO nova.virt.libvirt.driver [None req-2e8fba2f-945a-4074-ac8b-d3171dfe8b43 a9b7ab1c9c854146af8af16f337a063d aa5b008731384d18ba83c8d69e76bcef - - default default] [instance: 95f85266-e2bc-4615-b523-e7346ad3ab40] Using config drive
Dec 05 09:22:23 compute-1 nova_compute[189066]: 2025-12-05 09:22:23.193 189070 DEBUG nova.network.neutron [None req-b09eff91-d682-40db-9795-f01015de55bc 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] [instance: 2b94d7d0-b000-460f-8883-8953d60115d0] Successfully updated port: b2d6abb2-b32e-4bf7-bbdd-00d28c47ddc5 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Dec 05 09:22:23 compute-1 nova_compute[189066]: 2025-12-05 09:22:23.404 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:22:23 compute-1 nova_compute[189066]: 2025-12-05 09:22:23.466 189070 DEBUG oslo_concurrency.lockutils [None req-b09eff91-d682-40db-9795-f01015de55bc 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] Acquiring lock "refresh_cache-2b94d7d0-b000-460f-8883-8953d60115d0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 09:22:23 compute-1 nova_compute[189066]: 2025-12-05 09:22:23.466 189070 DEBUG oslo_concurrency.lockutils [None req-b09eff91-d682-40db-9795-f01015de55bc 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] Acquired lock "refresh_cache-2b94d7d0-b000-460f-8883-8953d60115d0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 09:22:23 compute-1 nova_compute[189066]: 2025-12-05 09:22:23.466 189070 DEBUG nova.network.neutron [None req-b09eff91-d682-40db-9795-f01015de55bc 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] [instance: 2b94d7d0-b000-460f-8883-8953d60115d0] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Dec 05 09:22:23 compute-1 nova_compute[189066]: 2025-12-05 09:22:23.741 189070 DEBUG nova.network.neutron [req-633898c8-60e1-4334-82c4-df473922a6b8 req-c91a7ce3-c11b-4927-a567-a7b2798258f1 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 95f85266-e2bc-4615-b523-e7346ad3ab40] Updated VIF entry in instance network info cache for port fbbb4f47-a5b9-460c-ae65-29a664051272. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Dec 05 09:22:23 compute-1 nova_compute[189066]: 2025-12-05 09:22:23.741 189070 DEBUG nova.network.neutron [req-633898c8-60e1-4334-82c4-df473922a6b8 req-c91a7ce3-c11b-4927-a567-a7b2798258f1 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 95f85266-e2bc-4615-b523-e7346ad3ab40] Updating instance_info_cache with network_info: [{"id": "fbbb4f47-a5b9-460c-ae65-29a664051272", "address": "fa:16:3e:d6:6e:e7", "network": {"id": "0a97aec7-0780-4b5e-9498-e796fd7b42fd", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-971219995-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "aa5b008731384d18ba83c8d69e76bcef", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfbbb4f47-a5", "ovs_interfaceid": "fbbb4f47-a5b9-460c-ae65-29a664051272", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 09:22:24 compute-1 nova_compute[189066]: 2025-12-05 09:22:24.096 189070 DEBUG oslo_concurrency.lockutils [req-633898c8-60e1-4334-82c4-df473922a6b8 req-c91a7ce3-c11b-4927-a567-a7b2798258f1 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Releasing lock "refresh_cache-95f85266-e2bc-4615-b523-e7346ad3ab40" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 09:22:24 compute-1 podman[220733]: 2025-12-05 09:22:24.866780112 +0000 UTC m=+0.114663562 container health_status d60d3dfd47255bc7c068dbedc03dbf86678863f0414a1b8f8536e4d53dd67a13 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a42d21893a42142b0b00e96296a13ac9d2c0fedf55d6438ad104fd0309c63376-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.vendor=CentOS)
Dec 05 09:22:26 compute-1 nova_compute[189066]: 2025-12-05 09:22:26.207 189070 DEBUG nova.network.neutron [None req-b09eff91-d682-40db-9795-f01015de55bc 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] [instance: 2b94d7d0-b000-460f-8883-8953d60115d0] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Dec 05 09:22:26 compute-1 nova_compute[189066]: 2025-12-05 09:22:26.580 189070 INFO nova.virt.libvirt.driver [None req-2e8fba2f-945a-4074-ac8b-d3171dfe8b43 a9b7ab1c9c854146af8af16f337a063d aa5b008731384d18ba83c8d69e76bcef - - default default] [instance: 95f85266-e2bc-4615-b523-e7346ad3ab40] Creating config drive at /var/lib/nova/instances/95f85266-e2bc-4615-b523-e7346ad3ab40/disk.config
Dec 05 09:22:26 compute-1 nova_compute[189066]: 2025-12-05 09:22:26.585 189070 DEBUG oslo_concurrency.processutils [None req-2e8fba2f-945a-4074-ac8b-d3171dfe8b43 a9b7ab1c9c854146af8af16f337a063d aa5b008731384d18ba83c8d69e76bcef - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/95f85266-e2bc-4615-b523-e7346ad3ab40/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpfwzefzjm execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 09:22:26 compute-1 nova_compute[189066]: 2025-12-05 09:22:26.710 189070 DEBUG oslo_concurrency.processutils [None req-2e8fba2f-945a-4074-ac8b-d3171dfe8b43 a9b7ab1c9c854146af8af16f337a063d aa5b008731384d18ba83c8d69e76bcef - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/95f85266-e2bc-4615-b523-e7346ad3ab40/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpfwzefzjm" returned: 0 in 0.126s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 09:22:26 compute-1 kernel: tun: Universal TUN/TAP device driver, 1.6
Dec 05 09:22:26 compute-1 kernel: tapfbbb4f47-a5: entered promiscuous mode
Dec 05 09:22:26 compute-1 NetworkManager[55704]: <info>  [1764926546.8191] manager: (tapfbbb4f47-a5): new Tun device (/org/freedesktop/NetworkManager/Devices/22)
Dec 05 09:22:26 compute-1 ovn_controller[95809]: 2025-12-05T09:22:26Z|00027|binding|INFO|Claiming lport fbbb4f47-a5b9-460c-ae65-29a664051272 for this chassis.
Dec 05 09:22:26 compute-1 ovn_controller[95809]: 2025-12-05T09:22:26Z|00028|binding|INFO|fbbb4f47-a5b9-460c-ae65-29a664051272: Claiming fa:16:3e:d6:6e:e7 10.100.0.10
Dec 05 09:22:26 compute-1 nova_compute[189066]: 2025-12-05 09:22:26.822 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:22:26 compute-1 systemd-udevd[220794]: Network interface NamePolicy= disabled on kernel command line.
Dec 05 09:22:26 compute-1 NetworkManager[55704]: <info>  [1764926546.8718] device (tapfbbb4f47-a5): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec 05 09:22:26 compute-1 NetworkManager[55704]: <info>  [1764926546.8739] device (tapfbbb4f47-a5): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec 05 09:22:26 compute-1 nova_compute[189066]: 2025-12-05 09:22:26.907 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:22:26 compute-1 systemd-machined[154815]: New machine qemu-1-instance-00000004.
Dec 05 09:22:26 compute-1 ovn_controller[95809]: 2025-12-05T09:22:26Z|00029|binding|INFO|Setting lport fbbb4f47-a5b9-460c-ae65-29a664051272 ovn-installed in OVS
Dec 05 09:22:26 compute-1 nova_compute[189066]: 2025-12-05 09:22:26.911 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:22:26 compute-1 podman[220763]: 2025-12-05 09:22:26.915000238 +0000 UTC m=+0.132046832 container health_status 0cdc951f3b455a1459471b20e1f1626ca53f702831c125f835d1b670aeb17ccf (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a42d21893a42142b0b00e96296a13ac9d2c0fedf55d6438ad104fd0309c63376-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0)
Dec 05 09:22:26 compute-1 systemd[1]: Started Virtual Machine qemu-1-instance-00000004.
Dec 05 09:22:27 compute-1 nova_compute[189066]: 2025-12-05 09:22:27.516 189070 DEBUG nova.virt.driver [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] Emitting event <LifecycleEvent: 1764926547.5142682, 95f85266-e2bc-4615-b523-e7346ad3ab40 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 09:22:27 compute-1 nova_compute[189066]: 2025-12-05 09:22:27.517 189070 INFO nova.compute.manager [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] [instance: 95f85266-e2bc-4615-b523-e7346ad3ab40] VM Started (Lifecycle Event)
Dec 05 09:22:27 compute-1 nova_compute[189066]: 2025-12-05 09:22:27.976 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:22:28 compute-1 nova_compute[189066]: 2025-12-05 09:22:28.406 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:22:29 compute-1 podman[220825]: 2025-12-05 09:22:29.650096019 +0000 UTC m=+0.074205024 container health_status e8cea6352187f28e672aa25c227f2705a90d58074b8cf42235a56df5d4026fd5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a42d21893a42142b0b00e96296a13ac9d2c0fedf55d6438ad104fd0309c63376-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 05 09:22:32 compute-1 podman[220844]: 2025-12-05 09:22:32.635216824 +0000 UTC m=+0.068266287 container health_status 3f3288243aefe700d53cffce184353529fbaf6e44f1c19ec4792ed576af41176 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 05 09:22:33 compute-1 nova_compute[189066]: 2025-12-05 09:22:33.016 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:22:33 compute-1 nova_compute[189066]: 2025-12-05 09:22:33.408 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:22:36 compute-1 ovn_controller[95809]: 2025-12-05T09:22:36Z|00030|binding|INFO|Setting lport fbbb4f47-a5b9-460c-ae65-29a664051272 up in Southbound
Dec 05 09:22:36 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:22:36.118 105272 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d6:6e:e7 10.100.0.10'], port_security=['fa:16:3e:d6:6e:e7 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': '95f85266-e2bc-4615-b523-e7346ad3ab40', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-0a97aec7-0780-4b5e-9498-e796fd7b42fd', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'aa5b008731384d18ba83c8d69e76bcef', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'ab115429-13ca-4258-9a08-fc76d0ca17b7', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=36325ae0-997b-4e15-a889-e33151da06b1, chassis=[<ovs.db.idl.Row object at 0x7fc06f54c970>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fc06f54c970>], logical_port=fbbb4f47-a5b9-460c-ae65-29a664051272) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 09:22:36 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:22:36.120 105272 INFO neutron.agent.ovn.metadata.agent [-] Port fbbb4f47-a5b9-460c-ae65-29a664051272 in datapath 0a97aec7-0780-4b5e-9498-e796fd7b42fd bound to our chassis
Dec 05 09:22:36 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:22:36.123 105272 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 0a97aec7-0780-4b5e-9498-e796fd7b42fd
Dec 05 09:22:36 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:22:36.767 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[b6085059-a9b0-4a96-95d9-12a49d28b140]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:22:36 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:22:36.769 105272 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap0a97aec7-01 in ovnmeta-0a97aec7-0780-4b5e-9498-e796fd7b42fd namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Dec 05 09:22:36 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:22:36.772 220556 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap0a97aec7-00 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Dec 05 09:22:36 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:22:36.772 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[99569a6e-8533-435d-aadd-9fc55a20ba5f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:22:36 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:22:36.773 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[03e55852-c198-45f0-9e42-35f7b6c3d2a2]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:22:36 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:22:36.802 105809 DEBUG oslo.privsep.daemon [-] privsep: reply[64621d91-a051-4807-9b2c-605b86a3849a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:22:36 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:22:36.822 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[618057bb-cc19-4d04-8307-9a598bec01ec]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:22:36 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:22:36.825 105272 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.link_cmd', '--privsep_sock_path', '/tmp/tmp2szn1li9/privsep.sock']
Dec 05 09:22:36 compute-1 podman[220869]: 2025-12-05 09:22:36.925614816 +0000 UTC m=+0.091875600 container health_status b07f36300303a5c1729056a61e344d6c6fd9b2e404082d8e8ce7185d45652c2b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1755695350, vcs-type=git, config_id=openstack_network_exporter, vendor=Red Hat, Inc., version=9.6, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.buildah.version=1.33.7, build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, distribution-scope=public, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible)
Dec 05 09:22:38 compute-1 nova_compute[189066]: 2025-12-05 09:22:38.019 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:22:38 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:22:38.021 105272 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap
Dec 05 09:22:38 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:22:38.022 105272 DEBUG oslo.privsep.daemon [-] Accepted privsep connection to /tmp/tmp2szn1li9/privsep.sock __init__ /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:362
Dec 05 09:22:38 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:22:37.681 220896 INFO oslo.privsep.daemon [-] privsep daemon starting
Dec 05 09:22:38 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:22:37.688 220896 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Dec 05 09:22:38 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:22:37.691 220896 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_NET_ADMIN|CAP_SYS_ADMIN/CAP_NET_ADMIN|CAP_SYS_ADMIN/none
Dec 05 09:22:38 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:22:37.691 220896 INFO oslo.privsep.daemon [-] privsep daemon running as pid 220896
Dec 05 09:22:38 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:22:38.025 220896 DEBUG oslo.privsep.daemon [-] privsep: reply[76669cb3-1430-4fec-9c7a-659a1fd0caf5]: (2,) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:22:38 compute-1 nova_compute[189066]: 2025-12-05 09:22:38.411 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:22:38 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:22:38.753 220896 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "context-manager" by "neutron_lib.db.api._create_context_manager" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:22:38 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:22:38.753 220896 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" acquired by "neutron_lib.db.api._create_context_manager" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:22:38 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:22:38.753 220896 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" "released" by "neutron_lib.db.api._create_context_manager" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:22:39 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:22:39.771 220896 DEBUG oslo.privsep.daemon [-] privsep: reply[a09860c0-413e-4258-8614-c8c2ff06220c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:22:39 compute-1 podman[220901]: 2025-12-05 09:22:39.776084016 +0000 UTC m=+0.076226245 container health_status b9f45cdcb567bb250381fa80d86c78c226d5638c7de1b6d79ee017275814ff68 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 05 09:22:39 compute-1 NetworkManager[55704]: <info>  [1764926559.7918] manager: (tap0a97aec7-00): new Veth device (/org/freedesktop/NetworkManager/Devices/23)
Dec 05 09:22:39 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:22:39.791 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[182d3042-7af5-4c52-86e3-00a143283ec1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:22:39 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:22:39.820 220896 DEBUG oslo.privsep.daemon [-] privsep: reply[4b1034f5-a619-4f1b-88e3-4d4a3f2c921e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:22:39 compute-1 systemd-udevd[220930]: Network interface NamePolicy= disabled on kernel command line.
Dec 05 09:22:39 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:22:39.823 220896 DEBUG oslo.privsep.daemon [-] privsep: reply[2009d41c-022e-4e54-9eb4-12deb6d6699a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:22:39 compute-1 NetworkManager[55704]: <info>  [1764926559.8475] device (tap0a97aec7-00): carrier: link connected
Dec 05 09:22:39 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:22:39.855 220896 DEBUG oslo.privsep.daemon [-] privsep: reply[6e5017f7-5b96-439e-8cb1-11fb24fe5626]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:22:39 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:22:39.876 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[27c40f3e-eb30-4d7a-9474-0a07f98e21c1]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap0a97aec7-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:66:cb:d0'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 12], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 383285, 'reachable_time': 33580, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 220948, 'error': None, 'target': 'ovnmeta-0a97aec7-0780-4b5e-9498-e796fd7b42fd', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:22:39 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:22:39.895 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[3627b07a-ac5d-4f06-b87d-93387ec23d65]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe66:cbd0'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 383285, 'tstamp': 383285}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 220949, 'error': None, 'target': 'ovnmeta-0a97aec7-0780-4b5e-9498-e796fd7b42fd', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:22:39 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:22:39.914 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[79be0da3-eb5d-4dfe-837c-80593d4e1f9e]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap0a97aec7-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:66:cb:d0'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 12], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 383285, 'reachable_time': 33580, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 220950, 'error': None, 'target': 'ovnmeta-0a97aec7-0780-4b5e-9498-e796fd7b42fd', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:22:39 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:22:39.946 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[ded4867e-fb9a-4446-93ac-634c9760b638]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:22:40 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:22:40.046 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[aca21ae4-c178-432c-a682-d7c35bb3f46a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:22:40 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:22:40.049 105272 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap0a97aec7-00, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 09:22:40 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:22:40.049 105272 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 09:22:40 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:22:40.049 105272 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap0a97aec7-00, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 09:22:40 compute-1 kernel: tap0a97aec7-00: entered promiscuous mode
Dec 05 09:22:40 compute-1 NetworkManager[55704]: <info>  [1764926560.0525] manager: (tap0a97aec7-00): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/24)
Dec 05 09:22:40 compute-1 nova_compute[189066]: 2025-12-05 09:22:40.052 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:22:40 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:22:40.056 105272 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap0a97aec7-00, col_values=(('external_ids', {'iface-id': '0cfbe0fa-0fb7-480e-bf9d-3c9768dbeffa'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 09:22:40 compute-1 nova_compute[189066]: 2025-12-05 09:22:40.058 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:22:40 compute-1 ovn_controller[95809]: 2025-12-05T09:22:40Z|00031|binding|INFO|Releasing lport 0cfbe0fa-0fb7-480e-bf9d-3c9768dbeffa from this chassis (sb_readonly=0)
Dec 05 09:22:40 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:22:40.060 105272 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/0a97aec7-0780-4b5e-9498-e796fd7b42fd.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/0a97aec7-0780-4b5e-9498-e796fd7b42fd.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Dec 05 09:22:40 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:22:40.061 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[5cf006a7-a53b-4f95-b236-8d7ba11dd3d7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:22:40 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:22:40.064 105272 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Dec 05 09:22:40 compute-1 ovn_metadata_agent[105267]: global
Dec 05 09:22:40 compute-1 ovn_metadata_agent[105267]:     log         /dev/log local0 debug
Dec 05 09:22:40 compute-1 ovn_metadata_agent[105267]:     log-tag     haproxy-metadata-proxy-0a97aec7-0780-4b5e-9498-e796fd7b42fd
Dec 05 09:22:40 compute-1 ovn_metadata_agent[105267]:     user        root
Dec 05 09:22:40 compute-1 ovn_metadata_agent[105267]:     group       root
Dec 05 09:22:40 compute-1 ovn_metadata_agent[105267]:     maxconn     1024
Dec 05 09:22:40 compute-1 ovn_metadata_agent[105267]:     pidfile     /var/lib/neutron/external/pids/0a97aec7-0780-4b5e-9498-e796fd7b42fd.pid.haproxy
Dec 05 09:22:40 compute-1 ovn_metadata_agent[105267]:     daemon
Dec 05 09:22:40 compute-1 ovn_metadata_agent[105267]: 
Dec 05 09:22:40 compute-1 ovn_metadata_agent[105267]: defaults
Dec 05 09:22:40 compute-1 ovn_metadata_agent[105267]:     log global
Dec 05 09:22:40 compute-1 ovn_metadata_agent[105267]:     mode http
Dec 05 09:22:40 compute-1 ovn_metadata_agent[105267]:     option httplog
Dec 05 09:22:40 compute-1 ovn_metadata_agent[105267]:     option dontlognull
Dec 05 09:22:40 compute-1 ovn_metadata_agent[105267]:     option http-server-close
Dec 05 09:22:40 compute-1 ovn_metadata_agent[105267]:     option forwardfor
Dec 05 09:22:40 compute-1 ovn_metadata_agent[105267]:     retries                 3
Dec 05 09:22:40 compute-1 ovn_metadata_agent[105267]:     timeout http-request    30s
Dec 05 09:22:40 compute-1 ovn_metadata_agent[105267]:     timeout connect         30s
Dec 05 09:22:40 compute-1 ovn_metadata_agent[105267]:     timeout client          32s
Dec 05 09:22:40 compute-1 ovn_metadata_agent[105267]:     timeout server          32s
Dec 05 09:22:40 compute-1 ovn_metadata_agent[105267]:     timeout http-keep-alive 30s
Dec 05 09:22:40 compute-1 ovn_metadata_agent[105267]: 
Dec 05 09:22:40 compute-1 ovn_metadata_agent[105267]: 
Dec 05 09:22:40 compute-1 ovn_metadata_agent[105267]: listen listener
Dec 05 09:22:40 compute-1 ovn_metadata_agent[105267]:     bind 169.254.169.254:80
Dec 05 09:22:40 compute-1 ovn_metadata_agent[105267]:     server metadata /var/lib/neutron/metadata_proxy
Dec 05 09:22:40 compute-1 ovn_metadata_agent[105267]:     http-request add-header X-OVN-Network-ID 0a97aec7-0780-4b5e-9498-e796fd7b42fd
Dec 05 09:22:40 compute-1 ovn_metadata_agent[105267]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Dec 05 09:22:40 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:22:40.065 105272 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-0a97aec7-0780-4b5e-9498-e796fd7b42fd', 'env', 'PROCESS_TAG=haproxy-0a97aec7-0780-4b5e-9498-e796fd7b42fd', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/0a97aec7-0780-4b5e-9498-e796fd7b42fd.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Dec 05 09:22:40 compute-1 nova_compute[189066]: 2025-12-05 09:22:40.070 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:22:40 compute-1 podman[220985]: 2025-12-05 09:22:40.663880322 +0000 UTC m=+0.048489229 container create 70231d9faa901311c678624953a87062e85e9c9355234ddd131fe44ab3e4a2ae (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0a97aec7-0780-4b5e-9498-e796fd7b42fd, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 05 09:22:40 compute-1 systemd[1]: Started libpod-conmon-70231d9faa901311c678624953a87062e85e9c9355234ddd131fe44ab3e4a2ae.scope.
Dec 05 09:22:40 compute-1 podman[220985]: 2025-12-05 09:22:40.637759996 +0000 UTC m=+0.022368923 image pull 014dc726c85414b29f2dde7b5d875685d08784761c0f0ffa8630d1583a877bf9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Dec 05 09:22:40 compute-1 systemd[1]: Started libcrun container.
Dec 05 09:22:40 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b0ab732b937e1a0aa90885ab57a8a882c3f199767b0ea9071c131766f2258997/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 05 09:22:40 compute-1 podman[220985]: 2025-12-05 09:22:40.763711017 +0000 UTC m=+0.148319944 container init 70231d9faa901311c678624953a87062e85e9c9355234ddd131fe44ab3e4a2ae (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0a97aec7-0780-4b5e-9498-e796fd7b42fd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 05 09:22:40 compute-1 podman[220985]: 2025-12-05 09:22:40.769902741 +0000 UTC m=+0.154511648 container start 70231d9faa901311c678624953a87062e85e9c9355234ddd131fe44ab3e4a2ae (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0a97aec7-0780-4b5e-9498-e796fd7b42fd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, tcib_managed=true)
Dec 05 09:22:40 compute-1 neutron-haproxy-ovnmeta-0a97aec7-0780-4b5e-9498-e796fd7b42fd[221001]: [NOTICE]   (221005) : New worker (221007) forked
Dec 05 09:22:40 compute-1 neutron-haproxy-ovnmeta-0a97aec7-0780-4b5e-9498-e796fd7b42fd[221001]: [NOTICE]   (221005) : Loading success.
Dec 05 09:22:41 compute-1 nova_compute[189066]: 2025-12-05 09:22:41.296 189070 DEBUG nova.network.neutron [None req-b09eff91-d682-40db-9795-f01015de55bc 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] [instance: 2b94d7d0-b000-460f-8883-8953d60115d0] Updating instance_info_cache with network_info: [{"id": "b2d6abb2-b32e-4bf7-bbdd-00d28c47ddc5", "address": "fa:16:3e:57:1d:2e", "network": {"id": "359084ff-88c9-4324-bfef-b1ef9f403118", "bridge": "br-int", "label": "tempest-network-smoke--271958037", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e26ae3fdd48d4947978a480f70e14f84", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb2d6abb2-b3", "ovs_interfaceid": "b2d6abb2-b32e-4bf7-bbdd-00d28c47ddc5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 09:22:42 compute-1 sshd-session[220961]: Received disconnect from 122.168.194.41 port 56474:11: Bye Bye [preauth]
Dec 05 09:22:42 compute-1 sshd-session[220961]: Disconnected from authenticating user root 122.168.194.41 port 56474 [preauth]
Dec 05 09:22:43 compute-1 nova_compute[189066]: 2025-12-05 09:22:43.022 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:22:43 compute-1 nova_compute[189066]: 2025-12-05 09:22:43.413 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:22:45 compute-1 nova_compute[189066]: 2025-12-05 09:22:45.625 189070 DEBUG nova.compute.manager [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] [instance: 95f85266-e2bc-4615-b523-e7346ad3ab40] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 09:22:45 compute-1 nova_compute[189066]: 2025-12-05 09:22:45.635 189070 DEBUG nova.virt.driver [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] Emitting event <LifecycleEvent: 1764926547.5145826, 95f85266-e2bc-4615-b523-e7346ad3ab40 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 09:22:45 compute-1 nova_compute[189066]: 2025-12-05 09:22:45.636 189070 INFO nova.compute.manager [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] [instance: 95f85266-e2bc-4615-b523-e7346ad3ab40] VM Paused (Lifecycle Event)
Dec 05 09:22:45 compute-1 nova_compute[189066]: 2025-12-05 09:22:45.762 189070 WARNING oslo.service.loopingcall [-] Function 'nova.servicegroup.drivers.db.DbDriver._report_state' run outlasted interval by 7.29 sec
Dec 05 09:22:46 compute-1 podman[221016]: 2025-12-05 09:22:46.631814414 +0000 UTC m=+0.064906704 container health_status 550b604b3b40c506829669f3af0f3e8dc4e7ac01464222ce7f26998708fe1a96 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Dec 05 09:22:48 compute-1 nova_compute[189066]: 2025-12-05 09:22:48.025 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:22:48 compute-1 nova_compute[189066]: 2025-12-05 09:22:48.074 189070 DEBUG oslo_concurrency.lockutils [None req-b09eff91-d682-40db-9795-f01015de55bc 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] Releasing lock "refresh_cache-2b94d7d0-b000-460f-8883-8953d60115d0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 09:22:48 compute-1 nova_compute[189066]: 2025-12-05 09:22:48.075 189070 DEBUG nova.compute.manager [None req-b09eff91-d682-40db-9795-f01015de55bc 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] [instance: 2b94d7d0-b000-460f-8883-8953d60115d0] Instance network_info: |[{"id": "b2d6abb2-b32e-4bf7-bbdd-00d28c47ddc5", "address": "fa:16:3e:57:1d:2e", "network": {"id": "359084ff-88c9-4324-bfef-b1ef9f403118", "bridge": "br-int", "label": "tempest-network-smoke--271958037", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e26ae3fdd48d4947978a480f70e14f84", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb2d6abb2-b3", "ovs_interfaceid": "b2d6abb2-b32e-4bf7-bbdd-00d28c47ddc5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Dec 05 09:22:48 compute-1 nova_compute[189066]: 2025-12-05 09:22:48.077 189070 DEBUG nova.compute.manager [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] [instance: 95f85266-e2bc-4615-b523-e7346ad3ab40] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 09:22:48 compute-1 nova_compute[189066]: 2025-12-05 09:22:48.081 189070 DEBUG nova.virt.libvirt.driver [None req-b09eff91-d682-40db-9795-f01015de55bc 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] [instance: 2b94d7d0-b000-460f-8883-8953d60115d0] Start _get_guest_xml network_info=[{"id": "b2d6abb2-b32e-4bf7-bbdd-00d28c47ddc5", "address": "fa:16:3e:57:1d:2e", "network": {"id": "359084ff-88c9-4324-bfef-b1ef9f403118", "bridge": "br-int", "label": "tempest-network-smoke--271958037", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e26ae3fdd48d4947978a480f70e14f84", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb2d6abb2-b3", "ovs_interfaceid": "b2d6abb2-b32e-4bf7-bbdd-00d28c47ddc5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T09:19:04Z,direct_url=<?>,disk_format='qcow2',id=3ebffd97-b242-42d7-b245-ebdaf8e4377c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='1cb29f3743454b7fb71af92993455437',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T09:19:06Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encrypted': False, 'encryption_secret_uuid': None, 'size': 0, 'boot_index': 0, 'device_name': '/dev/vda', 'disk_bus': 'virtio', 'guest_format': None, 'encryption_options': None, 'encryption_format': None, 'device_type': 'disk', 'image_id': '3ebffd97-b242-42d7-b245-ebdaf8e4377c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Dec 05 09:22:48 compute-1 nova_compute[189066]: 2025-12-05 09:22:48.086 189070 DEBUG nova.compute.manager [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] [instance: 95f85266-e2bc-4615-b523-e7346ad3ab40] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 09:22:48 compute-1 nova_compute[189066]: 2025-12-05 09:22:48.095 189070 WARNING nova.virt.libvirt.driver [None req-b09eff91-d682-40db-9795-f01015de55bc 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 05 09:22:48 compute-1 nova_compute[189066]: 2025-12-05 09:22:48.105 189070 DEBUG nova.virt.libvirt.host [None req-b09eff91-d682-40db-9795-f01015de55bc 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Dec 05 09:22:48 compute-1 nova_compute[189066]: 2025-12-05 09:22:48.107 189070 DEBUG nova.virt.libvirt.host [None req-b09eff91-d682-40db-9795-f01015de55bc 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Dec 05 09:22:48 compute-1 nova_compute[189066]: 2025-12-05 09:22:48.114 189070 DEBUG nova.virt.libvirt.host [None req-b09eff91-d682-40db-9795-f01015de55bc 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Dec 05 09:22:48 compute-1 nova_compute[189066]: 2025-12-05 09:22:48.115 189070 DEBUG nova.virt.libvirt.host [None req-b09eff91-d682-40db-9795-f01015de55bc 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Dec 05 09:22:48 compute-1 nova_compute[189066]: 2025-12-05 09:22:48.117 189070 DEBUG nova.virt.libvirt.driver [None req-b09eff91-d682-40db-9795-f01015de55bc 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Dec 05 09:22:48 compute-1 nova_compute[189066]: 2025-12-05 09:22:48.118 189070 DEBUG nova.virt.hardware [None req-b09eff91-d682-40db-9795-f01015de55bc 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-05T09:19:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='fbadeab4-f24f-4100-963a-d228b2a6f7c4',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T09:19:04Z,direct_url=<?>,disk_format='qcow2',id=3ebffd97-b242-42d7-b245-ebdaf8e4377c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='1cb29f3743454b7fb71af92993455437',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T09:19:06Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Dec 05 09:22:48 compute-1 nova_compute[189066]: 2025-12-05 09:22:48.118 189070 DEBUG nova.virt.hardware [None req-b09eff91-d682-40db-9795-f01015de55bc 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Dec 05 09:22:48 compute-1 nova_compute[189066]: 2025-12-05 09:22:48.118 189070 DEBUG nova.virt.hardware [None req-b09eff91-d682-40db-9795-f01015de55bc 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Dec 05 09:22:48 compute-1 nova_compute[189066]: 2025-12-05 09:22:48.119 189070 DEBUG nova.virt.hardware [None req-b09eff91-d682-40db-9795-f01015de55bc 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Dec 05 09:22:48 compute-1 nova_compute[189066]: 2025-12-05 09:22:48.119 189070 DEBUG nova.virt.hardware [None req-b09eff91-d682-40db-9795-f01015de55bc 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Dec 05 09:22:48 compute-1 nova_compute[189066]: 2025-12-05 09:22:48.119 189070 DEBUG nova.virt.hardware [None req-b09eff91-d682-40db-9795-f01015de55bc 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Dec 05 09:22:48 compute-1 nova_compute[189066]: 2025-12-05 09:22:48.119 189070 DEBUG nova.virt.hardware [None req-b09eff91-d682-40db-9795-f01015de55bc 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Dec 05 09:22:48 compute-1 nova_compute[189066]: 2025-12-05 09:22:48.120 189070 DEBUG nova.virt.hardware [None req-b09eff91-d682-40db-9795-f01015de55bc 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Dec 05 09:22:48 compute-1 nova_compute[189066]: 2025-12-05 09:22:48.120 189070 DEBUG nova.virt.hardware [None req-b09eff91-d682-40db-9795-f01015de55bc 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Dec 05 09:22:48 compute-1 nova_compute[189066]: 2025-12-05 09:22:48.120 189070 DEBUG nova.virt.hardware [None req-b09eff91-d682-40db-9795-f01015de55bc 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Dec 05 09:22:48 compute-1 nova_compute[189066]: 2025-12-05 09:22:48.121 189070 DEBUG nova.virt.hardware [None req-b09eff91-d682-40db-9795-f01015de55bc 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Dec 05 09:22:48 compute-1 nova_compute[189066]: 2025-12-05 09:22:48.125 189070 DEBUG nova.virt.libvirt.vif [None req-b09eff91-d682-40db-9795-f01015de55bc 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T09:21:50Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-804921789',display_name='tempest-TestNetworkAdvancedServerOps-server-804921789',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-804921789',id=3,image_ref='3ebffd97-b242-42d7-b245-ebdaf8e4377c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDW3qadE13zsRSDKJlvOjZ/sgAUF4xLDU490CG+fZt4wcZlik42yE/XMAhTvQFBPRgAXCQSd0Lb+5EKERW2D+JVe4gr8wR12Ds02HfiWcpROJrTejCLEVPRm4reH/9sdrA==',key_name='tempest-TestNetworkAdvancedServerOps-175847342',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='e26ae3fdd48d4947978a480f70e14f84',ramdisk_id='',reservation_id='r-rmofgs1d',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='3ebffd97-b242-42d7-b245-ebdaf8e4377c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkAdvancedServerOps-1829130727',owner_user_name='tempest-TestNetworkAdvancedServerOps-1829130727-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T09:21:57Z,user_data=None,user_id='65751a90715341b2984ef84ebbaa1650',uuid=2b94d7d0-b000-460f-8883-8953d60115d0,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "b2d6abb2-b32e-4bf7-bbdd-00d28c47ddc5", "address": "fa:16:3e:57:1d:2e", "network": {"id": "359084ff-88c9-4324-bfef-b1ef9f403118", "bridge": "br-int", "label": "tempest-network-smoke--271958037", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e26ae3fdd48d4947978a480f70e14f84", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb2d6abb2-b3", "ovs_interfaceid": "b2d6abb2-b32e-4bf7-bbdd-00d28c47ddc5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Dec 05 09:22:48 compute-1 nova_compute[189066]: 2025-12-05 09:22:48.126 189070 DEBUG nova.network.os_vif_util [None req-b09eff91-d682-40db-9795-f01015de55bc 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] Converting VIF {"id": "b2d6abb2-b32e-4bf7-bbdd-00d28c47ddc5", "address": "fa:16:3e:57:1d:2e", "network": {"id": "359084ff-88c9-4324-bfef-b1ef9f403118", "bridge": "br-int", "label": "tempest-network-smoke--271958037", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e26ae3fdd48d4947978a480f70e14f84", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb2d6abb2-b3", "ovs_interfaceid": "b2d6abb2-b32e-4bf7-bbdd-00d28c47ddc5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 09:22:48 compute-1 nova_compute[189066]: 2025-12-05 09:22:48.127 189070 DEBUG nova.network.os_vif_util [None req-b09eff91-d682-40db-9795-f01015de55bc 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:57:1d:2e,bridge_name='br-int',has_traffic_filtering=True,id=b2d6abb2-b32e-4bf7-bbdd-00d28c47ddc5,network=Network(359084ff-88c9-4324-bfef-b1ef9f403118),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb2d6abb2-b3') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 09:22:48 compute-1 nova_compute[189066]: 2025-12-05 09:22:48.129 189070 DEBUG nova.objects.instance [None req-b09eff91-d682-40db-9795-f01015de55bc 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] Lazy-loading 'pci_devices' on Instance uuid 2b94d7d0-b000-460f-8883-8953d60115d0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 09:22:48 compute-1 nova_compute[189066]: 2025-12-05 09:22:48.162 189070 DEBUG nova.virt.libvirt.driver [None req-b09eff91-d682-40db-9795-f01015de55bc 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] [instance: 2b94d7d0-b000-460f-8883-8953d60115d0] End _get_guest_xml xml=<domain type="kvm">
Dec 05 09:22:48 compute-1 nova_compute[189066]:   <uuid>2b94d7d0-b000-460f-8883-8953d60115d0</uuid>
Dec 05 09:22:48 compute-1 nova_compute[189066]:   <name>instance-00000003</name>
Dec 05 09:22:48 compute-1 nova_compute[189066]:   <memory>131072</memory>
Dec 05 09:22:48 compute-1 nova_compute[189066]:   <vcpu>1</vcpu>
Dec 05 09:22:48 compute-1 nova_compute[189066]:   <metadata>
Dec 05 09:22:48 compute-1 nova_compute[189066]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 05 09:22:48 compute-1 nova_compute[189066]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec 05 09:22:48 compute-1 nova_compute[189066]:       <nova:name>tempest-TestNetworkAdvancedServerOps-server-804921789</nova:name>
Dec 05 09:22:48 compute-1 nova_compute[189066]:       <nova:creationTime>2025-12-05 09:22:48</nova:creationTime>
Dec 05 09:22:48 compute-1 nova_compute[189066]:       <nova:flavor name="m1.nano">
Dec 05 09:22:48 compute-1 nova_compute[189066]:         <nova:memory>128</nova:memory>
Dec 05 09:22:48 compute-1 nova_compute[189066]:         <nova:disk>1</nova:disk>
Dec 05 09:22:48 compute-1 nova_compute[189066]:         <nova:swap>0</nova:swap>
Dec 05 09:22:48 compute-1 nova_compute[189066]:         <nova:ephemeral>0</nova:ephemeral>
Dec 05 09:22:48 compute-1 nova_compute[189066]:         <nova:vcpus>1</nova:vcpus>
Dec 05 09:22:48 compute-1 nova_compute[189066]:       </nova:flavor>
Dec 05 09:22:48 compute-1 nova_compute[189066]:       <nova:owner>
Dec 05 09:22:48 compute-1 nova_compute[189066]:         <nova:user uuid="65751a90715341b2984ef84ebbaa1650">tempest-TestNetworkAdvancedServerOps-1829130727-project-member</nova:user>
Dec 05 09:22:48 compute-1 nova_compute[189066]:         <nova:project uuid="e26ae3fdd48d4947978a480f70e14f84">tempest-TestNetworkAdvancedServerOps-1829130727</nova:project>
Dec 05 09:22:48 compute-1 nova_compute[189066]:       </nova:owner>
Dec 05 09:22:48 compute-1 nova_compute[189066]:       <nova:root type="image" uuid="3ebffd97-b242-42d7-b245-ebdaf8e4377c"/>
Dec 05 09:22:48 compute-1 nova_compute[189066]:       <nova:ports>
Dec 05 09:22:48 compute-1 nova_compute[189066]:         <nova:port uuid="b2d6abb2-b32e-4bf7-bbdd-00d28c47ddc5">
Dec 05 09:22:48 compute-1 nova_compute[189066]:           <nova:ip type="fixed" address="10.100.0.8" ipVersion="4"/>
Dec 05 09:22:48 compute-1 nova_compute[189066]:         </nova:port>
Dec 05 09:22:48 compute-1 nova_compute[189066]:       </nova:ports>
Dec 05 09:22:48 compute-1 nova_compute[189066]:     </nova:instance>
Dec 05 09:22:48 compute-1 nova_compute[189066]:   </metadata>
Dec 05 09:22:48 compute-1 nova_compute[189066]:   <sysinfo type="smbios">
Dec 05 09:22:48 compute-1 nova_compute[189066]:     <system>
Dec 05 09:22:48 compute-1 nova_compute[189066]:       <entry name="manufacturer">RDO</entry>
Dec 05 09:22:48 compute-1 nova_compute[189066]:       <entry name="product">OpenStack Compute</entry>
Dec 05 09:22:48 compute-1 nova_compute[189066]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec 05 09:22:48 compute-1 nova_compute[189066]:       <entry name="serial">2b94d7d0-b000-460f-8883-8953d60115d0</entry>
Dec 05 09:22:48 compute-1 nova_compute[189066]:       <entry name="uuid">2b94d7d0-b000-460f-8883-8953d60115d0</entry>
Dec 05 09:22:48 compute-1 nova_compute[189066]:       <entry name="family">Virtual Machine</entry>
Dec 05 09:22:48 compute-1 nova_compute[189066]:     </system>
Dec 05 09:22:48 compute-1 nova_compute[189066]:   </sysinfo>
Dec 05 09:22:48 compute-1 nova_compute[189066]:   <os>
Dec 05 09:22:48 compute-1 nova_compute[189066]:     <type arch="x86_64" machine="q35">hvm</type>
Dec 05 09:22:48 compute-1 nova_compute[189066]:     <boot dev="hd"/>
Dec 05 09:22:48 compute-1 nova_compute[189066]:     <smbios mode="sysinfo"/>
Dec 05 09:22:48 compute-1 nova_compute[189066]:   </os>
Dec 05 09:22:48 compute-1 nova_compute[189066]:   <features>
Dec 05 09:22:48 compute-1 nova_compute[189066]:     <acpi/>
Dec 05 09:22:48 compute-1 nova_compute[189066]:     <apic/>
Dec 05 09:22:48 compute-1 nova_compute[189066]:     <vmcoreinfo/>
Dec 05 09:22:48 compute-1 nova_compute[189066]:   </features>
Dec 05 09:22:48 compute-1 nova_compute[189066]:   <clock offset="utc">
Dec 05 09:22:48 compute-1 nova_compute[189066]:     <timer name="pit" tickpolicy="delay"/>
Dec 05 09:22:48 compute-1 nova_compute[189066]:     <timer name="rtc" tickpolicy="catchup"/>
Dec 05 09:22:48 compute-1 nova_compute[189066]:     <timer name="hpet" present="no"/>
Dec 05 09:22:48 compute-1 nova_compute[189066]:   </clock>
Dec 05 09:22:48 compute-1 nova_compute[189066]:   <cpu mode="custom" match="exact">
Dec 05 09:22:48 compute-1 nova_compute[189066]:     <model>Nehalem</model>
Dec 05 09:22:48 compute-1 nova_compute[189066]:     <topology sockets="1" cores="1" threads="1"/>
Dec 05 09:22:48 compute-1 nova_compute[189066]:   </cpu>
Dec 05 09:22:48 compute-1 nova_compute[189066]:   <devices>
Dec 05 09:22:48 compute-1 nova_compute[189066]:     <disk type="file" device="disk">
Dec 05 09:22:48 compute-1 nova_compute[189066]:       <driver name="qemu" type="qcow2" cache="none"/>
Dec 05 09:22:48 compute-1 nova_compute[189066]:       <source file="/var/lib/nova/instances/2b94d7d0-b000-460f-8883-8953d60115d0/disk"/>
Dec 05 09:22:48 compute-1 nova_compute[189066]:       <target dev="vda" bus="virtio"/>
Dec 05 09:22:48 compute-1 nova_compute[189066]:     </disk>
Dec 05 09:22:48 compute-1 nova_compute[189066]:     <disk type="file" device="cdrom">
Dec 05 09:22:48 compute-1 nova_compute[189066]:       <driver name="qemu" type="raw" cache="none"/>
Dec 05 09:22:48 compute-1 nova_compute[189066]:       <source file="/var/lib/nova/instances/2b94d7d0-b000-460f-8883-8953d60115d0/disk.config"/>
Dec 05 09:22:48 compute-1 nova_compute[189066]:       <target dev="sda" bus="sata"/>
Dec 05 09:22:48 compute-1 nova_compute[189066]:     </disk>
Dec 05 09:22:48 compute-1 nova_compute[189066]:     <interface type="ethernet">
Dec 05 09:22:48 compute-1 nova_compute[189066]:       <mac address="fa:16:3e:57:1d:2e"/>
Dec 05 09:22:48 compute-1 nova_compute[189066]:       <model type="virtio"/>
Dec 05 09:22:48 compute-1 nova_compute[189066]:       <driver name="vhost" rx_queue_size="512"/>
Dec 05 09:22:48 compute-1 nova_compute[189066]:       <mtu size="1442"/>
Dec 05 09:22:48 compute-1 nova_compute[189066]:       <target dev="tapb2d6abb2-b3"/>
Dec 05 09:22:48 compute-1 nova_compute[189066]:     </interface>
Dec 05 09:22:48 compute-1 nova_compute[189066]:     <serial type="pty">
Dec 05 09:22:48 compute-1 nova_compute[189066]:       <log file="/var/lib/nova/instances/2b94d7d0-b000-460f-8883-8953d60115d0/console.log" append="off"/>
Dec 05 09:22:48 compute-1 nova_compute[189066]:     </serial>
Dec 05 09:22:48 compute-1 nova_compute[189066]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Dec 05 09:22:48 compute-1 nova_compute[189066]:     <video>
Dec 05 09:22:48 compute-1 nova_compute[189066]:       <model type="virtio"/>
Dec 05 09:22:48 compute-1 nova_compute[189066]:     </video>
Dec 05 09:22:48 compute-1 nova_compute[189066]:     <input type="tablet" bus="usb"/>
Dec 05 09:22:48 compute-1 nova_compute[189066]:     <rng model="virtio">
Dec 05 09:22:48 compute-1 nova_compute[189066]:       <backend model="random">/dev/urandom</backend>
Dec 05 09:22:48 compute-1 nova_compute[189066]:     </rng>
Dec 05 09:22:48 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root"/>
Dec 05 09:22:48 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:22:48 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:22:48 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:22:48 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:22:48 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:22:48 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:22:48 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:22:48 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:22:48 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:22:48 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:22:48 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:22:48 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:22:48 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:22:48 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:22:48 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:22:48 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:22:48 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:22:48 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:22:48 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:22:48 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:22:48 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:22:48 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:22:48 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:22:48 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:22:48 compute-1 nova_compute[189066]:     <controller type="usb" index="0"/>
Dec 05 09:22:48 compute-1 nova_compute[189066]:     <memballoon model="virtio">
Dec 05 09:22:48 compute-1 nova_compute[189066]:       <stats period="10"/>
Dec 05 09:22:48 compute-1 nova_compute[189066]:     </memballoon>
Dec 05 09:22:48 compute-1 nova_compute[189066]:   </devices>
Dec 05 09:22:48 compute-1 nova_compute[189066]: </domain>
Dec 05 09:22:48 compute-1 nova_compute[189066]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Dec 05 09:22:48 compute-1 nova_compute[189066]: 2025-12-05 09:22:48.164 189070 DEBUG nova.compute.manager [None req-b09eff91-d682-40db-9795-f01015de55bc 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] [instance: 2b94d7d0-b000-460f-8883-8953d60115d0] Preparing to wait for external event network-vif-plugged-b2d6abb2-b32e-4bf7-bbdd-00d28c47ddc5 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Dec 05 09:22:48 compute-1 nova_compute[189066]: 2025-12-05 09:22:48.164 189070 DEBUG oslo_concurrency.lockutils [None req-b09eff91-d682-40db-9795-f01015de55bc 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] Acquiring lock "2b94d7d0-b000-460f-8883-8953d60115d0-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:22:48 compute-1 nova_compute[189066]: 2025-12-05 09:22:48.165 189070 DEBUG oslo_concurrency.lockutils [None req-b09eff91-d682-40db-9795-f01015de55bc 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] Lock "2b94d7d0-b000-460f-8883-8953d60115d0-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:22:48 compute-1 nova_compute[189066]: 2025-12-05 09:22:48.165 189070 DEBUG oslo_concurrency.lockutils [None req-b09eff91-d682-40db-9795-f01015de55bc 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] Lock "2b94d7d0-b000-460f-8883-8953d60115d0-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:22:48 compute-1 nova_compute[189066]: 2025-12-05 09:22:48.165 189070 DEBUG nova.virt.libvirt.vif [None req-b09eff91-d682-40db-9795-f01015de55bc 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T09:21:50Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-804921789',display_name='tempest-TestNetworkAdvancedServerOps-server-804921789',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-804921789',id=3,image_ref='3ebffd97-b242-42d7-b245-ebdaf8e4377c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDW3qadE13zsRSDKJlvOjZ/sgAUF4xLDU490CG+fZt4wcZlik42yE/XMAhTvQFBPRgAXCQSd0Lb+5EKERW2D+JVe4gr8wR12Ds02HfiWcpROJrTejCLEVPRm4reH/9sdrA==',key_name='tempest-TestNetworkAdvancedServerOps-175847342',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='e26ae3fdd48d4947978a480f70e14f84',ramdisk_id='',reservation_id='r-rmofgs1d',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='3ebffd97-b242-42d7-b245-ebdaf8e4377c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkAdvancedServerOps-1829130727',owner_user_name='tempest-TestNetworkAdvancedServerOps-1829130727-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T09:21:57Z,user_data=None,user_id='65751a90715341b2984ef84ebbaa1650',uuid=2b94d7d0-b000-460f-8883-8953d60115d0,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "b2d6abb2-b32e-4bf7-bbdd-00d28c47ddc5", "address": "fa:16:3e:57:1d:2e", "network": {"id": "359084ff-88c9-4324-bfef-b1ef9f403118", "bridge": "br-int", "label": "tempest-network-smoke--271958037", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e26ae3fdd48d4947978a480f70e14f84", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb2d6abb2-b3", "ovs_interfaceid": "b2d6abb2-b32e-4bf7-bbdd-00d28c47ddc5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Dec 05 09:22:48 compute-1 nova_compute[189066]: 2025-12-05 09:22:48.166 189070 DEBUG nova.network.os_vif_util [None req-b09eff91-d682-40db-9795-f01015de55bc 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] Converting VIF {"id": "b2d6abb2-b32e-4bf7-bbdd-00d28c47ddc5", "address": "fa:16:3e:57:1d:2e", "network": {"id": "359084ff-88c9-4324-bfef-b1ef9f403118", "bridge": "br-int", "label": "tempest-network-smoke--271958037", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e26ae3fdd48d4947978a480f70e14f84", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb2d6abb2-b3", "ovs_interfaceid": "b2d6abb2-b32e-4bf7-bbdd-00d28c47ddc5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 09:22:48 compute-1 nova_compute[189066]: 2025-12-05 09:22:48.166 189070 DEBUG nova.network.os_vif_util [None req-b09eff91-d682-40db-9795-f01015de55bc 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:57:1d:2e,bridge_name='br-int',has_traffic_filtering=True,id=b2d6abb2-b32e-4bf7-bbdd-00d28c47ddc5,network=Network(359084ff-88c9-4324-bfef-b1ef9f403118),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb2d6abb2-b3') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 09:22:48 compute-1 nova_compute[189066]: 2025-12-05 09:22:48.167 189070 DEBUG os_vif [None req-b09eff91-d682-40db-9795-f01015de55bc 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:57:1d:2e,bridge_name='br-int',has_traffic_filtering=True,id=b2d6abb2-b32e-4bf7-bbdd-00d28c47ddc5,network=Network(359084ff-88c9-4324-bfef-b1ef9f403118),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb2d6abb2-b3') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Dec 05 09:22:48 compute-1 nova_compute[189066]: 2025-12-05 09:22:48.168 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:22:48 compute-1 nova_compute[189066]: 2025-12-05 09:22:48.169 189070 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 09:22:48 compute-1 nova_compute[189066]: 2025-12-05 09:22:48.169 189070 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 09:22:48 compute-1 nova_compute[189066]: 2025-12-05 09:22:48.170 189070 INFO nova.compute.manager [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] [instance: 95f85266-e2bc-4615-b523-e7346ad3ab40] During sync_power_state the instance has a pending task (spawning). Skip.
Dec 05 09:22:48 compute-1 nova_compute[189066]: 2025-12-05 09:22:48.174 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:22:48 compute-1 nova_compute[189066]: 2025-12-05 09:22:48.174 189070 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapb2d6abb2-b3, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 09:22:48 compute-1 nova_compute[189066]: 2025-12-05 09:22:48.175 189070 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapb2d6abb2-b3, col_values=(('external_ids', {'iface-id': 'b2d6abb2-b32e-4bf7-bbdd-00d28c47ddc5', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:57:1d:2e', 'vm-uuid': '2b94d7d0-b000-460f-8883-8953d60115d0'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 09:22:48 compute-1 nova_compute[189066]: 2025-12-05 09:22:48.177 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:22:48 compute-1 NetworkManager[55704]: <info>  [1764926568.1783] manager: (tapb2d6abb2-b3): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/25)
Dec 05 09:22:48 compute-1 nova_compute[189066]: 2025-12-05 09:22:48.180 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 09:22:48 compute-1 nova_compute[189066]: 2025-12-05 09:22:48.185 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:22:48 compute-1 nova_compute[189066]: 2025-12-05 09:22:48.186 189070 INFO os_vif [None req-b09eff91-d682-40db-9795-f01015de55bc 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:57:1d:2e,bridge_name='br-int',has_traffic_filtering=True,id=b2d6abb2-b32e-4bf7-bbdd-00d28c47ddc5,network=Network(359084ff-88c9-4324-bfef-b1ef9f403118),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb2d6abb2-b3')
Dec 05 09:22:48 compute-1 nova_compute[189066]: 2025-12-05 09:22:48.302 189070 DEBUG nova.virt.libvirt.driver [None req-b09eff91-d682-40db-9795-f01015de55bc 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 05 09:22:48 compute-1 nova_compute[189066]: 2025-12-05 09:22:48.302 189070 DEBUG nova.virt.libvirt.driver [None req-b09eff91-d682-40db-9795-f01015de55bc 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 05 09:22:48 compute-1 nova_compute[189066]: 2025-12-05 09:22:48.303 189070 DEBUG nova.virt.libvirt.driver [None req-b09eff91-d682-40db-9795-f01015de55bc 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] No VIF found with MAC fa:16:3e:57:1d:2e, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Dec 05 09:22:48 compute-1 nova_compute[189066]: 2025-12-05 09:22:48.303 189070 INFO nova.virt.libvirt.driver [None req-b09eff91-d682-40db-9795-f01015de55bc 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] [instance: 2b94d7d0-b000-460f-8883-8953d60115d0] Using config drive
Dec 05 09:22:48 compute-1 nova_compute[189066]: 2025-12-05 09:22:48.415 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:22:49 compute-1 nova_compute[189066]: 2025-12-05 09:22:49.549 189070 DEBUG nova.compute.manager [req-c7ee3d65-2f4b-4368-82a7-c08eb8d4e512 req-0909bd8c-3f02-421c-bffa-a0207d5527be 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 2b94d7d0-b000-460f-8883-8953d60115d0] Received event network-changed-b2d6abb2-b32e-4bf7-bbdd-00d28c47ddc5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 09:22:49 compute-1 nova_compute[189066]: 2025-12-05 09:22:49.550 189070 DEBUG nova.compute.manager [req-c7ee3d65-2f4b-4368-82a7-c08eb8d4e512 req-0909bd8c-3f02-421c-bffa-a0207d5527be 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 2b94d7d0-b000-460f-8883-8953d60115d0] Refreshing instance network info cache due to event network-changed-b2d6abb2-b32e-4bf7-bbdd-00d28c47ddc5. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Dec 05 09:22:49 compute-1 nova_compute[189066]: 2025-12-05 09:22:49.550 189070 DEBUG oslo_concurrency.lockutils [req-c7ee3d65-2f4b-4368-82a7-c08eb8d4e512 req-0909bd8c-3f02-421c-bffa-a0207d5527be 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Acquiring lock "refresh_cache-2b94d7d0-b000-460f-8883-8953d60115d0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 09:22:49 compute-1 nova_compute[189066]: 2025-12-05 09:22:49.550 189070 DEBUG oslo_concurrency.lockutils [req-c7ee3d65-2f4b-4368-82a7-c08eb8d4e512 req-0909bd8c-3f02-421c-bffa-a0207d5527be 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Acquired lock "refresh_cache-2b94d7d0-b000-460f-8883-8953d60115d0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 09:22:49 compute-1 nova_compute[189066]: 2025-12-05 09:22:49.551 189070 DEBUG nova.network.neutron [req-c7ee3d65-2f4b-4368-82a7-c08eb8d4e512 req-0909bd8c-3f02-421c-bffa-a0207d5527be 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 2b94d7d0-b000-460f-8883-8953d60115d0] Refreshing network info cache for port b2d6abb2-b32e-4bf7-bbdd-00d28c47ddc5 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Dec 05 09:22:51 compute-1 nova_compute[189066]: 2025-12-05 09:22:51.138 189070 INFO nova.virt.libvirt.driver [None req-b09eff91-d682-40db-9795-f01015de55bc 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] [instance: 2b94d7d0-b000-460f-8883-8953d60115d0] Creating config drive at /var/lib/nova/instances/2b94d7d0-b000-460f-8883-8953d60115d0/disk.config
Dec 05 09:22:51 compute-1 nova_compute[189066]: 2025-12-05 09:22:51.149 189070 DEBUG oslo_concurrency.processutils [None req-b09eff91-d682-40db-9795-f01015de55bc 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/2b94d7d0-b000-460f-8883-8953d60115d0/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp6oku0s7y execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 09:22:51 compute-1 nova_compute[189066]: 2025-12-05 09:22:51.280 189070 DEBUG oslo_concurrency.processutils [None req-b09eff91-d682-40db-9795-f01015de55bc 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/2b94d7d0-b000-460f-8883-8953d60115d0/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp6oku0s7y" returned: 0 in 0.131s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 09:22:51 compute-1 kernel: tapb2d6abb2-b3: entered promiscuous mode
Dec 05 09:22:51 compute-1 NetworkManager[55704]: <info>  [1764926571.3549] manager: (tapb2d6abb2-b3): new Tun device (/org/freedesktop/NetworkManager/Devices/26)
Dec 05 09:22:51 compute-1 ovn_controller[95809]: 2025-12-05T09:22:51Z|00032|binding|INFO|Claiming lport b2d6abb2-b32e-4bf7-bbdd-00d28c47ddc5 for this chassis.
Dec 05 09:22:51 compute-1 ovn_controller[95809]: 2025-12-05T09:22:51Z|00033|binding|INFO|b2d6abb2-b32e-4bf7-bbdd-00d28c47ddc5: Claiming fa:16:3e:57:1d:2e 10.100.0.8
Dec 05 09:22:51 compute-1 nova_compute[189066]: 2025-12-05 09:22:51.356 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:22:51 compute-1 nova_compute[189066]: 2025-12-05 09:22:51.359 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:22:51 compute-1 nova_compute[189066]: 2025-12-05 09:22:51.364 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:22:51 compute-1 systemd-machined[154815]: New machine qemu-2-instance-00000003.
Dec 05 09:22:51 compute-1 ovn_controller[95809]: 2025-12-05T09:22:51Z|00034|binding|INFO|Setting lport b2d6abb2-b32e-4bf7-bbdd-00d28c47ddc5 ovn-installed in OVS
Dec 05 09:22:51 compute-1 nova_compute[189066]: 2025-12-05 09:22:51.419 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:22:51 compute-1 systemd[1]: Started Virtual Machine qemu-2-instance-00000003.
Dec 05 09:22:51 compute-1 systemd-udevd[221061]: Network interface NamePolicy= disabled on kernel command line.
Dec 05 09:22:51 compute-1 NetworkManager[55704]: <info>  [1764926571.4493] device (tapb2d6abb2-b3): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec 05 09:22:51 compute-1 NetworkManager[55704]: <info>  [1764926571.4504] device (tapb2d6abb2-b3): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec 05 09:22:52 compute-1 nova_compute[189066]: 2025-12-05 09:22:52.053 189070 DEBUG nova.virt.driver [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] Emitting event <LifecycleEvent: 1764926572.051887, 2b94d7d0-b000-460f-8883-8953d60115d0 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 09:22:52 compute-1 nova_compute[189066]: 2025-12-05 09:22:52.054 189070 INFO nova.compute.manager [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] [instance: 2b94d7d0-b000-460f-8883-8953d60115d0] VM Started (Lifecycle Event)
Dec 05 09:22:53 compute-1 nova_compute[189066]: 2025-12-05 09:22:53.178 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:22:53 compute-1 nova_compute[189066]: 2025-12-05 09:22:53.417 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:22:53 compute-1 ovn_controller[95809]: 2025-12-05T09:22:53Z|00035|binding|INFO|Setting lport b2d6abb2-b32e-4bf7-bbdd-00d28c47ddc5 up in Southbound
Dec 05 09:22:53 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:22:53.601 105272 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:57:1d:2e 10.100.0.8'], port_security=['fa:16:3e:57:1d:2e 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '2b94d7d0-b000-460f-8883-8953d60115d0', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-359084ff-88c9-4324-bfef-b1ef9f403118', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e26ae3fdd48d4947978a480f70e14f84', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'fc71c22e-381a-49db-a2d4-d596204126ce', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e32f91e2-293f-4583-87a5-1928a1341468, chassis=[<ovs.db.idl.Row object at 0x7fc06f54c970>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fc06f54c970>], logical_port=b2d6abb2-b32e-4bf7-bbdd-00d28c47ddc5) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 09:22:53 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:22:53.603 105272 INFO neutron.agent.ovn.metadata.agent [-] Port b2d6abb2-b32e-4bf7-bbdd-00d28c47ddc5 in datapath 359084ff-88c9-4324-bfef-b1ef9f403118 bound to our chassis
Dec 05 09:22:53 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:22:53.605 105272 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 359084ff-88c9-4324-bfef-b1ef9f403118
Dec 05 09:22:53 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:22:53.620 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[368f1945-4804-443a-87da-d944a2a57a8a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:22:53 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:22:53.622 105272 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap359084ff-81 in ovnmeta-359084ff-88c9-4324-bfef-b1ef9f403118 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Dec 05 09:22:53 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:22:53.625 220556 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap359084ff-80 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Dec 05 09:22:53 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:22:53.625 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[4f12ebeb-5abe-4358-b3c9-b97526e17680]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:22:53 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:22:53.626 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[e12fb89b-3553-48c5-83fb-853229c36dfa]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:22:53 compute-1 nova_compute[189066]: 2025-12-05 09:22:53.635 189070 DEBUG nova.compute.manager [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] [instance: 2b94d7d0-b000-460f-8883-8953d60115d0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 09:22:53 compute-1 nova_compute[189066]: 2025-12-05 09:22:53.640 189070 DEBUG nova.virt.driver [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] Emitting event <LifecycleEvent: 1764926572.0565193, 2b94d7d0-b000-460f-8883-8953d60115d0 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 09:22:53 compute-1 nova_compute[189066]: 2025-12-05 09:22:53.640 189070 INFO nova.compute.manager [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] [instance: 2b94d7d0-b000-460f-8883-8953d60115d0] VM Paused (Lifecycle Event)
Dec 05 09:22:53 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:22:53.662 105809 DEBUG oslo.privsep.daemon [-] privsep: reply[f17050c3-0c3d-4fff-8197-f247ab423445]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:22:53 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:22:53.683 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[abee45fc-ebe5-45be-808e-5364c8e19862]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:22:53 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:22:53.720 220896 DEBUG oslo.privsep.daemon [-] privsep: reply[b61bf219-92a8-46cc-b88f-5b6e7de02e99]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:22:53 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:22:53.729 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[6f560762-c20f-42b1-a992-f6f0a77bdf5b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:22:53 compute-1 systemd-udevd[221065]: Network interface NamePolicy= disabled on kernel command line.
Dec 05 09:22:53 compute-1 NetworkManager[55704]: <info>  [1764926573.7311] manager: (tap359084ff-80): new Veth device (/org/freedesktop/NetworkManager/Devices/27)
Dec 05 09:22:53 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:22:53.764 220896 DEBUG oslo.privsep.daemon [-] privsep: reply[fca48368-3e69-48d1-bace-f20e716aafcc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:22:53 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:22:53.768 220896 DEBUG oslo.privsep.daemon [-] privsep: reply[0ddbe5ac-ee76-44f9-921c-dac388727bba]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:22:53 compute-1 nova_compute[189066]: 2025-12-05 09:22:53.790 189070 DEBUG nova.compute.manager [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] [instance: 2b94d7d0-b000-460f-8883-8953d60115d0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 09:22:53 compute-1 NetworkManager[55704]: <info>  [1764926573.7925] device (tap359084ff-80): carrier: link connected
Dec 05 09:22:53 compute-1 nova_compute[189066]: 2025-12-05 09:22:53.794 189070 DEBUG nova.compute.manager [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] [instance: 2b94d7d0-b000-460f-8883-8953d60115d0] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 09:22:53 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:22:53.798 220896 DEBUG oslo.privsep.daemon [-] privsep: reply[835f2afe-d430-482c-bf6b-9d71d627e8cb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:22:53 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:22:53.820 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[0cd56b6b-7455-45c1-add8-d5c7c7d07b65]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap359084ff-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:b8:5b:fe'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 14], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 384679, 'reachable_time': 17764, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 221102, 'error': None, 'target': 'ovnmeta-359084ff-88c9-4324-bfef-b1ef9f403118', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:22:53 compute-1 nova_compute[189066]: 2025-12-05 09:22:53.832 189070 INFO nova.compute.manager [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] [instance: 2b94d7d0-b000-460f-8883-8953d60115d0] During sync_power_state the instance has a pending task (spawning). Skip.
Dec 05 09:22:53 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:22:53.838 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[e630c26f-38b1-47a0-bc19-99bc96c04ffd]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:feb8:5bfe'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 384679, 'tstamp': 384679}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 221103, 'error': None, 'target': 'ovnmeta-359084ff-88c9-4324-bfef-b1ef9f403118', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:22:53 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:22:53.856 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[40eac0ce-a9e1-457f-8f30-559e75f07e85]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap359084ff-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:b8:5b:fe'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 14], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 384679, 'reachable_time': 17764, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 221104, 'error': None, 'target': 'ovnmeta-359084ff-88c9-4324-bfef-b1ef9f403118', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:22:53 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:22:53.898 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[63503347-94a5-430c-9ffb-4ac00ad6554b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:22:53 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:22:53.972 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[8a50e770-4d86-448b-98a9-900546b2d475]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:22:53 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:22:53.974 105272 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap359084ff-80, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 09:22:53 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:22:53.974 105272 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 09:22:53 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:22:53.975 105272 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap359084ff-80, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 09:22:53 compute-1 nova_compute[189066]: 2025-12-05 09:22:53.977 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:22:53 compute-1 NetworkManager[55704]: <info>  [1764926573.9780] manager: (tap359084ff-80): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/28)
Dec 05 09:22:53 compute-1 kernel: tap359084ff-80: entered promiscuous mode
Dec 05 09:22:53 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:22:53.980 105272 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap359084ff-80, col_values=(('external_ids', {'iface-id': 'b3346b7b-37e6-4cad-b494-c202eaae0edf'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 09:22:53 compute-1 ovn_controller[95809]: 2025-12-05T09:22:53Z|00036|binding|INFO|Releasing lport b3346b7b-37e6-4cad-b494-c202eaae0edf from this chassis (sb_readonly=0)
Dec 05 09:22:53 compute-1 nova_compute[189066]: 2025-12-05 09:22:53.994 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:22:53 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:22:53.995 105272 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/359084ff-88c9-4324-bfef-b1ef9f403118.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/359084ff-88c9-4324-bfef-b1ef9f403118.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Dec 05 09:22:53 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:22:53.997 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[5afcb200-6ed6-4784-8cc7-a8cb9aa3a70a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:22:53 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:22:53.997 105272 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Dec 05 09:22:53 compute-1 ovn_metadata_agent[105267]: global
Dec 05 09:22:53 compute-1 ovn_metadata_agent[105267]:     log         /dev/log local0 debug
Dec 05 09:22:53 compute-1 ovn_metadata_agent[105267]:     log-tag     haproxy-metadata-proxy-359084ff-88c9-4324-bfef-b1ef9f403118
Dec 05 09:22:53 compute-1 ovn_metadata_agent[105267]:     user        root
Dec 05 09:22:53 compute-1 ovn_metadata_agent[105267]:     group       root
Dec 05 09:22:53 compute-1 ovn_metadata_agent[105267]:     maxconn     1024
Dec 05 09:22:53 compute-1 ovn_metadata_agent[105267]:     pidfile     /var/lib/neutron/external/pids/359084ff-88c9-4324-bfef-b1ef9f403118.pid.haproxy
Dec 05 09:22:53 compute-1 ovn_metadata_agent[105267]:     daemon
Dec 05 09:22:53 compute-1 ovn_metadata_agent[105267]: 
Dec 05 09:22:53 compute-1 ovn_metadata_agent[105267]: defaults
Dec 05 09:22:53 compute-1 ovn_metadata_agent[105267]:     log global
Dec 05 09:22:53 compute-1 ovn_metadata_agent[105267]:     mode http
Dec 05 09:22:53 compute-1 ovn_metadata_agent[105267]:     option httplog
Dec 05 09:22:53 compute-1 ovn_metadata_agent[105267]:     option dontlognull
Dec 05 09:22:53 compute-1 ovn_metadata_agent[105267]:     option http-server-close
Dec 05 09:22:53 compute-1 ovn_metadata_agent[105267]:     option forwardfor
Dec 05 09:22:53 compute-1 ovn_metadata_agent[105267]:     retries                 3
Dec 05 09:22:53 compute-1 ovn_metadata_agent[105267]:     timeout http-request    30s
Dec 05 09:22:53 compute-1 ovn_metadata_agent[105267]:     timeout connect         30s
Dec 05 09:22:53 compute-1 ovn_metadata_agent[105267]:     timeout client          32s
Dec 05 09:22:53 compute-1 ovn_metadata_agent[105267]:     timeout server          32s
Dec 05 09:22:53 compute-1 ovn_metadata_agent[105267]:     timeout http-keep-alive 30s
Dec 05 09:22:53 compute-1 ovn_metadata_agent[105267]: 
Dec 05 09:22:53 compute-1 ovn_metadata_agent[105267]: 
Dec 05 09:22:53 compute-1 ovn_metadata_agent[105267]: listen listener
Dec 05 09:22:53 compute-1 ovn_metadata_agent[105267]:     bind 169.254.169.254:80
Dec 05 09:22:53 compute-1 ovn_metadata_agent[105267]:     server metadata /var/lib/neutron/metadata_proxy
Dec 05 09:22:53 compute-1 ovn_metadata_agent[105267]:     http-request add-header X-OVN-Network-ID 359084ff-88c9-4324-bfef-b1ef9f403118
Dec 05 09:22:53 compute-1 ovn_metadata_agent[105267]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Dec 05 09:22:53 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:22:53.998 105272 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-359084ff-88c9-4324-bfef-b1ef9f403118', 'env', 'PROCESS_TAG=haproxy-359084ff-88c9-4324-bfef-b1ef9f403118', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/359084ff-88c9-4324-bfef-b1ef9f403118.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Dec 05 09:22:54 compute-1 podman[221137]: 2025-12-05 09:22:54.60331449 +0000 UTC m=+0.064033663 container create e10179d8c493adc669af42844f7fdc869ff8365d46ce07f1f025f9c233eeb8b7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-359084ff-88c9-4324-bfef-b1ef9f403118, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 05 09:22:54 compute-1 systemd[1]: Started libpod-conmon-e10179d8c493adc669af42844f7fdc869ff8365d46ce07f1f025f9c233eeb8b7.scope.
Dec 05 09:22:54 compute-1 podman[221137]: 2025-12-05 09:22:54.571626007 +0000 UTC m=+0.032345180 image pull 014dc726c85414b29f2dde7b5d875685d08784761c0f0ffa8630d1583a877bf9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Dec 05 09:22:54 compute-1 systemd[1]: Started libcrun container.
Dec 05 09:22:54 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d4a53f9d0734f0aa74de1f2635f61e4b9f8b0e08303c49cd54990aeff28aece0/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 05 09:22:54 compute-1 podman[221137]: 2025-12-05 09:22:54.692946934 +0000 UTC m=+0.153666147 container init e10179d8c493adc669af42844f7fdc869ff8365d46ce07f1f025f9c233eeb8b7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-359084ff-88c9-4324-bfef-b1ef9f403118, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 05 09:22:54 compute-1 podman[221137]: 2025-12-05 09:22:54.69889563 +0000 UTC m=+0.159614813 container start e10179d8c493adc669af42844f7fdc869ff8365d46ce07f1f025f9c233eeb8b7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-359084ff-88c9-4324-bfef-b1ef9f403118, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Dec 05 09:22:54 compute-1 neutron-haproxy-ovnmeta-359084ff-88c9-4324-bfef-b1ef9f403118[221153]: [NOTICE]   (221157) : New worker (221159) forked
Dec 05 09:22:54 compute-1 neutron-haproxy-ovnmeta-359084ff-88c9-4324-bfef-b1ef9f403118[221153]: [NOTICE]   (221157) : Loading success.
Dec 05 09:22:55 compute-1 nova_compute[189066]: 2025-12-05 09:22:55.452 189070 DEBUG nova.network.neutron [req-c7ee3d65-2f4b-4368-82a7-c08eb8d4e512 req-0909bd8c-3f02-421c-bffa-a0207d5527be 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 2b94d7d0-b000-460f-8883-8953d60115d0] Updated VIF entry in instance network info cache for port b2d6abb2-b32e-4bf7-bbdd-00d28c47ddc5. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Dec 05 09:22:55 compute-1 nova_compute[189066]: 2025-12-05 09:22:55.453 189070 DEBUG nova.network.neutron [req-c7ee3d65-2f4b-4368-82a7-c08eb8d4e512 req-0909bd8c-3f02-421c-bffa-a0207d5527be 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 2b94d7d0-b000-460f-8883-8953d60115d0] Updating instance_info_cache with network_info: [{"id": "b2d6abb2-b32e-4bf7-bbdd-00d28c47ddc5", "address": "fa:16:3e:57:1d:2e", "network": {"id": "359084ff-88c9-4324-bfef-b1ef9f403118", "bridge": "br-int", "label": "tempest-network-smoke--271958037", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e26ae3fdd48d4947978a480f70e14f84", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb2d6abb2-b3", "ovs_interfaceid": "b2d6abb2-b32e-4bf7-bbdd-00d28c47ddc5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 09:22:55 compute-1 nova_compute[189066]: 2025-12-05 09:22:55.509 189070 DEBUG oslo_concurrency.lockutils [req-c7ee3d65-2f4b-4368-82a7-c08eb8d4e512 req-0909bd8c-3f02-421c-bffa-a0207d5527be 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Releasing lock "refresh_cache-2b94d7d0-b000-460f-8883-8953d60115d0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 09:22:55 compute-1 podman[221168]: 2025-12-05 09:22:55.643309524 +0000 UTC m=+0.081337129 container health_status d60d3dfd47255bc7c068dbedc03dbf86678863f0414a1b8f8536e4d53dd67a13 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a42d21893a42142b0b00e96296a13ac9d2c0fedf55d6438ad104fd0309c63376-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Dec 05 09:22:57 compute-1 podman[221188]: 2025-12-05 09:22:57.653723407 +0000 UTC m=+0.089443101 container health_status 0cdc951f3b455a1459471b20e1f1626ca53f702831c125f835d1b670aeb17ccf (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a42d21893a42142b0b00e96296a13ac9d2c0fedf55d6438ad104fd0309c63376-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true)
Dec 05 09:22:58 compute-1 nova_compute[189066]: 2025-12-05 09:22:58.180 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:22:58 compute-1 nova_compute[189066]: 2025-12-05 09:22:58.419 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:22:59 compute-1 nova_compute[189066]: 2025-12-05 09:22:59.224 189070 DEBUG nova.compute.manager [req-fb101ce0-2cf2-4958-9d11-f6ce8fbb77d6 req-8dafbd9f-c914-4277-a220-e342569afc93 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 95f85266-e2bc-4615-b523-e7346ad3ab40] Received event network-vif-plugged-fbbb4f47-a5b9-460c-ae65-29a664051272 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 09:22:59 compute-1 nova_compute[189066]: 2025-12-05 09:22:59.225 189070 DEBUG oslo_concurrency.lockutils [req-fb101ce0-2cf2-4958-9d11-f6ce8fbb77d6 req-8dafbd9f-c914-4277-a220-e342569afc93 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Acquiring lock "95f85266-e2bc-4615-b523-e7346ad3ab40-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:22:59 compute-1 nova_compute[189066]: 2025-12-05 09:22:59.225 189070 DEBUG oslo_concurrency.lockutils [req-fb101ce0-2cf2-4958-9d11-f6ce8fbb77d6 req-8dafbd9f-c914-4277-a220-e342569afc93 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Lock "95f85266-e2bc-4615-b523-e7346ad3ab40-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:22:59 compute-1 nova_compute[189066]: 2025-12-05 09:22:59.225 189070 DEBUG oslo_concurrency.lockutils [req-fb101ce0-2cf2-4958-9d11-f6ce8fbb77d6 req-8dafbd9f-c914-4277-a220-e342569afc93 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Lock "95f85266-e2bc-4615-b523-e7346ad3ab40-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:22:59 compute-1 nova_compute[189066]: 2025-12-05 09:22:59.225 189070 DEBUG nova.compute.manager [req-fb101ce0-2cf2-4958-9d11-f6ce8fbb77d6 req-8dafbd9f-c914-4277-a220-e342569afc93 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 95f85266-e2bc-4615-b523-e7346ad3ab40] Processing event network-vif-plugged-fbbb4f47-a5b9-460c-ae65-29a664051272 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Dec 05 09:22:59 compute-1 nova_compute[189066]: 2025-12-05 09:22:59.226 189070 DEBUG nova.compute.manager [None req-2e8fba2f-945a-4074-ac8b-d3171dfe8b43 a9b7ab1c9c854146af8af16f337a063d aa5b008731384d18ba83c8d69e76bcef - - default default] [instance: 95f85266-e2bc-4615-b523-e7346ad3ab40] Instance event wait completed in 31 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Dec 05 09:22:59 compute-1 nova_compute[189066]: 2025-12-05 09:22:59.304 189070 DEBUG nova.virt.driver [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] Emitting event <LifecycleEvent: 1764926579.3033793, 95f85266-e2bc-4615-b523-e7346ad3ab40 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 09:22:59 compute-1 nova_compute[189066]: 2025-12-05 09:22:59.305 189070 INFO nova.compute.manager [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] [instance: 95f85266-e2bc-4615-b523-e7346ad3ab40] VM Resumed (Lifecycle Event)
Dec 05 09:22:59 compute-1 nova_compute[189066]: 2025-12-05 09:22:59.308 189070 DEBUG nova.virt.libvirt.driver [None req-2e8fba2f-945a-4074-ac8b-d3171dfe8b43 a9b7ab1c9c854146af8af16f337a063d aa5b008731384d18ba83c8d69e76bcef - - default default] [instance: 95f85266-e2bc-4615-b523-e7346ad3ab40] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Dec 05 09:22:59 compute-1 nova_compute[189066]: 2025-12-05 09:22:59.314 189070 INFO nova.virt.libvirt.driver [-] [instance: 95f85266-e2bc-4615-b523-e7346ad3ab40] Instance spawned successfully.
Dec 05 09:22:59 compute-1 nova_compute[189066]: 2025-12-05 09:22:59.315 189070 DEBUG nova.virt.libvirt.driver [None req-2e8fba2f-945a-4074-ac8b-d3171dfe8b43 a9b7ab1c9c854146af8af16f337a063d aa5b008731384d18ba83c8d69e76bcef - - default default] [instance: 95f85266-e2bc-4615-b523-e7346ad3ab40] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Dec 05 09:22:59 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:22:59.428 105272 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=6, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f2:d3:89', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '8e:43:2c:7d:20:dd'}, ipsec=False) old=SB_Global(nb_cfg=5) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 09:22:59 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:22:59.431 105272 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Dec 05 09:22:59 compute-1 nova_compute[189066]: 2025-12-05 09:22:59.430 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:22:59 compute-1 nova_compute[189066]: 2025-12-05 09:22:59.545 189070 DEBUG nova.compute.manager [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] [instance: 95f85266-e2bc-4615-b523-e7346ad3ab40] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 09:22:59 compute-1 nova_compute[189066]: 2025-12-05 09:22:59.554 189070 DEBUG nova.compute.manager [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] [instance: 95f85266-e2bc-4615-b523-e7346ad3ab40] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 09:22:59 compute-1 nova_compute[189066]: 2025-12-05 09:22:59.558 189070 DEBUG nova.virt.libvirt.driver [None req-2e8fba2f-945a-4074-ac8b-d3171dfe8b43 a9b7ab1c9c854146af8af16f337a063d aa5b008731384d18ba83c8d69e76bcef - - default default] [instance: 95f85266-e2bc-4615-b523-e7346ad3ab40] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 09:22:59 compute-1 nova_compute[189066]: 2025-12-05 09:22:59.559 189070 DEBUG nova.virt.libvirt.driver [None req-2e8fba2f-945a-4074-ac8b-d3171dfe8b43 a9b7ab1c9c854146af8af16f337a063d aa5b008731384d18ba83c8d69e76bcef - - default default] [instance: 95f85266-e2bc-4615-b523-e7346ad3ab40] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 09:22:59 compute-1 nova_compute[189066]: 2025-12-05 09:22:59.560 189070 DEBUG nova.virt.libvirt.driver [None req-2e8fba2f-945a-4074-ac8b-d3171dfe8b43 a9b7ab1c9c854146af8af16f337a063d aa5b008731384d18ba83c8d69e76bcef - - default default] [instance: 95f85266-e2bc-4615-b523-e7346ad3ab40] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 09:22:59 compute-1 nova_compute[189066]: 2025-12-05 09:22:59.561 189070 DEBUG nova.virt.libvirt.driver [None req-2e8fba2f-945a-4074-ac8b-d3171dfe8b43 a9b7ab1c9c854146af8af16f337a063d aa5b008731384d18ba83c8d69e76bcef - - default default] [instance: 95f85266-e2bc-4615-b523-e7346ad3ab40] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 09:22:59 compute-1 nova_compute[189066]: 2025-12-05 09:22:59.562 189070 DEBUG nova.virt.libvirt.driver [None req-2e8fba2f-945a-4074-ac8b-d3171dfe8b43 a9b7ab1c9c854146af8af16f337a063d aa5b008731384d18ba83c8d69e76bcef - - default default] [instance: 95f85266-e2bc-4615-b523-e7346ad3ab40] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 09:22:59 compute-1 nova_compute[189066]: 2025-12-05 09:22:59.562 189070 DEBUG nova.virt.libvirt.driver [None req-2e8fba2f-945a-4074-ac8b-d3171dfe8b43 a9b7ab1c9c854146af8af16f337a063d aa5b008731384d18ba83c8d69e76bcef - - default default] [instance: 95f85266-e2bc-4615-b523-e7346ad3ab40] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 09:22:59 compute-1 nova_compute[189066]: 2025-12-05 09:22:59.628 189070 DEBUG nova.compute.manager [req-a9147a86-b5e5-459d-9a12-b03e00aa50b9 req-d52155c9-80f6-419b-b7d3-687024379f6b 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 2b94d7d0-b000-460f-8883-8953d60115d0] Received event network-vif-plugged-b2d6abb2-b32e-4bf7-bbdd-00d28c47ddc5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 09:22:59 compute-1 nova_compute[189066]: 2025-12-05 09:22:59.629 189070 DEBUG oslo_concurrency.lockutils [req-a9147a86-b5e5-459d-9a12-b03e00aa50b9 req-d52155c9-80f6-419b-b7d3-687024379f6b 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Acquiring lock "2b94d7d0-b000-460f-8883-8953d60115d0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:22:59 compute-1 nova_compute[189066]: 2025-12-05 09:22:59.629 189070 DEBUG oslo_concurrency.lockutils [req-a9147a86-b5e5-459d-9a12-b03e00aa50b9 req-d52155c9-80f6-419b-b7d3-687024379f6b 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Lock "2b94d7d0-b000-460f-8883-8953d60115d0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:22:59 compute-1 nova_compute[189066]: 2025-12-05 09:22:59.630 189070 DEBUG oslo_concurrency.lockutils [req-a9147a86-b5e5-459d-9a12-b03e00aa50b9 req-d52155c9-80f6-419b-b7d3-687024379f6b 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Lock "2b94d7d0-b000-460f-8883-8953d60115d0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:22:59 compute-1 nova_compute[189066]: 2025-12-05 09:22:59.630 189070 DEBUG nova.compute.manager [req-a9147a86-b5e5-459d-9a12-b03e00aa50b9 req-d52155c9-80f6-419b-b7d3-687024379f6b 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 2b94d7d0-b000-460f-8883-8953d60115d0] Processing event network-vif-plugged-b2d6abb2-b32e-4bf7-bbdd-00d28c47ddc5 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Dec 05 09:22:59 compute-1 nova_compute[189066]: 2025-12-05 09:22:59.631 189070 DEBUG nova.compute.manager [None req-b09eff91-d682-40db-9795-f01015de55bc 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] [instance: 2b94d7d0-b000-460f-8883-8953d60115d0] Instance event wait completed in 7 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Dec 05 09:22:59 compute-1 nova_compute[189066]: 2025-12-05 09:22:59.638 189070 DEBUG nova.virt.libvirt.driver [None req-b09eff91-d682-40db-9795-f01015de55bc 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] [instance: 2b94d7d0-b000-460f-8883-8953d60115d0] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Dec 05 09:22:59 compute-1 nova_compute[189066]: 2025-12-05 09:22:59.647 189070 INFO nova.compute.manager [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] [instance: 95f85266-e2bc-4615-b523-e7346ad3ab40] During sync_power_state the instance has a pending task (spawning). Skip.
Dec 05 09:22:59 compute-1 nova_compute[189066]: 2025-12-05 09:22:59.648 189070 DEBUG nova.virt.driver [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] Emitting event <LifecycleEvent: 1764926579.6362834, 2b94d7d0-b000-460f-8883-8953d60115d0 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 09:22:59 compute-1 nova_compute[189066]: 2025-12-05 09:22:59.648 189070 INFO nova.compute.manager [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] [instance: 2b94d7d0-b000-460f-8883-8953d60115d0] VM Resumed (Lifecycle Event)
Dec 05 09:22:59 compute-1 nova_compute[189066]: 2025-12-05 09:22:59.653 189070 INFO nova.virt.libvirt.driver [-] [instance: 2b94d7d0-b000-460f-8883-8953d60115d0] Instance spawned successfully.
Dec 05 09:22:59 compute-1 nova_compute[189066]: 2025-12-05 09:22:59.654 189070 DEBUG nova.virt.libvirt.driver [None req-b09eff91-d682-40db-9795-f01015de55bc 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] [instance: 2b94d7d0-b000-460f-8883-8953d60115d0] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Dec 05 09:22:59 compute-1 nova_compute[189066]: 2025-12-05 09:22:59.722 189070 DEBUG nova.compute.manager [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] [instance: 2b94d7d0-b000-460f-8883-8953d60115d0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 09:22:59 compute-1 nova_compute[189066]: 2025-12-05 09:22:59.729 189070 DEBUG nova.compute.manager [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] [instance: 2b94d7d0-b000-460f-8883-8953d60115d0] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 09:22:59 compute-1 nova_compute[189066]: 2025-12-05 09:22:59.734 189070 DEBUG nova.virt.libvirt.driver [None req-b09eff91-d682-40db-9795-f01015de55bc 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] [instance: 2b94d7d0-b000-460f-8883-8953d60115d0] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 09:22:59 compute-1 nova_compute[189066]: 2025-12-05 09:22:59.734 189070 DEBUG nova.virt.libvirt.driver [None req-b09eff91-d682-40db-9795-f01015de55bc 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] [instance: 2b94d7d0-b000-460f-8883-8953d60115d0] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 09:22:59 compute-1 nova_compute[189066]: 2025-12-05 09:22:59.735 189070 DEBUG nova.virt.libvirt.driver [None req-b09eff91-d682-40db-9795-f01015de55bc 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] [instance: 2b94d7d0-b000-460f-8883-8953d60115d0] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 09:22:59 compute-1 nova_compute[189066]: 2025-12-05 09:22:59.735 189070 DEBUG nova.virt.libvirt.driver [None req-b09eff91-d682-40db-9795-f01015de55bc 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] [instance: 2b94d7d0-b000-460f-8883-8953d60115d0] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 09:22:59 compute-1 nova_compute[189066]: 2025-12-05 09:22:59.735 189070 DEBUG nova.virt.libvirt.driver [None req-b09eff91-d682-40db-9795-f01015de55bc 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] [instance: 2b94d7d0-b000-460f-8883-8953d60115d0] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 09:22:59 compute-1 nova_compute[189066]: 2025-12-05 09:22:59.736 189070 DEBUG nova.virt.libvirt.driver [None req-b09eff91-d682-40db-9795-f01015de55bc 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] [instance: 2b94d7d0-b000-460f-8883-8953d60115d0] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 09:22:59 compute-1 nova_compute[189066]: 2025-12-05 09:22:59.992 189070 INFO nova.compute.manager [None req-2e8fba2f-945a-4074-ac8b-d3171dfe8b43 a9b7ab1c9c854146af8af16f337a063d aa5b008731384d18ba83c8d69e76bcef - - default default] [instance: 95f85266-e2bc-4615-b523-e7346ad3ab40] Took 52.15 seconds to spawn the instance on the hypervisor.
Dec 05 09:22:59 compute-1 nova_compute[189066]: 2025-12-05 09:22:59.997 189070 DEBUG nova.compute.manager [None req-2e8fba2f-945a-4074-ac8b-d3171dfe8b43 a9b7ab1c9c854146af8af16f337a063d aa5b008731384d18ba83c8d69e76bcef - - default default] [instance: 95f85266-e2bc-4615-b523-e7346ad3ab40] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 09:23:00 compute-1 nova_compute[189066]: 2025-12-05 09:23:00.378 189070 INFO nova.compute.manager [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] [instance: 2b94d7d0-b000-460f-8883-8953d60115d0] During sync_power_state the instance has a pending task (spawning). Skip.
Dec 05 09:23:00 compute-1 nova_compute[189066]: 2025-12-05 09:23:00.578 189070 INFO nova.compute.manager [None req-b09eff91-d682-40db-9795-f01015de55bc 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] [instance: 2b94d7d0-b000-460f-8883-8953d60115d0] Took 62.58 seconds to spawn the instance on the hypervisor.
Dec 05 09:23:00 compute-1 nova_compute[189066]: 2025-12-05 09:23:00.579 189070 DEBUG nova.compute.manager [None req-b09eff91-d682-40db-9795-f01015de55bc 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] [instance: 2b94d7d0-b000-460f-8883-8953d60115d0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 09:23:00 compute-1 nova_compute[189066]: 2025-12-05 09:23:00.592 189070 INFO nova.compute.manager [None req-2e8fba2f-945a-4074-ac8b-d3171dfe8b43 a9b7ab1c9c854146af8af16f337a063d aa5b008731384d18ba83c8d69e76bcef - - default default] [instance: 95f85266-e2bc-4615-b523-e7346ad3ab40] Took 53.38 seconds to build instance.
Dec 05 09:23:00 compute-1 podman[221218]: 2025-12-05 09:23:00.648835768 +0000 UTC m=+0.070064902 container health_status e8cea6352187f28e672aa25c227f2705a90d58074b8cf42235a56df5d4026fd5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a42d21893a42142b0b00e96296a13ac9d2c0fedf55d6438ad104fd0309c63376-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true)
Dec 05 09:23:00 compute-1 nova_compute[189066]: 2025-12-05 09:23:00.721 189070 DEBUG oslo_concurrency.lockutils [None req-2e8fba2f-945a-4074-ac8b-d3171dfe8b43 a9b7ab1c9c854146af8af16f337a063d aa5b008731384d18ba83c8d69e76bcef - - default default] Lock "95f85266-e2bc-4615-b523-e7346ad3ab40" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 53.618s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:23:00 compute-1 nova_compute[189066]: 2025-12-05 09:23:00.721 189070 DEBUG oslo_concurrency.lockutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Lock "95f85266-e2bc-4615-b523-e7346ad3ab40" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 47.292s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:23:00 compute-1 nova_compute[189066]: 2025-12-05 09:23:00.722 189070 INFO nova.compute.manager [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] [instance: 95f85266-e2bc-4615-b523-e7346ad3ab40] During sync_power_state the instance has a pending task (spawning). Skip.
Dec 05 09:23:00 compute-1 nova_compute[189066]: 2025-12-05 09:23:00.722 189070 DEBUG oslo_concurrency.lockutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Lock "95f85266-e2bc-4615-b523-e7346ad3ab40" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:23:01 compute-1 nova_compute[189066]: 2025-12-05 09:23:01.059 189070 INFO nova.compute.manager [None req-b09eff91-d682-40db-9795-f01015de55bc 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] [instance: 2b94d7d0-b000-460f-8883-8953d60115d0] Took 64.90 seconds to build instance.
Dec 05 09:23:01 compute-1 sshd-session[221216]: Received disconnect from 122.114.113.177 port 45948:11: Bye Bye [preauth]
Dec 05 09:23:01 compute-1 sshd-session[221216]: Disconnected from authenticating user root 122.114.113.177 port 45948 [preauth]
Dec 05 09:23:02 compute-1 nova_compute[189066]: 2025-12-05 09:23:02.057 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:23:03 compute-1 nova_compute[189066]: 2025-12-05 09:23:03.184 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:23:03 compute-1 nova_compute[189066]: 2025-12-05 09:23:03.423 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:23:03 compute-1 podman[221236]: 2025-12-05 09:23:03.647756598 +0000 UTC m=+0.081278713 container health_status 3f3288243aefe700d53cffce184353529fbaf6e44f1c19ec4792ed576af41176 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 05 09:23:03 compute-1 nova_compute[189066]: 2025-12-05 09:23:03.761 189070 DEBUG oslo_concurrency.lockutils [None req-b09eff91-d682-40db-9795-f01015de55bc 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] Lock "2b94d7d0-b000-460f-8883-8953d60115d0" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 67.814s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:23:03 compute-1 nova_compute[189066]: 2025-12-05 09:23:03.761 189070 DEBUG oslo_concurrency.lockutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Lock "2b94d7d0-b000-460f-8883-8953d60115d0" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 50.332s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:23:03 compute-1 nova_compute[189066]: 2025-12-05 09:23:03.762 189070 INFO nova.compute.manager [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] [instance: 2b94d7d0-b000-460f-8883-8953d60115d0] During sync_power_state the instance has a pending task (spawning). Skip.
Dec 05 09:23:03 compute-1 nova_compute[189066]: 2025-12-05 09:23:03.762 189070 DEBUG oslo_concurrency.lockutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Lock "2b94d7d0-b000-460f-8883-8953d60115d0" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:23:04 compute-1 nova_compute[189066]: 2025-12-05 09:23:04.413 189070 DEBUG nova.compute.manager [req-ecfd26bf-df96-4f30-9d53-ef7769f2a8f5 req-e27b7811-a975-485d-81c9-efa3cdeac74e 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 95f85266-e2bc-4615-b523-e7346ad3ab40] Received event network-vif-plugged-fbbb4f47-a5b9-460c-ae65-29a664051272 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 09:23:04 compute-1 nova_compute[189066]: 2025-12-05 09:23:04.414 189070 DEBUG oslo_concurrency.lockutils [req-ecfd26bf-df96-4f30-9d53-ef7769f2a8f5 req-e27b7811-a975-485d-81c9-efa3cdeac74e 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Acquiring lock "95f85266-e2bc-4615-b523-e7346ad3ab40-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:23:04 compute-1 nova_compute[189066]: 2025-12-05 09:23:04.649 189070 DEBUG oslo_concurrency.lockutils [req-ecfd26bf-df96-4f30-9d53-ef7769f2a8f5 req-e27b7811-a975-485d-81c9-efa3cdeac74e 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Lock "95f85266-e2bc-4615-b523-e7346ad3ab40-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.235s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:23:04 compute-1 nova_compute[189066]: 2025-12-05 09:23:04.650 189070 DEBUG oslo_concurrency.lockutils [req-ecfd26bf-df96-4f30-9d53-ef7769f2a8f5 req-e27b7811-a975-485d-81c9-efa3cdeac74e 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Lock "95f85266-e2bc-4615-b523-e7346ad3ab40-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:23:04 compute-1 nova_compute[189066]: 2025-12-05 09:23:04.650 189070 DEBUG nova.compute.manager [req-ecfd26bf-df96-4f30-9d53-ef7769f2a8f5 req-e27b7811-a975-485d-81c9-efa3cdeac74e 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 95f85266-e2bc-4615-b523-e7346ad3ab40] No waiting events found dispatching network-vif-plugged-fbbb4f47-a5b9-460c-ae65-29a664051272 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 09:23:04 compute-1 nova_compute[189066]: 2025-12-05 09:23:04.651 189070 WARNING nova.compute.manager [req-ecfd26bf-df96-4f30-9d53-ef7769f2a8f5 req-e27b7811-a975-485d-81c9-efa3cdeac74e 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 95f85266-e2bc-4615-b523-e7346ad3ab40] Received unexpected event network-vif-plugged-fbbb4f47-a5b9-460c-ae65-29a664051272 for instance with vm_state active and task_state None.
Dec 05 09:23:04 compute-1 nova_compute[189066]: 2025-12-05 09:23:04.679 189070 DEBUG nova.compute.manager [req-647e9e5c-0fe1-4b30-a004-744b6bf10fb0 req-03a08127-0697-4931-b422-3b5fb028c58a 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 2b94d7d0-b000-460f-8883-8953d60115d0] Received event network-vif-plugged-b2d6abb2-b32e-4bf7-bbdd-00d28c47ddc5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 09:23:04 compute-1 nova_compute[189066]: 2025-12-05 09:23:04.680 189070 DEBUG oslo_concurrency.lockutils [req-647e9e5c-0fe1-4b30-a004-744b6bf10fb0 req-03a08127-0697-4931-b422-3b5fb028c58a 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Acquiring lock "2b94d7d0-b000-460f-8883-8953d60115d0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:23:04 compute-1 nova_compute[189066]: 2025-12-05 09:23:04.683 189070 DEBUG oslo_concurrency.lockutils [req-647e9e5c-0fe1-4b30-a004-744b6bf10fb0 req-03a08127-0697-4931-b422-3b5fb028c58a 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Lock "2b94d7d0-b000-460f-8883-8953d60115d0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:23:04 compute-1 nova_compute[189066]: 2025-12-05 09:23:04.684 189070 DEBUG oslo_concurrency.lockutils [req-647e9e5c-0fe1-4b30-a004-744b6bf10fb0 req-03a08127-0697-4931-b422-3b5fb028c58a 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Lock "2b94d7d0-b000-460f-8883-8953d60115d0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:23:04 compute-1 nova_compute[189066]: 2025-12-05 09:23:04.684 189070 DEBUG nova.compute.manager [req-647e9e5c-0fe1-4b30-a004-744b6bf10fb0 req-03a08127-0697-4931-b422-3b5fb028c58a 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 2b94d7d0-b000-460f-8883-8953d60115d0] No waiting events found dispatching network-vif-plugged-b2d6abb2-b32e-4bf7-bbdd-00d28c47ddc5 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 09:23:04 compute-1 nova_compute[189066]: 2025-12-05 09:23:04.684 189070 WARNING nova.compute.manager [req-647e9e5c-0fe1-4b30-a004-744b6bf10fb0 req-03a08127-0697-4931-b422-3b5fb028c58a 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 2b94d7d0-b000-460f-8883-8953d60115d0] Received unexpected event network-vif-plugged-b2d6abb2-b32e-4bf7-bbdd-00d28c47ddc5 for instance with vm_state active and task_state None.
Dec 05 09:23:05 compute-1 nova_compute[189066]: 2025-12-05 09:23:05.015 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:23:06 compute-1 nova_compute[189066]: 2025-12-05 09:23:06.021 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:23:06 compute-1 nova_compute[189066]: 2025-12-05 09:23:06.021 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:23:06 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:23:06.434 105272 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=2deaed7a-68f6-453c-b7f8-10ef033f3762, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '6'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 09:23:07 compute-1 nova_compute[189066]: 2025-12-05 09:23:07.020 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:23:07 compute-1 nova_compute[189066]: 2025-12-05 09:23:07.021 189070 DEBUG nova.compute.manager [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 05 09:23:07 compute-1 nova_compute[189066]: 2025-12-05 09:23:07.021 189070 DEBUG nova.compute.manager [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 05 09:23:07 compute-1 podman[221254]: 2025-12-05 09:23:07.649795768 +0000 UTC m=+0.077539691 container health_status b07f36300303a5c1729056a61e344d6c6fd9b2e404082d8e8ce7185d45652c2b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, container_name=openstack_network_exporter, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, name=ubi9-minimal, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, version=9.6, vcs-type=git, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, io.openshift.expose-services=, maintainer=Red Hat, Inc., managed_by=edpm_ansible, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 05 09:23:08 compute-1 nova_compute[189066]: 2025-12-05 09:23:08.222 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:23:08 compute-1 nova_compute[189066]: 2025-12-05 09:23:08.426 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:23:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:23:08.868 105272 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:23:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:23:08.870 105272 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:23:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:23:08.871 105272 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:23:10 compute-1 nova_compute[189066]: 2025-12-05 09:23:10.091 189070 DEBUG oslo_concurrency.lockutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Acquiring lock "refresh_cache-2b94d7d0-b000-460f-8883-8953d60115d0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 09:23:10 compute-1 nova_compute[189066]: 2025-12-05 09:23:10.091 189070 DEBUG oslo_concurrency.lockutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Acquired lock "refresh_cache-2b94d7d0-b000-460f-8883-8953d60115d0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 09:23:10 compute-1 nova_compute[189066]: 2025-12-05 09:23:10.091 189070 DEBUG nova.network.neutron [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] [instance: 2b94d7d0-b000-460f-8883-8953d60115d0] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Dec 05 09:23:10 compute-1 nova_compute[189066]: 2025-12-05 09:23:10.092 189070 DEBUG nova.objects.instance [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 2b94d7d0-b000-460f-8883-8953d60115d0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 09:23:10 compute-1 podman[221275]: 2025-12-05 09:23:10.63079979 +0000 UTC m=+0.066833719 container health_status b9f45cdcb567bb250381fa80d86c78c226d5638c7de1b6d79ee017275814ff68 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Dec 05 09:23:12 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:12.632 12 DEBUG novaclient.v2.client [-] REQ: curl -g -i -X GET https://nova-internal.openstack.svc:8774/v2.1/flavors?is_public=None -H "Accept: application/json" -H "User-Agent: python-novaclient" -H "X-Auth-Token: {SHA256}cb0d1a4dec94bbcc7ec9c9c41a1990e8e45b62cf72d7082e21fc82bbe12b0a53" -H "X-OpenStack-Nova-API-Version: 2.1" _http_log_request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:519
Dec 05 09:23:13 compute-1 nova_compute[189066]: 2025-12-05 09:23:13.225 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:23:13 compute-1 nova_compute[189066]: 2025-12-05 09:23:13.428 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:23:14 compute-1 ovn_controller[95809]: 2025-12-05T09:23:14Z|00004|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:d6:6e:e7 10.100.0.10
Dec 05 09:23:14 compute-1 ovn_controller[95809]: 2025-12-05T09:23:14Z|00005|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:d6:6e:e7 10.100.0.10
Dec 05 09:23:14 compute-1 ovn_controller[95809]: 2025-12-05T09:23:14Z|00006|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:57:1d:2e 10.100.0.8
Dec 05 09:23:14 compute-1 ovn_controller[95809]: 2025-12-05T09:23:14Z|00007|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:57:1d:2e 10.100.0.8
Dec 05 09:23:14 compute-1 sshd-session[221344]: Received disconnect from 185.118.15.236 port 35916:11: Bye Bye [preauth]
Dec 05 09:23:14 compute-1 sshd-session[221344]: Disconnected from authenticating user root 185.118.15.236 port 35916 [preauth]
Dec 05 09:23:16 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:16.560 12 DEBUG novaclient.v2.client [-] RESP: [200] Connection: Keep-Alive Content-Length: 644 Content-Type: application/json Date: Fri, 05 Dec 2025 09:23:12 GMT Keep-Alive: timeout=5, max=100 OpenStack-API-Version: compute 2.1 Server: Apache Vary: OpenStack-API-Version,X-OpenStack-Nova-API-Version X-OpenStack-Nova-API-Version: 2.1 x-compute-request-id: req-6e7eefb2-3869-426a-a00a-dd456d3f9ee9 x-openstack-request-id: req-6e7eefb2-3869-426a-a00a-dd456d3f9ee9 _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:550
Dec 05 09:23:16 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:16.561 12 DEBUG novaclient.v2.client [-] RESP BODY: {"flavors": [{"id": "043ee359-5816-4bcf-a592-8e6441e30b6f", "name": "m1.micro", "links": [{"rel": "self", "href": "https://nova-internal.openstack.svc:8774/v2.1/flavors/043ee359-5816-4bcf-a592-8e6441e30b6f"}, {"rel": "bookmark", "href": "https://nova-internal.openstack.svc:8774/flavors/043ee359-5816-4bcf-a592-8e6441e30b6f"}]}, {"id": "fbadeab4-f24f-4100-963a-d228b2a6f7c4", "name": "m1.nano", "links": [{"rel": "self", "href": "https://nova-internal.openstack.svc:8774/v2.1/flavors/fbadeab4-f24f-4100-963a-d228b2a6f7c4"}, {"rel": "bookmark", "href": "https://nova-internal.openstack.svc:8774/flavors/fbadeab4-f24f-4100-963a-d228b2a6f7c4"}]}]} _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:582
Dec 05 09:23:16 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:16.561 12 DEBUG novaclient.v2.client [-] GET call to compute for https://nova-internal.openstack.svc:8774/v2.1/flavors?is_public=None used request id req-6e7eefb2-3869-426a-a00a-dd456d3f9ee9 request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:954
Dec 05 09:23:16 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:16.563 12 DEBUG novaclient.v2.client [-] REQ: curl -g -i -X GET https://nova-internal.openstack.svc:8774/v2.1/flavors/fbadeab4-f24f-4100-963a-d228b2a6f7c4 -H "Accept: application/json" -H "User-Agent: python-novaclient" -H "X-Auth-Token: {SHA256}cb0d1a4dec94bbcc7ec9c9c41a1990e8e45b62cf72d7082e21fc82bbe12b0a53" -H "X-OpenStack-Nova-API-Version: 2.1" _http_log_request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:519
Dec 05 09:23:17 compute-1 podman[221346]: 2025-12-05 09:23:17.665745024 +0000 UTC m=+0.083019755 container health_status 550b604b3b40c506829669f3af0f3e8dc4e7ac01464222ce7f26998708fe1a96 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec 05 09:23:18 compute-1 nova_compute[189066]: 2025-12-05 09:23:18.228 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:23:18 compute-1 nova_compute[189066]: 2025-12-05 09:23:18.432 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.053 12 DEBUG novaclient.v2.client [-] RESP: [200] Connection: Keep-Alive Content-Length: 495 Content-Type: application/json Date: Fri, 05 Dec 2025 09:23:16 GMT Keep-Alive: timeout=5, max=99 OpenStack-API-Version: compute 2.1 Server: Apache Vary: OpenStack-API-Version,X-OpenStack-Nova-API-Version X-OpenStack-Nova-API-Version: 2.1 x-compute-request-id: req-f2ba05e7-a973-4b56-ac64-1c21dd5b3bec x-openstack-request-id: req-f2ba05e7-a973-4b56-ac64-1c21dd5b3bec _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:550
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.053 12 DEBUG novaclient.v2.client [-] RESP BODY: {"flavor": {"id": "fbadeab4-f24f-4100-963a-d228b2a6f7c4", "name": "m1.nano", "ram": 128, "disk": 1, "swap": "", "OS-FLV-EXT-DATA:ephemeral": 0, "OS-FLV-DISABLED:disabled": false, "vcpus": 1, "os-flavor-access:is_public": true, "rxtx_factor": 1.0, "links": [{"rel": "self", "href": "https://nova-internal.openstack.svc:8774/v2.1/flavors/fbadeab4-f24f-4100-963a-d228b2a6f7c4"}, {"rel": "bookmark", "href": "https://nova-internal.openstack.svc:8774/flavors/fbadeab4-f24f-4100-963a-d228b2a6f7c4"}]}} _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:582
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.053 12 DEBUG novaclient.v2.client [-] GET call to compute for https://nova-internal.openstack.svc:8774/v2.1/flavors/fbadeab4-f24f-4100-963a-d228b2a6f7c4 used request id req-f2ba05e7-a973-4b56-ac64-1c21dd5b3bec request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:954
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.057 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '2b94d7d0-b000-460f-8883-8953d60115d0', 'name': 'tempest-TestNetworkAdvancedServerOps-server-804921789', 'flavor': {'id': 'fbadeab4-f24f-4100-963a-d228b2a6f7c4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '3ebffd97-b242-42d7-b245-ebdaf8e4377c'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000003', 'OS-EXT-SRV-ATTR:host': 'compute-1.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': 'e26ae3fdd48d4947978a480f70e14f84', 'user_id': '65751a90715341b2984ef84ebbaa1650', 'hostId': 'e310e8c2de91dded1bf420f01eae623153594a2dfb619537eb8a35c6', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.061 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '95f85266-e2bc-4615-b523-e7346ad3ab40', 'name': 'tempest-LiveAutoBlockMigrationV225Test-server-1827012883', 'flavor': {'id': 'fbadeab4-f24f-4100-963a-d228b2a6f7c4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '3ebffd97-b242-42d7-b245-ebdaf8e4377c'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000004', 'OS-EXT-SRV-ATTR:host': 'compute-1.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': 'aa5b008731384d18ba83c8d69e76bcef', 'user_id': 'a9b7ab1c9c854146af8af16f337a063d', 'hostId': '775d6a56e5645c4fc9619f653366421d7b59c37709cb5996def4234c', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.061 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.061 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.062 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-TestNetworkAdvancedServerOps-server-804921789>, <NovaLikeServer: tempest-LiveAutoBlockMigrationV225Test-server-1827012883>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkAdvancedServerOps-server-804921789>, <NovaLikeServer: tempest-LiveAutoBlockMigrationV225Test-server-1827012883>]
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.062 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.084 12 DEBUG ceilometer.compute.pollsters [-] 2b94d7d0-b000-460f-8883-8953d60115d0/cpu volume: 13770000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.104 12 DEBUG ceilometer.compute.pollsters [-] 95f85266-e2bc-4615-b523-e7346ad3ab40/cpu volume: 13910000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.112 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3447dbf3-76b0-4288-8021-566bc4cffbf2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 13770000000, 'user_id': '65751a90715341b2984ef84ebbaa1650', 'user_name': None, 'project_id': 'e26ae3fdd48d4947978a480f70e14f84', 'project_name': None, 'resource_id': '2b94d7d0-b000-460f-8883-8953d60115d0', 'timestamp': '2025-12-05T09:23:19.063064', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-804921789', 'name': 'instance-00000003', 'instance_id': '2b94d7d0-b000-460f-8883-8953d60115d0', 'instance_type': 'm1.nano', 'host': 'e310e8c2de91dded1bf420f01eae623153594a2dfb619537eb8a35c6', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'fbadeab4-f24f-4100-963a-d228b2a6f7c4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3ebffd97-b242-42d7-b245-ebdaf8e4377c'}, 'image_ref': '3ebffd97-b242-42d7-b245-ebdaf8e4377c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '096fcf50-d1bc-11f0-a2f3-fa163e9454b0', 'monotonic_time': 3872.141821963, 'message_signature': '87127a3e3e14cb1793c4276caaf1d1b1eecf69d988469666fd6e4abf697cbcb9'}, {'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 13910000000, 'user_id': 'a9b7ab1c9c854146af8af16f337a063d', 'user_name': None, 'project_id': 'aa5b008731384d18ba83c8d69e76bcef', 'project_name': None, 'resource_id': '95f85266-e2bc-4615-b523-e7346ad3ab40', 'timestamp': '2025-12-05T09:23:19.063064', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-1827012883', 'name': 'instance-00000004', 'instance_id': '95f85266-e2bc-4615-b523-e7346ad3ab40', 'instance_type': 'm1.nano', 'host': '775d6a56e5645c4fc9619f653366421d7b59c37709cb5996def4234c', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'fbadeab4-f24f-4100-963a-d228b2a6f7c4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3ebffd97-b242-42d7-b245-ebdaf8e4377c'}, 'image_ref': '3ebffd97-b242-42d7-b245-ebdaf8e4377c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '09728e16-d1bc-11f0-a2f3-fa163e9454b0', 'monotonic_time': 3872.161452624, 'message_signature': '538ebcfcae479dc1a4319a331425f2b249dc41ba83bc1b433d25d85d0ae602bc'}]}, 'timestamp': '2025-12-05 09:23:19.104757', '_unique_id': '05bffdc3df934afc91bd86d93a7ff82b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.112 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.112 12 ERROR oslo_messaging.notify.messaging     yield
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.112 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.112 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.112 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.112 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.112 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.112 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.112 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.112 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.112 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.112 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.112 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.112 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.112 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.112 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.112 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.112 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.112 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.112 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.112 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.112 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.112 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.112 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.112 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.112 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.112 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.112 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.112 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.112 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.112 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.115 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.119 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 2b94d7d0-b000-460f-8883-8953d60115d0 / tapb2d6abb2-b3 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.119 12 DEBUG ceilometer.compute.pollsters [-] 2b94d7d0-b000-460f-8883-8953d60115d0/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.123 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 95f85266-e2bc-4615-b523-e7346ad3ab40 / tapfbbb4f47-a5 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.123 12 DEBUG ceilometer.compute.pollsters [-] 95f85266-e2bc-4615-b523-e7346ad3ab40/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.126 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd4a67e2b-15bb-496f-b6e6-72955b252eae', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '65751a90715341b2984ef84ebbaa1650', 'user_name': None, 'project_id': 'e26ae3fdd48d4947978a480f70e14f84', 'project_name': None, 'resource_id': 'instance-00000003-2b94d7d0-b000-460f-8883-8953d60115d0-tapb2d6abb2-b3', 'timestamp': '2025-12-05T09:23:19.115655', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-804921789', 'name': 'tapb2d6abb2-b3', 'instance_id': '2b94d7d0-b000-460f-8883-8953d60115d0', 'instance_type': 'm1.nano', 'host': 'e310e8c2de91dded1bf420f01eae623153594a2dfb619537eb8a35c6', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'fbadeab4-f24f-4100-963a-d228b2a6f7c4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3ebffd97-b242-42d7-b245-ebdaf8e4377c'}, 'image_ref': '3ebffd97-b242-42d7-b245-ebdaf8e4377c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:57:1d:2e', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapb2d6abb2-b3'}, 'message_id': '0974f282-d1bc-11f0-a2f3-fa163e9454b0', 'monotonic_time': 3872.173325075, 'message_signature': '45f0b7e5c05be91fee22af6ca98aeed7015c8a31a64d42c8fd8aaf55ce5aeb72'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'a9b7ab1c9c854146af8af16f337a063d', 'user_name': None, 'project_id': 'aa5b008731384d18ba83c8d69e76bcef', 'project_name': None, 'resource_id': 'instance-00000004-95f85266-e2bc-4615-b523-e7346ad3ab40-tapfbbb4f47-a5', 'timestamp': '2025-12-05T09:23:19.115655', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-1827012883', 'name': 'tapfbbb4f47-a5', 'instance_id': '95f85266-e2bc-4615-b523-e7346ad3ab40', 'instance_type': 'm1.nano', 'host': '775d6a56e5645c4fc9619f653366421d7b59c37709cb5996def4234c', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'fbadeab4-f24f-4100-963a-d228b2a6f7c4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3ebffd97-b242-42d7-b245-ebdaf8e4377c'}, 'image_ref': '3ebffd97-b242-42d7-b245-ebdaf8e4377c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d6:6e:e7', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapfbbb4f47-a5'}, 'message_id': '0975926e-d1bc-11f0-a2f3-fa163e9454b0', 'monotonic_time': 3872.177968959, 'message_signature': 'e22445f785a2c331658a2610227e3d79d3364df9d548be8ce7052fe40d250e1c'}]}, 'timestamp': '2025-12-05 09:23:19.124495', '_unique_id': '65edb2aa1ec24d85ae4c72fe7be7d38a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.126 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.126 12 ERROR oslo_messaging.notify.messaging     yield
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.126 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.126 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.126 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.126 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.126 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.126 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.126 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.126 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.126 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.126 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.126 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.126 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.126 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.126 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.126 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.126 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.126 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.126 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.126 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.126 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.126 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.126 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.126 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.126 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.126 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.126 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.126 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.126 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.126 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.127 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.138 12 DEBUG ceilometer.compute.pollsters [-] 2b94d7d0-b000-460f-8883-8953d60115d0/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.139 12 DEBUG ceilometer.compute.pollsters [-] 2b94d7d0-b000-460f-8883-8953d60115d0/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.149 12 DEBUG ceilometer.compute.pollsters [-] 95f85266-e2bc-4615-b523-e7346ad3ab40/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.150 12 DEBUG ceilometer.compute.pollsters [-] 95f85266-e2bc-4615-b523-e7346ad3ab40/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.151 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1c998575-7d4d-4cbb-9960-c26ad646db20', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '65751a90715341b2984ef84ebbaa1650', 'user_name': None, 'project_id': 'e26ae3fdd48d4947978a480f70e14f84', 'project_name': None, 'resource_id': '2b94d7d0-b000-460f-8883-8953d60115d0-vda', 'timestamp': '2025-12-05T09:23:19.127165', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-804921789', 'name': 'instance-00000003', 'instance_id': '2b94d7d0-b000-460f-8883-8953d60115d0', 'instance_type': 'm1.nano', 'host': 'e310e8c2de91dded1bf420f01eae623153594a2dfb619537eb8a35c6', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'fbadeab4-f24f-4100-963a-d228b2a6f7c4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3ebffd97-b242-42d7-b245-ebdaf8e4377c'}, 'image_ref': '3ebffd97-b242-42d7-b245-ebdaf8e4377c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '0977d380-d1bc-11f0-a2f3-fa163e9454b0', 'monotonic_time': 3872.18497827, 'message_signature': '4c6bf168263deb23b38be1a121b237fe3abd305617627b1aae0f68b57beec056'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '65751a90715341b2984ef84ebbaa1650', 'user_name': None, 'project_id': 'e26ae3fdd48d4947978a480f70e14f84', 'project_name': None, 'resource_id': '2b94d7d0-b000-460f-8883-8953d60115d0-sda', 'timestamp': '2025-12-05T09:23:19.127165', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-804921789', 'name': 'instance-00000003', 'instance_id': '2b94d7d0-b000-460f-8883-8953d60115d0', 'instance_type': 'm1.nano', 'host': 'e310e8c2de91dded1bf420f01eae623153594a2dfb619537eb8a35c6', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'fbadeab4-f24f-4100-963a-d228b2a6f7c4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3ebffd97-b242-42d7-b245-ebdaf8e4377c'}, 'image_ref': '3ebffd97-b242-42d7-b245-ebdaf8e4377c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '0977e1c2-d1bc-11f0-a2f3-fa163e9454b0', 'monotonic_time': 3872.18497827, 'message_signature': '5f23e8544f7193d1f03174fc37d797093082e79f3cc987277f70459f5b582bf5'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'a9b7ab1c9c854146af8af16f337a063d', 'user_name': None, 'project_id': 'aa5b008731384d18ba83c8d69e76bcef', 'project_name': None, 'resource_id': '95f85266-e2bc-4615-b523-e7346ad3ab40-vda', 'timestamp': '2025-12-05T09:23:19.127165', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-1827012883', 'name': 'instance-00000004', 'instance_id': '95f85266-e2bc-4615-b523-e7346ad3ab40', 'instance_type': 'm1.nano', 'host': '775d6a56e5645c4fc9619f653366421d7b59c37709cb5996def4234c', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'fbadeab4-f24f-4100-963a-d228b2a6f7c4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3ebffd97-b242-42d7-b245-ebdaf8e4377c'}, 'image_ref': '3ebffd97-b242-42d7-b245-ebdaf8e4377c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '0979866c-d1bc-11f0-a2f3-fa163e9454b0', 'monotonic_time': 3872.197080417, 'message_signature': '84bd617db50d6ad1cd94c671779f5120c693a249c0d8ccadd02bc024ff663d64'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'a9b7ab1c9c854146af8af16f337a063d', 'user_name': None, 'project_id': 'aa5b008731384d18ba83c8d69e76bcef', 'project_name': None, 'resource_id': '95f85266-e2bc-4615-b523-e7346ad3ab40-sda', 'timestamp': '2025-12-05T09:23:19.127165', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-1827012883', 'name': 'instance-00000004', 'instance_id': '95f85266-e2bc-4615-b523-e7346ad3ab40', 'instance_type': 'm1.nano', 'host': '775d6a56e5645c4fc9619f653366421d7b59c37709cb5996def4234c', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'fbadeab4-f24f-4100-963a-d228b2a6f7c4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3ebffd97-b242-42d7-b245-ebdaf8e4377c'}, 'image_ref': '3ebffd97-b242-42d7-b245-ebdaf8e4377c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '09799152-d1bc-11f0-a2f3-fa163e9454b0', 'monotonic_time': 3872.197080417, 'message_signature': 'b72fe79e5f9333dd001e83ac8657985ad1cce650650182aca06f908030faaa0d'}]}, 'timestamp': '2025-12-05 09:23:19.150582', '_unique_id': 'efcc1de662e0428faf6323d8e027811a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.151 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.151 12 ERROR oslo_messaging.notify.messaging     yield
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.151 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.151 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.151 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.151 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.151 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.151 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.151 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.151 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.151 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.151 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.151 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.151 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.151 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.151 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.151 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.151 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.151 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.151 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.151 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.151 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.151 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.151 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.151 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.151 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.151 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.151 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.151 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.151 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.151 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.152 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.152 12 DEBUG ceilometer.compute.pollsters [-] 2b94d7d0-b000-460f-8883-8953d60115d0/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.153 12 DEBUG ceilometer.compute.pollsters [-] 95f85266-e2bc-4615-b523-e7346ad3ab40/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.153 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd6eb672a-75d6-44ab-a86d-5999d6431314', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '65751a90715341b2984ef84ebbaa1650', 'user_name': None, 'project_id': 'e26ae3fdd48d4947978a480f70e14f84', 'project_name': None, 'resource_id': 'instance-00000003-2b94d7d0-b000-460f-8883-8953d60115d0-tapb2d6abb2-b3', 'timestamp': '2025-12-05T09:23:19.152851', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-804921789', 'name': 'tapb2d6abb2-b3', 'instance_id': '2b94d7d0-b000-460f-8883-8953d60115d0', 'instance_type': 'm1.nano', 'host': 'e310e8c2de91dded1bf420f01eae623153594a2dfb619537eb8a35c6', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'fbadeab4-f24f-4100-963a-d228b2a6f7c4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3ebffd97-b242-42d7-b245-ebdaf8e4377c'}, 'image_ref': '3ebffd97-b242-42d7-b245-ebdaf8e4377c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:57:1d:2e', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapb2d6abb2-b3'}, 'message_id': '0979f606-d1bc-11f0-a2f3-fa163e9454b0', 'monotonic_time': 3872.173325075, 'message_signature': '880c282b796576a8d432f42947261b15cc80123b2f07b542be36d059391a366d'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'a9b7ab1c9c854146af8af16f337a063d', 'user_name': None, 'project_id': 'aa5b008731384d18ba83c8d69e76bcef', 'project_name': None, 'resource_id': 'instance-00000004-95f85266-e2bc-4615-b523-e7346ad3ab40-tapfbbb4f47-a5', 'timestamp': '2025-12-05T09:23:19.152851', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-1827012883', 'name': 'tapfbbb4f47-a5', 'instance_id': '95f85266-e2bc-4615-b523-e7346ad3ab40', 'instance_type': 'm1.nano', 'host': '775d6a56e5645c4fc9619f653366421d7b59c37709cb5996def4234c', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'fbadeab4-f24f-4100-963a-d228b2a6f7c4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3ebffd97-b242-42d7-b245-ebdaf8e4377c'}, 'image_ref': '3ebffd97-b242-42d7-b245-ebdaf8e4377c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d6:6e:e7', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapfbbb4f47-a5'}, 'message_id': '097a0092-d1bc-11f0-a2f3-fa163e9454b0', 'monotonic_time': 3872.177968959, 'message_signature': '83ed7dcf00e15d18468cd0625c47c0b5fb642ac5be950728b6d511da7e3340c0'}]}, 'timestamp': '2025-12-05 09:23:19.153452', '_unique_id': '9fe9dcbd629b4b75bf8ad9927a684a91'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.153 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.153 12 ERROR oslo_messaging.notify.messaging     yield
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.153 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.153 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.153 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.153 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.153 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.153 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.153 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.153 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.153 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.153 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.153 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.153 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.153 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.153 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.153 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.153 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.153 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.153 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.153 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.153 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.153 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.153 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.153 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.153 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.153 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.153 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.153 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.153 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.153 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.154 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.154 12 DEBUG ceilometer.compute.pollsters [-] 2b94d7d0-b000-460f-8883-8953d60115d0/network.outgoing.bytes volume: 1396 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.154 12 DEBUG ceilometer.compute.pollsters [-] 95f85266-e2bc-4615-b523-e7346ad3ab40/network.outgoing.bytes volume: 1396 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.155 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '77e5da07-2513-4eae-a63b-f680760ae7f9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1396, 'user_id': '65751a90715341b2984ef84ebbaa1650', 'user_name': None, 'project_id': 'e26ae3fdd48d4947978a480f70e14f84', 'project_name': None, 'resource_id': 'instance-00000003-2b94d7d0-b000-460f-8883-8953d60115d0-tapb2d6abb2-b3', 'timestamp': '2025-12-05T09:23:19.154625', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-804921789', 'name': 'tapb2d6abb2-b3', 'instance_id': '2b94d7d0-b000-460f-8883-8953d60115d0', 'instance_type': 'm1.nano', 'host': 'e310e8c2de91dded1bf420f01eae623153594a2dfb619537eb8a35c6', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'fbadeab4-f24f-4100-963a-d228b2a6f7c4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3ebffd97-b242-42d7-b245-ebdaf8e4377c'}, 'image_ref': '3ebffd97-b242-42d7-b245-ebdaf8e4377c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:57:1d:2e', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapb2d6abb2-b3'}, 'message_id': '097a3a6c-d1bc-11f0-a2f3-fa163e9454b0', 'monotonic_time': 3872.173325075, 'message_signature': '49792c641a013a01710ef0e22ef492647fe07a49de5d67bdfec0dd23b8eeabdd'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1396, 'user_id': 'a9b7ab1c9c854146af8af16f337a063d', 'user_name': None, 'project_id': 'aa5b008731384d18ba83c8d69e76bcef', 'project_name': None, 'resource_id': 'instance-00000004-95f85266-e2bc-4615-b523-e7346ad3ab40-tapfbbb4f47-a5', 'timestamp': '2025-12-05T09:23:19.154625', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-1827012883', 'name': 'tapfbbb4f47-a5', 'instance_id': '95f85266-e2bc-4615-b523-e7346ad3ab40', 'instance_type': 'm1.nano', 'host': '775d6a56e5645c4fc9619f653366421d7b59c37709cb5996def4234c', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'fbadeab4-f24f-4100-963a-d228b2a6f7c4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3ebffd97-b242-42d7-b245-ebdaf8e4377c'}, 'image_ref': '3ebffd97-b242-42d7-b245-ebdaf8e4377c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d6:6e:e7', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapfbbb4f47-a5'}, 'message_id': '097a42c8-d1bc-11f0-a2f3-fa163e9454b0', 'monotonic_time': 3872.177968959, 'message_signature': '9c18dde7171406fcdc0e9830b5b39bd725882d4d23cda31101715d7ebbbb5ec7'}]}, 'timestamp': '2025-12-05 09:23:19.155075', '_unique_id': '37df45a12f884175b292fa14b02f4427'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.155 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.155 12 ERROR oslo_messaging.notify.messaging     yield
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.155 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.155 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.155 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.155 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.155 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.155 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.155 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.155 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.155 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.155 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.155 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.155 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.155 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.155 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.155 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.155 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.155 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.155 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.155 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.155 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.155 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.155 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.155 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.155 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.155 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.155 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.155 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.155 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.155 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.156 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.156 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.156 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-TestNetworkAdvancedServerOps-server-804921789>, <NovaLikeServer: tempest-LiveAutoBlockMigrationV225Test-server-1827012883>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkAdvancedServerOps-server-804921789>, <NovaLikeServer: tempest-LiveAutoBlockMigrationV225Test-server-1827012883>]
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.156 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.178 12 DEBUG ceilometer.compute.pollsters [-] 2b94d7d0-b000-460f-8883-8953d60115d0/disk.device.write.requests volume: 301 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.179 12 DEBUG ceilometer.compute.pollsters [-] 2b94d7d0-b000-460f-8883-8953d60115d0/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.204 12 DEBUG ceilometer.compute.pollsters [-] 95f85266-e2bc-4615-b523-e7346ad3ab40/disk.device.write.requests volume: 309 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.205 12 DEBUG ceilometer.compute.pollsters [-] 95f85266-e2bc-4615-b523-e7346ad3ab40/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.207 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3cd68f10-20ef-4d21-ba8d-524d69ea1922', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 301, 'user_id': '65751a90715341b2984ef84ebbaa1650', 'user_name': None, 'project_id': 'e26ae3fdd48d4947978a480f70e14f84', 'project_name': None, 'resource_id': '2b94d7d0-b000-460f-8883-8953d60115d0-vda', 'timestamp': '2025-12-05T09:23:19.156507', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-804921789', 'name': 'instance-00000003', 'instance_id': '2b94d7d0-b000-460f-8883-8953d60115d0', 'instance_type': 'm1.nano', 'host': 'e310e8c2de91dded1bf420f01eae623153594a2dfb619537eb8a35c6', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'fbadeab4-f24f-4100-963a-d228b2a6f7c4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3ebffd97-b242-42d7-b245-ebdaf8e4377c'}, 'image_ref': '3ebffd97-b242-42d7-b245-ebdaf8e4377c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '097df6fc-d1bc-11f0-a2f3-fa163e9454b0', 'monotonic_time': 3872.214243348, 'message_signature': '884e6480b9abceca2a2731d30fbaf432b587420c9b303f52384c45c1052d60aa'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '65751a90715341b2984ef84ebbaa1650', 'user_name': None, 'project_id': 'e26ae3fdd48d4947978a480f70e14f84', 'project_name': None, 'resource_id': '2b94d7d0-b000-460f-8883-8953d60115d0-sda', 'timestamp': '2025-12-05T09:23:19.156507', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-804921789', 'name': 'instance-00000003', 'instance_id': '2b94d7d0-b000-460f-8883-8953d60115d0', 'instance_type': 'm1.nano', 'host': 'e310e8c2de91dded1bf420f01eae623153594a2dfb619537eb8a35c6', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'fbadeab4-f24f-4100-963a-d228b2a6f7c4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3ebffd97-b242-42d7-b245-ebdaf8e4377c'}, 'image_ref': '3ebffd97-b242-42d7-b245-ebdaf8e4377c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '097e0458-d1bc-11f0-a2f3-fa163e9454b0', 'monotonic_time': 3872.214243348, 'message_signature': 'd10c5d2ef8ad8e0b6dcaba63f2d281b5fd0d55d14d5e000095940f07cf1ad993'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 309, 'user_id': 'a9b7ab1c9c854146af8af16f337a063d', 'user_name': None, 'project_id': 'aa5b008731384d18ba83c8d69e76bcef', 'project_name': None, 'resource_id': '95f85266-e2bc-4615-b523-e7346ad3ab40-vda', 'timestamp': '2025-12-05T09:23:19.156507', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-1827012883', 'name': 'instance-00000004', 'instance_id': '95f85266-e2bc-4615-b523-e7346ad3ab40', 'instance_type': 'm1.nano', 'host': '775d6a56e5645c4fc9619f653366421d7b59c37709cb5996def4234c', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'fbadeab4-f24f-4100-963a-d228b2a6f7c4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3ebffd97-b242-42d7-b245-ebdaf8e4377c'}, 'image_ref': '3ebffd97-b242-42d7-b245-ebdaf8e4377c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '0981f018-d1bc-11f0-a2f3-fa163e9454b0', 'monotonic_time': 3872.237272562, 'message_signature': 'b4968269c64cbe11c7c862ba11841b209e2dab6acba109d6651c68856509f903'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': 'a9b7ab1c9c854146af8af16f337a063d', 'user_name': None, 'project_id': 'aa5b008731384d18ba83c8d69e76bcef', 'project_name': None, 'resource_id': '95f85266-e2bc-4615-b523-e7346ad3ab40-sda', 'timestamp': '2025-12-05T09:23:19.156507', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-1827012883', 'name': 'instance-00000004', 'instance_id': '95f85266-e2bc-4615-b523-e7346ad3ab40', 'instance_type': 'm1.nano', 'host': '775d6a56e5645c4fc9619f653366421d7b59c37709cb5996def4234c', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'fbadeab4-f24f-4100-963a-d228b2a6f7c4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3ebffd97-b242-42d7-b245-ebdaf8e4377c'}, 'image_ref': '3ebffd97-b242-42d7-b245-ebdaf8e4377c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '0981fe28-d1bc-11f0-a2f3-fa163e9454b0', 'monotonic_time': 3872.237272562, 'message_signature': '87b4504af5dd060dcb25b099ff52c0f406173c6176139626e9265136feb6341e'}]}, 'timestamp': '2025-12-05 09:23:19.205768', '_unique_id': '8bf26468dd424fbf875365414dbee82b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.207 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.207 12 ERROR oslo_messaging.notify.messaging     yield
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.207 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.207 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.207 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.207 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.207 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.207 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.207 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.207 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.207 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.207 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.207 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.207 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.207 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.207 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.207 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.207 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.207 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.207 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.207 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.207 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.207 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.207 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.207 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.207 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.207 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.207 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.207 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.207 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.207 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.208 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.208 12 DEBUG ceilometer.compute.pollsters [-] 2b94d7d0-b000-460f-8883-8953d60115d0/network.incoming.bytes volume: 1352 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.208 12 DEBUG ceilometer.compute.pollsters [-] 95f85266-e2bc-4615-b523-e7346ad3ab40/network.incoming.bytes volume: 1352 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.209 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '65ce660f-1c58-490f-92ec-d4f38cf6a1ee', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1352, 'user_id': '65751a90715341b2984ef84ebbaa1650', 'user_name': None, 'project_id': 'e26ae3fdd48d4947978a480f70e14f84', 'project_name': None, 'resource_id': 'instance-00000003-2b94d7d0-b000-460f-8883-8953d60115d0-tapb2d6abb2-b3', 'timestamp': '2025-12-05T09:23:19.208440', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-804921789', 'name': 'tapb2d6abb2-b3', 'instance_id': '2b94d7d0-b000-460f-8883-8953d60115d0', 'instance_type': 'm1.nano', 'host': 'e310e8c2de91dded1bf420f01eae623153594a2dfb619537eb8a35c6', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'fbadeab4-f24f-4100-963a-d228b2a6f7c4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3ebffd97-b242-42d7-b245-ebdaf8e4377c'}, 'image_ref': '3ebffd97-b242-42d7-b245-ebdaf8e4377c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:57:1d:2e', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapb2d6abb2-b3'}, 'message_id': '0982729a-d1bc-11f0-a2f3-fa163e9454b0', 'monotonic_time': 3872.173325075, 'message_signature': 'f0049fa168457480e416b383bcd2fe82ba0ebadce11717b04c9cfef77d3734a1'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1352, 'user_id': 'a9b7ab1c9c854146af8af16f337a063d', 'user_name': None, 'project_id': 'aa5b008731384d18ba83c8d69e76bcef', 'project_name': None, 'resource_id': 'instance-00000004-95f85266-e2bc-4615-b523-e7346ad3ab40-tapfbbb4f47-a5', 'timestamp': '2025-12-05T09:23:19.208440', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-1827012883', 'name': 'tapfbbb4f47-a5', 'instance_id': '95f85266-e2bc-4615-b523-e7346ad3ab40', 'instance_type': 'm1.nano', 'host': '775d6a56e5645c4fc9619f653366421d7b59c37709cb5996def4234c', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'fbadeab4-f24f-4100-963a-d228b2a6f7c4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3ebffd97-b242-42d7-b245-ebdaf8e4377c'}, 'image_ref': '3ebffd97-b242-42d7-b245-ebdaf8e4377c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d6:6e:e7', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapfbbb4f47-a5'}, 'message_id': '09827c9a-d1bc-11f0-a2f3-fa163e9454b0', 'monotonic_time': 3872.177968959, 'message_signature': '745c638db4a05cfa158f6350052464ceaab1e79922d3b48ffc926c100a954559'}]}, 'timestamp': '2025-12-05 09:23:19.208979', '_unique_id': '17ab8629e5414a11b388a6c4749f70c1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.209 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.209 12 ERROR oslo_messaging.notify.messaging     yield
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.209 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.209 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.209 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.209 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.209 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.209 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.209 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.209 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.209 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.209 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.209 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.209 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.209 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.209 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.209 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.209 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.209 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.209 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.209 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.209 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.209 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.209 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.209 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.209 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.209 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.209 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.209 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.209 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.209 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.210 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.210 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.210 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-TestNetworkAdvancedServerOps-server-804921789>, <NovaLikeServer: tempest-LiveAutoBlockMigrationV225Test-server-1827012883>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkAdvancedServerOps-server-804921789>, <NovaLikeServer: tempest-LiveAutoBlockMigrationV225Test-server-1827012883>]
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.210 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.210 12 DEBUG ceilometer.compute.pollsters [-] 2b94d7d0-b000-460f-8883-8953d60115d0/disk.device.write.bytes volume: 72753152 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.211 12 DEBUG ceilometer.compute.pollsters [-] 2b94d7d0-b000-460f-8883-8953d60115d0/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.211 12 DEBUG ceilometer.compute.pollsters [-] 95f85266-e2bc-4615-b523-e7346ad3ab40/disk.device.write.bytes volume: 72695808 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.211 12 DEBUG ceilometer.compute.pollsters [-] 95f85266-e2bc-4615-b523-e7346ad3ab40/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.212 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '71155e9c-fec8-4a38-8d9b-a37946fa4721', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 72753152, 'user_id': '65751a90715341b2984ef84ebbaa1650', 'user_name': None, 'project_id': 'e26ae3fdd48d4947978a480f70e14f84', 'project_name': None, 'resource_id': '2b94d7d0-b000-460f-8883-8953d60115d0-vda', 'timestamp': '2025-12-05T09:23:19.210758', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-804921789', 'name': 'instance-00000003', 'instance_id': '2b94d7d0-b000-460f-8883-8953d60115d0', 'instance_type': 'm1.nano', 'host': 'e310e8c2de91dded1bf420f01eae623153594a2dfb619537eb8a35c6', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'fbadeab4-f24f-4100-963a-d228b2a6f7c4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3ebffd97-b242-42d7-b245-ebdaf8e4377c'}, 'image_ref': '3ebffd97-b242-42d7-b245-ebdaf8e4377c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '0982cc2c-d1bc-11f0-a2f3-fa163e9454b0', 'monotonic_time': 3872.214243348, 'message_signature': '6354cacd617e6103388e13d57c553cd8958358888913e87295da66f0411df904'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '65751a90715341b2984ef84ebbaa1650', 'user_name': None, 'project_id': 'e26ae3fdd48d4947978a480f70e14f84', 'project_name': None, 'resource_id': '2b94d7d0-b000-460f-8883-8953d60115d0-sda', 'timestamp': '2025-12-05T09:23:19.210758', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-804921789', 'name': 'instance-00000003', 'instance_id': '2b94d7d0-b000-460f-8883-8953d60115d0', 'instance_type': 'm1.nano', 'host': 'e310e8c2de91dded1bf420f01eae623153594a2dfb619537eb8a35c6', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'fbadeab4-f24f-4100-963a-d228b2a6f7c4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3ebffd97-b242-42d7-b245-ebdaf8e4377c'}, 'image_ref': '3ebffd97-b242-42d7-b245-ebdaf8e4377c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '0982d44c-d1bc-11f0-a2f3-fa163e9454b0', 'monotonic_time': 3872.214243348, 'message_signature': '033f065bc619d72ba269f1b9c4723123bff11726df348bfb796423e754b4e23e'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 72695808, 'user_id': 'a9b7ab1c9c854146af8af16f337a063d', 'user_name': None, 'project_id': 'aa5b008731384d18ba83c8d69e76bcef', 'project_name': None, 'resource_id': '95f85266-e2bc-4615-b523-e7346ad3ab40-vda', 'timestamp': '2025-12-05T09:23:19.210758', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-1827012883', 'name': 'instance-00000004', 'instance_id': '95f85266-e2bc-4615-b523-e7346ad3ab40', 'instance_type': 'm1.nano', 'host': '775d6a56e5645c4fc9619f653366421d7b59c37709cb5996def4234c', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'fbadeab4-f24f-4100-963a-d228b2a6f7c4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3ebffd97-b242-42d7-b245-ebdaf8e4377c'}, 'image_ref': '3ebffd97-b242-42d7-b245-ebdaf8e4377c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '0982dfdc-d1bc-11f0-a2f3-fa163e9454b0', 'monotonic_time': 3872.237272562, 'message_signature': '87feabfcab2f76b9547ba4d8ca2396db2ce69bc08c259f410b7cc479a693de17'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'a9b7ab1c9c854146af8af16f337a063d', 'user_name': None, 'project_id': 'aa5b008731384d18ba83c8d69e76bcef', 'project_name': None, 'resource_id': '95f85266-e2bc-4615-b523-e7346ad3ab40-sda', 'timestamp': '2025-12-05T09:23:19.210758', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-1827012883', 'name': 'instance-00000004', 'instance_id': '95f85266-e2bc-4615-b523-e7346ad3ab40', 'instance_type': 'm1.nano', 'host': '775d6a56e5645c4fc9619f653366421d7b59c37709cb5996def4234c', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'fbadeab4-f24f-4100-963a-d228b2a6f7c4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3ebffd97-b242-42d7-b245-ebdaf8e4377c'}, 'image_ref': '3ebffd97-b242-42d7-b245-ebdaf8e4377c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '0982e9be-d1bc-11f0-a2f3-fa163e9454b0', 'monotonic_time': 3872.237272562, 'message_signature': 'f5c75e175a9ff61852deb6373b4087884a1d70cc6e6434244935846a8ffde941'}]}, 'timestamp': '2025-12-05 09:23:19.211780', '_unique_id': '7a8ac9b680bb4917aa023a6e81646f10'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.212 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.212 12 ERROR oslo_messaging.notify.messaging     yield
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.212 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.212 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.212 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.212 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.212 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.212 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.212 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.212 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.212 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.212 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.212 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.212 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.212 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.212 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.212 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.212 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.212 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.212 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.212 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.212 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.212 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.212 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.212 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.212 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.212 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.212 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.212 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.212 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.212 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.213 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.213 12 DEBUG ceilometer.compute.pollsters [-] 2b94d7d0-b000-460f-8883-8953d60115d0/disk.device.read.latency volume: 282882034 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.213 12 DEBUG ceilometer.compute.pollsters [-] 2b94d7d0-b000-460f-8883-8953d60115d0/disk.device.read.latency volume: 28317615 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.213 12 DEBUG ceilometer.compute.pollsters [-] 95f85266-e2bc-4615-b523-e7346ad3ab40/disk.device.read.latency volume: 326937671 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.213 12 DEBUG ceilometer.compute.pollsters [-] 95f85266-e2bc-4615-b523-e7346ad3ab40/disk.device.read.latency volume: 33072683 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.214 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9144dc4a-a44f-419b-adc1-1bf1fec777d3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 282882034, 'user_id': '65751a90715341b2984ef84ebbaa1650', 'user_name': None, 'project_id': 'e26ae3fdd48d4947978a480f70e14f84', 'project_name': None, 'resource_id': '2b94d7d0-b000-460f-8883-8953d60115d0-vda', 'timestamp': '2025-12-05T09:23:19.213100', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-804921789', 'name': 'instance-00000003', 'instance_id': '2b94d7d0-b000-460f-8883-8953d60115d0', 'instance_type': 'm1.nano', 'host': 'e310e8c2de91dded1bf420f01eae623153594a2dfb619537eb8a35c6', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'fbadeab4-f24f-4100-963a-d228b2a6f7c4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3ebffd97-b242-42d7-b245-ebdaf8e4377c'}, 'image_ref': '3ebffd97-b242-42d7-b245-ebdaf8e4377c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '098326a4-d1bc-11f0-a2f3-fa163e9454b0', 'monotonic_time': 3872.214243348, 'message_signature': '041ab2cc90bbe41c274879d499bdbaa599f4a3f47a6649595494b79b550a8937'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 28317615, 'user_id': '65751a90715341b2984ef84ebbaa1650', 'user_name': None, 'project_id': 'e26ae3fdd48d4947978a480f70e14f84', 'project_name': None, 'resource_id': '2b94d7d0-b000-460f-8883-8953d60115d0-sda', 'timestamp': '2025-12-05T09:23:19.213100', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-804921789', 'name': 'instance-00000003', 'instance_id': '2b94d7d0-b000-460f-8883-8953d60115d0', 'instance_type': 'm1.nano', 'host': 'e310e8c2de91dded1bf420f01eae623153594a2dfb619537eb8a35c6', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'fbadeab4-f24f-4100-963a-d228b2a6f7c4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3ebffd97-b242-42d7-b245-ebdaf8e4377c'}, 'image_ref': '3ebffd97-b242-42d7-b245-ebdaf8e4377c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '098334d2-d1bc-11f0-a2f3-fa163e9454b0', 'monotonic_time': 3872.214243348, 'message_signature': 'e250f4e955f9f530766310bd9bedf500cfbe3f2b3764c646b1dc4ab0584ae4ef'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 326937671, 'user_id': 'a9b7ab1c9c854146af8af16f337a063d', 'user_name': None, 'project_id': 'aa5b008731384d18ba83c8d69e76bcef', 'project_name': None, 'resource_id': '95f85266-e2bc-4615-b523-e7346ad3ab40-vda', 'timestamp': '2025-12-05T09:23:19.213100', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-1827012883', 'name': 'instance-00000004', 'instance_id': '95f85266-e2bc-4615-b523-e7346ad3ab40', 'instance_type': 'm1.nano', 'host': '775d6a56e5645c4fc9619f653366421d7b59c37709cb5996def4234c', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'fbadeab4-f24f-4100-963a-d228b2a6f7c4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3ebffd97-b242-42d7-b245-ebdaf8e4377c'}, 'image_ref': '3ebffd97-b242-42d7-b245-ebdaf8e4377c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '09833ef0-d1bc-11f0-a2f3-fa163e9454b0', 'monotonic_time': 3872.237272562, 'message_signature': 'dc495fefabfc9c252f2823ed43e079b60894f2d59fd204fac4af5ed11b7326fc'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 33072683, 'user_id': 'a9b7ab1c9c854146af8af16f337a063d', 'user_name': None, 'project_id': 'aa5b008731384d18ba83c8d69e76bcef', 'project_name': None, 'resource_id': '95f85266-e2bc-4615-b523-e7346ad3ab40-sda', 'timestamp': '2025-12-05T09:23:19.213100', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-1827012883', 'name': 'instance-00000004', 'instance_id': '95f85266-e2bc-4615-b523-e7346ad3ab40', 'instance_type': 'm1.nano', 'host': '775d6a56e5645c4fc9619f653366421d7b59c37709cb5996def4234c', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'fbadeab4-f24f-4100-963a-d228b2a6f7c4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3ebffd97-b242-42d7-b245-ebdaf8e4377c'}, 'image_ref': '3ebffd97-b242-42d7-b245-ebdaf8e4377c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '098346b6-d1bc-11f0-a2f3-fa163e9454b0', 'monotonic_time': 3872.237272562, 'message_signature': '5b398f6a72a832a6f2b7e74ff72f9a06942e21ed05fca62e54b2fbf29f20414d'}]}, 'timestamp': '2025-12-05 09:23:19.214140', '_unique_id': '7e350cb8ada94ea9964555f61afc133b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.214 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.214 12 ERROR oslo_messaging.notify.messaging     yield
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.214 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.214 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.214 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.214 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.214 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.214 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.214 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.214 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.214 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.214 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.214 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.214 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.214 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.214 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.214 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.214 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.214 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.214 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.214 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.214 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.214 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.214 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.214 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.214 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.214 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.214 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.214 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.214 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.214 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.215 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.215 12 DEBUG ceilometer.compute.pollsters [-] 2b94d7d0-b000-460f-8883-8953d60115d0/disk.device.write.latency volume: 3207822517 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.215 12 DEBUG ceilometer.compute.pollsters [-] 2b94d7d0-b000-460f-8883-8953d60115d0/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.216 12 DEBUG ceilometer.compute.pollsters [-] 95f85266-e2bc-4615-b523-e7346ad3ab40/disk.device.write.latency volume: 3285047717 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.216 12 DEBUG ceilometer.compute.pollsters [-] 95f85266-e2bc-4615-b523-e7346ad3ab40/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.217 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '35bcd0f1-0b9d-4cd5-9ce7-be3c76923ac9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 3207822517, 'user_id': '65751a90715341b2984ef84ebbaa1650', 'user_name': None, 'project_id': 'e26ae3fdd48d4947978a480f70e14f84', 'project_name': None, 'resource_id': '2b94d7d0-b000-460f-8883-8953d60115d0-vda', 'timestamp': '2025-12-05T09:23:19.215551', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-804921789', 'name': 'instance-00000003', 'instance_id': '2b94d7d0-b000-460f-8883-8953d60115d0', 'instance_type': 'm1.nano', 'host': 'e310e8c2de91dded1bf420f01eae623153594a2dfb619537eb8a35c6', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'fbadeab4-f24f-4100-963a-d228b2a6f7c4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3ebffd97-b242-42d7-b245-ebdaf8e4377c'}, 'image_ref': '3ebffd97-b242-42d7-b245-ebdaf8e4377c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '0983866c-d1bc-11f0-a2f3-fa163e9454b0', 'monotonic_time': 3872.214243348, 'message_signature': '83fffefeb91d8bed448a817f444689351e7e7e76ad235c78068a79175e957bee'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '65751a90715341b2984ef84ebbaa1650', 'user_name': None, 'project_id': 'e26ae3fdd48d4947978a480f70e14f84', 'project_name': None, 'resource_id': '2b94d7d0-b000-460f-8883-8953d60115d0-sda', 'timestamp': '2025-12-05T09:23:19.215551', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-804921789', 'name': 'instance-00000003', 'instance_id': '2b94d7d0-b000-460f-8883-8953d60115d0', 'instance_type': 'm1.nano', 'host': 'e310e8c2de91dded1bf420f01eae623153594a2dfb619537eb8a35c6', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'fbadeab4-f24f-4100-963a-d228b2a6f7c4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3ebffd97-b242-42d7-b245-ebdaf8e4377c'}, 'image_ref': '3ebffd97-b242-42d7-b245-ebdaf8e4377c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '09838fa4-d1bc-11f0-a2f3-fa163e9454b0', 'monotonic_time': 3872.214243348, 'message_signature': '27b48366ccc683d5e46e758f76b1f3d38cccb1ff014567775c9a2d281d214597'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 3285047717, 'user_id': 'a9b7ab1c9c854146af8af16f337a063d', 'user_name': None, 'project_id': 'aa5b008731384d18ba83c8d69e76bcef', 'project_name': None, 'resource_id': '95f85266-e2bc-4615-b523-e7346ad3ab40-vda', 'timestamp': '2025-12-05T09:23:19.215551', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-1827012883', 'name': 'instance-00000004', 'instance_id': '95f85266-e2bc-4615-b523-e7346ad3ab40', 'instance_type': 'm1.nano', 'host': '775d6a56e5645c4fc9619f653366421d7b59c37709cb5996def4234c', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'fbadeab4-f24f-4100-963a-d228b2a6f7c4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3ebffd97-b242-42d7-b245-ebdaf8e4377c'}, 'image_ref': '3ebffd97-b242-42d7-b245-ebdaf8e4377c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '0983976a-d1bc-11f0-a2f3-fa163e9454b0', 'monotonic_time': 3872.237272562, 'message_signature': '2a4669a8416560783e30bfbefa9373a8d4ced46042c8d29cfbbd8a7db2af868e'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': 'a9b7ab1c9c854146af8af16f337a063d', 'user_name': None, 'project_id': 'aa5b008731384d18ba83c8d69e76bcef', 'project_name': None, 'resource_id': '95f85266-e2bc-4615-b523-e7346ad3ab40-sda', 'timestamp': '2025-12-05T09:23:19.215551', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-1827012883', 'name': 'instance-00000004', 'instance_id': '95f85266-e2bc-4615-b523-e7346ad3ab40', 'instance_type': 'm1.nano', 'host': '775d6a56e5645c4fc9619f653366421d7b59c37709cb5996def4234c', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'fbadeab4-f24f-4100-963a-d228b2a6f7c4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3ebffd97-b242-42d7-b245-ebdaf8e4377c'}, 'image_ref': '3ebffd97-b242-42d7-b245-ebdaf8e4377c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '0983a016-d1bc-11f0-a2f3-fa163e9454b0', 'monotonic_time': 3872.237272562, 'message_signature': 'efea0b4554adbcb5d882c6ca1cef09b241ee4a8b690d3b0a8df96579aa4bd13c'}]}, 'timestamp': '2025-12-05 09:23:19.216520', '_unique_id': 'f3fa1e52b06d4f499a79605e8c69fa8a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.217 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.217 12 ERROR oslo_messaging.notify.messaging     yield
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.217 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.217 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.217 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.217 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.217 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.217 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.217 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.217 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.217 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.217 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.217 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.217 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.217 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.217 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.217 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.217 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.217 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.217 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.217 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.217 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.217 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.217 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.217 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.217 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.217 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.217 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.217 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.217 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.217 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.217 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.217 12 DEBUG ceilometer.compute.pollsters [-] 2b94d7d0-b000-460f-8883-8953d60115d0/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.218 12 DEBUG ceilometer.compute.pollsters [-] 95f85266-e2bc-4615-b523-e7346ad3ab40/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.218 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'fccf7eda-bae1-4178-afd0-888bfd891f3f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '65751a90715341b2984ef84ebbaa1650', 'user_name': None, 'project_id': 'e26ae3fdd48d4947978a480f70e14f84', 'project_name': None, 'resource_id': 'instance-00000003-2b94d7d0-b000-460f-8883-8953d60115d0-tapb2d6abb2-b3', 'timestamp': '2025-12-05T09:23:19.217839', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-804921789', 'name': 'tapb2d6abb2-b3', 'instance_id': '2b94d7d0-b000-460f-8883-8953d60115d0', 'instance_type': 'm1.nano', 'host': 'e310e8c2de91dded1bf420f01eae623153594a2dfb619537eb8a35c6', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'fbadeab4-f24f-4100-963a-d228b2a6f7c4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3ebffd97-b242-42d7-b245-ebdaf8e4377c'}, 'image_ref': '3ebffd97-b242-42d7-b245-ebdaf8e4377c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:57:1d:2e', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapb2d6abb2-b3'}, 'message_id': '0983e058-d1bc-11f0-a2f3-fa163e9454b0', 'monotonic_time': 3872.173325075, 'message_signature': '1323fe103c1aec027c2008a2d5665f5451a8ba3205484c050f8e00de3992948a'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'a9b7ab1c9c854146af8af16f337a063d', 'user_name': None, 'project_id': 'aa5b008731384d18ba83c8d69e76bcef', 'project_name': None, 'resource_id': 'instance-00000004-95f85266-e2bc-4615-b523-e7346ad3ab40-tapfbbb4f47-a5', 'timestamp': '2025-12-05T09:23:19.217839', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-1827012883', 'name': 'tapfbbb4f47-a5', 'instance_id': '95f85266-e2bc-4615-b523-e7346ad3ab40', 'instance_type': 'm1.nano', 'host': '775d6a56e5645c4fc9619f653366421d7b59c37709cb5996def4234c', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'fbadeab4-f24f-4100-963a-d228b2a6f7c4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3ebffd97-b242-42d7-b245-ebdaf8e4377c'}, 'image_ref': '3ebffd97-b242-42d7-b245-ebdaf8e4377c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d6:6e:e7', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapfbbb4f47-a5'}, 'message_id': '0983e8dc-d1bc-11f0-a2f3-fa163e9454b0', 'monotonic_time': 3872.177968959, 'message_signature': '55a3278121459ed102dd8168a474b2d90b416de25aabfe9960051db4dd0ac5df'}]}, 'timestamp': '2025-12-05 09:23:19.218299', '_unique_id': '8012798aa4434da09488377d95eff6e3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.218 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.218 12 ERROR oslo_messaging.notify.messaging     yield
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.218 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.218 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.218 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.218 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.218 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.218 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.218 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.218 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.218 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.218 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.218 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.218 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.218 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.218 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.218 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.218 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.218 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.218 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.218 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.218 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.218 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.218 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.218 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.218 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.218 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.218 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.218 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.218 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.218 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.219 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.219 12 DEBUG ceilometer.compute.pollsters [-] 2b94d7d0-b000-460f-8883-8953d60115d0/disk.device.read.bytes volume: 30493184 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.219 12 DEBUG ceilometer.compute.pollsters [-] 2b94d7d0-b000-460f-8883-8953d60115d0/disk.device.read.bytes volume: 274750 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.220 12 DEBUG ceilometer.compute.pollsters [-] 95f85266-e2bc-4615-b523-e7346ad3ab40/disk.device.read.bytes volume: 30513664 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.220 12 DEBUG ceilometer.compute.pollsters [-] 95f85266-e2bc-4615-b523-e7346ad3ab40/disk.device.read.bytes volume: 274750 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.221 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6976a7b4-081d-4312-aa34-ff70397de9d8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 30493184, 'user_id': '65751a90715341b2984ef84ebbaa1650', 'user_name': None, 'project_id': 'e26ae3fdd48d4947978a480f70e14f84', 'project_name': None, 'resource_id': '2b94d7d0-b000-460f-8883-8953d60115d0-vda', 'timestamp': '2025-12-05T09:23:19.219603', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-804921789', 'name': 'instance-00000003', 'instance_id': '2b94d7d0-b000-460f-8883-8953d60115d0', 'instance_type': 'm1.nano', 'host': 'e310e8c2de91dded1bf420f01eae623153594a2dfb619537eb8a35c6', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'fbadeab4-f24f-4100-963a-d228b2a6f7c4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3ebffd97-b242-42d7-b245-ebdaf8e4377c'}, 'image_ref': '3ebffd97-b242-42d7-b245-ebdaf8e4377c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '098426da-d1bc-11f0-a2f3-fa163e9454b0', 'monotonic_time': 3872.214243348, 'message_signature': '88792fc9c287dcadc3253252914d80a2c7247776708e8d6066a45f2833f540e8'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 274750, 'user_id': '65751a90715341b2984ef84ebbaa1650', 'user_name': None, 'project_id': 'e26ae3fdd48d4947978a480f70e14f84', 'project_name': None, 'resource_id': '2b94d7d0-b000-460f-8883-8953d60115d0-sda', 'timestamp': '2025-12-05T09:23:19.219603', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-804921789', 'name': 'instance-00000003', 'instance_id': '2b94d7d0-b000-460f-8883-8953d60115d0', 'instance_type': 'm1.nano', 'host': 'e310e8c2de91dded1bf420f01eae623153594a2dfb619537eb8a35c6', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'fbadeab4-f24f-4100-963a-d228b2a6f7c4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3ebffd97-b242-42d7-b245-ebdaf8e4377c'}, 'image_ref': '3ebffd97-b242-42d7-b245-ebdaf8e4377c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '09842f36-d1bc-11f0-a2f3-fa163e9454b0', 'monotonic_time': 3872.214243348, 'message_signature': 'b93fe3fb71335cd98f36ba02f5fb8db138edc5e5d9adce70b5818d77c68d0f35'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 30513664, 'user_id': 'a9b7ab1c9c854146af8af16f337a063d', 'user_name': None, 'project_id': 'aa5b008731384d18ba83c8d69e76bcef', 'project_name': None, 'resource_id': '95f85266-e2bc-4615-b523-e7346ad3ab40-vda', 'timestamp': '2025-12-05T09:23:19.219603', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-1827012883', 'name': 'instance-00000004', 'instance_id': '95f85266-e2bc-4615-b523-e7346ad3ab40', 'instance_type': 'm1.nano', 'host': '775d6a56e5645c4fc9619f653366421d7b59c37709cb5996def4234c', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'fbadeab4-f24f-4100-963a-d228b2a6f7c4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3ebffd97-b242-42d7-b245-ebdaf8e4377c'}, 'image_ref': '3ebffd97-b242-42d7-b245-ebdaf8e4377c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '098436a2-d1bc-11f0-a2f3-fa163e9454b0', 'monotonic_time': 3872.237272562, 'message_signature': '72c435aa3d93c33135607f390b66f4fa9fcb06bfc98bb58c75a97a3e8168e390'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 274750, 'user_id': 'a9b7ab1c9c854146af8af16f337a063d', 'user_name': None, 'project_id': 'aa5b008731384d18ba83c8d69e76bcef', 'project_name': None, 'resource_id': '95f85266-e2bc-4615-b523-e7346ad3ab40-sda', 'timestamp': '2025-12-05T09:23:19.219603', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-1827012883', 'name': 'instance-00000004', 'instance_id': '95f85266-e2bc-4615-b523-e7346ad3ab40', 'instance_type': 'm1.nano', 'host': '775d6a56e5645c4fc9619f653366421d7b59c37709cb5996def4234c', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'fbadeab4-f24f-4100-963a-d228b2a6f7c4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3ebffd97-b242-42d7-b245-ebdaf8e4377c'}, 'image_ref': '3ebffd97-b242-42d7-b245-ebdaf8e4377c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '09843fc6-d1bc-11f0-a2f3-fa163e9454b0', 'monotonic_time': 3872.237272562, 'message_signature': 'd80495019186539c22e233b6c6232a1d5db2d8d7487ce9b846aa1d16aa8f1e6a'}]}, 'timestamp': '2025-12-05 09:23:19.220591', '_unique_id': '179acdbf5985481ead52c1c2cc1a9364'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.221 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.221 12 ERROR oslo_messaging.notify.messaging     yield
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.221 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.221 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.221 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.221 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.221 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.221 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.221 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.221 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.221 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.221 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.221 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.221 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.221 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.221 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.221 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.221 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.221 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.221 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.221 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.221 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.221 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.221 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.221 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.221 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.221 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.221 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.221 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.221 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.221 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.221 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.221 12 DEBUG ceilometer.compute.pollsters [-] 2b94d7d0-b000-460f-8883-8953d60115d0/disk.device.usage volume: 29753344 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.222 12 DEBUG ceilometer.compute.pollsters [-] 2b94d7d0-b000-460f-8883-8953d60115d0/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.222 12 DEBUG ceilometer.compute.pollsters [-] 95f85266-e2bc-4615-b523-e7346ad3ab40/disk.device.usage volume: 29753344 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.222 12 DEBUG ceilometer.compute.pollsters [-] 95f85266-e2bc-4615-b523-e7346ad3ab40/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.223 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7c3f28d0-757f-4025-a083-ed6f68abe13c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 29753344, 'user_id': '65751a90715341b2984ef84ebbaa1650', 'user_name': None, 'project_id': 'e26ae3fdd48d4947978a480f70e14f84', 'project_name': None, 'resource_id': '2b94d7d0-b000-460f-8883-8953d60115d0-vda', 'timestamp': '2025-12-05T09:23:19.221921', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-804921789', 'name': 'instance-00000003', 'instance_id': '2b94d7d0-b000-460f-8883-8953d60115d0', 'instance_type': 'm1.nano', 'host': 'e310e8c2de91dded1bf420f01eae623153594a2dfb619537eb8a35c6', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'fbadeab4-f24f-4100-963a-d228b2a6f7c4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3ebffd97-b242-42d7-b245-ebdaf8e4377c'}, 'image_ref': '3ebffd97-b242-42d7-b245-ebdaf8e4377c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '09847f72-d1bc-11f0-a2f3-fa163e9454b0', 'monotonic_time': 3872.18497827, 'message_signature': '987faaab6913dcda705a557b9a16776525d4fa813c92b35f6baa1841826936cc'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '65751a90715341b2984ef84ebbaa1650', 'user_name': None, 'project_id': 'e26ae3fdd48d4947978a480f70e14f84', 'project_name': None, 'resource_id': '2b94d7d0-b000-460f-8883-8953d60115d0-sda', 'timestamp': '2025-12-05T09:23:19.221921', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-804921789', 'name': 'instance-00000003', 'instance_id': '2b94d7d0-b000-460f-8883-8953d60115d0', 'instance_type': 'm1.nano', 'host': 'e310e8c2de91dded1bf420f01eae623153594a2dfb619537eb8a35c6', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'fbadeab4-f24f-4100-963a-d228b2a6f7c4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3ebffd97-b242-42d7-b245-ebdaf8e4377c'}, 'image_ref': '3ebffd97-b242-42d7-b245-ebdaf8e4377c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '09848756-d1bc-11f0-a2f3-fa163e9454b0', 'monotonic_time': 3872.18497827, 'message_signature': '23b082d4293764e635fe4d749f8b47b3e904ad5d6f0af2edcf8ece755a79816f'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 29753344, 'user_id': 'a9b7ab1c9c854146af8af16f337a063d', 'user_name': None, 'project_id': 'aa5b008731384d18ba83c8d69e76bcef', 'project_name': None, 'resource_id': '95f85266-e2bc-4615-b523-e7346ad3ab40-vda', 'timestamp': '2025-12-05T09:23:19.221921', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-1827012883', 'name': 'instance-00000004', 'instance_id': '95f85266-e2bc-4615-b523-e7346ad3ab40', 'instance_type': 'm1.nano', 'host': '775d6a56e5645c4fc9619f653366421d7b59c37709cb5996def4234c', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'fbadeab4-f24f-4100-963a-d228b2a6f7c4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3ebffd97-b242-42d7-b245-ebdaf8e4377c'}, 'image_ref': '3ebffd97-b242-42d7-b245-ebdaf8e4377c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '09848ed6-d1bc-11f0-a2f3-fa163e9454b0', 'monotonic_time': 3872.197080417, 'message_signature': 'ed3ddc9faf1715aec35d4592c9ca4817bdc520845478bc1f3b70e9e0677444aa'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'a9b7ab1c9c854146af8af16f337a063d', 'user_name': None, 'project_id': 'aa5b008731384d18ba83c8d69e76bcef', 'project_name': None, 'resource_id': '95f85266-e2bc-4615-b523-e7346ad3ab40-sda', 'timestamp': '2025-12-05T09:23:19.221921', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-1827012883', 'name': 'instance-00000004', 'instance_id': '95f85266-e2bc-4615-b523-e7346ad3ab40', 'instance_type': 'm1.nano', 'host': '775d6a56e5645c4fc9619f653366421d7b59c37709cb5996def4234c', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'fbadeab4-f24f-4100-963a-d228b2a6f7c4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3ebffd97-b242-42d7-b245-ebdaf8e4377c'}, 'image_ref': '3ebffd97-b242-42d7-b245-ebdaf8e4377c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '09849976-d1bc-11f0-a2f3-fa163e9454b0', 'monotonic_time': 3872.197080417, 'message_signature': '3d385599f136cba1024034967fb0bf6d14b93be2d4fd7a4d429864335702c149'}]}, 'timestamp': '2025-12-05 09:23:19.222847', '_unique_id': '851bb1b73cb744afb3aceba97de55a6e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.223 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.223 12 ERROR oslo_messaging.notify.messaging     yield
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.223 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.223 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.223 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.223 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.223 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.223 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.223 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.223 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.223 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.223 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.223 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.223 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.223 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.223 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.223 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.223 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.223 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.223 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.223 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.223 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.223 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.223 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.223 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.223 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.223 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.223 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.223 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.223 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.223 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.224 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.224 12 DEBUG ceilometer.compute.pollsters [-] 2b94d7d0-b000-460f-8883-8953d60115d0/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.224 12 DEBUG ceilometer.compute.pollsters [-] 95f85266-e2bc-4615-b523-e7346ad3ab40/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.225 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9b58335a-cb8a-4c27-a485-87f08273925d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '65751a90715341b2984ef84ebbaa1650', 'user_name': None, 'project_id': 'e26ae3fdd48d4947978a480f70e14f84', 'project_name': None, 'resource_id': 'instance-00000003-2b94d7d0-b000-460f-8883-8953d60115d0-tapb2d6abb2-b3', 'timestamp': '2025-12-05T09:23:19.224081', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-804921789', 'name': 'tapb2d6abb2-b3', 'instance_id': '2b94d7d0-b000-460f-8883-8953d60115d0', 'instance_type': 'm1.nano', 'host': 'e310e8c2de91dded1bf420f01eae623153594a2dfb619537eb8a35c6', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'fbadeab4-f24f-4100-963a-d228b2a6f7c4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3ebffd97-b242-42d7-b245-ebdaf8e4377c'}, 'image_ref': '3ebffd97-b242-42d7-b245-ebdaf8e4377c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:57:1d:2e', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapb2d6abb2-b3'}, 'message_id': '0984d4ea-d1bc-11f0-a2f3-fa163e9454b0', 'monotonic_time': 3872.173325075, 'message_signature': 'ed92e65966269e0b5ec3f3b11cf80f9e4c9cbbe500464c434d364951362f631d'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'a9b7ab1c9c854146af8af16f337a063d', 'user_name': None, 'project_id': 'aa5b008731384d18ba83c8d69e76bcef', 'project_name': None, 'resource_id': 'instance-00000004-95f85266-e2bc-4615-b523-e7346ad3ab40-tapfbbb4f47-a5', 'timestamp': '2025-12-05T09:23:19.224081', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-1827012883', 'name': 'tapfbbb4f47-a5', 'instance_id': '95f85266-e2bc-4615-b523-e7346ad3ab40', 'instance_type': 'm1.nano', 'host': '775d6a56e5645c4fc9619f653366421d7b59c37709cb5996def4234c', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'fbadeab4-f24f-4100-963a-d228b2a6f7c4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3ebffd97-b242-42d7-b245-ebdaf8e4377c'}, 'image_ref': '3ebffd97-b242-42d7-b245-ebdaf8e4377c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d6:6e:e7', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapfbbb4f47-a5'}, 'message_id': '0984de4a-d1bc-11f0-a2f3-fa163e9454b0', 'monotonic_time': 3872.177968959, 'message_signature': '37aa82ccf9fbb873770d5cca4bc084dda1deccbac27729bd7008a461d4e389a0'}]}, 'timestamp': '2025-12-05 09:23:19.224605', '_unique_id': '805fce135be3445da1b29d6af81e3008'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.225 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.225 12 ERROR oslo_messaging.notify.messaging     yield
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.225 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.225 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.225 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.225 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.225 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.225 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.225 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.225 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.225 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.225 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.225 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.225 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.225 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.225 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.225 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.225 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.225 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.225 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.225 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.225 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.225 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.225 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.225 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.225 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.225 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.225 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.225 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.225 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.225 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.225 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.225 12 DEBUG ceilometer.compute.pollsters [-] 2b94d7d0-b000-460f-8883-8953d60115d0/disk.device.read.requests volume: 1096 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.226 12 DEBUG ceilometer.compute.pollsters [-] 2b94d7d0-b000-460f-8883-8953d60115d0/disk.device.read.requests volume: 108 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.226 12 DEBUG ceilometer.compute.pollsters [-] 95f85266-e2bc-4615-b523-e7346ad3ab40/disk.device.read.requests volume: 1102 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.226 12 DEBUG ceilometer.compute.pollsters [-] 95f85266-e2bc-4615-b523-e7346ad3ab40/disk.device.read.requests volume: 108 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.227 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '301ec0f3-1710-4ead-9b3b-45a504e008ed', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1096, 'user_id': '65751a90715341b2984ef84ebbaa1650', 'user_name': None, 'project_id': 'e26ae3fdd48d4947978a480f70e14f84', 'project_name': None, 'resource_id': '2b94d7d0-b000-460f-8883-8953d60115d0-vda', 'timestamp': '2025-12-05T09:23:19.225895', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-804921789', 'name': 'instance-00000003', 'instance_id': '2b94d7d0-b000-460f-8883-8953d60115d0', 'instance_type': 'm1.nano', 'host': 'e310e8c2de91dded1bf420f01eae623153594a2dfb619537eb8a35c6', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'fbadeab4-f24f-4100-963a-d228b2a6f7c4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3ebffd97-b242-42d7-b245-ebdaf8e4377c'}, 'image_ref': '3ebffd97-b242-42d7-b245-ebdaf8e4377c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '09851bb2-d1bc-11f0-a2f3-fa163e9454b0', 'monotonic_time': 3872.214243348, 'message_signature': '0e4b62340c16762faf53077cbd1d02df58487481488e4d5237d2592a9586c45d'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 108, 'user_id': '65751a90715341b2984ef84ebbaa1650', 'user_name': None, 'project_id': 'e26ae3fdd48d4947978a480f70e14f84', 'project_name': None, 'resource_id': '2b94d7d0-b000-460f-8883-8953d60115d0-sda', 'timestamp': '2025-12-05T09:23:19.225895', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-804921789', 'name': 'instance-00000003', 'instance_id': '2b94d7d0-b000-460f-8883-8953d60115d0', 'instance_type': 'm1.nano', 'host': 'e310e8c2de91dded1bf420f01eae623153594a2dfb619537eb8a35c6', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'fbadeab4-f24f-4100-963a-d228b2a6f7c4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3ebffd97-b242-42d7-b245-ebdaf8e4377c'}, 'image_ref': '3ebffd97-b242-42d7-b245-ebdaf8e4377c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '098523d2-d1bc-11f0-a2f3-fa163e9454b0', 'monotonic_time': 3872.214243348, 'message_signature': '5000e2d63711f13928f009b8d5d3b4e633bafc02ede56a9732b0598a7353fa30'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1102, 'user_id': 'a9b7ab1c9c854146af8af16f337a063d', 'user_name': None, 'project_id': 'aa5b008731384d18ba83c8d69e76bcef', 'project_name': None, 'resource_id': '95f85266-e2bc-4615-b523-e7346ad3ab40-vda', 'timestamp': '2025-12-05T09:23:19.225895', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-1827012883', 'name': 'instance-00000004', 'instance_id': '95f85266-e2bc-4615-b523-e7346ad3ab40', 'instance_type': 'm1.nano', 'host': '775d6a56e5645c4fc9619f653366421d7b59c37709cb5996def4234c', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'fbadeab4-f24f-4100-963a-d228b2a6f7c4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3ebffd97-b242-42d7-b245-ebdaf8e4377c'}, 'image_ref': '3ebffd97-b242-42d7-b245-ebdaf8e4377c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '09852b5c-d1bc-11f0-a2f3-fa163e9454b0', 'monotonic_time': 3872.237272562, 'message_signature': 'f87bee6ecdd4f45a381bf4d55e43b6c485f427b566eda541874781b7b9db79c0'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 108, 'user_id': 'a9b7ab1c9c854146af8af16f337a063d', 'user_name': None, 'project_id': 'aa5b008731384d18ba83c8d69e76bcef', 'project_name': None, 'resource_id': '95f85266-e2bc-4615-b523-e7346ad3ab40-sda', 'timestamp': '2025-12-05T09:23:19.225895', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-1827012883', 'name': 'instance-00000004', 'instance_id': '95f85266-e2bc-4615-b523-e7346ad3ab40', 'instance_type': 'm1.nano', 'host': '775d6a56e5645c4fc9619f653366421d7b59c37709cb5996def4234c', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'fbadeab4-f24f-4100-963a-d228b2a6f7c4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3ebffd97-b242-42d7-b245-ebdaf8e4377c'}, 'image_ref': '3ebffd97-b242-42d7-b245-ebdaf8e4377c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '09853584-d1bc-11f0-a2f3-fa163e9454b0', 'monotonic_time': 3872.237272562, 'message_signature': '2a910a241e569caa9f829795478ac2b88d175e4b69bdadad6ad7a0adfb9da5e1'}]}, 'timestamp': '2025-12-05 09:23:19.226818', '_unique_id': 'af571cce17ed467d81cd14c182ff6c2e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.227 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.227 12 ERROR oslo_messaging.notify.messaging     yield
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.227 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.227 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.227 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.227 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.227 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.227 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.227 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.227 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.227 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.227 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.227 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.227 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.227 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.227 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.227 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.227 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.227 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.227 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.227 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.227 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.227 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.227 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.227 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.227 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.227 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.227 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.227 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.227 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.227 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.228 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.228 12 DEBUG ceilometer.compute.pollsters [-] 2b94d7d0-b000-460f-8883-8953d60115d0/network.outgoing.packets volume: 12 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.228 12 DEBUG ceilometer.compute.pollsters [-] 95f85266-e2bc-4615-b523-e7346ad3ab40/network.outgoing.packets volume: 12 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.229 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c79603a2-4b38-44c6-b08b-3843c6907930', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 12, 'user_id': '65751a90715341b2984ef84ebbaa1650', 'user_name': None, 'project_id': 'e26ae3fdd48d4947978a480f70e14f84', 'project_name': None, 'resource_id': 'instance-00000003-2b94d7d0-b000-460f-8883-8953d60115d0-tapb2d6abb2-b3', 'timestamp': '2025-12-05T09:23:19.228122', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-804921789', 'name': 'tapb2d6abb2-b3', 'instance_id': '2b94d7d0-b000-460f-8883-8953d60115d0', 'instance_type': 'm1.nano', 'host': 'e310e8c2de91dded1bf420f01eae623153594a2dfb619537eb8a35c6', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'fbadeab4-f24f-4100-963a-d228b2a6f7c4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3ebffd97-b242-42d7-b245-ebdaf8e4377c'}, 'image_ref': '3ebffd97-b242-42d7-b245-ebdaf8e4377c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:57:1d:2e', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapb2d6abb2-b3'}, 'message_id': '09857152-d1bc-11f0-a2f3-fa163e9454b0', 'monotonic_time': 3872.173325075, 'message_signature': 'fd2bd839a8abc3197e8dd53d538f62a2ab2104e1d425a79480af4a5dff9d9fa6'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 12, 'user_id': 'a9b7ab1c9c854146af8af16f337a063d', 'user_name': None, 'project_id': 'aa5b008731384d18ba83c8d69e76bcef', 'project_name': None, 'resource_id': 'instance-00000004-95f85266-e2bc-4615-b523-e7346ad3ab40-tapfbbb4f47-a5', 'timestamp': '2025-12-05T09:23:19.228122', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-1827012883', 'name': 'tapfbbb4f47-a5', 'instance_id': '95f85266-e2bc-4615-b523-e7346ad3ab40', 'instance_type': 'm1.nano', 'host': '775d6a56e5645c4fc9619f653366421d7b59c37709cb5996def4234c', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'fbadeab4-f24f-4100-963a-d228b2a6f7c4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3ebffd97-b242-42d7-b245-ebdaf8e4377c'}, 'image_ref': '3ebffd97-b242-42d7-b245-ebdaf8e4377c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d6:6e:e7', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapfbbb4f47-a5'}, 'message_id': '098579cc-d1bc-11f0-a2f3-fa163e9454b0', 'monotonic_time': 3872.177968959, 'message_signature': 'cd1552f8935c882ea87549d62fddc0f502d4ae8081c3618915b04e0fe1c800eb'}]}, 'timestamp': '2025-12-05 09:23:19.228588', '_unique_id': '4fa4c8cce1af45a185ccb6a556c9e019'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.229 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.229 12 ERROR oslo_messaging.notify.messaging     yield
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.229 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.229 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.229 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.229 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.229 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.229 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.229 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.229 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.229 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.229 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.229 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.229 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.229 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.229 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.229 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.229 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.229 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.229 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.229 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.229 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.229 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.229 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.229 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.229 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.229 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.229 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.229 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.229 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.229 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.229 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.229 12 DEBUG ceilometer.compute.pollsters [-] 2b94d7d0-b000-460f-8883-8953d60115d0/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.230 12 DEBUG ceilometer.compute.pollsters [-] 95f85266-e2bc-4615-b523-e7346ad3ab40/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.230 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '77d76313-4db9-4f17-ac96-297fc0687d8a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '65751a90715341b2984ef84ebbaa1650', 'user_name': None, 'project_id': 'e26ae3fdd48d4947978a480f70e14f84', 'project_name': None, 'resource_id': 'instance-00000003-2b94d7d0-b000-460f-8883-8953d60115d0-tapb2d6abb2-b3', 'timestamp': '2025-12-05T09:23:19.229929', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-804921789', 'name': 'tapb2d6abb2-b3', 'instance_id': '2b94d7d0-b000-460f-8883-8953d60115d0', 'instance_type': 'm1.nano', 'host': 'e310e8c2de91dded1bf420f01eae623153594a2dfb619537eb8a35c6', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'fbadeab4-f24f-4100-963a-d228b2a6f7c4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3ebffd97-b242-42d7-b245-ebdaf8e4377c'}, 'image_ref': '3ebffd97-b242-42d7-b245-ebdaf8e4377c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:57:1d:2e', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapb2d6abb2-b3'}, 'message_id': '0985b7de-d1bc-11f0-a2f3-fa163e9454b0', 'monotonic_time': 3872.173325075, 'message_signature': '2cfed79be31433dd77b8a2a05180b0ea937a7e07b5e1f793dc43d0d9c8fad2d8'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'a9b7ab1c9c854146af8af16f337a063d', 'user_name': None, 'project_id': 'aa5b008731384d18ba83c8d69e76bcef', 'project_name': None, 'resource_id': 'instance-00000004-95f85266-e2bc-4615-b523-e7346ad3ab40-tapfbbb4f47-a5', 'timestamp': '2025-12-05T09:23:19.229929', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-1827012883', 'name': 'tapfbbb4f47-a5', 'instance_id': '95f85266-e2bc-4615-b523-e7346ad3ab40', 'instance_type': 'm1.nano', 'host': '775d6a56e5645c4fc9619f653366421d7b59c37709cb5996def4234c', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'fbadeab4-f24f-4100-963a-d228b2a6f7c4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3ebffd97-b242-42d7-b245-ebdaf8e4377c'}, 'image_ref': '3ebffd97-b242-42d7-b245-ebdaf8e4377c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d6:6e:e7', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapfbbb4f47-a5'}, 'message_id': '0985c2ba-d1bc-11f0-a2f3-fa163e9454b0', 'monotonic_time': 3872.177968959, 'message_signature': '984dac8da643ece8c3280a6aabfcf351cee83b70b90ff1c0ff392b883391c7e1'}]}, 'timestamp': '2025-12-05 09:23:19.230438', '_unique_id': 'f6f02269760848e5a8802ca1d9a9f445'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.230 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.230 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.230 12 ERROR oslo_messaging.notify.messaging     yield
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.230 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.230 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.230 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.230 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.230 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.230 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.230 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.230 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.230 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.230 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.230 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.230 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.230 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.230 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.230 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.230 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.230 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.230 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.230 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.230 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.230 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.230 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.230 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.230 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.230 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.230 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.230 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.230 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.230 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.230 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.230 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.230 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.230 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.230 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.230 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.230 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.230 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.230 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.230 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.230 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.230 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.230 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.230 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.230 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.230 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.230 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.230 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.230 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.230 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.230 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.230 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.231 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.231 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.231 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-TestNetworkAdvancedServerOps-server-804921789>, <NovaLikeServer: tempest-LiveAutoBlockMigrationV225Test-server-1827012883>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkAdvancedServerOps-server-804921789>, <NovaLikeServer: tempest-LiveAutoBlockMigrationV225Test-server-1827012883>]
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.232 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.232 12 DEBUG ceilometer.compute.pollsters [-] 2b94d7d0-b000-460f-8883-8953d60115d0/memory.usage volume: 40.48046875 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.232 12 DEBUG ceilometer.compute.pollsters [-] 95f85266-e2bc-4615-b523-e7346ad3ab40/memory.usage volume: 40.45703125 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.233 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd9edd18d-8cee-499b-a942-a63459b6bcb6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 40.48046875, 'user_id': '65751a90715341b2984ef84ebbaa1650', 'user_name': None, 'project_id': 'e26ae3fdd48d4947978a480f70e14f84', 'project_name': None, 'resource_id': '2b94d7d0-b000-460f-8883-8953d60115d0', 'timestamp': '2025-12-05T09:23:19.232064', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-804921789', 'name': 'instance-00000003', 'instance_id': '2b94d7d0-b000-460f-8883-8953d60115d0', 'instance_type': 'm1.nano', 'host': 'e310e8c2de91dded1bf420f01eae623153594a2dfb619537eb8a35c6', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'fbadeab4-f24f-4100-963a-d228b2a6f7c4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3ebffd97-b242-42d7-b245-ebdaf8e4377c'}, 'image_ref': '3ebffd97-b242-42d7-b245-ebdaf8e4377c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': '09860b94-d1bc-11f0-a2f3-fa163e9454b0', 'monotonic_time': 3872.141821963, 'message_signature': '3d204c06dabd6522b5ffdf95d5f9ea3712c6c7dbd2c64857aca9bb7eda75ed25'}, {'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 40.45703125, 'user_id': 'a9b7ab1c9c854146af8af16f337a063d', 'user_name': None, 'project_id': 'aa5b008731384d18ba83c8d69e76bcef', 'project_name': None, 'resource_id': '95f85266-e2bc-4615-b523-e7346ad3ab40', 'timestamp': '2025-12-05T09:23:19.232064', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-1827012883', 'name': 'instance-00000004', 'instance_id': '95f85266-e2bc-4615-b523-e7346ad3ab40', 'instance_type': 'm1.nano', 'host': '775d6a56e5645c4fc9619f653366421d7b59c37709cb5996def4234c', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'fbadeab4-f24f-4100-963a-d228b2a6f7c4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3ebffd97-b242-42d7-b245-ebdaf8e4377c'}, 'image_ref': '3ebffd97-b242-42d7-b245-ebdaf8e4377c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': '09861530-d1bc-11f0-a2f3-fa163e9454b0', 'monotonic_time': 3872.161452624, 'message_signature': '8c3ed197c414e9a72d199513039f3041ab5295d4aef28a627c7d4116ce146c32'}]}, 'timestamp': '2025-12-05 09:23:19.232555', '_unique_id': '85a82fddf3644b8ca6dbeede5d22d6bd'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.233 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.233 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.233 12 ERROR oslo_messaging.notify.messaging     yield
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.233 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.233 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.233 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.233 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.233 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.233 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.233 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.233 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.233 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.233 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.233 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.233 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.233 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.233 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.233 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.233 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.233 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.233 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.233 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.233 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.233 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.233 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.233 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.233 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.233 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.233 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.233 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.233 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.233 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.233 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.233 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.233 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.233 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.233 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.233 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.233 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.233 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.233 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.233 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.233 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.233 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.233 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.233 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.233 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.233 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.233 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.233 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.233 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.233 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.233 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.233 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.233 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.233 12 DEBUG ceilometer.compute.pollsters [-] 2b94d7d0-b000-460f-8883-8953d60115d0/network.incoming.packets volume: 9 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.234 12 DEBUG ceilometer.compute.pollsters [-] 95f85266-e2bc-4615-b523-e7346ad3ab40/network.incoming.packets volume: 9 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.234 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '67454b05-fd9f-416b-9f60-ab481ffe87ac', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 9, 'user_id': '65751a90715341b2984ef84ebbaa1650', 'user_name': None, 'project_id': 'e26ae3fdd48d4947978a480f70e14f84', 'project_name': None, 'resource_id': 'instance-00000003-2b94d7d0-b000-460f-8883-8953d60115d0-tapb2d6abb2-b3', 'timestamp': '2025-12-05T09:23:19.233793', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-804921789', 'name': 'tapb2d6abb2-b3', 'instance_id': '2b94d7d0-b000-460f-8883-8953d60115d0', 'instance_type': 'm1.nano', 'host': 'e310e8c2de91dded1bf420f01eae623153594a2dfb619537eb8a35c6', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'fbadeab4-f24f-4100-963a-d228b2a6f7c4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3ebffd97-b242-42d7-b245-ebdaf8e4377c'}, 'image_ref': '3ebffd97-b242-42d7-b245-ebdaf8e4377c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:57:1d:2e', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapb2d6abb2-b3'}, 'message_id': '09865126-d1bc-11f0-a2f3-fa163e9454b0', 'monotonic_time': 3872.173325075, 'message_signature': '41b4dcc458d9c1a54fe3eff6ddf8f40888d2384393b360ffda6614b060c72ce0'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 9, 'user_id': 'a9b7ab1c9c854146af8af16f337a063d', 'user_name': None, 'project_id': 'aa5b008731384d18ba83c8d69e76bcef', 'project_name': None, 'resource_id': 'instance-00000004-95f85266-e2bc-4615-b523-e7346ad3ab40-tapfbbb4f47-a5', 'timestamp': '2025-12-05T09:23:19.233793', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-1827012883', 'name': 'tapfbbb4f47-a5', 'instance_id': '95f85266-e2bc-4615-b523-e7346ad3ab40', 'instance_type': 'm1.nano', 'host': '775d6a56e5645c4fc9619f653366421d7b59c37709cb5996def4234c', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'fbadeab4-f24f-4100-963a-d228b2a6f7c4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3ebffd97-b242-42d7-b245-ebdaf8e4377c'}, 'image_ref': '3ebffd97-b242-42d7-b245-ebdaf8e4377c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d6:6e:e7', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapfbbb4f47-a5'}, 'message_id': '09865996-d1bc-11f0-a2f3-fa163e9454b0', 'monotonic_time': 3872.177968959, 'message_signature': '38e30e9665245fa15add05ce203b95ae75140be5ffd1e85d8281ac9886c4b8cc'}]}, 'timestamp': '2025-12-05 09:23:19.234312', '_unique_id': 'a5e20f6a0a1a4f2bac911b899b1a03af'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.234 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.234 12 ERROR oslo_messaging.notify.messaging     yield
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.234 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.234 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.234 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.234 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.234 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.234 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.234 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.234 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.234 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.234 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.234 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.234 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.234 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.234 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.234 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.234 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.234 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.234 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.234 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.234 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.234 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.234 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.234 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.234 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.234 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.234 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.234 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.234 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.234 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.235 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.235 12 DEBUG ceilometer.compute.pollsters [-] 2b94d7d0-b000-460f-8883-8953d60115d0/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.235 12 DEBUG ceilometer.compute.pollsters [-] 95f85266-e2bc-4615-b523-e7346ad3ab40/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.236 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2074106e-035a-4b9e-9a38-93760b60ea77', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '65751a90715341b2984ef84ebbaa1650', 'user_name': None, 'project_id': 'e26ae3fdd48d4947978a480f70e14f84', 'project_name': None, 'resource_id': 'instance-00000003-2b94d7d0-b000-460f-8883-8953d60115d0-tapb2d6abb2-b3', 'timestamp': '2025-12-05T09:23:19.235666', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-804921789', 'name': 'tapb2d6abb2-b3', 'instance_id': '2b94d7d0-b000-460f-8883-8953d60115d0', 'instance_type': 'm1.nano', 'host': 'e310e8c2de91dded1bf420f01eae623153594a2dfb619537eb8a35c6', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'fbadeab4-f24f-4100-963a-d228b2a6f7c4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3ebffd97-b242-42d7-b245-ebdaf8e4377c'}, 'image_ref': '3ebffd97-b242-42d7-b245-ebdaf8e4377c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:57:1d:2e', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapb2d6abb2-b3'}, 'message_id': '09869802-d1bc-11f0-a2f3-fa163e9454b0', 'monotonic_time': 3872.173325075, 'message_signature': '7565c2b7f558b5e77edea04c03726fc05386736ae142e0f6c97d05909886e78a'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'a9b7ab1c9c854146af8af16f337a063d', 'user_name': None, 'project_id': 'aa5b008731384d18ba83c8d69e76bcef', 'project_name': None, 'resource_id': 'instance-00000004-95f85266-e2bc-4615-b523-e7346ad3ab40-tapfbbb4f47-a5', 'timestamp': '2025-12-05T09:23:19.235666', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-1827012883', 'name': 'tapfbbb4f47-a5', 'instance_id': '95f85266-e2bc-4615-b523-e7346ad3ab40', 'instance_type': 'm1.nano', 'host': '775d6a56e5645c4fc9619f653366421d7b59c37709cb5996def4234c', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'fbadeab4-f24f-4100-963a-d228b2a6f7c4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3ebffd97-b242-42d7-b245-ebdaf8e4377c'}, 'image_ref': '3ebffd97-b242-42d7-b245-ebdaf8e4377c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d6:6e:e7', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapfbbb4f47-a5'}, 'message_id': '0986a2b6-d1bc-11f0-a2f3-fa163e9454b0', 'monotonic_time': 3872.177968959, 'message_signature': 'f518775bd4781049c287464b40f4206b21300213c54eacf3984bbb13b6b9c4cf'}]}, 'timestamp': '2025-12-05 09:23:19.236169', '_unique_id': '7c513d90315b436d88c4349b8add6fa5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.236 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.236 12 ERROR oslo_messaging.notify.messaging     yield
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.236 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.236 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.236 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.236 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.236 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.236 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.236 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.236 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.236 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.236 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.236 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.236 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.236 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.236 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.236 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.236 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.236 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.236 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.236 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.236 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.236 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.236 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.236 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.236 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.236 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.236 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.236 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.236 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.236 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.237 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.237 12 DEBUG ceilometer.compute.pollsters [-] 2b94d7d0-b000-460f-8883-8953d60115d0/disk.device.allocation volume: 30089216 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.237 12 DEBUG ceilometer.compute.pollsters [-] 2b94d7d0-b000-460f-8883-8953d60115d0/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.238 12 DEBUG ceilometer.compute.pollsters [-] 95f85266-e2bc-4615-b523-e7346ad3ab40/disk.device.allocation volume: 30154752 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.238 12 DEBUG ceilometer.compute.pollsters [-] 95f85266-e2bc-4615-b523-e7346ad3ab40/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.239 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5bbb5c2a-9197-4f55-a7c6-6e253161e32d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30089216, 'user_id': '65751a90715341b2984ef84ebbaa1650', 'user_name': None, 'project_id': 'e26ae3fdd48d4947978a480f70e14f84', 'project_name': None, 'resource_id': '2b94d7d0-b000-460f-8883-8953d60115d0-vda', 'timestamp': '2025-12-05T09:23:19.237679', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-804921789', 'name': 'instance-00000003', 'instance_id': '2b94d7d0-b000-460f-8883-8953d60115d0', 'instance_type': 'm1.nano', 'host': 'e310e8c2de91dded1bf420f01eae623153594a2dfb619537eb8a35c6', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'fbadeab4-f24f-4100-963a-d228b2a6f7c4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3ebffd97-b242-42d7-b245-ebdaf8e4377c'}, 'image_ref': '3ebffd97-b242-42d7-b245-ebdaf8e4377c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '0986e726-d1bc-11f0-a2f3-fa163e9454b0', 'monotonic_time': 3872.18497827, 'message_signature': '7eecbd7a770fabce6b3df665d4c8bd0930f5b2901cd34fb0201f49cfb7a9a4b9'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': '65751a90715341b2984ef84ebbaa1650', 'user_name': None, 'project_id': 'e26ae3fdd48d4947978a480f70e14f84', 'project_name': None, 'resource_id': '2b94d7d0-b000-460f-8883-8953d60115d0-sda', 'timestamp': '2025-12-05T09:23:19.237679', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-804921789', 'name': 'instance-00000003', 'instance_id': '2b94d7d0-b000-460f-8883-8953d60115d0', 'instance_type': 'm1.nano', 'host': 'e310e8c2de91dded1bf420f01eae623153594a2dfb619537eb8a35c6', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'fbadeab4-f24f-4100-963a-d228b2a6f7c4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3ebffd97-b242-42d7-b245-ebdaf8e4377c'}, 'image_ref': '3ebffd97-b242-42d7-b245-ebdaf8e4377c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '0986f16c-d1bc-11f0-a2f3-fa163e9454b0', 'monotonic_time': 3872.18497827, 'message_signature': 'c50f0bf3d75d4ec8782cb43fc6aa4e4690f1ec419f6e70e84fdb31ac15f13073'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30154752, 'user_id': 'a9b7ab1c9c854146af8af16f337a063d', 'user_name': None, 'project_id': 'aa5b008731384d18ba83c8d69e76bcef', 'project_name': None, 'resource_id': '95f85266-e2bc-4615-b523-e7346ad3ab40-vda', 'timestamp': '2025-12-05T09:23:19.237679', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-1827012883', 'name': 'instance-00000004', 'instance_id': '95f85266-e2bc-4615-b523-e7346ad3ab40', 'instance_type': 'm1.nano', 'host': '775d6a56e5645c4fc9619f653366421d7b59c37709cb5996def4234c', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'fbadeab4-f24f-4100-963a-d228b2a6f7c4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3ebffd97-b242-42d7-b245-ebdaf8e4377c'}, 'image_ref': '3ebffd97-b242-42d7-b245-ebdaf8e4377c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '0986fa68-d1bc-11f0-a2f3-fa163e9454b0', 'monotonic_time': 3872.197080417, 'message_signature': 'd2ff872352fe1167fcb13f728c4f7568b735559e9af0631e7d09f8482483e6b5'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': 'a9b7ab1c9c854146af8af16f337a063d', 'user_name': None, 'project_id': 'aa5b008731384d18ba83c8d69e76bcef', 'project_name': None, 'resource_id': '95f85266-e2bc-4615-b523-e7346ad3ab40-sda', 'timestamp': '2025-12-05T09:23:19.237679', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-1827012883', 'name': 'instance-00000004', 'instance_id': '95f85266-e2bc-4615-b523-e7346ad3ab40', 'instance_type': 'm1.nano', 'host': '775d6a56e5645c4fc9619f653366421d7b59c37709cb5996def4234c', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'fbadeab4-f24f-4100-963a-d228b2a6f7c4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3ebffd97-b242-42d7-b245-ebdaf8e4377c'}, 'image_ref': '3ebffd97-b242-42d7-b245-ebdaf8e4377c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '09870238-d1bc-11f0-a2f3-fa163e9454b0', 'monotonic_time': 3872.197080417, 'message_signature': 'b6ebb9156773d0689e999448c709dfb079230d6dc7d846cf5329565af06e368d'}]}, 'timestamp': '2025-12-05 09:23:19.238658', '_unique_id': '7b15c7e341d0417abb2a129e2661140d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.239 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.239 12 ERROR oslo_messaging.notify.messaging     yield
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.239 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.239 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.239 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.239 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.239 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.239 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.239 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.239 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.239 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.239 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.239 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.239 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.239 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.239 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.239 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.239 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.239 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.239 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.239 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.239 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.239 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.239 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.239 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.239 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.239 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.239 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.239 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.239 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 09:23:19 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:23:19.239 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:23:19 compute-1 nova_compute[189066]: 2025-12-05 09:23:19.586 189070 DEBUG nova.network.neutron [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] [instance: 2b94d7d0-b000-460f-8883-8953d60115d0] Updating instance_info_cache with network_info: [{"id": "b2d6abb2-b32e-4bf7-bbdd-00d28c47ddc5", "address": "fa:16:3e:57:1d:2e", "network": {"id": "359084ff-88c9-4324-bfef-b1ef9f403118", "bridge": "br-int", "label": "tempest-network-smoke--271958037", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e26ae3fdd48d4947978a480f70e14f84", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb2d6abb2-b3", "ovs_interfaceid": "b2d6abb2-b32e-4bf7-bbdd-00d28c47ddc5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 09:23:21 compute-1 nova_compute[189066]: 2025-12-05 09:23:21.533 189070 DEBUG oslo_concurrency.lockutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Releasing lock "refresh_cache-2b94d7d0-b000-460f-8883-8953d60115d0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 09:23:21 compute-1 nova_compute[189066]: 2025-12-05 09:23:21.534 189070 DEBUG nova.compute.manager [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] [instance: 2b94d7d0-b000-460f-8883-8953d60115d0] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Dec 05 09:23:21 compute-1 nova_compute[189066]: 2025-12-05 09:23:21.534 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:23:21 compute-1 nova_compute[189066]: 2025-12-05 09:23:21.535 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:23:21 compute-1 nova_compute[189066]: 2025-12-05 09:23:21.535 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:23:21 compute-1 nova_compute[189066]: 2025-12-05 09:23:21.535 189070 DEBUG nova.compute.manager [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 05 09:23:21 compute-1 nova_compute[189066]: 2025-12-05 09:23:21.535 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:23:21 compute-1 nova_compute[189066]: 2025-12-05 09:23:21.898 189070 DEBUG oslo_concurrency.lockutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:23:21 compute-1 nova_compute[189066]: 2025-12-05 09:23:21.898 189070 DEBUG oslo_concurrency.lockutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:23:21 compute-1 nova_compute[189066]: 2025-12-05 09:23:21.898 189070 DEBUG oslo_concurrency.lockutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:23:21 compute-1 nova_compute[189066]: 2025-12-05 09:23:21.899 189070 DEBUG nova.compute.resource_tracker [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 05 09:23:23 compute-1 nova_compute[189066]: 2025-12-05 09:23:23.231 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:23:23 compute-1 nova_compute[189066]: 2025-12-05 09:23:23.433 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:23:24 compute-1 sshd-session[221302]: Received disconnect from 101.47.162.91 port 54696:11: Bye Bye [preauth]
Dec 05 09:23:24 compute-1 sshd-session[221302]: Disconnected from 101.47.162.91 port 54696 [preauth]
Dec 05 09:23:26 compute-1 podman[221370]: 2025-12-05 09:23:26.641887482 +0000 UTC m=+0.077827168 container health_status d60d3dfd47255bc7c068dbedc03dbf86678863f0414a1b8f8536e4d53dd67a13 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a42d21893a42142b0b00e96296a13ac9d2c0fedf55d6438ad104fd0309c63376-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 05 09:23:26 compute-1 nova_compute[189066]: 2025-12-05 09:23:26.692 189070 DEBUG oslo_concurrency.processutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2b94d7d0-b000-460f-8883-8953d60115d0/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 09:23:26 compute-1 nova_compute[189066]: 2025-12-05 09:23:26.773 189070 DEBUG oslo_concurrency.processutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2b94d7d0-b000-460f-8883-8953d60115d0/disk --force-share --output=json" returned: 0 in 0.081s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 09:23:26 compute-1 nova_compute[189066]: 2025-12-05 09:23:26.774 189070 DEBUG oslo_concurrency.processutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2b94d7d0-b000-460f-8883-8953d60115d0/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 09:23:26 compute-1 nova_compute[189066]: 2025-12-05 09:23:26.839 189070 DEBUG oslo_concurrency.processutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2b94d7d0-b000-460f-8883-8953d60115d0/disk --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 09:23:26 compute-1 nova_compute[189066]: 2025-12-05 09:23:26.848 189070 DEBUG oslo_concurrency.processutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/95f85266-e2bc-4615-b523-e7346ad3ab40/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 09:23:26 compute-1 nova_compute[189066]: 2025-12-05 09:23:26.913 189070 DEBUG oslo_concurrency.processutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/95f85266-e2bc-4615-b523-e7346ad3ab40/disk --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 09:23:26 compute-1 nova_compute[189066]: 2025-12-05 09:23:26.915 189070 DEBUG oslo_concurrency.processutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/95f85266-e2bc-4615-b523-e7346ad3ab40/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 09:23:26 compute-1 nova_compute[189066]: 2025-12-05 09:23:26.977 189070 DEBUG oslo_concurrency.processutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/95f85266-e2bc-4615-b523-e7346ad3ab40/disk --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 09:23:27 compute-1 nova_compute[189066]: 2025-12-05 09:23:27.188 189070 WARNING nova.virt.libvirt.driver [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 05 09:23:27 compute-1 nova_compute[189066]: 2025-12-05 09:23:27.190 189070 DEBUG nova.compute.resource_tracker [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5453MB free_disk=73.28036117553711GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 05 09:23:27 compute-1 nova_compute[189066]: 2025-12-05 09:23:27.190 189070 DEBUG oslo_concurrency.lockutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:23:27 compute-1 nova_compute[189066]: 2025-12-05 09:23:27.191 189070 DEBUG oslo_concurrency.lockutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:23:27 compute-1 nova_compute[189066]: 2025-12-05 09:23:27.318 189070 DEBUG nova.compute.resource_tracker [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Instance 2b94d7d0-b000-460f-8883-8953d60115d0 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Dec 05 09:23:27 compute-1 nova_compute[189066]: 2025-12-05 09:23:27.318 189070 DEBUG nova.compute.resource_tracker [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Instance 95f85266-e2bc-4615-b523-e7346ad3ab40 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Dec 05 09:23:27 compute-1 nova_compute[189066]: 2025-12-05 09:23:27.318 189070 DEBUG nova.compute.resource_tracker [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 05 09:23:27 compute-1 nova_compute[189066]: 2025-12-05 09:23:27.318 189070 DEBUG nova.compute.resource_tracker [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 05 09:23:27 compute-1 nova_compute[189066]: 2025-12-05 09:23:27.480 189070 DEBUG nova.compute.provider_tree [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Inventory has not changed in ProviderTree for provider: be68f9f1-7820-4bfa-8dbd-210e13729f64 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 09:23:27 compute-1 nova_compute[189066]: 2025-12-05 09:23:27.508 189070 DEBUG nova.scheduler.client.report [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Inventory has not changed for provider be68f9f1-7820-4bfa-8dbd-210e13729f64 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 09:23:27 compute-1 nova_compute[189066]: 2025-12-05 09:23:27.803 189070 DEBUG nova.compute.resource_tracker [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 05 09:23:27 compute-1 nova_compute[189066]: 2025-12-05 09:23:27.803 189070 DEBUG oslo_concurrency.lockutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.613s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:23:28 compute-1 nova_compute[189066]: 2025-12-05 09:23:28.005 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:23:28 compute-1 NetworkManager[55704]: <info>  [1764926608.0070] manager: (patch-br-int-to-provnet-540ef657-1e81-4b72-856b-0e1c84504735): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/29)
Dec 05 09:23:28 compute-1 NetworkManager[55704]: <info>  [1764926608.0084] device (patch-br-int-to-provnet-540ef657-1e81-4b72-856b-0e1c84504735)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Dec 05 09:23:28 compute-1 NetworkManager[55704]: <info>  [1764926608.0100] manager: (patch-provnet-540ef657-1e81-4b72-856b-0e1c84504735-to-br-int): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/30)
Dec 05 09:23:28 compute-1 NetworkManager[55704]: <info>  [1764926608.0104] device (patch-provnet-540ef657-1e81-4b72-856b-0e1c84504735-to-br-int)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Dec 05 09:23:28 compute-1 NetworkManager[55704]: <info>  [1764926608.0113] manager: (patch-br-int-to-provnet-540ef657-1e81-4b72-856b-0e1c84504735): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/31)
Dec 05 09:23:28 compute-1 NetworkManager[55704]: <info>  [1764926608.0119] manager: (patch-provnet-540ef657-1e81-4b72-856b-0e1c84504735-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/32)
Dec 05 09:23:28 compute-1 NetworkManager[55704]: <info>  [1764926608.0123] device (patch-br-int-to-provnet-540ef657-1e81-4b72-856b-0e1c84504735)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'none', managed-type: 'full')
Dec 05 09:23:28 compute-1 NetworkManager[55704]: <info>  [1764926608.0128] device (patch-provnet-540ef657-1e81-4b72-856b-0e1c84504735-to-br-int)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'none', managed-type: 'full')
Dec 05 09:23:28 compute-1 nova_compute[189066]: 2025-12-05 09:23:28.233 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:23:28 compute-1 nova_compute[189066]: 2025-12-05 09:23:28.273 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:23:28 compute-1 ovn_controller[95809]: 2025-12-05T09:23:28Z|00037|binding|INFO|Releasing lport 0cfbe0fa-0fb7-480e-bf9d-3c9768dbeffa from this chassis (sb_readonly=0)
Dec 05 09:23:28 compute-1 ovn_controller[95809]: 2025-12-05T09:23:28Z|00038|binding|INFO|Releasing lport b3346b7b-37e6-4cad-b494-c202eaae0edf from this chassis (sb_readonly=0)
Dec 05 09:23:28 compute-1 nova_compute[189066]: 2025-12-05 09:23:28.322 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:23:28 compute-1 nova_compute[189066]: 2025-12-05 09:23:28.467 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:23:28 compute-1 podman[221404]: 2025-12-05 09:23:28.691885393 +0000 UTC m=+0.110681983 container health_status 0cdc951f3b455a1459471b20e1f1626ca53f702831c125f835d1b670aeb17ccf (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a42d21893a42142b0b00e96296a13ac9d2c0fedf55d6438ad104fd0309c63376-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller)
Dec 05 09:23:29 compute-1 ovn_controller[95809]: 2025-12-05T09:23:29Z|00039|memory|INFO|peak resident set size grew 60% in last 1196.8 seconds, from 16128 kB to 25856 kB
Dec 05 09:23:29 compute-1 ovn_controller[95809]: 2025-12-05T09:23:29Z|00040|memory|INFO|idl-cells-OVN_Southbound:13519 idl-cells-Open_vSwitch:927 if_status_mgr_ifaces_state_usage-KB:1 if_status_mgr_ifaces_usage-KB:1 lflow-cache-entries-cache-expr:483 lflow-cache-entries-cache-matches:326 lflow-cache-size-KB:2039 local_datapath_usage-KB:4 ofctrl_desired_flow_usage-KB:810 ofctrl_installed_flow_usage-KB:593 ofctrl_sb_flow_ref_usage-KB:301
Dec 05 09:23:30 compute-1 nova_compute[189066]: 2025-12-05 09:23:30.100 189070 DEBUG nova.virt.libvirt.driver [None req-121aab57-9143-4c2f-8514-a7ce33bf2580 9d57c2b5ca2f41c09c8f4b723716ef7b 9891534839274cbc89aec062c36488b0 - - default default] [instance: 95f85266-e2bc-4615-b523-e7346ad3ab40] Check if temp file /var/lib/nova/instances/tmpnflagnbx exists to indicate shared storage is being used for migration. Exists? False _check_shared_storage_test_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10065
Dec 05 09:23:30 compute-1 nova_compute[189066]: 2025-12-05 09:23:30.101 189070 DEBUG nova.compute.manager [None req-121aab57-9143-4c2f-8514-a7ce33bf2580 9d57c2b5ca2f41c09c8f4b723716ef7b 9891534839274cbc89aec062c36488b0 - - default default] source check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpnflagnbx',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='95f85266-e2bc-4615-b523-e7346ad3ab40',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_source /usr/lib/python3.9/site-packages/nova/compute/manager.py:8587
Dec 05 09:23:31 compute-1 podman[221430]: 2025-12-05 09:23:31.829706328 +0000 UTC m=+0.059615581 container health_status e8cea6352187f28e672aa25c227f2705a90d58074b8cf42235a56df5d4026fd5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a42d21893a42142b0b00e96296a13ac9d2c0fedf55d6438ad104fd0309c63376-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 05 09:23:33 compute-1 nova_compute[189066]: 2025-12-05 09:23:33.235 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:23:33 compute-1 nova_compute[189066]: 2025-12-05 09:23:33.470 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:23:34 compute-1 podman[221451]: 2025-12-05 09:23:34.628803853 +0000 UTC m=+0.069756790 container health_status 3f3288243aefe700d53cffce184353529fbaf6e44f1c19ec4792ed576af41176 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, managed_by=edpm_ansible)
Dec 05 09:23:35 compute-1 sshd-session[221449]: Received disconnect from 43.225.158.169 port 46926:11: Bye Bye [preauth]
Dec 05 09:23:35 compute-1 sshd-session[221449]: Disconnected from authenticating user root 43.225.158.169 port 46926 [preauth]
Dec 05 09:23:36 compute-1 nova_compute[189066]: 2025-12-05 09:23:36.399 189070 DEBUG nova.compute.manager [req-4f11eb20-1be9-4448-84e9-473a79a338ee req-66fc3e2e-100e-485c-91be-90fe07a5f419 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 2b94d7d0-b000-460f-8883-8953d60115d0] Received event network-changed-b2d6abb2-b32e-4bf7-bbdd-00d28c47ddc5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 09:23:36 compute-1 nova_compute[189066]: 2025-12-05 09:23:36.399 189070 DEBUG nova.compute.manager [req-4f11eb20-1be9-4448-84e9-473a79a338ee req-66fc3e2e-100e-485c-91be-90fe07a5f419 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 2b94d7d0-b000-460f-8883-8953d60115d0] Refreshing instance network info cache due to event network-changed-b2d6abb2-b32e-4bf7-bbdd-00d28c47ddc5. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Dec 05 09:23:36 compute-1 nova_compute[189066]: 2025-12-05 09:23:36.400 189070 DEBUG oslo_concurrency.lockutils [req-4f11eb20-1be9-4448-84e9-473a79a338ee req-66fc3e2e-100e-485c-91be-90fe07a5f419 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Acquiring lock "refresh_cache-2b94d7d0-b000-460f-8883-8953d60115d0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 09:23:36 compute-1 nova_compute[189066]: 2025-12-05 09:23:36.400 189070 DEBUG oslo_concurrency.lockutils [req-4f11eb20-1be9-4448-84e9-473a79a338ee req-66fc3e2e-100e-485c-91be-90fe07a5f419 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Acquired lock "refresh_cache-2b94d7d0-b000-460f-8883-8953d60115d0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 09:23:36 compute-1 nova_compute[189066]: 2025-12-05 09:23:36.400 189070 DEBUG nova.network.neutron [req-4f11eb20-1be9-4448-84e9-473a79a338ee req-66fc3e2e-100e-485c-91be-90fe07a5f419 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 2b94d7d0-b000-460f-8883-8953d60115d0] Refreshing network info cache for port b2d6abb2-b32e-4bf7-bbdd-00d28c47ddc5 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Dec 05 09:23:36 compute-1 nova_compute[189066]: 2025-12-05 09:23:36.936 189070 DEBUG oslo_concurrency.processutils [None req-121aab57-9143-4c2f-8514-a7ce33bf2580 9d57c2b5ca2f41c09c8f4b723716ef7b 9891534839274cbc89aec062c36488b0 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/95f85266-e2bc-4615-b523-e7346ad3ab40/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 09:23:37 compute-1 nova_compute[189066]: 2025-12-05 09:23:37.006 189070 DEBUG oslo_concurrency.processutils [None req-121aab57-9143-4c2f-8514-a7ce33bf2580 9d57c2b5ca2f41c09c8f4b723716ef7b 9891534839274cbc89aec062c36488b0 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/95f85266-e2bc-4615-b523-e7346ad3ab40/disk --force-share --output=json" returned: 0 in 0.070s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 09:23:37 compute-1 nova_compute[189066]: 2025-12-05 09:23:37.007 189070 DEBUG oslo_concurrency.processutils [None req-121aab57-9143-4c2f-8514-a7ce33bf2580 9d57c2b5ca2f41c09c8f4b723716ef7b 9891534839274cbc89aec062c36488b0 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/95f85266-e2bc-4615-b523-e7346ad3ab40/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 09:23:37 compute-1 nova_compute[189066]: 2025-12-05 09:23:37.069 189070 DEBUG oslo_concurrency.processutils [None req-121aab57-9143-4c2f-8514-a7ce33bf2580 9d57c2b5ca2f41c09c8f4b723716ef7b 9891534839274cbc89aec062c36488b0 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/95f85266-e2bc-4615-b523-e7346ad3ab40/disk --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 09:23:37 compute-1 nova_compute[189066]: 2025-12-05 09:23:37.071 189070 DEBUG oslo_concurrency.lockutils [None req-121aab57-9143-4c2f-8514-a7ce33bf2580 9d57c2b5ca2f41c09c8f4b723716ef7b 9891534839274cbc89aec062c36488b0 - - default default] Acquiring lock "compute-rpcapi-router" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 09:23:37 compute-1 nova_compute[189066]: 2025-12-05 09:23:37.071 189070 DEBUG oslo_concurrency.lockutils [None req-121aab57-9143-4c2f-8514-a7ce33bf2580 9d57c2b5ca2f41c09c8f4b723716ef7b 9891534839274cbc89aec062c36488b0 - - default default] Acquired lock "compute-rpcapi-router" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 09:23:37 compute-1 nova_compute[189066]: 2025-12-05 09:23:37.121 189070 INFO nova.compute.rpcapi [None req-121aab57-9143-4c2f-8514-a7ce33bf2580 9d57c2b5ca2f41c09c8f4b723716ef7b 9891534839274cbc89aec062c36488b0 - - default default] Automatically selected compute RPC version 6.2 from minimum service version 66
Dec 05 09:23:37 compute-1 nova_compute[189066]: 2025-12-05 09:23:37.122 189070 DEBUG oslo_concurrency.lockutils [None req-121aab57-9143-4c2f-8514-a7ce33bf2580 9d57c2b5ca2f41c09c8f4b723716ef7b 9891534839274cbc89aec062c36488b0 - - default default] Releasing lock "compute-rpcapi-router" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 09:23:38 compute-1 nova_compute[189066]: 2025-12-05 09:23:38.238 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:23:38 compute-1 nova_compute[189066]: 2025-12-05 09:23:38.512 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:23:38 compute-1 podman[221478]: 2025-12-05 09:23:38.621931485 +0000 UTC m=+0.059241972 container health_status b07f36300303a5c1729056a61e344d6c6fd9b2e404082d8e8ce7185d45652c2b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.expose-services=, build-date=2025-08-20T13:12:41, name=ubi9-minimal, managed_by=edpm_ansible, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, maintainer=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, vendor=Red Hat, Inc., config_id=openstack_network_exporter)
Dec 05 09:23:40 compute-1 nova_compute[189066]: 2025-12-05 09:23:40.451 189070 INFO nova.compute.manager [None req-4b5666bf-c71e-416e-a1fe-6c66d07e87b8 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] [instance: 2b94d7d0-b000-460f-8883-8953d60115d0] Get console output
Dec 05 09:23:40 compute-1 nova_compute[189066]: 2025-12-05 09:23:40.565 220618 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Dec 05 09:23:41 compute-1 ovn_controller[95809]: 2025-12-05T09:23:41Z|00041|binding|INFO|Releasing lport 0cfbe0fa-0fb7-480e-bf9d-3c9768dbeffa from this chassis (sb_readonly=0)
Dec 05 09:23:41 compute-1 ovn_controller[95809]: 2025-12-05T09:23:41Z|00042|binding|INFO|Releasing lport b3346b7b-37e6-4cad-b494-c202eaae0edf from this chassis (sb_readonly=0)
Dec 05 09:23:41 compute-1 nova_compute[189066]: 2025-12-05 09:23:41.496 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:23:41 compute-1 podman[221499]: 2025-12-05 09:23:41.635205028 +0000 UTC m=+0.066870649 container health_status b9f45cdcb567bb250381fa80d86c78c226d5638c7de1b6d79ee017275814ff68 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Dec 05 09:23:42 compute-1 nova_compute[189066]: 2025-12-05 09:23:42.250 189070 DEBUG nova.network.neutron [req-4f11eb20-1be9-4448-84e9-473a79a338ee req-66fc3e2e-100e-485c-91be-90fe07a5f419 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 2b94d7d0-b000-460f-8883-8953d60115d0] Updated VIF entry in instance network info cache for port b2d6abb2-b32e-4bf7-bbdd-00d28c47ddc5. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Dec 05 09:23:42 compute-1 nova_compute[189066]: 2025-12-05 09:23:42.251 189070 DEBUG nova.network.neutron [req-4f11eb20-1be9-4448-84e9-473a79a338ee req-66fc3e2e-100e-485c-91be-90fe07a5f419 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 2b94d7d0-b000-460f-8883-8953d60115d0] Updating instance_info_cache with network_info: [{"id": "b2d6abb2-b32e-4bf7-bbdd-00d28c47ddc5", "address": "fa:16:3e:57:1d:2e", "network": {"id": "359084ff-88c9-4324-bfef-b1ef9f403118", "bridge": "br-int", "label": "tempest-network-smoke--271958037", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.195", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e26ae3fdd48d4947978a480f70e14f84", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb2d6abb2-b3", "ovs_interfaceid": "b2d6abb2-b32e-4bf7-bbdd-00d28c47ddc5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 09:23:43 compute-1 nova_compute[189066]: 2025-12-05 09:23:43.240 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:23:43 compute-1 nova_compute[189066]: 2025-12-05 09:23:43.514 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:23:43 compute-1 nova_compute[189066]: 2025-12-05 09:23:43.656 189070 DEBUG oslo_concurrency.lockutils [req-4f11eb20-1be9-4448-84e9-473a79a338ee req-66fc3e2e-100e-485c-91be-90fe07a5f419 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Releasing lock "refresh_cache-2b94d7d0-b000-460f-8883-8953d60115d0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 09:23:44 compute-1 sshd-session[221523]: Accepted publickey for nova from 192.168.122.102 port 44972 ssh2: ECDSA SHA256:SmkhuBePRe5VD3eW9pHWZd8sXFprcvpDE1m9LAG/9Ps
Dec 05 09:23:44 compute-1 systemd[1]: Created slice User Slice of UID 42436.
Dec 05 09:23:44 compute-1 systemd[1]: Starting User Runtime Directory /run/user/42436...
Dec 05 09:23:44 compute-1 systemd-logind[807]: New session 28 of user nova.
Dec 05 09:23:44 compute-1 systemd[1]: Finished User Runtime Directory /run/user/42436.
Dec 05 09:23:44 compute-1 systemd[1]: Starting User Manager for UID 42436...
Dec 05 09:23:44 compute-1 systemd[221527]: pam_unix(systemd-user:session): session opened for user nova(uid=42436) by nova(uid=0)
Dec 05 09:23:44 compute-1 systemd[221527]: Queued start job for default target Main User Target.
Dec 05 09:23:44 compute-1 systemd[221527]: Created slice User Application Slice.
Dec 05 09:23:44 compute-1 systemd[221527]: Started Mark boot as successful after the user session has run 2 minutes.
Dec 05 09:23:44 compute-1 systemd[221527]: Started Daily Cleanup of User's Temporary Directories.
Dec 05 09:23:44 compute-1 systemd[221527]: Reached target Paths.
Dec 05 09:23:44 compute-1 systemd[221527]: Reached target Timers.
Dec 05 09:23:44 compute-1 systemd[221527]: Starting D-Bus User Message Bus Socket...
Dec 05 09:23:44 compute-1 systemd[221527]: Starting Create User's Volatile Files and Directories...
Dec 05 09:23:44 compute-1 systemd[221527]: Finished Create User's Volatile Files and Directories.
Dec 05 09:23:44 compute-1 systemd[221527]: Listening on D-Bus User Message Bus Socket.
Dec 05 09:23:44 compute-1 systemd[221527]: Reached target Sockets.
Dec 05 09:23:44 compute-1 systemd[221527]: Reached target Basic System.
Dec 05 09:23:44 compute-1 systemd[221527]: Reached target Main User Target.
Dec 05 09:23:44 compute-1 systemd[221527]: Startup finished in 135ms.
Dec 05 09:23:44 compute-1 systemd[1]: Started User Manager for UID 42436.
Dec 05 09:23:44 compute-1 systemd[1]: Started Session 28 of User nova.
Dec 05 09:23:44 compute-1 sshd-session[221523]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Dec 05 09:23:44 compute-1 sshd-session[221542]: Received disconnect from 192.168.122.102 port 44972:11: disconnected by user
Dec 05 09:23:44 compute-1 sshd-session[221542]: Disconnected from user nova 192.168.122.102 port 44972
Dec 05 09:23:44 compute-1 sshd-session[221523]: pam_unix(sshd:session): session closed for user nova
Dec 05 09:23:44 compute-1 systemd[1]: session-28.scope: Deactivated successfully.
Dec 05 09:23:44 compute-1 systemd-logind[807]: Session 28 logged out. Waiting for processes to exit.
Dec 05 09:23:44 compute-1 systemd-logind[807]: Removed session 28.
Dec 05 09:23:46 compute-1 nova_compute[189066]: 2025-12-05 09:23:46.737 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:23:48 compute-1 nova_compute[189066]: 2025-12-05 09:23:48.243 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:23:48 compute-1 nova_compute[189066]: 2025-12-05 09:23:48.558 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:23:48 compute-1 podman[221547]: 2025-12-05 09:23:48.694355566 +0000 UTC m=+0.075704556 container health_status 550b604b3b40c506829669f3af0f3e8dc4e7ac01464222ce7f26998708fe1a96 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec 05 09:23:50 compute-1 nova_compute[189066]: 2025-12-05 09:23:50.136 189070 DEBUG nova.compute.manager [req-39032ab3-7751-44bc-8be8-883696b9bc1e req-4b8a8d03-bcb0-4655-8308-fc6656cdfd0a 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 95f85266-e2bc-4615-b523-e7346ad3ab40] Received event network-vif-unplugged-fbbb4f47-a5b9-460c-ae65-29a664051272 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 09:23:50 compute-1 nova_compute[189066]: 2025-12-05 09:23:50.137 189070 DEBUG oslo_concurrency.lockutils [req-39032ab3-7751-44bc-8be8-883696b9bc1e req-4b8a8d03-bcb0-4655-8308-fc6656cdfd0a 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Acquiring lock "95f85266-e2bc-4615-b523-e7346ad3ab40-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:23:50 compute-1 nova_compute[189066]: 2025-12-05 09:23:50.137 189070 DEBUG oslo_concurrency.lockutils [req-39032ab3-7751-44bc-8be8-883696b9bc1e req-4b8a8d03-bcb0-4655-8308-fc6656cdfd0a 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Lock "95f85266-e2bc-4615-b523-e7346ad3ab40-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:23:50 compute-1 nova_compute[189066]: 2025-12-05 09:23:50.137 189070 DEBUG oslo_concurrency.lockutils [req-39032ab3-7751-44bc-8be8-883696b9bc1e req-4b8a8d03-bcb0-4655-8308-fc6656cdfd0a 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Lock "95f85266-e2bc-4615-b523-e7346ad3ab40-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:23:50 compute-1 nova_compute[189066]: 2025-12-05 09:23:50.138 189070 DEBUG nova.compute.manager [req-39032ab3-7751-44bc-8be8-883696b9bc1e req-4b8a8d03-bcb0-4655-8308-fc6656cdfd0a 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 95f85266-e2bc-4615-b523-e7346ad3ab40] No waiting events found dispatching network-vif-unplugged-fbbb4f47-a5b9-460c-ae65-29a664051272 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 09:23:50 compute-1 nova_compute[189066]: 2025-12-05 09:23:50.138 189070 DEBUG nova.compute.manager [req-39032ab3-7751-44bc-8be8-883696b9bc1e req-4b8a8d03-bcb0-4655-8308-fc6656cdfd0a 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 95f85266-e2bc-4615-b523-e7346ad3ab40] Received event network-vif-unplugged-fbbb4f47-a5b9-460c-ae65-29a664051272 for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Dec 05 09:23:52 compute-1 nova_compute[189066]: 2025-12-05 09:23:52.785 189070 INFO nova.compute.manager [None req-56a032b1-8cf1-488f-ba94-c20d6c0b10fb 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] [instance: 2b94d7d0-b000-460f-8883-8953d60115d0] Get console output
Dec 05 09:23:52 compute-1 nova_compute[189066]: 2025-12-05 09:23:52.792 220618 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Dec 05 09:23:53 compute-1 nova_compute[189066]: 2025-12-05 09:23:53.246 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:23:53 compute-1 nova_compute[189066]: 2025-12-05 09:23:53.355 189070 INFO nova.compute.manager [None req-121aab57-9143-4c2f-8514-a7ce33bf2580 9d57c2b5ca2f41c09c8f4b723716ef7b 9891534839274cbc89aec062c36488b0 - - default default] [instance: 95f85266-e2bc-4615-b523-e7346ad3ab40] Took 16.28 seconds for pre_live_migration on destination host compute-2.ctlplane.example.com.
Dec 05 09:23:53 compute-1 nova_compute[189066]: 2025-12-05 09:23:53.356 189070 DEBUG nova.compute.manager [None req-121aab57-9143-4c2f-8514-a7ce33bf2580 9d57c2b5ca2f41c09c8f4b723716ef7b 9891534839274cbc89aec062c36488b0 - - default default] [instance: 95f85266-e2bc-4615-b523-e7346ad3ab40] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Dec 05 09:23:53 compute-1 nova_compute[189066]: 2025-12-05 09:23:53.562 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:23:54 compute-1 nova_compute[189066]: 2025-12-05 09:23:54.241 189070 DEBUG nova.compute.manager [None req-121aab57-9143-4c2f-8514-a7ce33bf2580 9d57c2b5ca2f41c09c8f4b723716ef7b 9891534839274cbc89aec062c36488b0 - - default default] live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpnflagnbx',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='95f85266-e2bc-4615-b523-e7346ad3ab40',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=Migration(60ad7bb0-a13d-4ca7-a419-a16d5433e46c),old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=True) _do_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8939
Dec 05 09:23:54 compute-1 nova_compute[189066]: 2025-12-05 09:23:54.305 189070 DEBUG nova.objects.instance [None req-121aab57-9143-4c2f-8514-a7ce33bf2580 9d57c2b5ca2f41c09c8f4b723716ef7b 9891534839274cbc89aec062c36488b0 - - default default] Lazy-loading 'migration_context' on Instance uuid 95f85266-e2bc-4615-b523-e7346ad3ab40 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 09:23:54 compute-1 nova_compute[189066]: 2025-12-05 09:23:54.306 189070 DEBUG nova.virt.libvirt.driver [None req-121aab57-9143-4c2f-8514-a7ce33bf2580 9d57c2b5ca2f41c09c8f4b723716ef7b 9891534839274cbc89aec062c36488b0 - - default default] [instance: 95f85266-e2bc-4615-b523-e7346ad3ab40] Starting monitoring of live migration _live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10639
Dec 05 09:23:54 compute-1 nova_compute[189066]: 2025-12-05 09:23:54.309 189070 DEBUG nova.virt.libvirt.driver [None req-121aab57-9143-4c2f-8514-a7ce33bf2580 9d57c2b5ca2f41c09c8f4b723716ef7b 9891534839274cbc89aec062c36488b0 - - default default] [instance: 95f85266-e2bc-4615-b523-e7346ad3ab40] Operation thread is still running _live_migration_monitor /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10440
Dec 05 09:23:54 compute-1 nova_compute[189066]: 2025-12-05 09:23:54.309 189070 DEBUG nova.virt.libvirt.driver [None req-121aab57-9143-4c2f-8514-a7ce33bf2580 9d57c2b5ca2f41c09c8f4b723716ef7b 9891534839274cbc89aec062c36488b0 - - default default] [instance: 95f85266-e2bc-4615-b523-e7346ad3ab40] Migration not running yet _live_migration_monitor /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10449
Dec 05 09:23:54 compute-1 nova_compute[189066]: 2025-12-05 09:23:54.381 189070 DEBUG nova.virt.libvirt.vif [None req-121aab57-9143-4c2f-8514-a7ce33bf2580 9d57c2b5ca2f41c09c8f4b723716ef7b 9891534839274cbc89aec062c36488b0 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-05T09:22:06Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-LiveAutoBlockMigrationV225Test-server-1827012883',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-liveautoblockmigrationv225test-server-1827012883',id=4,image_ref='3ebffd97-b242-42d7-b245-ebdaf8e4377c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-05T09:23:00Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='aa5b008731384d18ba83c8d69e76bcef',ramdisk_id='',reservation_id='r-0233w6ol',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='3ebffd97-b242-42d7-b245-ebdaf8e4377c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-LiveAutoBlockMigrationV225Test-63392288',owner_user_name='tempest-LiveAutoBlockMigrationV225Test-63392288-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-05T09:23:00Z,user_data=None,user_id='a9b7ab1c9c854146af8af16f337a063d',uuid=95f85266-e2bc-4615-b523-e7346ad3ab40,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "fbbb4f47-a5b9-460c-ae65-29a664051272", "address": "fa:16:3e:d6:6e:e7", "network": {"id": "0a97aec7-0780-4b5e-9498-e796fd7b42fd", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-971219995-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "aa5b008731384d18ba83c8d69e76bcef", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tapfbbb4f47-a5", "ovs_interfaceid": "fbbb4f47-a5b9-460c-ae65-29a664051272", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Dec 05 09:23:54 compute-1 nova_compute[189066]: 2025-12-05 09:23:54.382 189070 DEBUG nova.network.os_vif_util [None req-121aab57-9143-4c2f-8514-a7ce33bf2580 9d57c2b5ca2f41c09c8f4b723716ef7b 9891534839274cbc89aec062c36488b0 - - default default] Converting VIF {"id": "fbbb4f47-a5b9-460c-ae65-29a664051272", "address": "fa:16:3e:d6:6e:e7", "network": {"id": "0a97aec7-0780-4b5e-9498-e796fd7b42fd", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-971219995-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "aa5b008731384d18ba83c8d69e76bcef", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tapfbbb4f47-a5", "ovs_interfaceid": "fbbb4f47-a5b9-460c-ae65-29a664051272", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 09:23:54 compute-1 nova_compute[189066]: 2025-12-05 09:23:54.384 189070 DEBUG nova.network.os_vif_util [None req-121aab57-9143-4c2f-8514-a7ce33bf2580 9d57c2b5ca2f41c09c8f4b723716ef7b 9891534839274cbc89aec062c36488b0 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d6:6e:e7,bridge_name='br-int',has_traffic_filtering=True,id=fbbb4f47-a5b9-460c-ae65-29a664051272,network=Network(0a97aec7-0780-4b5e-9498-e796fd7b42fd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfbbb4f47-a5') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 09:23:54 compute-1 nova_compute[189066]: 2025-12-05 09:23:54.386 189070 DEBUG nova.virt.libvirt.migration [None req-121aab57-9143-4c2f-8514-a7ce33bf2580 9d57c2b5ca2f41c09c8f4b723716ef7b 9891534839274cbc89aec062c36488b0 - - default default] [instance: 95f85266-e2bc-4615-b523-e7346ad3ab40] Updating guest XML with vif config: <interface type="ethernet">
Dec 05 09:23:54 compute-1 nova_compute[189066]:   <mac address="fa:16:3e:d6:6e:e7"/>
Dec 05 09:23:54 compute-1 nova_compute[189066]:   <model type="virtio"/>
Dec 05 09:23:54 compute-1 nova_compute[189066]:   <driver name="vhost" rx_queue_size="512"/>
Dec 05 09:23:54 compute-1 nova_compute[189066]:   <mtu size="1442"/>
Dec 05 09:23:54 compute-1 nova_compute[189066]:   <target dev="tapfbbb4f47-a5"/>
Dec 05 09:23:54 compute-1 nova_compute[189066]: </interface>
Dec 05 09:23:54 compute-1 nova_compute[189066]:  _update_vif_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:388
Dec 05 09:23:54 compute-1 nova_compute[189066]: 2025-12-05 09:23:54.387 189070 DEBUG nova.virt.libvirt.driver [None req-121aab57-9143-4c2f-8514-a7ce33bf2580 9d57c2b5ca2f41c09c8f4b723716ef7b 9891534839274cbc89aec062c36488b0 - - default default] [instance: 95f85266-e2bc-4615-b523-e7346ad3ab40] About to invoke the migrate API _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10272
Dec 05 09:23:54 compute-1 systemd[1]: Stopping User Manager for UID 42436...
Dec 05 09:23:54 compute-1 systemd[221527]: Activating special unit Exit the Session...
Dec 05 09:23:54 compute-1 systemd[221527]: Stopped target Main User Target.
Dec 05 09:23:54 compute-1 systemd[221527]: Stopped target Basic System.
Dec 05 09:23:54 compute-1 systemd[221527]: Stopped target Paths.
Dec 05 09:23:54 compute-1 systemd[221527]: Stopped target Sockets.
Dec 05 09:23:54 compute-1 systemd[221527]: Stopped target Timers.
Dec 05 09:23:54 compute-1 systemd[221527]: Stopped Mark boot as successful after the user session has run 2 minutes.
Dec 05 09:23:54 compute-1 systemd[221527]: Stopped Daily Cleanup of User's Temporary Directories.
Dec 05 09:23:54 compute-1 systemd[221527]: Closed D-Bus User Message Bus Socket.
Dec 05 09:23:54 compute-1 systemd[221527]: Stopped Create User's Volatile Files and Directories.
Dec 05 09:23:54 compute-1 systemd[221527]: Removed slice User Application Slice.
Dec 05 09:23:54 compute-1 systemd[221527]: Reached target Shutdown.
Dec 05 09:23:54 compute-1 systemd[221527]: Finished Exit the Session.
Dec 05 09:23:54 compute-1 systemd[221527]: Reached target Exit the Session.
Dec 05 09:23:54 compute-1 systemd[1]: user@42436.service: Deactivated successfully.
Dec 05 09:23:54 compute-1 systemd[1]: Stopped User Manager for UID 42436.
Dec 05 09:23:54 compute-1 systemd[1]: Stopping User Runtime Directory /run/user/42436...
Dec 05 09:23:54 compute-1 systemd[1]: run-user-42436.mount: Deactivated successfully.
Dec 05 09:23:54 compute-1 systemd[1]: user-runtime-dir@42436.service: Deactivated successfully.
Dec 05 09:23:54 compute-1 systemd[1]: Stopped User Runtime Directory /run/user/42436.
Dec 05 09:23:54 compute-1 systemd[1]: Removed slice User Slice of UID 42436.
Dec 05 09:23:54 compute-1 nova_compute[189066]: 2025-12-05 09:23:54.813 189070 DEBUG nova.virt.libvirt.migration [None req-121aab57-9143-4c2f-8514-a7ce33bf2580 9d57c2b5ca2f41c09c8f4b723716ef7b 9891534839274cbc89aec062c36488b0 - - default default] [instance: 95f85266-e2bc-4615-b523-e7346ad3ab40] Current None elapsed 0 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Dec 05 09:23:54 compute-1 nova_compute[189066]: 2025-12-05 09:23:54.814 189070 INFO nova.virt.libvirt.migration [None req-121aab57-9143-4c2f-8514-a7ce33bf2580 9d57c2b5ca2f41c09c8f4b723716ef7b 9891534839274cbc89aec062c36488b0 - - default default] [instance: 95f85266-e2bc-4615-b523-e7346ad3ab40] Increasing downtime to 50 ms after 0 sec elapsed time
Dec 05 09:23:54 compute-1 nova_compute[189066]: 2025-12-05 09:23:54.831 189070 DEBUG nova.compute.manager [req-ce71b434-3e36-47c6-b4f4-dcd813490b6c req-820dfd5c-8dee-425b-9cf8-ce462faf044a 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 95f85266-e2bc-4615-b523-e7346ad3ab40] Received event network-vif-plugged-fbbb4f47-a5b9-460c-ae65-29a664051272 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 09:23:54 compute-1 nova_compute[189066]: 2025-12-05 09:23:54.831 189070 DEBUG oslo_concurrency.lockutils [req-ce71b434-3e36-47c6-b4f4-dcd813490b6c req-820dfd5c-8dee-425b-9cf8-ce462faf044a 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Acquiring lock "95f85266-e2bc-4615-b523-e7346ad3ab40-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:23:54 compute-1 nova_compute[189066]: 2025-12-05 09:23:54.831 189070 DEBUG oslo_concurrency.lockutils [req-ce71b434-3e36-47c6-b4f4-dcd813490b6c req-820dfd5c-8dee-425b-9cf8-ce462faf044a 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Lock "95f85266-e2bc-4615-b523-e7346ad3ab40-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:23:54 compute-1 nova_compute[189066]: 2025-12-05 09:23:54.831 189070 DEBUG oslo_concurrency.lockutils [req-ce71b434-3e36-47c6-b4f4-dcd813490b6c req-820dfd5c-8dee-425b-9cf8-ce462faf044a 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Lock "95f85266-e2bc-4615-b523-e7346ad3ab40-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:23:54 compute-1 nova_compute[189066]: 2025-12-05 09:23:54.832 189070 DEBUG nova.compute.manager [req-ce71b434-3e36-47c6-b4f4-dcd813490b6c req-820dfd5c-8dee-425b-9cf8-ce462faf044a 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 95f85266-e2bc-4615-b523-e7346ad3ab40] No waiting events found dispatching network-vif-plugged-fbbb4f47-a5b9-460c-ae65-29a664051272 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 09:23:54 compute-1 nova_compute[189066]: 2025-12-05 09:23:54.832 189070 WARNING nova.compute.manager [req-ce71b434-3e36-47c6-b4f4-dcd813490b6c req-820dfd5c-8dee-425b-9cf8-ce462faf044a 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 95f85266-e2bc-4615-b523-e7346ad3ab40] Received unexpected event network-vif-plugged-fbbb4f47-a5b9-460c-ae65-29a664051272 for instance with vm_state active and task_state migrating.
Dec 05 09:23:55 compute-1 nova_compute[189066]: 2025-12-05 09:23:55.446 189070 INFO nova.virt.libvirt.driver [None req-121aab57-9143-4c2f-8514-a7ce33bf2580 9d57c2b5ca2f41c09c8f4b723716ef7b 9891534839274cbc89aec062c36488b0 - - default default] [instance: 95f85266-e2bc-4615-b523-e7346ad3ab40] Migration running for 0 secs, memory 100% remaining (bytes processed=0, remaining=0, total=0); disk 100% remaining (bytes processed=0, remaining=0, total=0).
Dec 05 09:23:55 compute-1 nova_compute[189066]: 2025-12-05 09:23:55.950 189070 DEBUG nova.virt.libvirt.migration [None req-121aab57-9143-4c2f-8514-a7ce33bf2580 9d57c2b5ca2f41c09c8f4b723716ef7b 9891534839274cbc89aec062c36488b0 - - default default] [instance: 95f85266-e2bc-4615-b523-e7346ad3ab40] Current 50 elapsed 1 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Dec 05 09:23:55 compute-1 nova_compute[189066]: 2025-12-05 09:23:55.952 189070 DEBUG nova.virt.libvirt.migration [None req-121aab57-9143-4c2f-8514-a7ce33bf2580 9d57c2b5ca2f41c09c8f4b723716ef7b 9891534839274cbc89aec062c36488b0 - - default default] [instance: 95f85266-e2bc-4615-b523-e7346ad3ab40] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525
Dec 05 09:23:56 compute-1 nova_compute[189066]: 2025-12-05 09:23:56.457 189070 DEBUG nova.virt.libvirt.migration [None req-121aab57-9143-4c2f-8514-a7ce33bf2580 9d57c2b5ca2f41c09c8f4b723716ef7b 9891534839274cbc89aec062c36488b0 - - default default] [instance: 95f85266-e2bc-4615-b523-e7346ad3ab40] Current 50 elapsed 2 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Dec 05 09:23:56 compute-1 nova_compute[189066]: 2025-12-05 09:23:56.458 189070 DEBUG nova.virt.libvirt.migration [None req-121aab57-9143-4c2f-8514-a7ce33bf2580 9d57c2b5ca2f41c09c8f4b723716ef7b 9891534839274cbc89aec062c36488b0 - - default default] [instance: 95f85266-e2bc-4615-b523-e7346ad3ab40] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525
Dec 05 09:23:56 compute-1 nova_compute[189066]: 2025-12-05 09:23:56.815 189070 DEBUG nova.virt.driver [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] Emitting event <LifecycleEvent: 1764926636.8144307, 95f85266-e2bc-4615-b523-e7346ad3ab40 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 09:23:56 compute-1 nova_compute[189066]: 2025-12-05 09:23:56.815 189070 INFO nova.compute.manager [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] [instance: 95f85266-e2bc-4615-b523-e7346ad3ab40] VM Paused (Lifecycle Event)
Dec 05 09:23:56 compute-1 nova_compute[189066]: 2025-12-05 09:23:56.962 189070 DEBUG nova.virt.libvirt.migration [None req-121aab57-9143-4c2f-8514-a7ce33bf2580 9d57c2b5ca2f41c09c8f4b723716ef7b 9891534839274cbc89aec062c36488b0 - - default default] [instance: 95f85266-e2bc-4615-b523-e7346ad3ab40] Current 50 elapsed 2 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Dec 05 09:23:56 compute-1 nova_compute[189066]: 2025-12-05 09:23:56.963 189070 DEBUG nova.virt.libvirt.migration [None req-121aab57-9143-4c2f-8514-a7ce33bf2580 9d57c2b5ca2f41c09c8f4b723716ef7b 9891534839274cbc89aec062c36488b0 - - default default] [instance: 95f85266-e2bc-4615-b523-e7346ad3ab40] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525
Dec 05 09:23:57 compute-1 kernel: tapfbbb4f47-a5 (unregistering): left promiscuous mode
Dec 05 09:23:57 compute-1 NetworkManager[55704]: <info>  [1764926637.0036] device (tapfbbb4f47-a5): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec 05 09:23:57 compute-1 nova_compute[189066]: 2025-12-05 09:23:57.011 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:23:57 compute-1 ovn_controller[95809]: 2025-12-05T09:23:57Z|00043|binding|INFO|Releasing lport fbbb4f47-a5b9-460c-ae65-29a664051272 from this chassis (sb_readonly=0)
Dec 05 09:23:57 compute-1 ovn_controller[95809]: 2025-12-05T09:23:57Z|00044|binding|INFO|Setting lport fbbb4f47-a5b9-460c-ae65-29a664051272 down in Southbound
Dec 05 09:23:57 compute-1 ovn_controller[95809]: 2025-12-05T09:23:57Z|00045|binding|INFO|Removing iface tapfbbb4f47-a5 ovn-installed in OVS
Dec 05 09:23:57 compute-1 nova_compute[189066]: 2025-12-05 09:23:57.014 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:23:57 compute-1 nova_compute[189066]: 2025-12-05 09:23:57.025 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:23:57 compute-1 systemd[1]: machine-qemu\x2d1\x2dinstance\x2d00000004.scope: Deactivated successfully.
Dec 05 09:23:57 compute-1 systemd[1]: machine-qemu\x2d1\x2dinstance\x2d00000004.scope: Consumed 17.838s CPU time.
Dec 05 09:23:57 compute-1 systemd-machined[154815]: Machine qemu-1-instance-00000004 terminated.
Dec 05 09:23:57 compute-1 podman[221589]: 2025-12-05 09:23:57.119563658 +0000 UTC m=+0.080970275 container health_status d60d3dfd47255bc7c068dbedc03dbf86678863f0414a1b8f8536e4d53dd67a13 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a42d21893a42142b0b00e96296a13ac9d2c0fedf55d6438ad104fd0309c63376-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Dec 05 09:23:57 compute-1 nova_compute[189066]: 2025-12-05 09:23:57.217 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:23:57 compute-1 nova_compute[189066]: 2025-12-05 09:23:57.222 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:23:57 compute-1 nova_compute[189066]: 2025-12-05 09:23:57.266 189070 DEBUG nova.virt.libvirt.driver [None req-121aab57-9143-4c2f-8514-a7ce33bf2580 9d57c2b5ca2f41c09c8f4b723716ef7b 9891534839274cbc89aec062c36488b0 - - default default] [instance: 95f85266-e2bc-4615-b523-e7346ad3ab40] Migrate API has completed _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10279
Dec 05 09:23:57 compute-1 nova_compute[189066]: 2025-12-05 09:23:57.267 189070 DEBUG nova.virt.libvirt.driver [None req-121aab57-9143-4c2f-8514-a7ce33bf2580 9d57c2b5ca2f41c09c8f4b723716ef7b 9891534839274cbc89aec062c36488b0 - - default default] [instance: 95f85266-e2bc-4615-b523-e7346ad3ab40] Migration operation thread has finished _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10327
Dec 05 09:23:57 compute-1 nova_compute[189066]: 2025-12-05 09:23:57.267 189070 DEBUG nova.virt.libvirt.driver [None req-121aab57-9143-4c2f-8514-a7ce33bf2580 9d57c2b5ca2f41c09c8f4b723716ef7b 9891534839274cbc89aec062c36488b0 - - default default] [instance: 95f85266-e2bc-4615-b523-e7346ad3ab40] Migration operation thread notification thread_finished /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10630
Dec 05 09:23:57 compute-1 nova_compute[189066]: 2025-12-05 09:23:57.467 189070 DEBUG nova.virt.libvirt.guest [None req-121aab57-9143-4c2f-8514-a7ce33bf2580 9d57c2b5ca2f41c09c8f4b723716ef7b 9891534839274cbc89aec062c36488b0 - - default default] Domain has shutdown/gone away: Domain not found: no domain with matching uuid '95f85266-e2bc-4615-b523-e7346ad3ab40' (instance-00000004) get_job_info /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:688
Dec 05 09:23:57 compute-1 nova_compute[189066]: 2025-12-05 09:23:57.470 189070 INFO nova.virt.libvirt.driver [None req-121aab57-9143-4c2f-8514-a7ce33bf2580 9d57c2b5ca2f41c09c8f4b723716ef7b 9891534839274cbc89aec062c36488b0 - - default default] [instance: 95f85266-e2bc-4615-b523-e7346ad3ab40] Migration operation has completed
Dec 05 09:23:57 compute-1 nova_compute[189066]: 2025-12-05 09:23:57.471 189070 INFO nova.compute.manager [None req-121aab57-9143-4c2f-8514-a7ce33bf2580 9d57c2b5ca2f41c09c8f4b723716ef7b 9891534839274cbc89aec062c36488b0 - - default default] [instance: 95f85266-e2bc-4615-b523-e7346ad3ab40] _post_live_migration() is started..
Dec 05 09:23:57 compute-1 ovn_controller[95809]: 2025-12-05T09:23:57Z|00046|binding|INFO|Releasing lport 0cfbe0fa-0fb7-480e-bf9d-3c9768dbeffa from this chassis (sb_readonly=0)
Dec 05 09:23:57 compute-1 ovn_controller[95809]: 2025-12-05T09:23:57Z|00047|binding|INFO|Releasing lport b3346b7b-37e6-4cad-b494-c202eaae0edf from this chassis (sb_readonly=0)
Dec 05 09:23:57 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:23:57.738 105272 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d6:6e:e7 10.100.0.10'], port_security=['fa:16:3e:d6:6e:e7 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com,compute-2.ctlplane.example.com', 'activation-strategy': 'rarp', 'additional-chassis-activated': '27a42d69-fccd-4cb4-8b07-f904963c8b4f'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': '95f85266-e2bc-4615-b523-e7346ad3ab40', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-0a97aec7-0780-4b5e-9498-e796fd7b42fd', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'aa5b008731384d18ba83c8d69e76bcef', 'neutron:revision_number': '8', 'neutron:security_group_ids': 'ab115429-13ca-4258-9a08-fc76d0ca17b7', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=36325ae0-997b-4e15-a889-e33151da06b1, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fc06f54c970>], logical_port=fbbb4f47-a5b9-460c-ae65-29a664051272) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fc06f54c970>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 09:23:57 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:23:57.741 105272 INFO neutron.agent.ovn.metadata.agent [-] Port fbbb4f47-a5b9-460c-ae65-29a664051272 in datapath 0a97aec7-0780-4b5e-9498-e796fd7b42fd unbound from our chassis
Dec 05 09:23:57 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:23:57.745 105272 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 0a97aec7-0780-4b5e-9498-e796fd7b42fd, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 05 09:23:57 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:23:57.749 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[6be348ee-731a-42b5-8a0a-ea379f5fbcda]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:23:57 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:23:57.753 105272 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-0a97aec7-0780-4b5e-9498-e796fd7b42fd namespace which is not needed anymore
Dec 05 09:23:57 compute-1 nova_compute[189066]: 2025-12-05 09:23:57.764 189070 DEBUG nova.compute.manager [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] [instance: 95f85266-e2bc-4615-b523-e7346ad3ab40] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 09:23:57 compute-1 neutron-haproxy-ovnmeta-0a97aec7-0780-4b5e-9498-e796fd7b42fd[221001]: [NOTICE]   (221005) : haproxy version is 2.8.14-c23fe91
Dec 05 09:23:57 compute-1 neutron-haproxy-ovnmeta-0a97aec7-0780-4b5e-9498-e796fd7b42fd[221001]: [NOTICE]   (221005) : path to executable is /usr/sbin/haproxy
Dec 05 09:23:57 compute-1 neutron-haproxy-ovnmeta-0a97aec7-0780-4b5e-9498-e796fd7b42fd[221001]: [WARNING]  (221005) : Exiting Master process...
Dec 05 09:23:57 compute-1 neutron-haproxy-ovnmeta-0a97aec7-0780-4b5e-9498-e796fd7b42fd[221001]: [ALERT]    (221005) : Current worker (221007) exited with code 143 (Terminated)
Dec 05 09:23:57 compute-1 neutron-haproxy-ovnmeta-0a97aec7-0780-4b5e-9498-e796fd7b42fd[221001]: [WARNING]  (221005) : All workers exited. Exiting... (0)
Dec 05 09:23:57 compute-1 systemd[1]: libpod-70231d9faa901311c678624953a87062e85e9c9355234ddd131fe44ab3e4a2ae.scope: Deactivated successfully.
Dec 05 09:23:57 compute-1 podman[221649]: 2025-12-05 09:23:57.931160364 +0000 UTC m=+0.052013825 container died 70231d9faa901311c678624953a87062e85e9c9355234ddd131fe44ab3e4a2ae (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0a97aec7-0780-4b5e-9498-e796fd7b42fd, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true)
Dec 05 09:23:57 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-70231d9faa901311c678624953a87062e85e9c9355234ddd131fe44ab3e4a2ae-userdata-shm.mount: Deactivated successfully.
Dec 05 09:23:57 compute-1 systemd[1]: var-lib-containers-storage-overlay-b0ab732b937e1a0aa90885ab57a8a882c3f199767b0ea9071c131766f2258997-merged.mount: Deactivated successfully.
Dec 05 09:23:57 compute-1 podman[221649]: 2025-12-05 09:23:57.968870818 +0000 UTC m=+0.089724269 container cleanup 70231d9faa901311c678624953a87062e85e9c9355234ddd131fe44ab3e4a2ae (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0a97aec7-0780-4b5e-9498-e796fd7b42fd, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125)
Dec 05 09:23:57 compute-1 systemd[1]: libpod-conmon-70231d9faa901311c678624953a87062e85e9c9355234ddd131fe44ab3e4a2ae.scope: Deactivated successfully.
Dec 05 09:23:58 compute-1 podman[221679]: 2025-12-05 09:23:58.049497634 +0000 UTC m=+0.047921915 container remove 70231d9faa901311c678624953a87062e85e9c9355234ddd131fe44ab3e4a2ae (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0a97aec7-0780-4b5e-9498-e796fd7b42fd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS)
Dec 05 09:23:58 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:23:58.056 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[502a0916-f034-457f-aa92-edb13274ec16]: (4, ('Fri Dec  5 09:23:57 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-0a97aec7-0780-4b5e-9498-e796fd7b42fd (70231d9faa901311c678624953a87062e85e9c9355234ddd131fe44ab3e4a2ae)\n70231d9faa901311c678624953a87062e85e9c9355234ddd131fe44ab3e4a2ae\nFri Dec  5 09:23:57 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-0a97aec7-0780-4b5e-9498-e796fd7b42fd (70231d9faa901311c678624953a87062e85e9c9355234ddd131fe44ab3e4a2ae)\n70231d9faa901311c678624953a87062e85e9c9355234ddd131fe44ab3e4a2ae\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:23:58 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:23:58.058 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[0bf9d6d4-7926-4780-afb5-8c7a689476fa]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:23:58 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:23:58.059 105272 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap0a97aec7-00, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 09:23:58 compute-1 kernel: tap0a97aec7-00: left promiscuous mode
Dec 05 09:23:58 compute-1 nova_compute[189066]: 2025-12-05 09:23:58.096 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:23:58 compute-1 nova_compute[189066]: 2025-12-05 09:23:58.115 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:23:58 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:23:58.123 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[d7ce7720-fa47-4b74-a660-af2f2d23b307]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:23:58 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:23:58.138 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[fbabd3e0-bea9-4506-ae11-3c4202059e3a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:23:58 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:23:58.140 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[27a76d97-17a1-4b13-bb7b-2ac4d06b7b73]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:23:58 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:23:58.163 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[c6b10ddc-e451-4f34-91e4-fa808320d090]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 383276, 'reachable_time': 21807, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 221698, 'error': None, 'target': 'ovnmeta-0a97aec7-0780-4b5e-9498-e796fd7b42fd', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:23:58 compute-1 systemd[1]: run-netns-ovnmeta\x2d0a97aec7\x2d0780\x2d4b5e\x2d9498\x2de796fd7b42fd.mount: Deactivated successfully.
Dec 05 09:23:58 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:23:58.182 105809 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-0a97aec7-0780-4b5e-9498-e796fd7b42fd deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Dec 05 09:23:58 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:23:58.184 105809 DEBUG oslo.privsep.daemon [-] privsep: reply[251f06bf-a9ed-4cac-8245-8f649b2d4798]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:23:58 compute-1 nova_compute[189066]: 2025-12-05 09:23:58.250 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:23:58 compute-1 nova_compute[189066]: 2025-12-05 09:23:58.565 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:23:59 compute-1 podman[221702]: 2025-12-05 09:23:59.685971193 +0000 UTC m=+0.120630638 container health_status 0cdc951f3b455a1459471b20e1f1626ca53f702831c125f835d1b670aeb17ccf (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a42d21893a42142b0b00e96296a13ac9d2c0fedf55d6438ad104fd0309c63376-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Dec 05 09:23:59 compute-1 nova_compute[189066]: 2025-12-05 09:23:59.686 189070 DEBUG nova.compute.manager [req-7e32a7e4-e326-4b6d-900b-fb9e10c51aba req-9859c8f8-c488-41c1-9675-0472c7e9d2f8 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 95f85266-e2bc-4615-b523-e7346ad3ab40] Received event network-changed-fbbb4f47-a5b9-460c-ae65-29a664051272 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 09:23:59 compute-1 nova_compute[189066]: 2025-12-05 09:23:59.686 189070 DEBUG nova.compute.manager [req-7e32a7e4-e326-4b6d-900b-fb9e10c51aba req-9859c8f8-c488-41c1-9675-0472c7e9d2f8 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 95f85266-e2bc-4615-b523-e7346ad3ab40] Refreshing instance network info cache due to event network-changed-fbbb4f47-a5b9-460c-ae65-29a664051272. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Dec 05 09:23:59 compute-1 nova_compute[189066]: 2025-12-05 09:23:59.687 189070 DEBUG oslo_concurrency.lockutils [req-7e32a7e4-e326-4b6d-900b-fb9e10c51aba req-9859c8f8-c488-41c1-9675-0472c7e9d2f8 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Acquiring lock "refresh_cache-95f85266-e2bc-4615-b523-e7346ad3ab40" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 09:23:59 compute-1 nova_compute[189066]: 2025-12-05 09:23:59.687 189070 DEBUG oslo_concurrency.lockutils [req-7e32a7e4-e326-4b6d-900b-fb9e10c51aba req-9859c8f8-c488-41c1-9675-0472c7e9d2f8 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Acquired lock "refresh_cache-95f85266-e2bc-4615-b523-e7346ad3ab40" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 09:23:59 compute-1 nova_compute[189066]: 2025-12-05 09:23:59.687 189070 DEBUG nova.network.neutron [req-7e32a7e4-e326-4b6d-900b-fb9e10c51aba req-9859c8f8-c488-41c1-9675-0472c7e9d2f8 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 95f85266-e2bc-4615-b523-e7346ad3ab40] Refreshing network info cache for port fbbb4f47-a5b9-460c-ae65-29a664051272 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Dec 05 09:23:59 compute-1 nova_compute[189066]: 2025-12-05 09:23:59.883 189070 DEBUG nova.network.neutron [None req-121aab57-9143-4c2f-8514-a7ce33bf2580 9d57c2b5ca2f41c09c8f4b723716ef7b 9891534839274cbc89aec062c36488b0 - - default default] Activated binding for port fbbb4f47-a5b9-460c-ae65-29a664051272 and host compute-2.ctlplane.example.com migrate_instance_start /usr/lib/python3.9/site-packages/nova/network/neutron.py:3181
Dec 05 09:23:59 compute-1 nova_compute[189066]: 2025-12-05 09:23:59.883 189070 DEBUG nova.compute.manager [None req-121aab57-9143-4c2f-8514-a7ce33bf2580 9d57c2b5ca2f41c09c8f4b723716ef7b 9891534839274cbc89aec062c36488b0 - - default default] [instance: 95f85266-e2bc-4615-b523-e7346ad3ab40] Calling driver.post_live_migration_at_source with original source VIFs from migrate_data: [{"id": "fbbb4f47-a5b9-460c-ae65-29a664051272", "address": "fa:16:3e:d6:6e:e7", "network": {"id": "0a97aec7-0780-4b5e-9498-e796fd7b42fd", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-971219995-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "aa5b008731384d18ba83c8d69e76bcef", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfbbb4f47-a5", "ovs_interfaceid": "fbbb4f47-a5b9-460c-ae65-29a664051272", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] _post_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:9326
Dec 05 09:23:59 compute-1 nova_compute[189066]: 2025-12-05 09:23:59.884 189070 DEBUG nova.virt.libvirt.vif [None req-121aab57-9143-4c2f-8514-a7ce33bf2580 9d57c2b5ca2f41c09c8f4b723716ef7b 9891534839274cbc89aec062c36488b0 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-05T09:22:06Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-LiveAutoBlockMigrationV225Test-server-1827012883',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-liveautoblockmigrationv225test-server-1827012883',id=4,image_ref='3ebffd97-b242-42d7-b245-ebdaf8e4377c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-05T09:23:00Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='aa5b008731384d18ba83c8d69e76bcef',ramdisk_id='',reservation_id='r-0233w6ol',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='3ebffd97-b242-42d7-b245-ebdaf8e4377c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-LiveAutoBlockMigrationV225Test-63392288',owner_user_name='tempest-LiveAutoBlockMigrationV225Test-63392288-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-05T09:23:21Z,user_data=None,user_id='a9b7ab1c9c854146af8af16f337a063d',uuid=95f85266-e2bc-4615-b523-e7346ad3ab40,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "fbbb4f47-a5b9-460c-ae65-29a664051272", "address": "fa:16:3e:d6:6e:e7", "network": {"id": "0a97aec7-0780-4b5e-9498-e796fd7b42fd", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-971219995-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "aa5b008731384d18ba83c8d69e76bcef", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfbbb4f47-a5", "ovs_interfaceid": "fbbb4f47-a5b9-460c-ae65-29a664051272", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Dec 05 09:23:59 compute-1 nova_compute[189066]: 2025-12-05 09:23:59.885 189070 DEBUG nova.network.os_vif_util [None req-121aab57-9143-4c2f-8514-a7ce33bf2580 9d57c2b5ca2f41c09c8f4b723716ef7b 9891534839274cbc89aec062c36488b0 - - default default] Converting VIF {"id": "fbbb4f47-a5b9-460c-ae65-29a664051272", "address": "fa:16:3e:d6:6e:e7", "network": {"id": "0a97aec7-0780-4b5e-9498-e796fd7b42fd", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-971219995-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "aa5b008731384d18ba83c8d69e76bcef", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfbbb4f47-a5", "ovs_interfaceid": "fbbb4f47-a5b9-460c-ae65-29a664051272", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 09:23:59 compute-1 nova_compute[189066]: 2025-12-05 09:23:59.885 189070 DEBUG nova.network.os_vif_util [None req-121aab57-9143-4c2f-8514-a7ce33bf2580 9d57c2b5ca2f41c09c8f4b723716ef7b 9891534839274cbc89aec062c36488b0 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d6:6e:e7,bridge_name='br-int',has_traffic_filtering=True,id=fbbb4f47-a5b9-460c-ae65-29a664051272,network=Network(0a97aec7-0780-4b5e-9498-e796fd7b42fd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfbbb4f47-a5') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 09:23:59 compute-1 nova_compute[189066]: 2025-12-05 09:23:59.886 189070 DEBUG os_vif [None req-121aab57-9143-4c2f-8514-a7ce33bf2580 9d57c2b5ca2f41c09c8f4b723716ef7b 9891534839274cbc89aec062c36488b0 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:d6:6e:e7,bridge_name='br-int',has_traffic_filtering=True,id=fbbb4f47-a5b9-460c-ae65-29a664051272,network=Network(0a97aec7-0780-4b5e-9498-e796fd7b42fd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfbbb4f47-a5') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Dec 05 09:23:59 compute-1 nova_compute[189066]: 2025-12-05 09:23:59.890 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:23:59 compute-1 nova_compute[189066]: 2025-12-05 09:23:59.891 189070 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapfbbb4f47-a5, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 09:23:59 compute-1 nova_compute[189066]: 2025-12-05 09:23:59.893 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:23:59 compute-1 nova_compute[189066]: 2025-12-05 09:23:59.896 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 09:23:59 compute-1 nova_compute[189066]: 2025-12-05 09:23:59.907 189070 INFO os_vif [None req-121aab57-9143-4c2f-8514-a7ce33bf2580 9d57c2b5ca2f41c09c8f4b723716ef7b 9891534839274cbc89aec062c36488b0 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:d6:6e:e7,bridge_name='br-int',has_traffic_filtering=True,id=fbbb4f47-a5b9-460c-ae65-29a664051272,network=Network(0a97aec7-0780-4b5e-9498-e796fd7b42fd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfbbb4f47-a5')
Dec 05 09:23:59 compute-1 nova_compute[189066]: 2025-12-05 09:23:59.907 189070 DEBUG oslo_concurrency.lockutils [None req-121aab57-9143-4c2f-8514-a7ce33bf2580 9d57c2b5ca2f41c09c8f4b723716ef7b 9891534839274cbc89aec062c36488b0 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:23:59 compute-1 nova_compute[189066]: 2025-12-05 09:23:59.908 189070 DEBUG oslo_concurrency.lockutils [None req-121aab57-9143-4c2f-8514-a7ce33bf2580 9d57c2b5ca2f41c09c8f4b723716ef7b 9891534839274cbc89aec062c36488b0 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:23:59 compute-1 nova_compute[189066]: 2025-12-05 09:23:59.908 189070 DEBUG oslo_concurrency.lockutils [None req-121aab57-9143-4c2f-8514-a7ce33bf2580 9d57c2b5ca2f41c09c8f4b723716ef7b 9891534839274cbc89aec062c36488b0 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:23:59 compute-1 nova_compute[189066]: 2025-12-05 09:23:59.908 189070 DEBUG nova.compute.manager [None req-121aab57-9143-4c2f-8514-a7ce33bf2580 9d57c2b5ca2f41c09c8f4b723716ef7b 9891534839274cbc89aec062c36488b0 - - default default] [instance: 95f85266-e2bc-4615-b523-e7346ad3ab40] Calling driver.cleanup from _post_live_migration _post_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:9349
Dec 05 09:23:59 compute-1 nova_compute[189066]: 2025-12-05 09:23:59.909 189070 INFO nova.virt.libvirt.driver [None req-121aab57-9143-4c2f-8514-a7ce33bf2580 9d57c2b5ca2f41c09c8f4b723716ef7b 9891534839274cbc89aec062c36488b0 - - default default] [instance: 95f85266-e2bc-4615-b523-e7346ad3ab40] Deleting instance files /var/lib/nova/instances/95f85266-e2bc-4615-b523-e7346ad3ab40_del
Dec 05 09:23:59 compute-1 nova_compute[189066]: 2025-12-05 09:23:59.910 189070 INFO nova.virt.libvirt.driver [None req-121aab57-9143-4c2f-8514-a7ce33bf2580 9d57c2b5ca2f41c09c8f4b723716ef7b 9891534839274cbc89aec062c36488b0 - - default default] [instance: 95f85266-e2bc-4615-b523-e7346ad3ab40] Deletion of /var/lib/nova/instances/95f85266-e2bc-4615-b523-e7346ad3ab40_del complete
Dec 05 09:24:00 compute-1 sshd-session[221700]: Received disconnect from 122.168.194.41 port 53242:11: Bye Bye [preauth]
Dec 05 09:24:00 compute-1 sshd-session[221700]: Disconnected from authenticating user root 122.168.194.41 port 53242 [preauth]
Dec 05 09:24:02 compute-1 podman[221728]: 2025-12-05 09:24:02.624159626 +0000 UTC m=+0.060067253 container health_status e8cea6352187f28e672aa25c227f2705a90d58074b8cf42235a56df5d4026fd5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, org.label-schema.build-date=20251125, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a42d21893a42142b0b00e96296a13ac9d2c0fedf55d6438ad104fd0309c63376-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Dec 05 09:24:03 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:24:03.340 105272 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=7, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f2:d3:89', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '8e:43:2c:7d:20:dd'}, ipsec=False) old=SB_Global(nb_cfg=6) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 09:24:03 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:24:03.342 105272 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Dec 05 09:24:03 compute-1 nova_compute[189066]: 2025-12-05 09:24:03.341 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:24:03 compute-1 nova_compute[189066]: 2025-12-05 09:24:03.567 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:24:04 compute-1 nova_compute[189066]: 2025-12-05 09:24:04.895 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:24:05 compute-1 podman[221748]: 2025-12-05 09:24:05.629995128 +0000 UTC m=+0.069553695 container health_status 3f3288243aefe700d53cffce184353529fbaf6e44f1c19ec4792ed576af41176 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, managed_by=edpm_ansible)
Dec 05 09:24:08 compute-1 nova_compute[189066]: 2025-12-05 09:24:08.568 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:24:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:24:08.870 105272 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:24:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:24:08.870 105272 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:24:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:24:08.871 105272 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:24:09 compute-1 nova_compute[189066]: 2025-12-05 09:24:09.292 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:24:09 compute-1 nova_compute[189066]: 2025-12-05 09:24:09.293 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:24:09 compute-1 podman[221770]: 2025-12-05 09:24:09.667835217 +0000 UTC m=+0.099831688 container health_status b07f36300303a5c1729056a61e344d6c6fd9b2e404082d8e8ce7185d45652c2b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, version=9.6, architecture=x86_64, config_id=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., maintainer=Red Hat, Inc., io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, name=ubi9-minimal, build-date=2025-08-20T13:12:41, release=1755695350)
Dec 05 09:24:09 compute-1 nova_compute[189066]: 2025-12-05 09:24:09.900 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:24:11 compute-1 nova_compute[189066]: 2025-12-05 09:24:11.236 189070 DEBUG nova.network.neutron [req-7e32a7e4-e326-4b6d-900b-fb9e10c51aba req-9859c8f8-c488-41c1-9675-0472c7e9d2f8 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 95f85266-e2bc-4615-b523-e7346ad3ab40] Updated VIF entry in instance network info cache for port fbbb4f47-a5b9-460c-ae65-29a664051272. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Dec 05 09:24:11 compute-1 nova_compute[189066]: 2025-12-05 09:24:11.236 189070 DEBUG nova.network.neutron [req-7e32a7e4-e326-4b6d-900b-fb9e10c51aba req-9859c8f8-c488-41c1-9675-0472c7e9d2f8 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 95f85266-e2bc-4615-b523-e7346ad3ab40] Updating instance_info_cache with network_info: [{"id": "fbbb4f47-a5b9-460c-ae65-29a664051272", "address": "fa:16:3e:d6:6e:e7", "network": {"id": "0a97aec7-0780-4b5e-9498-e796fd7b42fd", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-971219995-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "aa5b008731384d18ba83c8d69e76bcef", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfbbb4f47-a5", "ovs_interfaceid": "fbbb4f47-a5b9-460c-ae65-29a664051272", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 09:24:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:24:11.344 105272 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=2deaed7a-68f6-453c-b7f8-10ef033f3762, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '7'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 09:24:11 compute-1 nova_compute[189066]: 2025-12-05 09:24:11.558 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:24:11 compute-1 nova_compute[189066]: 2025-12-05 09:24:11.559 189070 DEBUG nova.compute.manager [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 05 09:24:11 compute-1 nova_compute[189066]: 2025-12-05 09:24:11.567 189070 DEBUG oslo_concurrency.lockutils [req-7e32a7e4-e326-4b6d-900b-fb9e10c51aba req-9859c8f8-c488-41c1-9675-0472c7e9d2f8 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Releasing lock "refresh_cache-95f85266-e2bc-4615-b523-e7346ad3ab40" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 09:24:11 compute-1 nova_compute[189066]: 2025-12-05 09:24:11.612 189070 DEBUG oslo_concurrency.lockutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Acquiring lock "refresh_cache-95f85266-e2bc-4615-b523-e7346ad3ab40" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 09:24:11 compute-1 nova_compute[189066]: 2025-12-05 09:24:11.613 189070 DEBUG oslo_concurrency.lockutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Acquired lock "refresh_cache-95f85266-e2bc-4615-b523-e7346ad3ab40" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 09:24:11 compute-1 nova_compute[189066]: 2025-12-05 09:24:11.613 189070 DEBUG nova.network.neutron [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] [instance: 95f85266-e2bc-4615-b523-e7346ad3ab40] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Dec 05 09:24:11 compute-1 nova_compute[189066]: 2025-12-05 09:24:11.990 189070 DEBUG oslo_concurrency.lockutils [None req-3b78da78-4a16-441d-a1c2-3ae5cd1497db 401ddf2632c64d0094aff64bba42db19 ea153810a6ed4283be879b89a75f1e86 - - default default] Acquiring lock "refresh_cache-2b94d7d0-b000-460f-8883-8953d60115d0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 09:24:11 compute-1 nova_compute[189066]: 2025-12-05 09:24:11.990 189070 DEBUG oslo_concurrency.lockutils [None req-3b78da78-4a16-441d-a1c2-3ae5cd1497db 401ddf2632c64d0094aff64bba42db19 ea153810a6ed4283be879b89a75f1e86 - - default default] Acquired lock "refresh_cache-2b94d7d0-b000-460f-8883-8953d60115d0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 09:24:11 compute-1 nova_compute[189066]: 2025-12-05 09:24:11.991 189070 DEBUG nova.network.neutron [None req-3b78da78-4a16-441d-a1c2-3ae5cd1497db 401ddf2632c64d0094aff64bba42db19 ea153810a6ed4283be879b89a75f1e86 - - default default] [instance: 2b94d7d0-b000-460f-8883-8953d60115d0] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Dec 05 09:24:12 compute-1 podman[221792]: 2025-12-05 09:24:12.625118899 +0000 UTC m=+0.065795184 container health_status b9f45cdcb567bb250381fa80d86c78c226d5638c7de1b6d79ee017275814ff68 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Dec 05 09:24:12 compute-1 nova_compute[189066]: 2025-12-05 09:24:12.771 189070 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764926637.2620661, 95f85266-e2bc-4615-b523-e7346ad3ab40 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 09:24:12 compute-1 nova_compute[189066]: 2025-12-05 09:24:12.771 189070 INFO nova.compute.manager [-] [instance: 95f85266-e2bc-4615-b523-e7346ad3ab40] VM Stopped (Lifecycle Event)
Dec 05 09:24:12 compute-1 nova_compute[189066]: 2025-12-05 09:24:12.830 189070 DEBUG nova.compute.manager [None req-22f94d59-db1e-4985-85d1-ed52731741f0 - - - - - -] [instance: 95f85266-e2bc-4615-b523-e7346ad3ab40] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 09:24:13 compute-1 nova_compute[189066]: 2025-12-05 09:24:13.603 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:24:14 compute-1 nova_compute[189066]: 2025-12-05 09:24:14.903 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:24:18 compute-1 nova_compute[189066]: 2025-12-05 09:24:18.638 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:24:18 compute-1 nova_compute[189066]: 2025-12-05 09:24:18.750 189070 DEBUG nova.compute.manager [req-32b9dc2e-9f29-40c8-880f-0c6fbb1fd27f req-c6b064ef-c54f-4d5a-b014-566e11922127 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 95f85266-e2bc-4615-b523-e7346ad3ab40] Received event network-vif-unplugged-fbbb4f47-a5b9-460c-ae65-29a664051272 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 09:24:18 compute-1 nova_compute[189066]: 2025-12-05 09:24:18.751 189070 DEBUG oslo_concurrency.lockutils [req-32b9dc2e-9f29-40c8-880f-0c6fbb1fd27f req-c6b064ef-c54f-4d5a-b014-566e11922127 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Acquiring lock "95f85266-e2bc-4615-b523-e7346ad3ab40-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:24:18 compute-1 nova_compute[189066]: 2025-12-05 09:24:18.751 189070 DEBUG oslo_concurrency.lockutils [req-32b9dc2e-9f29-40c8-880f-0c6fbb1fd27f req-c6b064ef-c54f-4d5a-b014-566e11922127 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Lock "95f85266-e2bc-4615-b523-e7346ad3ab40-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:24:18 compute-1 nova_compute[189066]: 2025-12-05 09:24:18.751 189070 DEBUG oslo_concurrency.lockutils [req-32b9dc2e-9f29-40c8-880f-0c6fbb1fd27f req-c6b064ef-c54f-4d5a-b014-566e11922127 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Lock "95f85266-e2bc-4615-b523-e7346ad3ab40-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:24:18 compute-1 nova_compute[189066]: 2025-12-05 09:24:18.752 189070 DEBUG nova.compute.manager [req-32b9dc2e-9f29-40c8-880f-0c6fbb1fd27f req-c6b064ef-c54f-4d5a-b014-566e11922127 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 95f85266-e2bc-4615-b523-e7346ad3ab40] No waiting events found dispatching network-vif-unplugged-fbbb4f47-a5b9-460c-ae65-29a664051272 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 09:24:18 compute-1 nova_compute[189066]: 2025-12-05 09:24:18.752 189070 DEBUG nova.compute.manager [req-32b9dc2e-9f29-40c8-880f-0c6fbb1fd27f req-c6b064ef-c54f-4d5a-b014-566e11922127 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 95f85266-e2bc-4615-b523-e7346ad3ab40] Received event network-vif-unplugged-fbbb4f47-a5b9-460c-ae65-29a664051272 for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Dec 05 09:24:18 compute-1 nova_compute[189066]: 2025-12-05 09:24:18.891 189070 DEBUG nova.network.neutron [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] [instance: 95f85266-e2bc-4615-b523-e7346ad3ab40] Updating instance_info_cache with network_info: [{"id": "fbbb4f47-a5b9-460c-ae65-29a664051272", "address": "fa:16:3e:d6:6e:e7", "network": {"id": "0a97aec7-0780-4b5e-9498-e796fd7b42fd", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-971219995-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "aa5b008731384d18ba83c8d69e76bcef", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfbbb4f47-a5", "ovs_interfaceid": "fbbb4f47-a5b9-460c-ae65-29a664051272", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 09:24:19 compute-1 nova_compute[189066]: 2025-12-05 09:24:19.035 189070 DEBUG oslo_concurrency.lockutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Releasing lock "refresh_cache-95f85266-e2bc-4615-b523-e7346ad3ab40" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 09:24:19 compute-1 nova_compute[189066]: 2025-12-05 09:24:19.036 189070 DEBUG nova.compute.manager [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] [instance: 95f85266-e2bc-4615-b523-e7346ad3ab40] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Dec 05 09:24:19 compute-1 nova_compute[189066]: 2025-12-05 09:24:19.037 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:24:19 compute-1 nova_compute[189066]: 2025-12-05 09:24:19.037 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:24:19 compute-1 nova_compute[189066]: 2025-12-05 09:24:19.037 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:24:19 compute-1 nova_compute[189066]: 2025-12-05 09:24:19.038 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:24:19 compute-1 nova_compute[189066]: 2025-12-05 09:24:19.038 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:24:19 compute-1 nova_compute[189066]: 2025-12-05 09:24:19.038 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:24:19 compute-1 nova_compute[189066]: 2025-12-05 09:24:19.038 189070 DEBUG nova.compute.manager [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 05 09:24:19 compute-1 nova_compute[189066]: 2025-12-05 09:24:19.039 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:24:19 compute-1 podman[221825]: 2025-12-05 09:24:19.619492358 +0000 UTC m=+0.057123990 container health_status 550b604b3b40c506829669f3af0f3e8dc4e7ac01464222ce7f26998708fe1a96 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Dec 05 09:24:19 compute-1 nova_compute[189066]: 2025-12-05 09:24:19.659 189070 DEBUG oslo_concurrency.lockutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:24:19 compute-1 nova_compute[189066]: 2025-12-05 09:24:19.660 189070 DEBUG oslo_concurrency.lockutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:24:19 compute-1 nova_compute[189066]: 2025-12-05 09:24:19.660 189070 DEBUG oslo_concurrency.lockutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:24:19 compute-1 nova_compute[189066]: 2025-12-05 09:24:19.660 189070 DEBUG nova.compute.resource_tracker [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 05 09:24:19 compute-1 nova_compute[189066]: 2025-12-05 09:24:19.907 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:24:20 compute-1 nova_compute[189066]: 2025-12-05 09:24:20.188 189070 DEBUG oslo_concurrency.processutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2b94d7d0-b000-460f-8883-8953d60115d0/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 09:24:20 compute-1 nova_compute[189066]: 2025-12-05 09:24:20.254 189070 DEBUG oslo_concurrency.processutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2b94d7d0-b000-460f-8883-8953d60115d0/disk --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 09:24:20 compute-1 nova_compute[189066]: 2025-12-05 09:24:20.256 189070 DEBUG oslo_concurrency.processutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2b94d7d0-b000-460f-8883-8953d60115d0/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 09:24:20 compute-1 nova_compute[189066]: 2025-12-05 09:24:20.283 189070 DEBUG nova.compute.manager [req-ad1d29d2-d001-41cb-92f0-ab3c71681347 req-af795c34-4fed-40a9-916d-2a0a8daef936 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 95f85266-e2bc-4615-b523-e7346ad3ab40] Received event network-vif-plugged-fbbb4f47-a5b9-460c-ae65-29a664051272 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 09:24:20 compute-1 nova_compute[189066]: 2025-12-05 09:24:20.284 189070 DEBUG oslo_concurrency.lockutils [req-ad1d29d2-d001-41cb-92f0-ab3c71681347 req-af795c34-4fed-40a9-916d-2a0a8daef936 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Acquiring lock "95f85266-e2bc-4615-b523-e7346ad3ab40-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:24:20 compute-1 nova_compute[189066]: 2025-12-05 09:24:20.284 189070 DEBUG oslo_concurrency.lockutils [req-ad1d29d2-d001-41cb-92f0-ab3c71681347 req-af795c34-4fed-40a9-916d-2a0a8daef936 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Lock "95f85266-e2bc-4615-b523-e7346ad3ab40-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:24:20 compute-1 nova_compute[189066]: 2025-12-05 09:24:20.285 189070 DEBUG oslo_concurrency.lockutils [req-ad1d29d2-d001-41cb-92f0-ab3c71681347 req-af795c34-4fed-40a9-916d-2a0a8daef936 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Lock "95f85266-e2bc-4615-b523-e7346ad3ab40-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:24:20 compute-1 nova_compute[189066]: 2025-12-05 09:24:20.285 189070 DEBUG nova.compute.manager [req-ad1d29d2-d001-41cb-92f0-ab3c71681347 req-af795c34-4fed-40a9-916d-2a0a8daef936 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 95f85266-e2bc-4615-b523-e7346ad3ab40] No waiting events found dispatching network-vif-plugged-fbbb4f47-a5b9-460c-ae65-29a664051272 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 09:24:20 compute-1 nova_compute[189066]: 2025-12-05 09:24:20.285 189070 WARNING nova.compute.manager [req-ad1d29d2-d001-41cb-92f0-ab3c71681347 req-af795c34-4fed-40a9-916d-2a0a8daef936 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 95f85266-e2bc-4615-b523-e7346ad3ab40] Received unexpected event network-vif-plugged-fbbb4f47-a5b9-460c-ae65-29a664051272 for instance with vm_state active and task_state migrating.
Dec 05 09:24:20 compute-1 nova_compute[189066]: 2025-12-05 09:24:20.357 189070 DEBUG oslo_concurrency.processutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2b94d7d0-b000-460f-8883-8953d60115d0/disk --force-share --output=json" returned: 0 in 0.102s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 09:24:20 compute-1 nova_compute[189066]: 2025-12-05 09:24:20.523 189070 WARNING nova.virt.libvirt.driver [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 05 09:24:20 compute-1 nova_compute[189066]: 2025-12-05 09:24:20.525 189070 DEBUG nova.compute.resource_tracker [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5627MB free_disk=73.30521392822266GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 05 09:24:20 compute-1 nova_compute[189066]: 2025-12-05 09:24:20.526 189070 DEBUG oslo_concurrency.lockutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:24:20 compute-1 nova_compute[189066]: 2025-12-05 09:24:20.526 189070 DEBUG oslo_concurrency.lockutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:24:20 compute-1 nova_compute[189066]: 2025-12-05 09:24:20.801 189070 INFO nova.compute.resource_tracker [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] [instance: 2b94d7d0-b000-460f-8883-8953d60115d0] Updating resource usage from migration 28d48d37-57dd-4441-8e4e-cf3cfac9be6c
Dec 05 09:24:20 compute-1 nova_compute[189066]: 2025-12-05 09:24:20.801 189070 INFO nova.compute.resource_tracker [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] [instance: 95f85266-e2bc-4615-b523-e7346ad3ab40] Updating resource usage from migration 60ad7bb0-a13d-4ca7-a419-a16d5433e46c
Dec 05 09:24:21 compute-1 nova_compute[189066]: 2025-12-05 09:24:21.330 189070 DEBUG nova.compute.resource_tracker [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Migration 60ad7bb0-a13d-4ca7-a419-a16d5433e46c is active on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1640
Dec 05 09:24:21 compute-1 nova_compute[189066]: 2025-12-05 09:24:21.330 189070 DEBUG nova.compute.resource_tracker [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Migration 28d48d37-57dd-4441-8e4e-cf3cfac9be6c is active on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1640
Dec 05 09:24:21 compute-1 nova_compute[189066]: 2025-12-05 09:24:21.330 189070 DEBUG nova.compute.resource_tracker [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 05 09:24:21 compute-1 nova_compute[189066]: 2025-12-05 09:24:21.331 189070 DEBUG nova.compute.resource_tracker [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 05 09:24:21 compute-1 nova_compute[189066]: 2025-12-05 09:24:21.432 189070 DEBUG nova.compute.provider_tree [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Inventory has not changed in ProviderTree for provider: be68f9f1-7820-4bfa-8dbd-210e13729f64 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 09:24:21 compute-1 nova_compute[189066]: 2025-12-05 09:24:21.506 189070 DEBUG nova.scheduler.client.report [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Inventory has not changed for provider be68f9f1-7820-4bfa-8dbd-210e13729f64 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 09:24:21 compute-1 nova_compute[189066]: 2025-12-05 09:24:21.650 189070 DEBUG nova.network.neutron [None req-3b78da78-4a16-441d-a1c2-3ae5cd1497db 401ddf2632c64d0094aff64bba42db19 ea153810a6ed4283be879b89a75f1e86 - - default default] [instance: 2b94d7d0-b000-460f-8883-8953d60115d0] Updating instance_info_cache with network_info: [{"id": "b2d6abb2-b32e-4bf7-bbdd-00d28c47ddc5", "address": "fa:16:3e:57:1d:2e", "network": {"id": "359084ff-88c9-4324-bfef-b1ef9f403118", "bridge": "br-int", "label": "tempest-network-smoke--271958037", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.195", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e26ae3fdd48d4947978a480f70e14f84", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb2d6abb2-b3", "ovs_interfaceid": "b2d6abb2-b32e-4bf7-bbdd-00d28c47ddc5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 09:24:21 compute-1 nova_compute[189066]: 2025-12-05 09:24:21.990 189070 DEBUG oslo_concurrency.lockutils [None req-3b78da78-4a16-441d-a1c2-3ae5cd1497db 401ddf2632c64d0094aff64bba42db19 ea153810a6ed4283be879b89a75f1e86 - - default default] Releasing lock "refresh_cache-2b94d7d0-b000-460f-8883-8953d60115d0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 09:24:22 compute-1 nova_compute[189066]: 2025-12-05 09:24:22.017 189070 DEBUG nova.compute.resource_tracker [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 05 09:24:22 compute-1 nova_compute[189066]: 2025-12-05 09:24:22.018 189070 DEBUG oslo_concurrency.lockutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.492s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:24:22 compute-1 nova_compute[189066]: 2025-12-05 09:24:22.857 189070 DEBUG nova.virt.libvirt.driver [None req-3b78da78-4a16-441d-a1c2-3ae5cd1497db 401ddf2632c64d0094aff64bba42db19 ea153810a6ed4283be879b89a75f1e86 - - default default] [instance: 2b94d7d0-b000-460f-8883-8953d60115d0] Starting migrate_disk_and_power_off migrate_disk_and_power_off /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11511
Dec 05 09:24:22 compute-1 nova_compute[189066]: 2025-12-05 09:24:22.858 189070 DEBUG nova.virt.libvirt.volume.remotefs [None req-3b78da78-4a16-441d-a1c2-3ae5cd1497db 401ddf2632c64d0094aff64bba42db19 ea153810a6ed4283be879b89a75f1e86 - - default default] Creating file /var/lib/nova/instances/2b94d7d0-b000-460f-8883-8953d60115d0/3544eba1972d4f7f824921644d616aff.tmp on remote host 192.168.122.100 create_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:79
Dec 05 09:24:22 compute-1 nova_compute[189066]: 2025-12-05 09:24:22.858 189070 DEBUG oslo_concurrency.processutils [None req-3b78da78-4a16-441d-a1c2-3ae5cd1497db 401ddf2632c64d0094aff64bba42db19 ea153810a6ed4283be879b89a75f1e86 - - default default] Running cmd (subprocess): ssh -o BatchMode=yes 192.168.122.100 touch /var/lib/nova/instances/2b94d7d0-b000-460f-8883-8953d60115d0/3544eba1972d4f7f824921644d616aff.tmp execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 09:24:22 compute-1 nova_compute[189066]: 2025-12-05 09:24:22.903 189070 DEBUG oslo_concurrency.lockutils [None req-84589b53-7718-467d-b317-24ceebd62a0b fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] Acquiring lock "6b54aedd-9d9e-436b-9010-c5b04ffaca40" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:24:22 compute-1 nova_compute[189066]: 2025-12-05 09:24:22.904 189070 DEBUG oslo_concurrency.lockutils [None req-84589b53-7718-467d-b317-24ceebd62a0b fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] Lock "6b54aedd-9d9e-436b-9010-c5b04ffaca40" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:24:22 compute-1 nova_compute[189066]: 2025-12-05 09:24:22.971 189070 DEBUG nova.compute.manager [None req-84589b53-7718-467d-b317-24ceebd62a0b fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] [instance: 6b54aedd-9d9e-436b-9010-c5b04ffaca40] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Dec 05 09:24:23 compute-1 nova_compute[189066]: 2025-12-05 09:24:23.251 189070 DEBUG oslo_concurrency.lockutils [None req-84589b53-7718-467d-b317-24ceebd62a0b fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:24:23 compute-1 nova_compute[189066]: 2025-12-05 09:24:23.251 189070 DEBUG oslo_concurrency.lockutils [None req-84589b53-7718-467d-b317-24ceebd62a0b fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:24:23 compute-1 nova_compute[189066]: 2025-12-05 09:24:23.259 189070 DEBUG nova.virt.hardware [None req-84589b53-7718-467d-b317-24ceebd62a0b fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Dec 05 09:24:23 compute-1 nova_compute[189066]: 2025-12-05 09:24:23.260 189070 INFO nova.compute.claims [None req-84589b53-7718-467d-b317-24ceebd62a0b fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] [instance: 6b54aedd-9d9e-436b-9010-c5b04ffaca40] Claim successful on node compute-1.ctlplane.example.com
Dec 05 09:24:23 compute-1 nova_compute[189066]: 2025-12-05 09:24:23.326 189070 DEBUG oslo_concurrency.processutils [None req-3b78da78-4a16-441d-a1c2-3ae5cd1497db 401ddf2632c64d0094aff64bba42db19 ea153810a6ed4283be879b89a75f1e86 - - default default] CMD "ssh -o BatchMode=yes 192.168.122.100 touch /var/lib/nova/instances/2b94d7d0-b000-460f-8883-8953d60115d0/3544eba1972d4f7f824921644d616aff.tmp" returned: 1 in 0.468s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 09:24:23 compute-1 nova_compute[189066]: 2025-12-05 09:24:23.326 189070 DEBUG oslo_concurrency.processutils [None req-3b78da78-4a16-441d-a1c2-3ae5cd1497db 401ddf2632c64d0094aff64bba42db19 ea153810a6ed4283be879b89a75f1e86 - - default default] 'ssh -o BatchMode=yes 192.168.122.100 touch /var/lib/nova/instances/2b94d7d0-b000-460f-8883-8953d60115d0/3544eba1972d4f7f824921644d616aff.tmp' failed. Not Retrying. execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:473
Dec 05 09:24:23 compute-1 nova_compute[189066]: 2025-12-05 09:24:23.327 189070 DEBUG nova.virt.libvirt.volume.remotefs [None req-3b78da78-4a16-441d-a1c2-3ae5cd1497db 401ddf2632c64d0094aff64bba42db19 ea153810a6ed4283be879b89a75f1e86 - - default default] Creating directory /var/lib/nova/instances/2b94d7d0-b000-460f-8883-8953d60115d0 on remote host 192.168.122.100 create_dir /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:91
Dec 05 09:24:23 compute-1 nova_compute[189066]: 2025-12-05 09:24:23.327 189070 DEBUG oslo_concurrency.processutils [None req-3b78da78-4a16-441d-a1c2-3ae5cd1497db 401ddf2632c64d0094aff64bba42db19 ea153810a6ed4283be879b89a75f1e86 - - default default] Running cmd (subprocess): ssh -o BatchMode=yes 192.168.122.100 mkdir -p /var/lib/nova/instances/2b94d7d0-b000-460f-8883-8953d60115d0 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 09:24:23 compute-1 nova_compute[189066]: 2025-12-05 09:24:23.537 189070 DEBUG oslo_concurrency.processutils [None req-3b78da78-4a16-441d-a1c2-3ae5cd1497db 401ddf2632c64d0094aff64bba42db19 ea153810a6ed4283be879b89a75f1e86 - - default default] CMD "ssh -o BatchMode=yes 192.168.122.100 mkdir -p /var/lib/nova/instances/2b94d7d0-b000-460f-8883-8953d60115d0" returned: 0 in 0.210s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 09:24:23 compute-1 nova_compute[189066]: 2025-12-05 09:24:23.541 189070 DEBUG nova.virt.libvirt.driver [None req-3b78da78-4a16-441d-a1c2-3ae5cd1497db 401ddf2632c64d0094aff64bba42db19 ea153810a6ed4283be879b89a75f1e86 - - default default] [instance: 2b94d7d0-b000-460f-8883-8953d60115d0] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071
Dec 05 09:24:23 compute-1 nova_compute[189066]: 2025-12-05 09:24:23.640 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:24:24 compute-1 nova_compute[189066]: 2025-12-05 09:24:24.806 189070 DEBUG nova.compute.provider_tree [None req-84589b53-7718-467d-b317-24ceebd62a0b fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] Inventory has not changed in ProviderTree for provider: be68f9f1-7820-4bfa-8dbd-210e13729f64 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 09:24:24 compute-1 nova_compute[189066]: 2025-12-05 09:24:24.909 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:24:24 compute-1 nova_compute[189066]: 2025-12-05 09:24:24.913 189070 DEBUG nova.scheduler.client.report [None req-84589b53-7718-467d-b317-24ceebd62a0b fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] Inventory has not changed for provider be68f9f1-7820-4bfa-8dbd-210e13729f64 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 09:24:25 compute-1 nova_compute[189066]: 2025-12-05 09:24:25.072 189070 DEBUG oslo_concurrency.lockutils [None req-84589b53-7718-467d-b317-24ceebd62a0b fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 1.820s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:24:25 compute-1 nova_compute[189066]: 2025-12-05 09:24:25.073 189070 DEBUG nova.compute.manager [None req-84589b53-7718-467d-b317-24ceebd62a0b fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] [instance: 6b54aedd-9d9e-436b-9010-c5b04ffaca40] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Dec 05 09:24:25 compute-1 nova_compute[189066]: 2025-12-05 09:24:25.491 189070 DEBUG nova.compute.manager [None req-84589b53-7718-467d-b317-24ceebd62a0b fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] [instance: 6b54aedd-9d9e-436b-9010-c5b04ffaca40] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Dec 05 09:24:25 compute-1 nova_compute[189066]: 2025-12-05 09:24:25.491 189070 DEBUG nova.network.neutron [None req-84589b53-7718-467d-b317-24ceebd62a0b fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] [instance: 6b54aedd-9d9e-436b-9010-c5b04ffaca40] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Dec 05 09:24:25 compute-1 nova_compute[189066]: 2025-12-05 09:24:25.643 189070 INFO nova.virt.libvirt.driver [None req-84589b53-7718-467d-b317-24ceebd62a0b fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] [instance: 6b54aedd-9d9e-436b-9010-c5b04ffaca40] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Dec 05 09:24:25 compute-1 kernel: tapb2d6abb2-b3 (unregistering): left promiscuous mode
Dec 05 09:24:25 compute-1 NetworkManager[55704]: <info>  [1764926665.7862] device (tapb2d6abb2-b3): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec 05 09:24:25 compute-1 ovn_controller[95809]: 2025-12-05T09:24:25Z|00048|binding|INFO|Releasing lport b2d6abb2-b32e-4bf7-bbdd-00d28c47ddc5 from this chassis (sb_readonly=0)
Dec 05 09:24:25 compute-1 nova_compute[189066]: 2025-12-05 09:24:25.830 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:24:25 compute-1 ovn_controller[95809]: 2025-12-05T09:24:25Z|00049|binding|INFO|Setting lport b2d6abb2-b32e-4bf7-bbdd-00d28c47ddc5 down in Southbound
Dec 05 09:24:25 compute-1 ovn_controller[95809]: 2025-12-05T09:24:25Z|00050|binding|INFO|Removing iface tapb2d6abb2-b3 ovn-installed in OVS
Dec 05 09:24:25 compute-1 nova_compute[189066]: 2025-12-05 09:24:25.834 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:24:25 compute-1 nova_compute[189066]: 2025-12-05 09:24:25.845 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:24:25 compute-1 systemd[1]: machine-qemu\x2d2\x2dinstance\x2d00000003.scope: Deactivated successfully.
Dec 05 09:24:25 compute-1 systemd[1]: machine-qemu\x2d2\x2dinstance\x2d00000003.scope: Consumed 18.817s CPU time.
Dec 05 09:24:25 compute-1 systemd-machined[154815]: Machine qemu-2-instance-00000003 terminated.
Dec 05 09:24:25 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:24:25.989 105272 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:57:1d:2e 10.100.0.8'], port_security=['fa:16:3e:57:1d:2e 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '2b94d7d0-b000-460f-8883-8953d60115d0', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-359084ff-88c9-4324-bfef-b1ef9f403118', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e26ae3fdd48d4947978a480f70e14f84', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'fc71c22e-381a-49db-a2d4-d596204126ce', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com', 'neutron:port_fip': '192.168.122.195'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e32f91e2-293f-4583-87a5-1928a1341468, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fc06f54c970>], logical_port=b2d6abb2-b32e-4bf7-bbdd-00d28c47ddc5) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fc06f54c970>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 09:24:25 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:24:25.991 105272 INFO neutron.agent.ovn.metadata.agent [-] Port b2d6abb2-b32e-4bf7-bbdd-00d28c47ddc5 in datapath 359084ff-88c9-4324-bfef-b1ef9f403118 unbound from our chassis
Dec 05 09:24:25 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:24:25.995 105272 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 359084ff-88c9-4324-bfef-b1ef9f403118, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 05 09:24:26 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:24:25.999 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[abb2fd09-63c5-4379-b21a-f4b8614df6d2]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:24:26 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:24:26.001 105272 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-359084ff-88c9-4324-bfef-b1ef9f403118 namespace which is not needed anymore
Dec 05 09:24:26 compute-1 kernel: tapb2d6abb2-b3: entered promiscuous mode
Dec 05 09:24:26 compute-1 kernel: tapb2d6abb2-b3 (unregistering): left promiscuous mode
Dec 05 09:24:26 compute-1 nova_compute[189066]: 2025-12-05 09:24:26.027 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:24:26 compute-1 nova_compute[189066]: 2025-12-05 09:24:26.094 189070 DEBUG nova.compute.manager [None req-84589b53-7718-467d-b317-24ceebd62a0b fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] [instance: 6b54aedd-9d9e-436b-9010-c5b04ffaca40] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Dec 05 09:24:26 compute-1 neutron-haproxy-ovnmeta-359084ff-88c9-4324-bfef-b1ef9f403118[221153]: [NOTICE]   (221157) : haproxy version is 2.8.14-c23fe91
Dec 05 09:24:26 compute-1 neutron-haproxy-ovnmeta-359084ff-88c9-4324-bfef-b1ef9f403118[221153]: [NOTICE]   (221157) : path to executable is /usr/sbin/haproxy
Dec 05 09:24:26 compute-1 neutron-haproxy-ovnmeta-359084ff-88c9-4324-bfef-b1ef9f403118[221153]: [WARNING]  (221157) : Exiting Master process...
Dec 05 09:24:26 compute-1 neutron-haproxy-ovnmeta-359084ff-88c9-4324-bfef-b1ef9f403118[221153]: [WARNING]  (221157) : Exiting Master process...
Dec 05 09:24:26 compute-1 neutron-haproxy-ovnmeta-359084ff-88c9-4324-bfef-b1ef9f403118[221153]: [ALERT]    (221157) : Current worker (221159) exited with code 143 (Terminated)
Dec 05 09:24:26 compute-1 neutron-haproxy-ovnmeta-359084ff-88c9-4324-bfef-b1ef9f403118[221153]: [WARNING]  (221157) : All workers exited. Exiting... (0)
Dec 05 09:24:26 compute-1 systemd[1]: libpod-e10179d8c493adc669af42844f7fdc869ff8365d46ce07f1f025f9c233eeb8b7.scope: Deactivated successfully.
Dec 05 09:24:26 compute-1 podman[221897]: 2025-12-05 09:24:26.179415374 +0000 UTC m=+0.049489865 container died e10179d8c493adc669af42844f7fdc869ff8365d46ce07f1f025f9c233eeb8b7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-359084ff-88c9-4324-bfef-b1ef9f403118, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Dec 05 09:24:26 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-e10179d8c493adc669af42844f7fdc869ff8365d46ce07f1f025f9c233eeb8b7-userdata-shm.mount: Deactivated successfully.
Dec 05 09:24:26 compute-1 systemd[1]: var-lib-containers-storage-overlay-d4a53f9d0734f0aa74de1f2635f61e4b9f8b0e08303c49cd54990aeff28aece0-merged.mount: Deactivated successfully.
Dec 05 09:24:26 compute-1 podman[221897]: 2025-12-05 09:24:26.22907796 +0000 UTC m=+0.099152441 container cleanup e10179d8c493adc669af42844f7fdc869ff8365d46ce07f1f025f9c233eeb8b7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-359084ff-88c9-4324-bfef-b1ef9f403118, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Dec 05 09:24:26 compute-1 systemd[1]: libpod-conmon-e10179d8c493adc669af42844f7fdc869ff8365d46ce07f1f025f9c233eeb8b7.scope: Deactivated successfully.
Dec 05 09:24:26 compute-1 podman[221928]: 2025-12-05 09:24:26.308678341 +0000 UTC m=+0.049338420 container remove e10179d8c493adc669af42844f7fdc869ff8365d46ce07f1f025f9c233eeb8b7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-359084ff-88c9-4324-bfef-b1ef9f403118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125)
Dec 05 09:24:26 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:24:26.315 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[326f0c59-3414-42a8-8efc-7f8126f5cab6]: (4, ('Fri Dec  5 09:24:26 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-359084ff-88c9-4324-bfef-b1ef9f403118 (e10179d8c493adc669af42844f7fdc869ff8365d46ce07f1f025f9c233eeb8b7)\ne10179d8c493adc669af42844f7fdc869ff8365d46ce07f1f025f9c233eeb8b7\nFri Dec  5 09:24:26 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-359084ff-88c9-4324-bfef-b1ef9f403118 (e10179d8c493adc669af42844f7fdc869ff8365d46ce07f1f025f9c233eeb8b7)\ne10179d8c493adc669af42844f7fdc869ff8365d46ce07f1f025f9c233eeb8b7\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:24:26 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:24:26.317 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[6f6cb05a-ddbd-4c67-8228-e0d5949ea145]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:24:26 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:24:26.318 105272 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap359084ff-80, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 09:24:26 compute-1 nova_compute[189066]: 2025-12-05 09:24:26.320 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:24:26 compute-1 kernel: tap359084ff-80: left promiscuous mode
Dec 05 09:24:26 compute-1 nova_compute[189066]: 2025-12-05 09:24:26.336 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:24:26 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:24:26.342 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[923df944-f60f-4b61-8640-26ac246ef8cb]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:24:26 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:24:26.361 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[2a09676a-d1b5-4135-a771-a93ed3164755]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:24:26 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:24:26.362 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[f3bb7a22-628e-4ff1-84a8-baebd6058a48]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:24:26 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:24:26.382 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[48ad4dd7-5d6d-436e-91c7-64124496e815]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 384672, 'reachable_time': 20536, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 221947, 'error': None, 'target': 'ovnmeta-359084ff-88c9-4324-bfef-b1ef9f403118', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:24:26 compute-1 systemd[1]: run-netns-ovnmeta\x2d359084ff\x2d88c9\x2d4324\x2dbfef\x2db1ef9f403118.mount: Deactivated successfully.
Dec 05 09:24:26 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:24:26.387 105809 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-359084ff-88c9-4324-bfef-b1ef9f403118 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Dec 05 09:24:26 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:24:26.387 105809 DEBUG oslo.privsep.daemon [-] privsep: reply[13aea0ee-0250-4bb0-92b5-435c3e6d9e0e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:24:26 compute-1 nova_compute[189066]: 2025-12-05 09:24:26.562 189070 INFO nova.virt.libvirt.driver [None req-3b78da78-4a16-441d-a1c2-3ae5cd1497db 401ddf2632c64d0094aff64bba42db19 ea153810a6ed4283be879b89a75f1e86 - - default default] [instance: 2b94d7d0-b000-460f-8883-8953d60115d0] Instance shutdown successfully after 3 seconds.
Dec 05 09:24:26 compute-1 nova_compute[189066]: 2025-12-05 09:24:26.567 189070 INFO nova.virt.libvirt.driver [-] [instance: 2b94d7d0-b000-460f-8883-8953d60115d0] Instance destroyed successfully.
Dec 05 09:24:26 compute-1 nova_compute[189066]: 2025-12-05 09:24:26.568 189070 DEBUG nova.virt.libvirt.vif [None req-3b78da78-4a16-441d-a1c2-3ae5cd1497db 401ddf2632c64d0094aff64bba42db19 ea153810a6ed4283be879b89a75f1e86 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-05T09:21:50Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-804921789',display_name='tempest-TestNetworkAdvancedServerOps-server-804921789',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-804921789',id=3,image_ref='3ebffd97-b242-42d7-b245-ebdaf8e4377c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDW3qadE13zsRSDKJlvOjZ/sgAUF4xLDU490CG+fZt4wcZlik42yE/XMAhTvQFBPRgAXCQSd0Lb+5EKERW2D+JVe4gr8wR12Ds02HfiWcpROJrTejCLEVPRm4reH/9sdrA==',key_name='tempest-TestNetworkAdvancedServerOps-175847342',keypairs=<?>,launch_index=0,launched_at=2025-12-05T09:23:00Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=MigrationContext,new_flavor=Flavor(1),node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='e26ae3fdd48d4947978a480f70e14f84',ramdisk_id='',reservation_id='r-rmofgs1d',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=ServiceList,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='3ebffd97-b242-42d7-b245-ebdaf8e4377c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-TestNetworkAdvancedServerOps-1829130727',owner_user_name='tempest-TestNetworkAdvancedServerOps-1829130727-project-member'},tags=<?>,task_state='resize_migrating',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-05T09:24:07Z,user_data=None,user_id='65751a90715341b2984ef84ebbaa1650',uuid=2b94d7d0-b000-460f-8883-8953d60115d0,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "b2d6abb2-b32e-4bf7-bbdd-00d28c47ddc5", "address": "fa:16:3e:57:1d:2e", "network": {"id": "359084ff-88c9-4324-bfef-b1ef9f403118", "bridge": "br-int", "label": "tempest-network-smoke--271958037", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.195", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-network-smoke--271958037", "vif_mac": "fa:16:3e:57:1d:2e"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e26ae3fdd48d4947978a480f70e14f84", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb2d6abb2-b3", "ovs_interfaceid": "b2d6abb2-b32e-4bf7-bbdd-00d28c47ddc5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Dec 05 09:24:26 compute-1 nova_compute[189066]: 2025-12-05 09:24:26.568 189070 DEBUG nova.network.os_vif_util [None req-3b78da78-4a16-441d-a1c2-3ae5cd1497db 401ddf2632c64d0094aff64bba42db19 ea153810a6ed4283be879b89a75f1e86 - - default default] Converting VIF {"id": "b2d6abb2-b32e-4bf7-bbdd-00d28c47ddc5", "address": "fa:16:3e:57:1d:2e", "network": {"id": "359084ff-88c9-4324-bfef-b1ef9f403118", "bridge": "br-int", "label": "tempest-network-smoke--271958037", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.195", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-network-smoke--271958037", "vif_mac": "fa:16:3e:57:1d:2e"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e26ae3fdd48d4947978a480f70e14f84", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb2d6abb2-b3", "ovs_interfaceid": "b2d6abb2-b32e-4bf7-bbdd-00d28c47ddc5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 09:24:26 compute-1 nova_compute[189066]: 2025-12-05 09:24:26.569 189070 DEBUG nova.network.os_vif_util [None req-3b78da78-4a16-441d-a1c2-3ae5cd1497db 401ddf2632c64d0094aff64bba42db19 ea153810a6ed4283be879b89a75f1e86 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:57:1d:2e,bridge_name='br-int',has_traffic_filtering=True,id=b2d6abb2-b32e-4bf7-bbdd-00d28c47ddc5,network=Network(359084ff-88c9-4324-bfef-b1ef9f403118),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb2d6abb2-b3') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 09:24:26 compute-1 nova_compute[189066]: 2025-12-05 09:24:26.570 189070 DEBUG os_vif [None req-3b78da78-4a16-441d-a1c2-3ae5cd1497db 401ddf2632c64d0094aff64bba42db19 ea153810a6ed4283be879b89a75f1e86 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:57:1d:2e,bridge_name='br-int',has_traffic_filtering=True,id=b2d6abb2-b32e-4bf7-bbdd-00d28c47ddc5,network=Network(359084ff-88c9-4324-bfef-b1ef9f403118),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb2d6abb2-b3') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Dec 05 09:24:26 compute-1 nova_compute[189066]: 2025-12-05 09:24:26.573 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:24:26 compute-1 nova_compute[189066]: 2025-12-05 09:24:26.573 189070 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb2d6abb2-b3, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 09:24:26 compute-1 nova_compute[189066]: 2025-12-05 09:24:26.575 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:24:26 compute-1 nova_compute[189066]: 2025-12-05 09:24:26.577 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:24:26 compute-1 nova_compute[189066]: 2025-12-05 09:24:26.586 189070 INFO os_vif [None req-3b78da78-4a16-441d-a1c2-3ae5cd1497db 401ddf2632c64d0094aff64bba42db19 ea153810a6ed4283be879b89a75f1e86 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:57:1d:2e,bridge_name='br-int',has_traffic_filtering=True,id=b2d6abb2-b32e-4bf7-bbdd-00d28c47ddc5,network=Network(359084ff-88c9-4324-bfef-b1ef9f403118),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb2d6abb2-b3')
Dec 05 09:24:26 compute-1 nova_compute[189066]: 2025-12-05 09:24:26.591 189070 DEBUG oslo_concurrency.processutils [None req-3b78da78-4a16-441d-a1c2-3ae5cd1497db 401ddf2632c64d0094aff64bba42db19 ea153810a6ed4283be879b89a75f1e86 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2b94d7d0-b000-460f-8883-8953d60115d0/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 09:24:26 compute-1 nova_compute[189066]: 2025-12-05 09:24:26.650 189070 DEBUG oslo_concurrency.processutils [None req-3b78da78-4a16-441d-a1c2-3ae5cd1497db 401ddf2632c64d0094aff64bba42db19 ea153810a6ed4283be879b89a75f1e86 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2b94d7d0-b000-460f-8883-8953d60115d0/disk --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 09:24:26 compute-1 nova_compute[189066]: 2025-12-05 09:24:26.651 189070 DEBUG oslo_concurrency.processutils [None req-3b78da78-4a16-441d-a1c2-3ae5cd1497db 401ddf2632c64d0094aff64bba42db19 ea153810a6ed4283be879b89a75f1e86 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2b94d7d0-b000-460f-8883-8953d60115d0/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 09:24:26 compute-1 nova_compute[189066]: 2025-12-05 09:24:26.713 189070 DEBUG oslo_concurrency.processutils [None req-3b78da78-4a16-441d-a1c2-3ae5cd1497db 401ddf2632c64d0094aff64bba42db19 ea153810a6ed4283be879b89a75f1e86 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2b94d7d0-b000-460f-8883-8953d60115d0/disk --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 09:24:26 compute-1 nova_compute[189066]: 2025-12-05 09:24:26.715 189070 DEBUG nova.virt.libvirt.volume.remotefs [None req-3b78da78-4a16-441d-a1c2-3ae5cd1497db 401ddf2632c64d0094aff64bba42db19 ea153810a6ed4283be879b89a75f1e86 - - default default] Copying file /var/lib/nova/instances/2b94d7d0-b000-460f-8883-8953d60115d0_resize/disk to 192.168.122.100:/var/lib/nova/instances/2b94d7d0-b000-460f-8883-8953d60115d0/disk copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103
Dec 05 09:24:26 compute-1 nova_compute[189066]: 2025-12-05 09:24:26.715 189070 DEBUG oslo_concurrency.processutils [None req-3b78da78-4a16-441d-a1c2-3ae5cd1497db 401ddf2632c64d0094aff64bba42db19 ea153810a6ed4283be879b89a75f1e86 - - default default] Running cmd (subprocess): scp -r /var/lib/nova/instances/2b94d7d0-b000-460f-8883-8953d60115d0_resize/disk 192.168.122.100:/var/lib/nova/instances/2b94d7d0-b000-460f-8883-8953d60115d0/disk execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 09:24:27 compute-1 nova_compute[189066]: 2025-12-05 09:24:27.149 189070 DEBUG nova.policy [None req-84589b53-7718-467d-b317-24ceebd62a0b fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'fae1c60e378945ea84b34c4824b835b1', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'fa1cd463d74b49139a088d332d37e611', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Dec 05 09:24:27 compute-1 nova_compute[189066]: 2025-12-05 09:24:27.355 189070 DEBUG oslo_concurrency.processutils [None req-3b78da78-4a16-441d-a1c2-3ae5cd1497db 401ddf2632c64d0094aff64bba42db19 ea153810a6ed4283be879b89a75f1e86 - - default default] CMD "scp -r /var/lib/nova/instances/2b94d7d0-b000-460f-8883-8953d60115d0_resize/disk 192.168.122.100:/var/lib/nova/instances/2b94d7d0-b000-460f-8883-8953d60115d0/disk" returned: 0 in 0.640s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 09:24:27 compute-1 nova_compute[189066]: 2025-12-05 09:24:27.356 189070 DEBUG nova.virt.libvirt.volume.remotefs [None req-3b78da78-4a16-441d-a1c2-3ae5cd1497db 401ddf2632c64d0094aff64bba42db19 ea153810a6ed4283be879b89a75f1e86 - - default default] Copying file /var/lib/nova/instances/2b94d7d0-b000-460f-8883-8953d60115d0_resize/disk.config to 192.168.122.100:/var/lib/nova/instances/2b94d7d0-b000-460f-8883-8953d60115d0/disk.config copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103
Dec 05 09:24:27 compute-1 nova_compute[189066]: 2025-12-05 09:24:27.357 189070 DEBUG oslo_concurrency.processutils [None req-3b78da78-4a16-441d-a1c2-3ae5cd1497db 401ddf2632c64d0094aff64bba42db19 ea153810a6ed4283be879b89a75f1e86 - - default default] Running cmd (subprocess): scp -C -r /var/lib/nova/instances/2b94d7d0-b000-460f-8883-8953d60115d0_resize/disk.config 192.168.122.100:/var/lib/nova/instances/2b94d7d0-b000-460f-8883-8953d60115d0/disk.config execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 09:24:27 compute-1 nova_compute[189066]: 2025-12-05 09:24:27.580 189070 DEBUG nova.compute.manager [None req-84589b53-7718-467d-b317-24ceebd62a0b fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] [instance: 6b54aedd-9d9e-436b-9010-c5b04ffaca40] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Dec 05 09:24:27 compute-1 nova_compute[189066]: 2025-12-05 09:24:27.583 189070 DEBUG nova.virt.libvirt.driver [None req-84589b53-7718-467d-b317-24ceebd62a0b fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] [instance: 6b54aedd-9d9e-436b-9010-c5b04ffaca40] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Dec 05 09:24:27 compute-1 nova_compute[189066]: 2025-12-05 09:24:27.583 189070 INFO nova.virt.libvirt.driver [None req-84589b53-7718-467d-b317-24ceebd62a0b fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] [instance: 6b54aedd-9d9e-436b-9010-c5b04ffaca40] Creating image(s)
Dec 05 09:24:27 compute-1 nova_compute[189066]: 2025-12-05 09:24:27.584 189070 DEBUG oslo_concurrency.lockutils [None req-84589b53-7718-467d-b317-24ceebd62a0b fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] Acquiring lock "/var/lib/nova/instances/6b54aedd-9d9e-436b-9010-c5b04ffaca40/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:24:27 compute-1 nova_compute[189066]: 2025-12-05 09:24:27.585 189070 DEBUG oslo_concurrency.lockutils [None req-84589b53-7718-467d-b317-24ceebd62a0b fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] Lock "/var/lib/nova/instances/6b54aedd-9d9e-436b-9010-c5b04ffaca40/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:24:27 compute-1 nova_compute[189066]: 2025-12-05 09:24:27.585 189070 DEBUG oslo_concurrency.lockutils [None req-84589b53-7718-467d-b317-24ceebd62a0b fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] Lock "/var/lib/nova/instances/6b54aedd-9d9e-436b-9010-c5b04ffaca40/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:24:27 compute-1 nova_compute[189066]: 2025-12-05 09:24:27.602 189070 DEBUG oslo_concurrency.processutils [None req-3b78da78-4a16-441d-a1c2-3ae5cd1497db 401ddf2632c64d0094aff64bba42db19 ea153810a6ed4283be879b89a75f1e86 - - default default] CMD "scp -C -r /var/lib/nova/instances/2b94d7d0-b000-460f-8883-8953d60115d0_resize/disk.config 192.168.122.100:/var/lib/nova/instances/2b94d7d0-b000-460f-8883-8953d60115d0/disk.config" returned: 0 in 0.246s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 09:24:27 compute-1 nova_compute[189066]: 2025-12-05 09:24:27.604 189070 DEBUG nova.virt.libvirt.volume.remotefs [None req-3b78da78-4a16-441d-a1c2-3ae5cd1497db 401ddf2632c64d0094aff64bba42db19 ea153810a6ed4283be879b89a75f1e86 - - default default] Copying file /var/lib/nova/instances/2b94d7d0-b000-460f-8883-8953d60115d0_resize/disk.info to 192.168.122.100:/var/lib/nova/instances/2b94d7d0-b000-460f-8883-8953d60115d0/disk.info copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103
Dec 05 09:24:27 compute-1 nova_compute[189066]: 2025-12-05 09:24:27.604 189070 DEBUG oslo_concurrency.processutils [None req-3b78da78-4a16-441d-a1c2-3ae5cd1497db 401ddf2632c64d0094aff64bba42db19 ea153810a6ed4283be879b89a75f1e86 - - default default] Running cmd (subprocess): scp -C -r /var/lib/nova/instances/2b94d7d0-b000-460f-8883-8953d60115d0_resize/disk.info 192.168.122.100:/var/lib/nova/instances/2b94d7d0-b000-460f-8883-8953d60115d0/disk.info execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 09:24:27 compute-1 nova_compute[189066]: 2025-12-05 09:24:27.628 189070 DEBUG oslo_concurrency.processutils [None req-84589b53-7718-467d-b317-24ceebd62a0b fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5b1bdb5cc50b9bf43ebe3094e9279a12a18df0ce --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 09:24:27 compute-1 podman[221958]: 2025-12-05 09:24:27.650996011 +0000 UTC m=+0.074720713 container health_status d60d3dfd47255bc7c068dbedc03dbf86678863f0414a1b8f8536e4d53dd67a13 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a42d21893a42142b0b00e96296a13ac9d2c0fedf55d6438ad104fd0309c63376-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, tcib_managed=true)
Dec 05 09:24:27 compute-1 nova_compute[189066]: 2025-12-05 09:24:27.694 189070 DEBUG oslo_concurrency.processutils [None req-84589b53-7718-467d-b317-24ceebd62a0b fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5b1bdb5cc50b9bf43ebe3094e9279a12a18df0ce --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 09:24:27 compute-1 nova_compute[189066]: 2025-12-05 09:24:27.695 189070 DEBUG oslo_concurrency.lockutils [None req-84589b53-7718-467d-b317-24ceebd62a0b fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] Acquiring lock "5b1bdb5cc50b9bf43ebe3094e9279a12a18df0ce" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:24:27 compute-1 nova_compute[189066]: 2025-12-05 09:24:27.695 189070 DEBUG oslo_concurrency.lockutils [None req-84589b53-7718-467d-b317-24ceebd62a0b fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] Lock "5b1bdb5cc50b9bf43ebe3094e9279a12a18df0ce" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:24:27 compute-1 nova_compute[189066]: 2025-12-05 09:24:27.706 189070 DEBUG oslo_concurrency.processutils [None req-84589b53-7718-467d-b317-24ceebd62a0b fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5b1bdb5cc50b9bf43ebe3094e9279a12a18df0ce --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 09:24:27 compute-1 nova_compute[189066]: 2025-12-05 09:24:27.768 189070 DEBUG oslo_concurrency.processutils [None req-84589b53-7718-467d-b317-24ceebd62a0b fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5b1bdb5cc50b9bf43ebe3094e9279a12a18df0ce --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 09:24:27 compute-1 nova_compute[189066]: 2025-12-05 09:24:27.769 189070 DEBUG oslo_concurrency.processutils [None req-84589b53-7718-467d-b317-24ceebd62a0b fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5b1bdb5cc50b9bf43ebe3094e9279a12a18df0ce,backing_fmt=raw /var/lib/nova/instances/6b54aedd-9d9e-436b-9010-c5b04ffaca40/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 09:24:27 compute-1 nova_compute[189066]: 2025-12-05 09:24:27.807 189070 DEBUG oslo_concurrency.processutils [None req-84589b53-7718-467d-b317-24ceebd62a0b fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5b1bdb5cc50b9bf43ebe3094e9279a12a18df0ce,backing_fmt=raw /var/lib/nova/instances/6b54aedd-9d9e-436b-9010-c5b04ffaca40/disk 1073741824" returned: 0 in 0.038s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 09:24:27 compute-1 nova_compute[189066]: 2025-12-05 09:24:27.808 189070 DEBUG oslo_concurrency.lockutils [None req-84589b53-7718-467d-b317-24ceebd62a0b fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] Lock "5b1bdb5cc50b9bf43ebe3094e9279a12a18df0ce" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.113s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:24:27 compute-1 nova_compute[189066]: 2025-12-05 09:24:27.809 189070 DEBUG oslo_concurrency.processutils [None req-84589b53-7718-467d-b317-24ceebd62a0b fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5b1bdb5cc50b9bf43ebe3094e9279a12a18df0ce --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 09:24:27 compute-1 nova_compute[189066]: 2025-12-05 09:24:27.843 189070 DEBUG oslo_concurrency.processutils [None req-3b78da78-4a16-441d-a1c2-3ae5cd1497db 401ddf2632c64d0094aff64bba42db19 ea153810a6ed4283be879b89a75f1e86 - - default default] CMD "scp -C -r /var/lib/nova/instances/2b94d7d0-b000-460f-8883-8953d60115d0_resize/disk.info 192.168.122.100:/var/lib/nova/instances/2b94d7d0-b000-460f-8883-8953d60115d0/disk.info" returned: 0 in 0.240s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 09:24:27 compute-1 nova_compute[189066]: 2025-12-05 09:24:27.866 189070 DEBUG oslo_concurrency.processutils [None req-84589b53-7718-467d-b317-24ceebd62a0b fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5b1bdb5cc50b9bf43ebe3094e9279a12a18df0ce --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 09:24:27 compute-1 nova_compute[189066]: 2025-12-05 09:24:27.867 189070 DEBUG nova.virt.disk.api [None req-84589b53-7718-467d-b317-24ceebd62a0b fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] Checking if we can resize image /var/lib/nova/instances/6b54aedd-9d9e-436b-9010-c5b04ffaca40/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Dec 05 09:24:27 compute-1 nova_compute[189066]: 2025-12-05 09:24:27.867 189070 DEBUG oslo_concurrency.processutils [None req-84589b53-7718-467d-b317-24ceebd62a0b fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6b54aedd-9d9e-436b-9010-c5b04ffaca40/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 09:24:27 compute-1 nova_compute[189066]: 2025-12-05 09:24:27.929 189070 DEBUG oslo_concurrency.processutils [None req-84589b53-7718-467d-b317-24ceebd62a0b fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6b54aedd-9d9e-436b-9010-c5b04ffaca40/disk --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 09:24:27 compute-1 nova_compute[189066]: 2025-12-05 09:24:27.930 189070 DEBUG nova.virt.disk.api [None req-84589b53-7718-467d-b317-24ceebd62a0b fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] Cannot resize image /var/lib/nova/instances/6b54aedd-9d9e-436b-9010-c5b04ffaca40/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Dec 05 09:24:27 compute-1 nova_compute[189066]: 2025-12-05 09:24:27.930 189070 DEBUG nova.objects.instance [None req-84589b53-7718-467d-b317-24ceebd62a0b fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] Lazy-loading 'migration_context' on Instance uuid 6b54aedd-9d9e-436b-9010-c5b04ffaca40 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 09:24:28 compute-1 nova_compute[189066]: 2025-12-05 09:24:28.642 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:24:29 compute-1 nova_compute[189066]: 2025-12-05 09:24:29.089 189070 DEBUG nova.virt.libvirt.driver [None req-84589b53-7718-467d-b317-24ceebd62a0b fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] [instance: 6b54aedd-9d9e-436b-9010-c5b04ffaca40] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Dec 05 09:24:29 compute-1 nova_compute[189066]: 2025-12-05 09:24:29.090 189070 DEBUG nova.virt.libvirt.driver [None req-84589b53-7718-467d-b317-24ceebd62a0b fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] [instance: 6b54aedd-9d9e-436b-9010-c5b04ffaca40] Ensure instance console log exists: /var/lib/nova/instances/6b54aedd-9d9e-436b-9010-c5b04ffaca40/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Dec 05 09:24:29 compute-1 nova_compute[189066]: 2025-12-05 09:24:29.091 189070 DEBUG oslo_concurrency.lockutils [None req-84589b53-7718-467d-b317-24ceebd62a0b fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:24:29 compute-1 nova_compute[189066]: 2025-12-05 09:24:29.091 189070 DEBUG oslo_concurrency.lockutils [None req-84589b53-7718-467d-b317-24ceebd62a0b fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:24:29 compute-1 nova_compute[189066]: 2025-12-05 09:24:29.091 189070 DEBUG oslo_concurrency.lockutils [None req-84589b53-7718-467d-b317-24ceebd62a0b fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:24:29 compute-1 nova_compute[189066]: 2025-12-05 09:24:29.112 189070 DEBUG neutronclient.v2_0.client [None req-3b78da78-4a16-441d-a1c2-3ae5cd1497db 401ddf2632c64d0094aff64bba42db19 ea153810a6ed4283be879b89a75f1e86 - - default default] Error message: {"NeutronError": {"type": "PortBindingNotFound", "message": "Binding for port b2d6abb2-b32e-4bf7-bbdd-00d28c47ddc5 for host compute-0.ctlplane.example.com could not be found.", "detail": ""}} _handle_fault_response /usr/lib/python3.9/site-packages/neutronclient/v2_0/client.py:262
Dec 05 09:24:29 compute-1 nova_compute[189066]: 2025-12-05 09:24:29.148 189070 DEBUG oslo_concurrency.lockutils [None req-121aab57-9143-4c2f-8514-a7ce33bf2580 9d57c2b5ca2f41c09c8f4b723716ef7b 9891534839274cbc89aec062c36488b0 - - default default] Acquiring lock "95f85266-e2bc-4615-b523-e7346ad3ab40-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:24:29 compute-1 nova_compute[189066]: 2025-12-05 09:24:29.149 189070 DEBUG oslo_concurrency.lockutils [None req-121aab57-9143-4c2f-8514-a7ce33bf2580 9d57c2b5ca2f41c09c8f4b723716ef7b 9891534839274cbc89aec062c36488b0 - - default default] Lock "95f85266-e2bc-4615-b523-e7346ad3ab40-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:24:29 compute-1 nova_compute[189066]: 2025-12-05 09:24:29.149 189070 DEBUG oslo_concurrency.lockutils [None req-121aab57-9143-4c2f-8514-a7ce33bf2580 9d57c2b5ca2f41c09c8f4b723716ef7b 9891534839274cbc89aec062c36488b0 - - default default] Lock "95f85266-e2bc-4615-b523-e7346ad3ab40-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:24:29 compute-1 nova_compute[189066]: 2025-12-05 09:24:29.281 189070 DEBUG oslo_concurrency.lockutils [None req-121aab57-9143-4c2f-8514-a7ce33bf2580 9d57c2b5ca2f41c09c8f4b723716ef7b 9891534839274cbc89aec062c36488b0 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:24:29 compute-1 nova_compute[189066]: 2025-12-05 09:24:29.282 189070 DEBUG oslo_concurrency.lockutils [None req-121aab57-9143-4c2f-8514-a7ce33bf2580 9d57c2b5ca2f41c09c8f4b723716ef7b 9891534839274cbc89aec062c36488b0 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:24:29 compute-1 nova_compute[189066]: 2025-12-05 09:24:29.282 189070 DEBUG oslo_concurrency.lockutils [None req-121aab57-9143-4c2f-8514-a7ce33bf2580 9d57c2b5ca2f41c09c8f4b723716ef7b 9891534839274cbc89aec062c36488b0 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:24:29 compute-1 nova_compute[189066]: 2025-12-05 09:24:29.282 189070 DEBUG nova.compute.resource_tracker [None req-121aab57-9143-4c2f-8514-a7ce33bf2580 9d57c2b5ca2f41c09c8f4b723716ef7b 9891534839274cbc89aec062c36488b0 - - default default] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 05 09:24:29 compute-1 nova_compute[189066]: 2025-12-05 09:24:29.508 189070 WARNING nova.virt.libvirt.driver [None req-121aab57-9143-4c2f-8514-a7ce33bf2580 9d57c2b5ca2f41c09c8f4b723716ef7b 9891534839274cbc89aec062c36488b0 - - default default] Periodic task is updating the host stats, it is trying to get disk info for instance-00000003, but the backing disk storage was removed by a concurrent operation such as resize. Error: No disk at /var/lib/nova/instances/2b94d7d0-b000-460f-8883-8953d60115d0/disk: nova.exception.DiskNotFound: No disk at /var/lib/nova/instances/2b94d7d0-b000-460f-8883-8953d60115d0/disk
Dec 05 09:24:29 compute-1 nova_compute[189066]: 2025-12-05 09:24:29.567 189070 DEBUG oslo_concurrency.lockutils [None req-3b78da78-4a16-441d-a1c2-3ae5cd1497db 401ddf2632c64d0094aff64bba42db19 ea153810a6ed4283be879b89a75f1e86 - - default default] Acquiring lock "2b94d7d0-b000-460f-8883-8953d60115d0-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:24:29 compute-1 nova_compute[189066]: 2025-12-05 09:24:29.567 189070 DEBUG oslo_concurrency.lockutils [None req-3b78da78-4a16-441d-a1c2-3ae5cd1497db 401ddf2632c64d0094aff64bba42db19 ea153810a6ed4283be879b89a75f1e86 - - default default] Lock "2b94d7d0-b000-460f-8883-8953d60115d0-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:24:29 compute-1 nova_compute[189066]: 2025-12-05 09:24:29.567 189070 DEBUG oslo_concurrency.lockutils [None req-3b78da78-4a16-441d-a1c2-3ae5cd1497db 401ddf2632c64d0094aff64bba42db19 ea153810a6ed4283be879b89a75f1e86 - - default default] Lock "2b94d7d0-b000-460f-8883-8953d60115d0-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:24:29 compute-1 nova_compute[189066]: 2025-12-05 09:24:29.675 189070 WARNING nova.virt.libvirt.driver [None req-121aab57-9143-4c2f-8514-a7ce33bf2580 9d57c2b5ca2f41c09c8f4b723716ef7b 9891534839274cbc89aec062c36488b0 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 05 09:24:29 compute-1 nova_compute[189066]: 2025-12-05 09:24:29.677 189070 DEBUG nova.compute.resource_tracker [None req-121aab57-9143-4c2f-8514-a7ce33bf2580 9d57c2b5ca2f41c09c8f4b723716ef7b 9891534839274cbc89aec062c36488b0 - - default default] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5757MB free_disk=73.30504608154297GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 05 09:24:29 compute-1 nova_compute[189066]: 2025-12-05 09:24:29.677 189070 DEBUG oslo_concurrency.lockutils [None req-121aab57-9143-4c2f-8514-a7ce33bf2580 9d57c2b5ca2f41c09c8f4b723716ef7b 9891534839274cbc89aec062c36488b0 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:24:29 compute-1 nova_compute[189066]: 2025-12-05 09:24:29.677 189070 DEBUG oslo_concurrency.lockutils [None req-121aab57-9143-4c2f-8514-a7ce33bf2580 9d57c2b5ca2f41c09c8f4b723716ef7b 9891534839274cbc89aec062c36488b0 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:24:30 compute-1 nova_compute[189066]: 2025-12-05 09:24:30.219 189070 DEBUG nova.compute.resource_tracker [None req-121aab57-9143-4c2f-8514-a7ce33bf2580 9d57c2b5ca2f41c09c8f4b723716ef7b 9891534839274cbc89aec062c36488b0 - - default default] Migration for instance 2b94d7d0-b000-460f-8883-8953d60115d0 refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:903
Dec 05 09:24:30 compute-1 nova_compute[189066]: 2025-12-05 09:24:30.220 189070 DEBUG nova.compute.resource_tracker [None req-121aab57-9143-4c2f-8514-a7ce33bf2580 9d57c2b5ca2f41c09c8f4b723716ef7b 9891534839274cbc89aec062c36488b0 - - default default] Migration for instance 95f85266-e2bc-4615-b523-e7346ad3ab40 refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:903
Dec 05 09:24:30 compute-1 podman[221994]: 2025-12-05 09:24:30.6678114 +0000 UTC m=+0.096908055 container health_status 0cdc951f3b455a1459471b20e1f1626ca53f702831c125f835d1b670aeb17ccf (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a42d21893a42142b0b00e96296a13ac9d2c0fedf55d6438ad104fd0309c63376-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_controller)
Dec 05 09:24:30 compute-1 nova_compute[189066]: 2025-12-05 09:24:30.684 189070 DEBUG nova.compute.resource_tracker [None req-121aab57-9143-4c2f-8514-a7ce33bf2580 9d57c2b5ca2f41c09c8f4b723716ef7b 9891534839274cbc89aec062c36488b0 - - default default] [instance: 95f85266-e2bc-4615-b523-e7346ad3ab40] Skipping migration as instance is neither resizing nor live-migrating. _update_usage_from_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1491
Dec 05 09:24:30 compute-1 nova_compute[189066]: 2025-12-05 09:24:30.685 189070 INFO nova.compute.resource_tracker [None req-121aab57-9143-4c2f-8514-a7ce33bf2580 9d57c2b5ca2f41c09c8f4b723716ef7b 9891534839274cbc89aec062c36488b0 - - default default] [instance: 2b94d7d0-b000-460f-8883-8953d60115d0] Updating resource usage from migration 28d48d37-57dd-4441-8e4e-cf3cfac9be6c
Dec 05 09:24:30 compute-1 nova_compute[189066]: 2025-12-05 09:24:30.685 189070 DEBUG nova.compute.resource_tracker [None req-121aab57-9143-4c2f-8514-a7ce33bf2580 9d57c2b5ca2f41c09c8f4b723716ef7b 9891534839274cbc89aec062c36488b0 - - default default] [instance: 2b94d7d0-b000-460f-8883-8953d60115d0] Starting to track outgoing migration 28d48d37-57dd-4441-8e4e-cf3cfac9be6c with flavor fbadeab4-f24f-4100-963a-d228b2a6f7c4 _update_usage_from_migration /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1444
Dec 05 09:24:30 compute-1 nova_compute[189066]: 2025-12-05 09:24:30.724 189070 DEBUG nova.compute.resource_tracker [None req-121aab57-9143-4c2f-8514-a7ce33bf2580 9d57c2b5ca2f41c09c8f4b723716ef7b 9891534839274cbc89aec062c36488b0 - - default default] Migration 60ad7bb0-a13d-4ca7-a419-a16d5433e46c is active on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1640
Dec 05 09:24:30 compute-1 nova_compute[189066]: 2025-12-05 09:24:30.725 189070 DEBUG nova.compute.resource_tracker [None req-121aab57-9143-4c2f-8514-a7ce33bf2580 9d57c2b5ca2f41c09c8f4b723716ef7b 9891534839274cbc89aec062c36488b0 - - default default] Migration 28d48d37-57dd-4441-8e4e-cf3cfac9be6c is active on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1640
Dec 05 09:24:30 compute-1 nova_compute[189066]: 2025-12-05 09:24:30.725 189070 DEBUG nova.compute.resource_tracker [None req-121aab57-9143-4c2f-8514-a7ce33bf2580 9d57c2b5ca2f41c09c8f4b723716ef7b 9891534839274cbc89aec062c36488b0 - - default default] Instance 6b54aedd-9d9e-436b-9010-c5b04ffaca40 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Dec 05 09:24:30 compute-1 nova_compute[189066]: 2025-12-05 09:24:30.725 189070 DEBUG nova.compute.resource_tracker [None req-121aab57-9143-4c2f-8514-a7ce33bf2580 9d57c2b5ca2f41c09c8f4b723716ef7b 9891534839274cbc89aec062c36488b0 - - default default] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 05 09:24:30 compute-1 nova_compute[189066]: 2025-12-05 09:24:30.725 189070 DEBUG nova.compute.resource_tracker [None req-121aab57-9143-4c2f-8514-a7ce33bf2580 9d57c2b5ca2f41c09c8f4b723716ef7b 9891534839274cbc89aec062c36488b0 - - default default] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 05 09:24:30 compute-1 nova_compute[189066]: 2025-12-05 09:24:30.845 189070 DEBUG nova.compute.provider_tree [None req-121aab57-9143-4c2f-8514-a7ce33bf2580 9d57c2b5ca2f41c09c8f4b723716ef7b 9891534839274cbc89aec062c36488b0 - - default default] Inventory has not changed in ProviderTree for provider: be68f9f1-7820-4bfa-8dbd-210e13729f64 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 09:24:30 compute-1 nova_compute[189066]: 2025-12-05 09:24:30.988 189070 DEBUG nova.scheduler.client.report [None req-121aab57-9143-4c2f-8514-a7ce33bf2580 9d57c2b5ca2f41c09c8f4b723716ef7b 9891534839274cbc89aec062c36488b0 - - default default] Inventory has not changed for provider be68f9f1-7820-4bfa-8dbd-210e13729f64 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 09:24:31 compute-1 nova_compute[189066]: 2025-12-05 09:24:31.021 189070 DEBUG nova.compute.resource_tracker [None req-121aab57-9143-4c2f-8514-a7ce33bf2580 9d57c2b5ca2f41c09c8f4b723716ef7b 9891534839274cbc89aec062c36488b0 - - default default] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 05 09:24:31 compute-1 nova_compute[189066]: 2025-12-05 09:24:31.021 189070 DEBUG oslo_concurrency.lockutils [None req-121aab57-9143-4c2f-8514-a7ce33bf2580 9d57c2b5ca2f41c09c8f4b723716ef7b 9891534839274cbc89aec062c36488b0 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.344s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:24:31 compute-1 nova_compute[189066]: 2025-12-05 09:24:31.029 189070 INFO nova.compute.manager [None req-121aab57-9143-4c2f-8514-a7ce33bf2580 9d57c2b5ca2f41c09c8f4b723716ef7b 9891534839274cbc89aec062c36488b0 - - default default] [instance: 95f85266-e2bc-4615-b523-e7346ad3ab40] Migrating instance to compute-2.ctlplane.example.com finished successfully.
Dec 05 09:24:31 compute-1 nova_compute[189066]: 2025-12-05 09:24:31.608 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:24:31 compute-1 nova_compute[189066]: 2025-12-05 09:24:31.834 189070 INFO nova.scheduler.client.report [None req-121aab57-9143-4c2f-8514-a7ce33bf2580 9d57c2b5ca2f41c09c8f4b723716ef7b 9891534839274cbc89aec062c36488b0 - - default default] Deleted allocation for migration 60ad7bb0-a13d-4ca7-a419-a16d5433e46c
Dec 05 09:24:31 compute-1 nova_compute[189066]: 2025-12-05 09:24:31.835 189070 DEBUG nova.virt.libvirt.driver [None req-121aab57-9143-4c2f-8514-a7ce33bf2580 9d57c2b5ca2f41c09c8f4b723716ef7b 9891534839274cbc89aec062c36488b0 - - default default] [instance: 95f85266-e2bc-4615-b523-e7346ad3ab40] Live migration monitoring is all done _live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10662
Dec 05 09:24:32 compute-1 nova_compute[189066]: 2025-12-05 09:24:32.302 189070 DEBUG nova.compute.manager [req-e467025a-0c5b-44df-b421-d133fce1bffc req-43b34ab9-65a8-48b5-9b46-65ca2cbe7ba0 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 2b94d7d0-b000-460f-8883-8953d60115d0] Received event network-vif-unplugged-b2d6abb2-b32e-4bf7-bbdd-00d28c47ddc5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 09:24:32 compute-1 nova_compute[189066]: 2025-12-05 09:24:32.303 189070 DEBUG oslo_concurrency.lockutils [req-e467025a-0c5b-44df-b421-d133fce1bffc req-43b34ab9-65a8-48b5-9b46-65ca2cbe7ba0 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Acquiring lock "2b94d7d0-b000-460f-8883-8953d60115d0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:24:32 compute-1 nova_compute[189066]: 2025-12-05 09:24:32.303 189070 DEBUG oslo_concurrency.lockutils [req-e467025a-0c5b-44df-b421-d133fce1bffc req-43b34ab9-65a8-48b5-9b46-65ca2cbe7ba0 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Lock "2b94d7d0-b000-460f-8883-8953d60115d0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:24:32 compute-1 nova_compute[189066]: 2025-12-05 09:24:32.304 189070 DEBUG oslo_concurrency.lockutils [req-e467025a-0c5b-44df-b421-d133fce1bffc req-43b34ab9-65a8-48b5-9b46-65ca2cbe7ba0 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Lock "2b94d7d0-b000-460f-8883-8953d60115d0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:24:32 compute-1 nova_compute[189066]: 2025-12-05 09:24:32.304 189070 DEBUG nova.compute.manager [req-e467025a-0c5b-44df-b421-d133fce1bffc req-43b34ab9-65a8-48b5-9b46-65ca2cbe7ba0 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 2b94d7d0-b000-460f-8883-8953d60115d0] No waiting events found dispatching network-vif-unplugged-b2d6abb2-b32e-4bf7-bbdd-00d28c47ddc5 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 09:24:32 compute-1 nova_compute[189066]: 2025-12-05 09:24:32.304 189070 WARNING nova.compute.manager [req-e467025a-0c5b-44df-b421-d133fce1bffc req-43b34ab9-65a8-48b5-9b46-65ca2cbe7ba0 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 2b94d7d0-b000-460f-8883-8953d60115d0] Received unexpected event network-vif-unplugged-b2d6abb2-b32e-4bf7-bbdd-00d28c47ddc5 for instance with vm_state active and task_state resize_migrated.
Dec 05 09:24:33 compute-1 podman[222020]: 2025-12-05 09:24:33.619799612 +0000 UTC m=+0.058336621 container health_status e8cea6352187f28e672aa25c227f2705a90d58074b8cf42235a56df5d4026fd5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a42d21893a42142b0b00e96296a13ac9d2c0fedf55d6438ad104fd0309c63376-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Dec 05 09:24:33 compute-1 nova_compute[189066]: 2025-12-05 09:24:33.644 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:24:36 compute-1 nova_compute[189066]: 2025-12-05 09:24:36.610 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:24:36 compute-1 podman[222039]: 2025-12-05 09:24:36.630098142 +0000 UTC m=+0.058723980 container health_status 3f3288243aefe700d53cffce184353529fbaf6e44f1c19ec4792ed576af41176 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=multipathd, managed_by=edpm_ansible, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, org.label-schema.license=GPLv2)
Dec 05 09:24:38 compute-1 nova_compute[189066]: 2025-12-05 09:24:38.683 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:24:40 compute-1 sshd-session[222060]: Received disconnect from 185.118.15.236 port 36042:11: Bye Bye [preauth]
Dec 05 09:24:40 compute-1 sshd-session[222060]: Disconnected from authenticating user root 185.118.15.236 port 36042 [preauth]
Dec 05 09:24:40 compute-1 podman[222062]: 2025-12-05 09:24:40.634742526 +0000 UTC m=+0.074241250 container health_status b07f36300303a5c1729056a61e344d6c6fd9b2e404082d8e8ce7185d45652c2b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.expose-services=, vcs-type=git, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., architecture=x86_64, io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., config_id=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, release=1755695350, io.buildah.version=1.33.7, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter)
Dec 05 09:24:41 compute-1 nova_compute[189066]: 2025-12-05 09:24:41.061 189070 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764926666.0597053, 2b94d7d0-b000-460f-8883-8953d60115d0 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 09:24:41 compute-1 nova_compute[189066]: 2025-12-05 09:24:41.062 189070 INFO nova.compute.manager [-] [instance: 2b94d7d0-b000-460f-8883-8953d60115d0] VM Stopped (Lifecycle Event)
Dec 05 09:24:41 compute-1 nova_compute[189066]: 2025-12-05 09:24:41.612 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:24:41 compute-1 nova_compute[189066]: 2025-12-05 09:24:41.757 189070 DEBUG nova.compute.manager [None req-2a832e58-c251-46a5-b915-1eff2bcc0740 - - - - - -] [instance: 2b94d7d0-b000-460f-8883-8953d60115d0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 09:24:41 compute-1 nova_compute[189066]: 2025-12-05 09:24:41.761 189070 DEBUG nova.compute.manager [None req-2a832e58-c251-46a5-b915-1eff2bcc0740 - - - - - -] [instance: 2b94d7d0-b000-460f-8883-8953d60115d0] Synchronizing instance power state after lifecycle event "Stopped"; current vm_state: active, current task_state: resize_migrated, current DB power_state: 1, VM power_state: 4 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 09:24:42 compute-1 nova_compute[189066]: 2025-12-05 09:24:42.920 189070 INFO nova.compute.manager [None req-2a832e58-c251-46a5-b915-1eff2bcc0740 - - - - - -] [instance: 2b94d7d0-b000-460f-8883-8953d60115d0] During the sync_power process the instance has moved from host compute-0.ctlplane.example.com to host compute-1.ctlplane.example.com
Dec 05 09:24:43 compute-1 podman[222083]: 2025-12-05 09:24:43.60237457 +0000 UTC m=+0.048839297 container health_status b9f45cdcb567bb250381fa80d86c78c226d5638c7de1b6d79ee017275814ff68 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Dec 05 09:24:43 compute-1 nova_compute[189066]: 2025-12-05 09:24:43.685 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:24:44 compute-1 nova_compute[189066]: 2025-12-05 09:24:44.343 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:24:44 compute-1 nova_compute[189066]: 2025-12-05 09:24:44.491 189070 DEBUG nova.compute.manager [req-6b518ac1-655f-4068-a2fe-da234a1a0175 req-697c6efa-b4f1-4883-93fa-a0bfae56d9da 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 2b94d7d0-b000-460f-8883-8953d60115d0] Received event network-vif-plugged-b2d6abb2-b32e-4bf7-bbdd-00d28c47ddc5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 09:24:44 compute-1 nova_compute[189066]: 2025-12-05 09:24:44.492 189070 DEBUG oslo_concurrency.lockutils [req-6b518ac1-655f-4068-a2fe-da234a1a0175 req-697c6efa-b4f1-4883-93fa-a0bfae56d9da 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Acquiring lock "2b94d7d0-b000-460f-8883-8953d60115d0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:24:44 compute-1 nova_compute[189066]: 2025-12-05 09:24:44.493 189070 DEBUG oslo_concurrency.lockutils [req-6b518ac1-655f-4068-a2fe-da234a1a0175 req-697c6efa-b4f1-4883-93fa-a0bfae56d9da 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Lock "2b94d7d0-b000-460f-8883-8953d60115d0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:24:44 compute-1 nova_compute[189066]: 2025-12-05 09:24:44.493 189070 DEBUG oslo_concurrency.lockutils [req-6b518ac1-655f-4068-a2fe-da234a1a0175 req-697c6efa-b4f1-4883-93fa-a0bfae56d9da 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Lock "2b94d7d0-b000-460f-8883-8953d60115d0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:24:44 compute-1 nova_compute[189066]: 2025-12-05 09:24:44.493 189070 DEBUG nova.compute.manager [req-6b518ac1-655f-4068-a2fe-da234a1a0175 req-697c6efa-b4f1-4883-93fa-a0bfae56d9da 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 2b94d7d0-b000-460f-8883-8953d60115d0] No waiting events found dispatching network-vif-plugged-b2d6abb2-b32e-4bf7-bbdd-00d28c47ddc5 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 09:24:44 compute-1 nova_compute[189066]: 2025-12-05 09:24:44.493 189070 WARNING nova.compute.manager [req-6b518ac1-655f-4068-a2fe-da234a1a0175 req-697c6efa-b4f1-4883-93fa-a0bfae56d9da 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 2b94d7d0-b000-460f-8883-8953d60115d0] Received unexpected event network-vif-plugged-b2d6abb2-b32e-4bf7-bbdd-00d28c47ddc5 for instance with vm_state active and task_state resize_migrated.
Dec 05 09:24:44 compute-1 nova_compute[189066]: 2025-12-05 09:24:44.687 189070 DEBUG nova.network.neutron [None req-84589b53-7718-467d-b317-24ceebd62a0b fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] [instance: 6b54aedd-9d9e-436b-9010-c5b04ffaca40] Successfully created port: 64e441f0-08b6-483c-9732-5217cdbe1468 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Dec 05 09:24:46 compute-1 nova_compute[189066]: 2025-12-05 09:24:46.657 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:24:48 compute-1 nova_compute[189066]: 2025-12-05 09:24:48.687 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:24:50 compute-1 podman[222105]: 2025-12-05 09:24:50.617132081 +0000 UTC m=+0.054735652 container health_status 550b604b3b40c506829669f3af0f3e8dc4e7ac01464222ce7f26998708fe1a96 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Dec 05 09:24:51 compute-1 nova_compute[189066]: 2025-12-05 09:24:51.659 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:24:52 compute-1 nova_compute[189066]: 2025-12-05 09:24:52.006 189070 DEBUG nova.compute.manager [req-48cf7118-6cf8-4704-9c42-5bfeb38f846e req-b5e57c1b-cca7-444c-aaac-89305f7840e5 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 2b94d7d0-b000-460f-8883-8953d60115d0] Received event network-changed-b2d6abb2-b32e-4bf7-bbdd-00d28c47ddc5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 09:24:52 compute-1 nova_compute[189066]: 2025-12-05 09:24:52.006 189070 DEBUG nova.compute.manager [req-48cf7118-6cf8-4704-9c42-5bfeb38f846e req-b5e57c1b-cca7-444c-aaac-89305f7840e5 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 2b94d7d0-b000-460f-8883-8953d60115d0] Refreshing instance network info cache due to event network-changed-b2d6abb2-b32e-4bf7-bbdd-00d28c47ddc5. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Dec 05 09:24:52 compute-1 nova_compute[189066]: 2025-12-05 09:24:52.007 189070 DEBUG oslo_concurrency.lockutils [req-48cf7118-6cf8-4704-9c42-5bfeb38f846e req-b5e57c1b-cca7-444c-aaac-89305f7840e5 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Acquiring lock "refresh_cache-2b94d7d0-b000-460f-8883-8953d60115d0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 09:24:52 compute-1 nova_compute[189066]: 2025-12-05 09:24:52.007 189070 DEBUG oslo_concurrency.lockutils [req-48cf7118-6cf8-4704-9c42-5bfeb38f846e req-b5e57c1b-cca7-444c-aaac-89305f7840e5 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Acquired lock "refresh_cache-2b94d7d0-b000-460f-8883-8953d60115d0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 09:24:52 compute-1 nova_compute[189066]: 2025-12-05 09:24:52.007 189070 DEBUG nova.network.neutron [req-48cf7118-6cf8-4704-9c42-5bfeb38f846e req-b5e57c1b-cca7-444c-aaac-89305f7840e5 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 2b94d7d0-b000-460f-8883-8953d60115d0] Refreshing network info cache for port b2d6abb2-b32e-4bf7-bbdd-00d28c47ddc5 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Dec 05 09:24:53 compute-1 nova_compute[189066]: 2025-12-05 09:24:53.630 189070 DEBUG nova.network.neutron [None req-84589b53-7718-467d-b317-24ceebd62a0b fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] [instance: 6b54aedd-9d9e-436b-9010-c5b04ffaca40] Successfully updated port: 64e441f0-08b6-483c-9732-5217cdbe1468 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Dec 05 09:24:53 compute-1 nova_compute[189066]: 2025-12-05 09:24:53.664 189070 DEBUG oslo_concurrency.lockutils [None req-84589b53-7718-467d-b317-24ceebd62a0b fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] Acquiring lock "refresh_cache-6b54aedd-9d9e-436b-9010-c5b04ffaca40" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 09:24:53 compute-1 nova_compute[189066]: 2025-12-05 09:24:53.665 189070 DEBUG oslo_concurrency.lockutils [None req-84589b53-7718-467d-b317-24ceebd62a0b fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] Acquired lock "refresh_cache-6b54aedd-9d9e-436b-9010-c5b04ffaca40" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 09:24:53 compute-1 nova_compute[189066]: 2025-12-05 09:24:53.665 189070 DEBUG nova.network.neutron [None req-84589b53-7718-467d-b317-24ceebd62a0b fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] [instance: 6b54aedd-9d9e-436b-9010-c5b04ffaca40] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Dec 05 09:24:53 compute-1 nova_compute[189066]: 2025-12-05 09:24:53.689 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:24:54 compute-1 nova_compute[189066]: 2025-12-05 09:24:54.622 189070 DEBUG nova.network.neutron [None req-84589b53-7718-467d-b317-24ceebd62a0b fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] [instance: 6b54aedd-9d9e-436b-9010-c5b04ffaca40] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Dec 05 09:24:55 compute-1 nova_compute[189066]: 2025-12-05 09:24:55.731 189070 DEBUG nova.network.neutron [req-48cf7118-6cf8-4704-9c42-5bfeb38f846e req-b5e57c1b-cca7-444c-aaac-89305f7840e5 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 2b94d7d0-b000-460f-8883-8953d60115d0] Updated VIF entry in instance network info cache for port b2d6abb2-b32e-4bf7-bbdd-00d28c47ddc5. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Dec 05 09:24:55 compute-1 nova_compute[189066]: 2025-12-05 09:24:55.731 189070 DEBUG nova.network.neutron [req-48cf7118-6cf8-4704-9c42-5bfeb38f846e req-b5e57c1b-cca7-444c-aaac-89305f7840e5 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 2b94d7d0-b000-460f-8883-8953d60115d0] Updating instance_info_cache with network_info: [{"id": "b2d6abb2-b32e-4bf7-bbdd-00d28c47ddc5", "address": "fa:16:3e:57:1d:2e", "network": {"id": "359084ff-88c9-4324-bfef-b1ef9f403118", "bridge": "br-int", "label": "tempest-network-smoke--271958037", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.195", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e26ae3fdd48d4947978a480f70e14f84", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb2d6abb2-b3", "ovs_interfaceid": "b2d6abb2-b32e-4bf7-bbdd-00d28c47ddc5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 09:24:56 compute-1 nova_compute[189066]: 2025-12-05 09:24:56.378 189070 DEBUG oslo_concurrency.lockutils [req-48cf7118-6cf8-4704-9c42-5bfeb38f846e req-b5e57c1b-cca7-444c-aaac-89305f7840e5 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Releasing lock "refresh_cache-2b94d7d0-b000-460f-8883-8953d60115d0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 09:24:56 compute-1 nova_compute[189066]: 2025-12-05 09:24:56.661 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:24:57 compute-1 nova_compute[189066]: 2025-12-05 09:24:57.645 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:24:58 compute-1 podman[222129]: 2025-12-05 09:24:58.62065874 +0000 UTC m=+0.060228416 container health_status d60d3dfd47255bc7c068dbedc03dbf86678863f0414a1b8f8536e4d53dd67a13 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_managed=true, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a42d21893a42142b0b00e96296a13ac9d2c0fedf55d6438ad104fd0309c63376-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Dec 05 09:24:58 compute-1 nova_compute[189066]: 2025-12-05 09:24:58.748 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:24:58 compute-1 nova_compute[189066]: 2025-12-05 09:24:58.841 189070 DEBUG nova.compute.manager [req-8c470638-495e-4404-a286-2e033c520348 req-c2e95988-9979-4a4c-a962-6fc30e240cbe 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 6b54aedd-9d9e-436b-9010-c5b04ffaca40] Received event network-changed-64e441f0-08b6-483c-9732-5217cdbe1468 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 09:24:58 compute-1 nova_compute[189066]: 2025-12-05 09:24:58.841 189070 DEBUG nova.compute.manager [req-8c470638-495e-4404-a286-2e033c520348 req-c2e95988-9979-4a4c-a962-6fc30e240cbe 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 6b54aedd-9d9e-436b-9010-c5b04ffaca40] Refreshing instance network info cache due to event network-changed-64e441f0-08b6-483c-9732-5217cdbe1468. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Dec 05 09:24:58 compute-1 nova_compute[189066]: 2025-12-05 09:24:58.841 189070 DEBUG oslo_concurrency.lockutils [req-8c470638-495e-4404-a286-2e033c520348 req-c2e95988-9979-4a4c-a962-6fc30e240cbe 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Acquiring lock "refresh_cache-6b54aedd-9d9e-436b-9010-c5b04ffaca40" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 09:25:00 compute-1 nova_compute[189066]: 2025-12-05 09:25:00.861 189070 DEBUG nova.network.neutron [None req-84589b53-7718-467d-b317-24ceebd62a0b fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] [instance: 6b54aedd-9d9e-436b-9010-c5b04ffaca40] Updating instance_info_cache with network_info: [{"id": "64e441f0-08b6-483c-9732-5217cdbe1468", "address": "fa:16:3e:5f:73:50", "network": {"id": "f58cc02f-396f-494d-8f1e-d6f4412689c2", "bridge": "br-int", "label": "tempest-network-smoke--188393732", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe5f:7350", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fa1cd463d74b49139a088d332d37e611", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap64e441f0-08", "ovs_interfaceid": "64e441f0-08b6-483c-9732-5217cdbe1468", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 09:25:01 compute-1 nova_compute[189066]: 2025-12-05 09:25:01.663 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:25:01 compute-1 podman[222149]: 2025-12-05 09:25:01.684139954 +0000 UTC m=+0.119107821 container health_status 0cdc951f3b455a1459471b20e1f1626ca53f702831c125f835d1b670aeb17ccf (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a42d21893a42142b0b00e96296a13ac9d2c0fedf55d6438ad104fd0309c63376-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible)
Dec 05 09:25:01 compute-1 nova_compute[189066]: 2025-12-05 09:25:01.801 189070 DEBUG oslo_concurrency.lockutils [None req-84589b53-7718-467d-b317-24ceebd62a0b fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] Releasing lock "refresh_cache-6b54aedd-9d9e-436b-9010-c5b04ffaca40" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 09:25:01 compute-1 nova_compute[189066]: 2025-12-05 09:25:01.802 189070 DEBUG nova.compute.manager [None req-84589b53-7718-467d-b317-24ceebd62a0b fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] [instance: 6b54aedd-9d9e-436b-9010-c5b04ffaca40] Instance network_info: |[{"id": "64e441f0-08b6-483c-9732-5217cdbe1468", "address": "fa:16:3e:5f:73:50", "network": {"id": "f58cc02f-396f-494d-8f1e-d6f4412689c2", "bridge": "br-int", "label": "tempest-network-smoke--188393732", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe5f:7350", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fa1cd463d74b49139a088d332d37e611", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap64e441f0-08", "ovs_interfaceid": "64e441f0-08b6-483c-9732-5217cdbe1468", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Dec 05 09:25:01 compute-1 nova_compute[189066]: 2025-12-05 09:25:01.802 189070 DEBUG oslo_concurrency.lockutils [req-8c470638-495e-4404-a286-2e033c520348 req-c2e95988-9979-4a4c-a962-6fc30e240cbe 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Acquired lock "refresh_cache-6b54aedd-9d9e-436b-9010-c5b04ffaca40" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 09:25:01 compute-1 nova_compute[189066]: 2025-12-05 09:25:01.803 189070 DEBUG nova.network.neutron [req-8c470638-495e-4404-a286-2e033c520348 req-c2e95988-9979-4a4c-a962-6fc30e240cbe 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 6b54aedd-9d9e-436b-9010-c5b04ffaca40] Refreshing network info cache for port 64e441f0-08b6-483c-9732-5217cdbe1468 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Dec 05 09:25:01 compute-1 nova_compute[189066]: 2025-12-05 09:25:01.806 189070 DEBUG nova.virt.libvirt.driver [None req-84589b53-7718-467d-b317-24ceebd62a0b fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] [instance: 6b54aedd-9d9e-436b-9010-c5b04ffaca40] Start _get_guest_xml network_info=[{"id": "64e441f0-08b6-483c-9732-5217cdbe1468", "address": "fa:16:3e:5f:73:50", "network": {"id": "f58cc02f-396f-494d-8f1e-d6f4412689c2", "bridge": "br-int", "label": "tempest-network-smoke--188393732", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe5f:7350", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fa1cd463d74b49139a088d332d37e611", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap64e441f0-08", "ovs_interfaceid": "64e441f0-08b6-483c-9732-5217cdbe1468", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T09:19:04Z,direct_url=<?>,disk_format='qcow2',id=3ebffd97-b242-42d7-b245-ebdaf8e4377c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='1cb29f3743454b7fb71af92993455437',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T09:19:06Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encrypted': False, 'encryption_secret_uuid': None, 'size': 0, 'boot_index': 0, 'device_name': '/dev/vda', 'disk_bus': 'virtio', 'guest_format': None, 'encryption_options': None, 'encryption_format': None, 'device_type': 'disk', 'image_id': '3ebffd97-b242-42d7-b245-ebdaf8e4377c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Dec 05 09:25:01 compute-1 nova_compute[189066]: 2025-12-05 09:25:01.818 189070 WARNING nova.virt.libvirt.driver [None req-84589b53-7718-467d-b317-24ceebd62a0b fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 05 09:25:01 compute-1 nova_compute[189066]: 2025-12-05 09:25:01.825 189070 DEBUG nova.virt.libvirt.host [None req-84589b53-7718-467d-b317-24ceebd62a0b fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Dec 05 09:25:01 compute-1 nova_compute[189066]: 2025-12-05 09:25:01.825 189070 DEBUG nova.virt.libvirt.host [None req-84589b53-7718-467d-b317-24ceebd62a0b fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Dec 05 09:25:01 compute-1 nova_compute[189066]: 2025-12-05 09:25:01.830 189070 DEBUG nova.virt.libvirt.host [None req-84589b53-7718-467d-b317-24ceebd62a0b fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Dec 05 09:25:01 compute-1 nova_compute[189066]: 2025-12-05 09:25:01.831 189070 DEBUG nova.virt.libvirt.host [None req-84589b53-7718-467d-b317-24ceebd62a0b fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Dec 05 09:25:01 compute-1 nova_compute[189066]: 2025-12-05 09:25:01.833 189070 DEBUG nova.virt.libvirt.driver [None req-84589b53-7718-467d-b317-24ceebd62a0b fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Dec 05 09:25:01 compute-1 nova_compute[189066]: 2025-12-05 09:25:01.833 189070 DEBUG nova.virt.hardware [None req-84589b53-7718-467d-b317-24ceebd62a0b fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-05T09:19:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='fbadeab4-f24f-4100-963a-d228b2a6f7c4',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T09:19:04Z,direct_url=<?>,disk_format='qcow2',id=3ebffd97-b242-42d7-b245-ebdaf8e4377c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='1cb29f3743454b7fb71af92993455437',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T09:19:06Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Dec 05 09:25:01 compute-1 nova_compute[189066]: 2025-12-05 09:25:01.833 189070 DEBUG nova.virt.hardware [None req-84589b53-7718-467d-b317-24ceebd62a0b fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Dec 05 09:25:01 compute-1 nova_compute[189066]: 2025-12-05 09:25:01.833 189070 DEBUG nova.virt.hardware [None req-84589b53-7718-467d-b317-24ceebd62a0b fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Dec 05 09:25:01 compute-1 nova_compute[189066]: 2025-12-05 09:25:01.834 189070 DEBUG nova.virt.hardware [None req-84589b53-7718-467d-b317-24ceebd62a0b fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Dec 05 09:25:01 compute-1 nova_compute[189066]: 2025-12-05 09:25:01.834 189070 DEBUG nova.virt.hardware [None req-84589b53-7718-467d-b317-24ceebd62a0b fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Dec 05 09:25:01 compute-1 nova_compute[189066]: 2025-12-05 09:25:01.834 189070 DEBUG nova.virt.hardware [None req-84589b53-7718-467d-b317-24ceebd62a0b fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Dec 05 09:25:01 compute-1 nova_compute[189066]: 2025-12-05 09:25:01.834 189070 DEBUG nova.virt.hardware [None req-84589b53-7718-467d-b317-24ceebd62a0b fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Dec 05 09:25:01 compute-1 nova_compute[189066]: 2025-12-05 09:25:01.834 189070 DEBUG nova.virt.hardware [None req-84589b53-7718-467d-b317-24ceebd62a0b fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Dec 05 09:25:01 compute-1 nova_compute[189066]: 2025-12-05 09:25:01.835 189070 DEBUG nova.virt.hardware [None req-84589b53-7718-467d-b317-24ceebd62a0b fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Dec 05 09:25:01 compute-1 nova_compute[189066]: 2025-12-05 09:25:01.835 189070 DEBUG nova.virt.hardware [None req-84589b53-7718-467d-b317-24ceebd62a0b fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Dec 05 09:25:01 compute-1 nova_compute[189066]: 2025-12-05 09:25:01.835 189070 DEBUG nova.virt.hardware [None req-84589b53-7718-467d-b317-24ceebd62a0b fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Dec 05 09:25:01 compute-1 nova_compute[189066]: 2025-12-05 09:25:01.840 189070 DEBUG nova.virt.libvirt.vif [None req-84589b53-7718-467d-b317-24ceebd62a0b fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T09:24:17Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-520554038',display_name='tempest-TestGettingAddress-server-520554038',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testgettingaddress-server-520554038',id=7,image_ref='3ebffd97-b242-42d7-b245-ebdaf8e4377c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBCHemSCbldDOeAvAklAY1GBE7IpwWfP2ZPUnyVv6eMnqJsn+zpySVsIxCVwt5x3P5v8sd6NBMOWL+BJqkJAyHVGLR2hUEXOHFYLsF9I/mokK0nJIvCbRk6qzwE1U+KhNRw==',key_name='tempest-TestGettingAddress-981102965',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='fa1cd463d74b49139a088d332d37e611',ramdisk_id='',reservation_id='r-9vtmxokr',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='3ebffd97-b242-42d7-b245-ebdaf8e4377c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-8368731',owner_user_name='tempest-TestGettingAddress-8368731-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T09:24:27Z,user_data=None,user_id='fae1c60e378945ea84b34c4824b835b1',uuid=6b54aedd-9d9e-436b-9010-c5b04ffaca40,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "64e441f0-08b6-483c-9732-5217cdbe1468", "address": "fa:16:3e:5f:73:50", "network": {"id": "f58cc02f-396f-494d-8f1e-d6f4412689c2", "bridge": "br-int", "label": "tempest-network-smoke--188393732", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe5f:7350", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fa1cd463d74b49139a088d332d37e611", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap64e441f0-08", "ovs_interfaceid": "64e441f0-08b6-483c-9732-5217cdbe1468", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Dec 05 09:25:01 compute-1 nova_compute[189066]: 2025-12-05 09:25:01.840 189070 DEBUG nova.network.os_vif_util [None req-84589b53-7718-467d-b317-24ceebd62a0b fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] Converting VIF {"id": "64e441f0-08b6-483c-9732-5217cdbe1468", "address": "fa:16:3e:5f:73:50", "network": {"id": "f58cc02f-396f-494d-8f1e-d6f4412689c2", "bridge": "br-int", "label": "tempest-network-smoke--188393732", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe5f:7350", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fa1cd463d74b49139a088d332d37e611", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap64e441f0-08", "ovs_interfaceid": "64e441f0-08b6-483c-9732-5217cdbe1468", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 09:25:01 compute-1 nova_compute[189066]: 2025-12-05 09:25:01.841 189070 DEBUG nova.network.os_vif_util [None req-84589b53-7718-467d-b317-24ceebd62a0b fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:5f:73:50,bridge_name='br-int',has_traffic_filtering=True,id=64e441f0-08b6-483c-9732-5217cdbe1468,network=Network(f58cc02f-396f-494d-8f1e-d6f4412689c2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap64e441f0-08') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 09:25:01 compute-1 nova_compute[189066]: 2025-12-05 09:25:01.842 189070 DEBUG nova.objects.instance [None req-84589b53-7718-467d-b317-24ceebd62a0b fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] Lazy-loading 'pci_devices' on Instance uuid 6b54aedd-9d9e-436b-9010-c5b04ffaca40 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 09:25:01 compute-1 nova_compute[189066]: 2025-12-05 09:25:01.922 189070 DEBUG nova.virt.libvirt.driver [None req-84589b53-7718-467d-b317-24ceebd62a0b fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] [instance: 6b54aedd-9d9e-436b-9010-c5b04ffaca40] End _get_guest_xml xml=<domain type="kvm">
Dec 05 09:25:01 compute-1 nova_compute[189066]:   <uuid>6b54aedd-9d9e-436b-9010-c5b04ffaca40</uuid>
Dec 05 09:25:01 compute-1 nova_compute[189066]:   <name>instance-00000007</name>
Dec 05 09:25:01 compute-1 nova_compute[189066]:   <memory>131072</memory>
Dec 05 09:25:01 compute-1 nova_compute[189066]:   <vcpu>1</vcpu>
Dec 05 09:25:01 compute-1 nova_compute[189066]:   <metadata>
Dec 05 09:25:01 compute-1 nova_compute[189066]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 05 09:25:01 compute-1 nova_compute[189066]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec 05 09:25:01 compute-1 nova_compute[189066]:       <nova:name>tempest-TestGettingAddress-server-520554038</nova:name>
Dec 05 09:25:01 compute-1 nova_compute[189066]:       <nova:creationTime>2025-12-05 09:25:01</nova:creationTime>
Dec 05 09:25:01 compute-1 nova_compute[189066]:       <nova:flavor name="m1.nano">
Dec 05 09:25:01 compute-1 nova_compute[189066]:         <nova:memory>128</nova:memory>
Dec 05 09:25:01 compute-1 nova_compute[189066]:         <nova:disk>1</nova:disk>
Dec 05 09:25:01 compute-1 nova_compute[189066]:         <nova:swap>0</nova:swap>
Dec 05 09:25:01 compute-1 nova_compute[189066]:         <nova:ephemeral>0</nova:ephemeral>
Dec 05 09:25:01 compute-1 nova_compute[189066]:         <nova:vcpus>1</nova:vcpus>
Dec 05 09:25:01 compute-1 nova_compute[189066]:       </nova:flavor>
Dec 05 09:25:01 compute-1 nova_compute[189066]:       <nova:owner>
Dec 05 09:25:01 compute-1 nova_compute[189066]:         <nova:user uuid="fae1c60e378945ea84b34c4824b835b1">tempest-TestGettingAddress-8368731-project-member</nova:user>
Dec 05 09:25:01 compute-1 nova_compute[189066]:         <nova:project uuid="fa1cd463d74b49139a088d332d37e611">tempest-TestGettingAddress-8368731</nova:project>
Dec 05 09:25:01 compute-1 nova_compute[189066]:       </nova:owner>
Dec 05 09:25:01 compute-1 nova_compute[189066]:       <nova:root type="image" uuid="3ebffd97-b242-42d7-b245-ebdaf8e4377c"/>
Dec 05 09:25:01 compute-1 nova_compute[189066]:       <nova:ports>
Dec 05 09:25:01 compute-1 nova_compute[189066]:         <nova:port uuid="64e441f0-08b6-483c-9732-5217cdbe1468">
Dec 05 09:25:01 compute-1 nova_compute[189066]:           <nova:ip type="fixed" address="2001:db8::f816:3eff:fe5f:7350" ipVersion="6"/>
Dec 05 09:25:01 compute-1 nova_compute[189066]:           <nova:ip type="fixed" address="10.100.0.3" ipVersion="4"/>
Dec 05 09:25:01 compute-1 nova_compute[189066]:         </nova:port>
Dec 05 09:25:01 compute-1 nova_compute[189066]:       </nova:ports>
Dec 05 09:25:01 compute-1 nova_compute[189066]:     </nova:instance>
Dec 05 09:25:01 compute-1 nova_compute[189066]:   </metadata>
Dec 05 09:25:01 compute-1 nova_compute[189066]:   <sysinfo type="smbios">
Dec 05 09:25:01 compute-1 nova_compute[189066]:     <system>
Dec 05 09:25:01 compute-1 nova_compute[189066]:       <entry name="manufacturer">RDO</entry>
Dec 05 09:25:01 compute-1 nova_compute[189066]:       <entry name="product">OpenStack Compute</entry>
Dec 05 09:25:01 compute-1 nova_compute[189066]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec 05 09:25:01 compute-1 nova_compute[189066]:       <entry name="serial">6b54aedd-9d9e-436b-9010-c5b04ffaca40</entry>
Dec 05 09:25:01 compute-1 nova_compute[189066]:       <entry name="uuid">6b54aedd-9d9e-436b-9010-c5b04ffaca40</entry>
Dec 05 09:25:01 compute-1 nova_compute[189066]:       <entry name="family">Virtual Machine</entry>
Dec 05 09:25:01 compute-1 nova_compute[189066]:     </system>
Dec 05 09:25:01 compute-1 nova_compute[189066]:   </sysinfo>
Dec 05 09:25:01 compute-1 nova_compute[189066]:   <os>
Dec 05 09:25:01 compute-1 nova_compute[189066]:     <type arch="x86_64" machine="q35">hvm</type>
Dec 05 09:25:01 compute-1 nova_compute[189066]:     <boot dev="hd"/>
Dec 05 09:25:01 compute-1 nova_compute[189066]:     <smbios mode="sysinfo"/>
Dec 05 09:25:01 compute-1 nova_compute[189066]:   </os>
Dec 05 09:25:01 compute-1 nova_compute[189066]:   <features>
Dec 05 09:25:01 compute-1 nova_compute[189066]:     <acpi/>
Dec 05 09:25:01 compute-1 nova_compute[189066]:     <apic/>
Dec 05 09:25:01 compute-1 nova_compute[189066]:     <vmcoreinfo/>
Dec 05 09:25:01 compute-1 nova_compute[189066]:   </features>
Dec 05 09:25:01 compute-1 nova_compute[189066]:   <clock offset="utc">
Dec 05 09:25:01 compute-1 nova_compute[189066]:     <timer name="pit" tickpolicy="delay"/>
Dec 05 09:25:01 compute-1 nova_compute[189066]:     <timer name="rtc" tickpolicy="catchup"/>
Dec 05 09:25:01 compute-1 nova_compute[189066]:     <timer name="hpet" present="no"/>
Dec 05 09:25:01 compute-1 nova_compute[189066]:   </clock>
Dec 05 09:25:01 compute-1 nova_compute[189066]:   <cpu mode="custom" match="exact">
Dec 05 09:25:01 compute-1 nova_compute[189066]:     <model>Nehalem</model>
Dec 05 09:25:01 compute-1 nova_compute[189066]:     <topology sockets="1" cores="1" threads="1"/>
Dec 05 09:25:01 compute-1 nova_compute[189066]:   </cpu>
Dec 05 09:25:01 compute-1 nova_compute[189066]:   <devices>
Dec 05 09:25:01 compute-1 nova_compute[189066]:     <disk type="file" device="disk">
Dec 05 09:25:01 compute-1 nova_compute[189066]:       <driver name="qemu" type="qcow2" cache="none"/>
Dec 05 09:25:01 compute-1 nova_compute[189066]:       <source file="/var/lib/nova/instances/6b54aedd-9d9e-436b-9010-c5b04ffaca40/disk"/>
Dec 05 09:25:01 compute-1 nova_compute[189066]:       <target dev="vda" bus="virtio"/>
Dec 05 09:25:01 compute-1 nova_compute[189066]:     </disk>
Dec 05 09:25:01 compute-1 nova_compute[189066]:     <disk type="file" device="cdrom">
Dec 05 09:25:01 compute-1 nova_compute[189066]:       <driver name="qemu" type="raw" cache="none"/>
Dec 05 09:25:01 compute-1 nova_compute[189066]:       <source file="/var/lib/nova/instances/6b54aedd-9d9e-436b-9010-c5b04ffaca40/disk.config"/>
Dec 05 09:25:01 compute-1 nova_compute[189066]:       <target dev="sda" bus="sata"/>
Dec 05 09:25:01 compute-1 nova_compute[189066]:     </disk>
Dec 05 09:25:01 compute-1 nova_compute[189066]:     <interface type="ethernet">
Dec 05 09:25:01 compute-1 nova_compute[189066]:       <mac address="fa:16:3e:5f:73:50"/>
Dec 05 09:25:01 compute-1 nova_compute[189066]:       <model type="virtio"/>
Dec 05 09:25:01 compute-1 nova_compute[189066]:       <driver name="vhost" rx_queue_size="512"/>
Dec 05 09:25:01 compute-1 nova_compute[189066]:       <mtu size="1442"/>
Dec 05 09:25:01 compute-1 nova_compute[189066]:       <target dev="tap64e441f0-08"/>
Dec 05 09:25:01 compute-1 nova_compute[189066]:     </interface>
Dec 05 09:25:01 compute-1 nova_compute[189066]:     <serial type="pty">
Dec 05 09:25:01 compute-1 nova_compute[189066]:       <log file="/var/lib/nova/instances/6b54aedd-9d9e-436b-9010-c5b04ffaca40/console.log" append="off"/>
Dec 05 09:25:01 compute-1 nova_compute[189066]:     </serial>
Dec 05 09:25:01 compute-1 nova_compute[189066]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Dec 05 09:25:01 compute-1 nova_compute[189066]:     <video>
Dec 05 09:25:01 compute-1 nova_compute[189066]:       <model type="virtio"/>
Dec 05 09:25:01 compute-1 nova_compute[189066]:     </video>
Dec 05 09:25:01 compute-1 nova_compute[189066]:     <input type="tablet" bus="usb"/>
Dec 05 09:25:01 compute-1 nova_compute[189066]:     <rng model="virtio">
Dec 05 09:25:01 compute-1 nova_compute[189066]:       <backend model="random">/dev/urandom</backend>
Dec 05 09:25:01 compute-1 nova_compute[189066]:     </rng>
Dec 05 09:25:01 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root"/>
Dec 05 09:25:01 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:25:01 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:25:01 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:25:01 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:25:01 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:25:01 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:25:01 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:25:01 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:25:01 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:25:01 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:25:01 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:25:01 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:25:01 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:25:01 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:25:01 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:25:01 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:25:01 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:25:01 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:25:01 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:25:01 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:25:01 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:25:01 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:25:01 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:25:01 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:25:01 compute-1 nova_compute[189066]:     <controller type="usb" index="0"/>
Dec 05 09:25:01 compute-1 nova_compute[189066]:     <memballoon model="virtio">
Dec 05 09:25:01 compute-1 nova_compute[189066]:       <stats period="10"/>
Dec 05 09:25:01 compute-1 nova_compute[189066]:     </memballoon>
Dec 05 09:25:01 compute-1 nova_compute[189066]:   </devices>
Dec 05 09:25:01 compute-1 nova_compute[189066]: </domain>
Dec 05 09:25:01 compute-1 nova_compute[189066]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Dec 05 09:25:01 compute-1 nova_compute[189066]: 2025-12-05 09:25:01.923 189070 DEBUG nova.compute.manager [None req-84589b53-7718-467d-b317-24ceebd62a0b fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] [instance: 6b54aedd-9d9e-436b-9010-c5b04ffaca40] Preparing to wait for external event network-vif-plugged-64e441f0-08b6-483c-9732-5217cdbe1468 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Dec 05 09:25:01 compute-1 nova_compute[189066]: 2025-12-05 09:25:01.924 189070 DEBUG oslo_concurrency.lockutils [None req-84589b53-7718-467d-b317-24ceebd62a0b fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] Acquiring lock "6b54aedd-9d9e-436b-9010-c5b04ffaca40-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:25:01 compute-1 nova_compute[189066]: 2025-12-05 09:25:01.924 189070 DEBUG oslo_concurrency.lockutils [None req-84589b53-7718-467d-b317-24ceebd62a0b fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] Lock "6b54aedd-9d9e-436b-9010-c5b04ffaca40-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:25:01 compute-1 nova_compute[189066]: 2025-12-05 09:25:01.925 189070 DEBUG oslo_concurrency.lockutils [None req-84589b53-7718-467d-b317-24ceebd62a0b fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] Lock "6b54aedd-9d9e-436b-9010-c5b04ffaca40-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:25:01 compute-1 nova_compute[189066]: 2025-12-05 09:25:01.925 189070 DEBUG nova.virt.libvirt.vif [None req-84589b53-7718-467d-b317-24ceebd62a0b fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T09:24:17Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-520554038',display_name='tempest-TestGettingAddress-server-520554038',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testgettingaddress-server-520554038',id=7,image_ref='3ebffd97-b242-42d7-b245-ebdaf8e4377c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBCHemSCbldDOeAvAklAY1GBE7IpwWfP2ZPUnyVv6eMnqJsn+zpySVsIxCVwt5x3P5v8sd6NBMOWL+BJqkJAyHVGLR2hUEXOHFYLsF9I/mokK0nJIvCbRk6qzwE1U+KhNRw==',key_name='tempest-TestGettingAddress-981102965',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='fa1cd463d74b49139a088d332d37e611',ramdisk_id='',reservation_id='r-9vtmxokr',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='3ebffd97-b242-42d7-b245-ebdaf8e4377c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-8368731',owner_user_name='tempest-TestGettingAddress-8368731-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T09:24:27Z,user_data=None,user_id='fae1c60e378945ea84b34c4824b835b1',uuid=6b54aedd-9d9e-436b-9010-c5b04ffaca40,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "64e441f0-08b6-483c-9732-5217cdbe1468", "address": "fa:16:3e:5f:73:50", "network": {"id": "f58cc02f-396f-494d-8f1e-d6f4412689c2", "bridge": "br-int", "label": "tempest-network-smoke--188393732", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe5f:7350", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fa1cd463d74b49139a088d332d37e611", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap64e441f0-08", "ovs_interfaceid": "64e441f0-08b6-483c-9732-5217cdbe1468", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Dec 05 09:25:01 compute-1 nova_compute[189066]: 2025-12-05 09:25:01.926 189070 DEBUG nova.network.os_vif_util [None req-84589b53-7718-467d-b317-24ceebd62a0b fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] Converting VIF {"id": "64e441f0-08b6-483c-9732-5217cdbe1468", "address": "fa:16:3e:5f:73:50", "network": {"id": "f58cc02f-396f-494d-8f1e-d6f4412689c2", "bridge": "br-int", "label": "tempest-network-smoke--188393732", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe5f:7350", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fa1cd463d74b49139a088d332d37e611", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap64e441f0-08", "ovs_interfaceid": "64e441f0-08b6-483c-9732-5217cdbe1468", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 09:25:01 compute-1 nova_compute[189066]: 2025-12-05 09:25:01.926 189070 DEBUG nova.network.os_vif_util [None req-84589b53-7718-467d-b317-24ceebd62a0b fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:5f:73:50,bridge_name='br-int',has_traffic_filtering=True,id=64e441f0-08b6-483c-9732-5217cdbe1468,network=Network(f58cc02f-396f-494d-8f1e-d6f4412689c2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap64e441f0-08') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 09:25:01 compute-1 nova_compute[189066]: 2025-12-05 09:25:01.927 189070 DEBUG os_vif [None req-84589b53-7718-467d-b317-24ceebd62a0b fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:5f:73:50,bridge_name='br-int',has_traffic_filtering=True,id=64e441f0-08b6-483c-9732-5217cdbe1468,network=Network(f58cc02f-396f-494d-8f1e-d6f4412689c2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap64e441f0-08') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Dec 05 09:25:01 compute-1 nova_compute[189066]: 2025-12-05 09:25:01.927 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:25:01 compute-1 nova_compute[189066]: 2025-12-05 09:25:01.928 189070 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 09:25:01 compute-1 nova_compute[189066]: 2025-12-05 09:25:01.929 189070 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 09:25:01 compute-1 nova_compute[189066]: 2025-12-05 09:25:01.932 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:25:01 compute-1 nova_compute[189066]: 2025-12-05 09:25:01.932 189070 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap64e441f0-08, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 09:25:01 compute-1 nova_compute[189066]: 2025-12-05 09:25:01.933 189070 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap64e441f0-08, col_values=(('external_ids', {'iface-id': '64e441f0-08b6-483c-9732-5217cdbe1468', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:5f:73:50', 'vm-uuid': '6b54aedd-9d9e-436b-9010-c5b04ffaca40'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 09:25:01 compute-1 nova_compute[189066]: 2025-12-05 09:25:01.934 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:25:01 compute-1 nova_compute[189066]: 2025-12-05 09:25:01.936 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 09:25:01 compute-1 NetworkManager[55704]: <info>  [1764926701.9367] manager: (tap64e441f0-08): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/33)
Dec 05 09:25:01 compute-1 nova_compute[189066]: 2025-12-05 09:25:01.942 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:25:01 compute-1 nova_compute[189066]: 2025-12-05 09:25:01.943 189070 INFO os_vif [None req-84589b53-7718-467d-b317-24ceebd62a0b fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:5f:73:50,bridge_name='br-int',has_traffic_filtering=True,id=64e441f0-08b6-483c-9732-5217cdbe1468,network=Network(f58cc02f-396f-494d-8f1e-d6f4412689c2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap64e441f0-08')
Dec 05 09:25:03 compute-1 nova_compute[189066]: 2025-12-05 09:25:03.225 189070 DEBUG nova.virt.libvirt.driver [None req-84589b53-7718-467d-b317-24ceebd62a0b fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 05 09:25:03 compute-1 nova_compute[189066]: 2025-12-05 09:25:03.226 189070 DEBUG nova.virt.libvirt.driver [None req-84589b53-7718-467d-b317-24ceebd62a0b fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 05 09:25:03 compute-1 nova_compute[189066]: 2025-12-05 09:25:03.226 189070 DEBUG nova.virt.libvirt.driver [None req-84589b53-7718-467d-b317-24ceebd62a0b fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] No VIF found with MAC fa:16:3e:5f:73:50, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Dec 05 09:25:03 compute-1 nova_compute[189066]: 2025-12-05 09:25:03.227 189070 INFO nova.virt.libvirt.driver [None req-84589b53-7718-467d-b317-24ceebd62a0b fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] [instance: 6b54aedd-9d9e-436b-9010-c5b04ffaca40] Using config drive
Dec 05 09:25:03 compute-1 nova_compute[189066]: 2025-12-05 09:25:03.751 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:25:04 compute-1 podman[222177]: 2025-12-05 09:25:04.620755197 +0000 UTC m=+0.058574026 container health_status e8cea6352187f28e672aa25c227f2705a90d58074b8cf42235a56df5d4026fd5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a42d21893a42142b0b00e96296a13ac9d2c0fedf55d6438ad104fd0309c63376-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Dec 05 09:25:04 compute-1 nova_compute[189066]: 2025-12-05 09:25:04.930 189070 DEBUG nova.compute.manager [req-2f2ed33e-6520-4f3d-aa0c-44b63e62bd65 req-be35fdc9-0e87-4247-aec7-b0e91cd2cd23 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 2b94d7d0-b000-460f-8883-8953d60115d0] Received event network-vif-plugged-b2d6abb2-b32e-4bf7-bbdd-00d28c47ddc5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 09:25:04 compute-1 nova_compute[189066]: 2025-12-05 09:25:04.930 189070 DEBUG oslo_concurrency.lockutils [req-2f2ed33e-6520-4f3d-aa0c-44b63e62bd65 req-be35fdc9-0e87-4247-aec7-b0e91cd2cd23 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Acquiring lock "2b94d7d0-b000-460f-8883-8953d60115d0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:25:04 compute-1 nova_compute[189066]: 2025-12-05 09:25:04.930 189070 DEBUG oslo_concurrency.lockutils [req-2f2ed33e-6520-4f3d-aa0c-44b63e62bd65 req-be35fdc9-0e87-4247-aec7-b0e91cd2cd23 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Lock "2b94d7d0-b000-460f-8883-8953d60115d0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:25:04 compute-1 nova_compute[189066]: 2025-12-05 09:25:04.930 189070 DEBUG oslo_concurrency.lockutils [req-2f2ed33e-6520-4f3d-aa0c-44b63e62bd65 req-be35fdc9-0e87-4247-aec7-b0e91cd2cd23 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Lock "2b94d7d0-b000-460f-8883-8953d60115d0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:25:04 compute-1 nova_compute[189066]: 2025-12-05 09:25:04.931 189070 DEBUG nova.compute.manager [req-2f2ed33e-6520-4f3d-aa0c-44b63e62bd65 req-be35fdc9-0e87-4247-aec7-b0e91cd2cd23 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 2b94d7d0-b000-460f-8883-8953d60115d0] No waiting events found dispatching network-vif-plugged-b2d6abb2-b32e-4bf7-bbdd-00d28c47ddc5 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 09:25:04 compute-1 nova_compute[189066]: 2025-12-05 09:25:04.931 189070 WARNING nova.compute.manager [req-2f2ed33e-6520-4f3d-aa0c-44b63e62bd65 req-be35fdc9-0e87-4247-aec7-b0e91cd2cd23 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 2b94d7d0-b000-460f-8883-8953d60115d0] Received unexpected event network-vif-plugged-b2d6abb2-b32e-4bf7-bbdd-00d28c47ddc5 for instance with vm_state resized and task_state None.
Dec 05 09:25:04 compute-1 nova_compute[189066]: 2025-12-05 09:25:04.931 189070 DEBUG nova.compute.manager [req-2f2ed33e-6520-4f3d-aa0c-44b63e62bd65 req-be35fdc9-0e87-4247-aec7-b0e91cd2cd23 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 2b94d7d0-b000-460f-8883-8953d60115d0] Received event network-vif-plugged-b2d6abb2-b32e-4bf7-bbdd-00d28c47ddc5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 09:25:04 compute-1 nova_compute[189066]: 2025-12-05 09:25:04.931 189070 DEBUG oslo_concurrency.lockutils [req-2f2ed33e-6520-4f3d-aa0c-44b63e62bd65 req-be35fdc9-0e87-4247-aec7-b0e91cd2cd23 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Acquiring lock "2b94d7d0-b000-460f-8883-8953d60115d0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:25:04 compute-1 nova_compute[189066]: 2025-12-05 09:25:04.931 189070 DEBUG oslo_concurrency.lockutils [req-2f2ed33e-6520-4f3d-aa0c-44b63e62bd65 req-be35fdc9-0e87-4247-aec7-b0e91cd2cd23 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Lock "2b94d7d0-b000-460f-8883-8953d60115d0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:25:04 compute-1 nova_compute[189066]: 2025-12-05 09:25:04.932 189070 DEBUG oslo_concurrency.lockutils [req-2f2ed33e-6520-4f3d-aa0c-44b63e62bd65 req-be35fdc9-0e87-4247-aec7-b0e91cd2cd23 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Lock "2b94d7d0-b000-460f-8883-8953d60115d0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:25:04 compute-1 nova_compute[189066]: 2025-12-05 09:25:04.932 189070 DEBUG nova.compute.manager [req-2f2ed33e-6520-4f3d-aa0c-44b63e62bd65 req-be35fdc9-0e87-4247-aec7-b0e91cd2cd23 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 2b94d7d0-b000-460f-8883-8953d60115d0] No waiting events found dispatching network-vif-plugged-b2d6abb2-b32e-4bf7-bbdd-00d28c47ddc5 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 09:25:04 compute-1 nova_compute[189066]: 2025-12-05 09:25:04.932 189070 WARNING nova.compute.manager [req-2f2ed33e-6520-4f3d-aa0c-44b63e62bd65 req-be35fdc9-0e87-4247-aec7-b0e91cd2cd23 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 2b94d7d0-b000-460f-8883-8953d60115d0] Received unexpected event network-vif-plugged-b2d6abb2-b32e-4bf7-bbdd-00d28c47ddc5 for instance with vm_state resized and task_state None.
Dec 05 09:25:05 compute-1 nova_compute[189066]: 2025-12-05 09:25:05.029 189070 INFO nova.virt.libvirt.driver [None req-84589b53-7718-467d-b317-24ceebd62a0b fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] [instance: 6b54aedd-9d9e-436b-9010-c5b04ffaca40] Creating config drive at /var/lib/nova/instances/6b54aedd-9d9e-436b-9010-c5b04ffaca40/disk.config
Dec 05 09:25:05 compute-1 nova_compute[189066]: 2025-12-05 09:25:05.035 189070 DEBUG oslo_concurrency.processutils [None req-84589b53-7718-467d-b317-24ceebd62a0b fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/6b54aedd-9d9e-436b-9010-c5b04ffaca40/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmplu1e3qjt execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 09:25:05 compute-1 nova_compute[189066]: 2025-12-05 09:25:05.168 189070 DEBUG oslo_concurrency.processutils [None req-84589b53-7718-467d-b317-24ceebd62a0b fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/6b54aedd-9d9e-436b-9010-c5b04ffaca40/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmplu1e3qjt" returned: 0 in 0.133s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 09:25:05 compute-1 kernel: tap64e441f0-08: entered promiscuous mode
Dec 05 09:25:05 compute-1 NetworkManager[55704]: <info>  [1764926705.2350] manager: (tap64e441f0-08): new Tun device (/org/freedesktop/NetworkManager/Devices/34)
Dec 05 09:25:05 compute-1 ovn_controller[95809]: 2025-12-05T09:25:05Z|00051|binding|INFO|Claiming lport 64e441f0-08b6-483c-9732-5217cdbe1468 for this chassis.
Dec 05 09:25:05 compute-1 ovn_controller[95809]: 2025-12-05T09:25:05Z|00052|binding|INFO|64e441f0-08b6-483c-9732-5217cdbe1468: Claiming fa:16:3e:5f:73:50 10.100.0.3 2001:db8::f816:3eff:fe5f:7350
Dec 05 09:25:05 compute-1 nova_compute[189066]: 2025-12-05 09:25:05.235 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:25:05 compute-1 ovn_controller[95809]: 2025-12-05T09:25:05Z|00053|binding|INFO|Setting lport 64e441f0-08b6-483c-9732-5217cdbe1468 ovn-installed in OVS
Dec 05 09:25:05 compute-1 nova_compute[189066]: 2025-12-05 09:25:05.249 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:25:05 compute-1 ovn_controller[95809]: 2025-12-05T09:25:05Z|00054|binding|INFO|Setting lport 64e441f0-08b6-483c-9732-5217cdbe1468 up in Southbound
Dec 05 09:25:05 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:25:05.251 105272 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:5f:73:50 10.100.0.3 2001:db8::f816:3eff:fe5f:7350'], port_security=['fa:16:3e:5f:73:50 10.100.0.3 2001:db8::f816:3eff:fe5f:7350'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28 2001:db8::f816:3eff:fe5f:7350/64', 'neutron:device_id': '6b54aedd-9d9e-436b-9010-c5b04ffaca40', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f58cc02f-396f-494d-8f1e-d6f4412689c2', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fa1cd463d74b49139a088d332d37e611', 'neutron:revision_number': '2', 'neutron:security_group_ids': '94445a7f-7152-4017-ac4c-5834cf45389c', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a03fccf0-8b31-495a-b68a-70be5d3c0194, chassis=[<ovs.db.idl.Row object at 0x7fc06f54c970>], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fc06f54c970>], logical_port=64e441f0-08b6-483c-9732-5217cdbe1468) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 09:25:05 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:25:05.252 105272 INFO neutron.agent.ovn.metadata.agent [-] Port 64e441f0-08b6-483c-9732-5217cdbe1468 in datapath f58cc02f-396f-494d-8f1e-d6f4412689c2 bound to our chassis
Dec 05 09:25:05 compute-1 nova_compute[189066]: 2025-12-05 09:25:05.255 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:25:05 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:25:05.255 105272 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network f58cc02f-396f-494d-8f1e-d6f4412689c2
Dec 05 09:25:05 compute-1 systemd-udevd[222214]: Network interface NamePolicy= disabled on kernel command line.
Dec 05 09:25:05 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:25:05.269 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[96f96d59-9e02-4693-a36a-89f328155e22]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:25:05 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:25:05.270 105272 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapf58cc02f-31 in ovnmeta-f58cc02f-396f-494d-8f1e-d6f4412689c2 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Dec 05 09:25:05 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:25:05.272 220556 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapf58cc02f-30 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Dec 05 09:25:05 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:25:05.272 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[30b64edc-67b2-4454-8e2d-aa8d7d8317c6]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:25:05 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:25:05.273 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[b6e16a48-fe2d-4bc5-a56a-42d9ff710868]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:25:05 compute-1 systemd-machined[154815]: New machine qemu-3-instance-00000007.
Dec 05 09:25:05 compute-1 NetworkManager[55704]: <info>  [1764926705.2801] device (tap64e441f0-08): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec 05 09:25:05 compute-1 NetworkManager[55704]: <info>  [1764926705.2815] device (tap64e441f0-08): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec 05 09:25:05 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:25:05.286 105809 DEBUG oslo.privsep.daemon [-] privsep: reply[70d49353-dd81-4f24-bc78-3be8972216d5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:25:05 compute-1 systemd[1]: Started Virtual Machine qemu-3-instance-00000007.
Dec 05 09:25:05 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:25:05.312 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[f0139685-e3b7-4397-a58a-f5f1da46ee9f]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:25:05 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:25:05.356 220896 DEBUG oslo.privsep.daemon [-] privsep: reply[434195a2-fd8b-4434-be1a-e4e273a4e61f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:25:05 compute-1 systemd-udevd[222218]: Network interface NamePolicy= disabled on kernel command line.
Dec 05 09:25:05 compute-1 NetworkManager[55704]: <info>  [1764926705.3647] manager: (tapf58cc02f-30): new Veth device (/org/freedesktop/NetworkManager/Devices/35)
Dec 05 09:25:05 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:25:05.364 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[87d078bd-37fe-444a-8230-72f039d272e0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:25:05 compute-1 nova_compute[189066]: 2025-12-05 09:25:05.387 189070 DEBUG oslo_concurrency.lockutils [None req-9012beb5-e7ac-4468-9dd4-0f0122c05614 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] Acquiring lock "2b94d7d0-b000-460f-8883-8953d60115d0" by "nova.compute.manager.ComputeManager.confirm_resize.<locals>.do_confirm_resize" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:25:05 compute-1 nova_compute[189066]: 2025-12-05 09:25:05.388 189070 DEBUG oslo_concurrency.lockutils [None req-9012beb5-e7ac-4468-9dd4-0f0122c05614 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] Lock "2b94d7d0-b000-460f-8883-8953d60115d0" acquired by "nova.compute.manager.ComputeManager.confirm_resize.<locals>.do_confirm_resize" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:25:05 compute-1 nova_compute[189066]: 2025-12-05 09:25:05.388 189070 DEBUG nova.compute.manager [None req-9012beb5-e7ac-4468-9dd4-0f0122c05614 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] [instance: 2b94d7d0-b000-460f-8883-8953d60115d0] Going to confirm migration 2 do_confirm_resize /usr/lib/python3.9/site-packages/nova/compute/manager.py:4679
Dec 05 09:25:05 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:25:05.406 220896 DEBUG oslo.privsep.daemon [-] privsep: reply[4a5b7f3d-1a06-403b-a2f1-9b8de7a3e5ec]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:25:05 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:25:05.410 220896 DEBUG oslo.privsep.daemon [-] privsep: reply[9362b514-f97c-4e02-a07f-5d12e19dade4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:25:05 compute-1 NetworkManager[55704]: <info>  [1764926705.4337] device (tapf58cc02f-30): carrier: link connected
Dec 05 09:25:05 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:25:05.437 220896 DEBUG oslo.privsep.daemon [-] privsep: reply[feafdf9a-20b2-46b3-bc5a-6e79ed78fa21]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:25:05 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:25:05.456 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[4581ee93-c914-48c5-9d78-a68d0588fa62]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf58cc02f-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:d1:dc:b6'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 18], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 397843, 'reachable_time': 15599, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 222247, 'error': None, 'target': 'ovnmeta-f58cc02f-396f-494d-8f1e-d6f4412689c2', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:25:05 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:25:05.475 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[e42fd0df-60ba-463e-9382-20d94c63d7ce]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fed1:dcb6'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 397843, 'tstamp': 397843}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 222248, 'error': None, 'target': 'ovnmeta-f58cc02f-396f-494d-8f1e-d6f4412689c2', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:25:05 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:25:05.505 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[8a093ff1-d24a-4d7a-ad7d-3367340b38d4]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf58cc02f-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:d1:dc:b6'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 176, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 176, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 18], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 397843, 'reachable_time': 15599, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 148, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 148, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 222249, 'error': None, 'target': 'ovnmeta-f58cc02f-396f-494d-8f1e-d6f4412689c2', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:25:05 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:25:05.548 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[794137e2-0f87-4f02-a248-090a74e7f6cb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:25:05 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:25:05.630 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[f965429e-0d7b-4aba-90c2-46e980ce4b85]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:25:05 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:25:05.632 105272 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf58cc02f-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 09:25:05 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:25:05.632 105272 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 09:25:05 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:25:05.633 105272 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf58cc02f-30, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 09:25:05 compute-1 nova_compute[189066]: 2025-12-05 09:25:05.635 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:25:05 compute-1 kernel: tapf58cc02f-30: entered promiscuous mode
Dec 05 09:25:05 compute-1 NetworkManager[55704]: <info>  [1764926705.6377] manager: (tapf58cc02f-30): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/36)
Dec 05 09:25:05 compute-1 nova_compute[189066]: 2025-12-05 09:25:05.639 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:25:05 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:25:05.641 105272 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapf58cc02f-30, col_values=(('external_ids', {'iface-id': 'fdbc0f28-ff71-4c6c-87fd-d55723f69ac2'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 09:25:05 compute-1 nova_compute[189066]: 2025-12-05 09:25:05.643 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:25:05 compute-1 ovn_controller[95809]: 2025-12-05T09:25:05Z|00055|binding|INFO|Releasing lport fdbc0f28-ff71-4c6c-87fd-d55723f69ac2 from this chassis (sb_readonly=0)
Dec 05 09:25:05 compute-1 nova_compute[189066]: 2025-12-05 09:25:05.645 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:25:05 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:25:05.646 105272 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/f58cc02f-396f-494d-8f1e-d6f4412689c2.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/f58cc02f-396f-494d-8f1e-d6f4412689c2.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Dec 05 09:25:05 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:25:05.647 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[7239f7d9-7d68-4c37-8faa-ddff2ff6da12]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:25:05 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:25:05.648 105272 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Dec 05 09:25:05 compute-1 ovn_metadata_agent[105267]: global
Dec 05 09:25:05 compute-1 ovn_metadata_agent[105267]:     log         /dev/log local0 debug
Dec 05 09:25:05 compute-1 ovn_metadata_agent[105267]:     log-tag     haproxy-metadata-proxy-f58cc02f-396f-494d-8f1e-d6f4412689c2
Dec 05 09:25:05 compute-1 ovn_metadata_agent[105267]:     user        root
Dec 05 09:25:05 compute-1 ovn_metadata_agent[105267]:     group       root
Dec 05 09:25:05 compute-1 ovn_metadata_agent[105267]:     maxconn     1024
Dec 05 09:25:05 compute-1 ovn_metadata_agent[105267]:     pidfile     /var/lib/neutron/external/pids/f58cc02f-396f-494d-8f1e-d6f4412689c2.pid.haproxy
Dec 05 09:25:05 compute-1 ovn_metadata_agent[105267]:     daemon
Dec 05 09:25:05 compute-1 ovn_metadata_agent[105267]: 
Dec 05 09:25:05 compute-1 ovn_metadata_agent[105267]: defaults
Dec 05 09:25:05 compute-1 ovn_metadata_agent[105267]:     log global
Dec 05 09:25:05 compute-1 ovn_metadata_agent[105267]:     mode http
Dec 05 09:25:05 compute-1 ovn_metadata_agent[105267]:     option httplog
Dec 05 09:25:05 compute-1 ovn_metadata_agent[105267]:     option dontlognull
Dec 05 09:25:05 compute-1 ovn_metadata_agent[105267]:     option http-server-close
Dec 05 09:25:05 compute-1 ovn_metadata_agent[105267]:     option forwardfor
Dec 05 09:25:05 compute-1 ovn_metadata_agent[105267]:     retries                 3
Dec 05 09:25:05 compute-1 ovn_metadata_agent[105267]:     timeout http-request    30s
Dec 05 09:25:05 compute-1 ovn_metadata_agent[105267]:     timeout connect         30s
Dec 05 09:25:05 compute-1 ovn_metadata_agent[105267]:     timeout client          32s
Dec 05 09:25:05 compute-1 ovn_metadata_agent[105267]:     timeout server          32s
Dec 05 09:25:05 compute-1 ovn_metadata_agent[105267]:     timeout http-keep-alive 30s
Dec 05 09:25:05 compute-1 ovn_metadata_agent[105267]: 
Dec 05 09:25:05 compute-1 ovn_metadata_agent[105267]: 
Dec 05 09:25:05 compute-1 ovn_metadata_agent[105267]: listen listener
Dec 05 09:25:05 compute-1 ovn_metadata_agent[105267]:     bind 169.254.169.254:80
Dec 05 09:25:05 compute-1 ovn_metadata_agent[105267]:     server metadata /var/lib/neutron/metadata_proxy
Dec 05 09:25:05 compute-1 ovn_metadata_agent[105267]:     http-request add-header X-OVN-Network-ID f58cc02f-396f-494d-8f1e-d6f4412689c2
Dec 05 09:25:05 compute-1 ovn_metadata_agent[105267]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Dec 05 09:25:05 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:25:05.648 105272 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-f58cc02f-396f-494d-8f1e-d6f4412689c2', 'env', 'PROCESS_TAG=haproxy-f58cc02f-396f-494d-8f1e-d6f4412689c2', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/f58cc02f-396f-494d-8f1e-d6f4412689c2.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Dec 05 09:25:05 compute-1 nova_compute[189066]: 2025-12-05 09:25:05.667 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:25:05 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:25:05.984 105272 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=8, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f2:d3:89', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '8e:43:2c:7d:20:dd'}, ipsec=False) old=SB_Global(nb_cfg=7) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 09:25:05 compute-1 nova_compute[189066]: 2025-12-05 09:25:05.984 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:25:06 compute-1 podman[222278]: 2025-12-05 09:25:06.072243083 +0000 UTC m=+0.054132548 container create 3b9c09fd3b4004552bca99c2c8284c30eec648aa0ec387345cc2c3eebabe0d87 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f58cc02f-396f-494d-8f1e-d6f4412689c2, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Dec 05 09:25:06 compute-1 systemd[1]: Started libpod-conmon-3b9c09fd3b4004552bca99c2c8284c30eec648aa0ec387345cc2c3eebabe0d87.scope.
Dec 05 09:25:06 compute-1 podman[222278]: 2025-12-05 09:25:06.042486203 +0000 UTC m=+0.024375698 image pull 014dc726c85414b29f2dde7b5d875685d08784761c0f0ffa8630d1583a877bf9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Dec 05 09:25:06 compute-1 systemd[1]: Started libcrun container.
Dec 05 09:25:06 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/318d7a1752c837b5ccc0860fe58bdf6906d58916a38f4de6c081886ae9c90363/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 05 09:25:06 compute-1 podman[222278]: 2025-12-05 09:25:06.178731062 +0000 UTC m=+0.160620547 container init 3b9c09fd3b4004552bca99c2c8284c30eec648aa0ec387345cc2c3eebabe0d87 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f58cc02f-396f-494d-8f1e-d6f4412689c2, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS, tcib_managed=true)
Dec 05 09:25:06 compute-1 podman[222278]: 2025-12-05 09:25:06.184765409 +0000 UTC m=+0.166654874 container start 3b9c09fd3b4004552bca99c2c8284c30eec648aa0ec387345cc2c3eebabe0d87 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f58cc02f-396f-494d-8f1e-d6f4412689c2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Dec 05 09:25:06 compute-1 neutron-haproxy-ovnmeta-f58cc02f-396f-494d-8f1e-d6f4412689c2[222294]: [NOTICE]   (222298) : New worker (222300) forked
Dec 05 09:25:06 compute-1 neutron-haproxy-ovnmeta-f58cc02f-396f-494d-8f1e-d6f4412689c2[222294]: [NOTICE]   (222298) : Loading success.
Dec 05 09:25:06 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:25:06.243 105272 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Dec 05 09:25:06 compute-1 nova_compute[189066]: 2025-12-05 09:25:06.427 189070 DEBUG nova.virt.driver [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] Emitting event <LifecycleEvent: 1764926706.426096, 6b54aedd-9d9e-436b-9010-c5b04ffaca40 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 09:25:06 compute-1 nova_compute[189066]: 2025-12-05 09:25:06.427 189070 INFO nova.compute.manager [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] [instance: 6b54aedd-9d9e-436b-9010-c5b04ffaca40] VM Started (Lifecycle Event)
Dec 05 09:25:06 compute-1 nova_compute[189066]: 2025-12-05 09:25:06.473 189070 DEBUG nova.compute.manager [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] [instance: 6b54aedd-9d9e-436b-9010-c5b04ffaca40] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 09:25:06 compute-1 nova_compute[189066]: 2025-12-05 09:25:06.478 189070 DEBUG nova.virt.driver [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] Emitting event <LifecycleEvent: 1764926706.4268825, 6b54aedd-9d9e-436b-9010-c5b04ffaca40 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 09:25:06 compute-1 nova_compute[189066]: 2025-12-05 09:25:06.479 189070 INFO nova.compute.manager [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] [instance: 6b54aedd-9d9e-436b-9010-c5b04ffaca40] VM Paused (Lifecycle Event)
Dec 05 09:25:06 compute-1 nova_compute[189066]: 2025-12-05 09:25:06.530 189070 DEBUG nova.compute.manager [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] [instance: 6b54aedd-9d9e-436b-9010-c5b04ffaca40] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 09:25:06 compute-1 nova_compute[189066]: 2025-12-05 09:25:06.534 189070 DEBUG nova.compute.manager [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] [instance: 6b54aedd-9d9e-436b-9010-c5b04ffaca40] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 09:25:06 compute-1 nova_compute[189066]: 2025-12-05 09:25:06.572 189070 INFO nova.compute.manager [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] [instance: 6b54aedd-9d9e-436b-9010-c5b04ffaca40] During sync_power_state the instance has a pending task (spawning). Skip.
Dec 05 09:25:06 compute-1 nova_compute[189066]: 2025-12-05 09:25:06.935 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:25:06 compute-1 nova_compute[189066]: 2025-12-05 09:25:06.997 189070 DEBUG oslo_concurrency.lockutils [None req-c79702cb-8a4b-4665-b9ca-3624a8692fbd 3cf2b75d6732438a9a3626bc5db6d76e efceb9f71c3c447ea59c2d2694f9e636 - - default default] Acquiring lock "b4e74925-5201-4f70-9beb-258ed9dc025a" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:25:06 compute-1 nova_compute[189066]: 2025-12-05 09:25:06.998 189070 DEBUG oslo_concurrency.lockutils [None req-c79702cb-8a4b-4665-b9ca-3624a8692fbd 3cf2b75d6732438a9a3626bc5db6d76e efceb9f71c3c447ea59c2d2694f9e636 - - default default] Lock "b4e74925-5201-4f70-9beb-258ed9dc025a" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:25:07 compute-1 nova_compute[189066]: 2025-12-05 09:25:07.126 189070 DEBUG nova.compute.manager [None req-c79702cb-8a4b-4665-b9ca-3624a8692fbd 3cf2b75d6732438a9a3626bc5db6d76e efceb9f71c3c447ea59c2d2694f9e636 - - default default] [instance: b4e74925-5201-4f70-9beb-258ed9dc025a] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Dec 05 09:25:07 compute-1 nova_compute[189066]: 2025-12-05 09:25:07.294 189070 DEBUG nova.network.neutron [req-8c470638-495e-4404-a286-2e033c520348 req-c2e95988-9979-4a4c-a962-6fc30e240cbe 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 6b54aedd-9d9e-436b-9010-c5b04ffaca40] Updated VIF entry in instance network info cache for port 64e441f0-08b6-483c-9732-5217cdbe1468. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Dec 05 09:25:07 compute-1 nova_compute[189066]: 2025-12-05 09:25:07.294 189070 DEBUG nova.network.neutron [req-8c470638-495e-4404-a286-2e033c520348 req-c2e95988-9979-4a4c-a962-6fc30e240cbe 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 6b54aedd-9d9e-436b-9010-c5b04ffaca40] Updating instance_info_cache with network_info: [{"id": "64e441f0-08b6-483c-9732-5217cdbe1468", "address": "fa:16:3e:5f:73:50", "network": {"id": "f58cc02f-396f-494d-8f1e-d6f4412689c2", "bridge": "br-int", "label": "tempest-network-smoke--188393732", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe5f:7350", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fa1cd463d74b49139a088d332d37e611", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap64e441f0-08", "ovs_interfaceid": "64e441f0-08b6-483c-9732-5217cdbe1468", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 09:25:07 compute-1 nova_compute[189066]: 2025-12-05 09:25:07.332 189070 DEBUG oslo_concurrency.lockutils [req-8c470638-495e-4404-a286-2e033c520348 req-c2e95988-9979-4a4c-a962-6fc30e240cbe 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Releasing lock "refresh_cache-6b54aedd-9d9e-436b-9010-c5b04ffaca40" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 09:25:07 compute-1 nova_compute[189066]: 2025-12-05 09:25:07.375 189070 DEBUG oslo_concurrency.lockutils [None req-c79702cb-8a4b-4665-b9ca-3624a8692fbd 3cf2b75d6732438a9a3626bc5db6d76e efceb9f71c3c447ea59c2d2694f9e636 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:25:07 compute-1 nova_compute[189066]: 2025-12-05 09:25:07.376 189070 DEBUG oslo_concurrency.lockutils [None req-c79702cb-8a4b-4665-b9ca-3624a8692fbd 3cf2b75d6732438a9a3626bc5db6d76e efceb9f71c3c447ea59c2d2694f9e636 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:25:07 compute-1 nova_compute[189066]: 2025-12-05 09:25:07.377 189070 DEBUG neutronclient.v2_0.client [None req-9012beb5-e7ac-4468-9dd4-0f0122c05614 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] Error message: {"NeutronError": {"type": "PortBindingNotFound", "message": "Binding for port b2d6abb2-b32e-4bf7-bbdd-00d28c47ddc5 for host compute-1.ctlplane.example.com could not be found.", "detail": ""}} _handle_fault_response /usr/lib/python3.9/site-packages/neutronclient/v2_0/client.py:262
Dec 05 09:25:07 compute-1 nova_compute[189066]: 2025-12-05 09:25:07.378 189070 DEBUG oslo_concurrency.lockutils [None req-9012beb5-e7ac-4468-9dd4-0f0122c05614 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] Acquiring lock "refresh_cache-2b94d7d0-b000-460f-8883-8953d60115d0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 09:25:07 compute-1 nova_compute[189066]: 2025-12-05 09:25:07.378 189070 DEBUG oslo_concurrency.lockutils [None req-9012beb5-e7ac-4468-9dd4-0f0122c05614 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] Acquired lock "refresh_cache-2b94d7d0-b000-460f-8883-8953d60115d0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 09:25:07 compute-1 nova_compute[189066]: 2025-12-05 09:25:07.378 189070 DEBUG nova.network.neutron [None req-9012beb5-e7ac-4468-9dd4-0f0122c05614 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] [instance: 2b94d7d0-b000-460f-8883-8953d60115d0] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Dec 05 09:25:07 compute-1 nova_compute[189066]: 2025-12-05 09:25:07.378 189070 DEBUG nova.objects.instance [None req-9012beb5-e7ac-4468-9dd4-0f0122c05614 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] Lazy-loading 'info_cache' on Instance uuid 2b94d7d0-b000-460f-8883-8953d60115d0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 09:25:07 compute-1 nova_compute[189066]: 2025-12-05 09:25:07.386 189070 DEBUG nova.virt.hardware [None req-c79702cb-8a4b-4665-b9ca-3624a8692fbd 3cf2b75d6732438a9a3626bc5db6d76e efceb9f71c3c447ea59c2d2694f9e636 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Dec 05 09:25:07 compute-1 nova_compute[189066]: 2025-12-05 09:25:07.387 189070 INFO nova.compute.claims [None req-c79702cb-8a4b-4665-b9ca-3624a8692fbd 3cf2b75d6732438a9a3626bc5db6d76e efceb9f71c3c447ea59c2d2694f9e636 - - default default] [instance: b4e74925-5201-4f70-9beb-258ed9dc025a] Claim successful on node compute-1.ctlplane.example.com
Dec 05 09:25:07 compute-1 podman[222316]: 2025-12-05 09:25:07.626674571 +0000 UTC m=+0.064264856 container health_status 3f3288243aefe700d53cffce184353529fbaf6e44f1c19ec4792ed576af41176 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_id=multipathd, container_name=multipathd, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Dec 05 09:25:07 compute-1 nova_compute[189066]: 2025-12-05 09:25:07.792 189070 DEBUG nova.compute.provider_tree [None req-c79702cb-8a4b-4665-b9ca-3624a8692fbd 3cf2b75d6732438a9a3626bc5db6d76e efceb9f71c3c447ea59c2d2694f9e636 - - default default] Inventory has not changed in ProviderTree for provider: be68f9f1-7820-4bfa-8dbd-210e13729f64 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 09:25:07 compute-1 nova_compute[189066]: 2025-12-05 09:25:07.825 189070 DEBUG nova.scheduler.client.report [None req-c79702cb-8a4b-4665-b9ca-3624a8692fbd 3cf2b75d6732438a9a3626bc5db6d76e efceb9f71c3c447ea59c2d2694f9e636 - - default default] Inventory has not changed for provider be68f9f1-7820-4bfa-8dbd-210e13729f64 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 09:25:07 compute-1 nova_compute[189066]: 2025-12-05 09:25:07.860 189070 DEBUG oslo_concurrency.lockutils [None req-c79702cb-8a4b-4665-b9ca-3624a8692fbd 3cf2b75d6732438a9a3626bc5db6d76e efceb9f71c3c447ea59c2d2694f9e636 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.484s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:25:07 compute-1 nova_compute[189066]: 2025-12-05 09:25:07.861 189070 DEBUG nova.compute.manager [None req-c79702cb-8a4b-4665-b9ca-3624a8692fbd 3cf2b75d6732438a9a3626bc5db6d76e efceb9f71c3c447ea59c2d2694f9e636 - - default default] [instance: b4e74925-5201-4f70-9beb-258ed9dc025a] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Dec 05 09:25:08 compute-1 nova_compute[189066]: 2025-12-05 09:25:08.008 189070 DEBUG nova.compute.manager [None req-c79702cb-8a4b-4665-b9ca-3624a8692fbd 3cf2b75d6732438a9a3626bc5db6d76e efceb9f71c3c447ea59c2d2694f9e636 - - default default] [instance: b4e74925-5201-4f70-9beb-258ed9dc025a] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Dec 05 09:25:08 compute-1 nova_compute[189066]: 2025-12-05 09:25:08.008 189070 DEBUG nova.network.neutron [None req-c79702cb-8a4b-4665-b9ca-3624a8692fbd 3cf2b75d6732438a9a3626bc5db6d76e efceb9f71c3c447ea59c2d2694f9e636 - - default default] [instance: b4e74925-5201-4f70-9beb-258ed9dc025a] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Dec 05 09:25:08 compute-1 nova_compute[189066]: 2025-12-05 09:25:08.108 189070 INFO nova.virt.libvirt.driver [None req-c79702cb-8a4b-4665-b9ca-3624a8692fbd 3cf2b75d6732438a9a3626bc5db6d76e efceb9f71c3c447ea59c2d2694f9e636 - - default default] [instance: b4e74925-5201-4f70-9beb-258ed9dc025a] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Dec 05 09:25:08 compute-1 nova_compute[189066]: 2025-12-05 09:25:08.150 189070 DEBUG nova.compute.manager [None req-c79702cb-8a4b-4665-b9ca-3624a8692fbd 3cf2b75d6732438a9a3626bc5db6d76e efceb9f71c3c447ea59c2d2694f9e636 - - default default] [instance: b4e74925-5201-4f70-9beb-258ed9dc025a] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Dec 05 09:25:08 compute-1 nova_compute[189066]: 2025-12-05 09:25:08.408 189070 DEBUG nova.compute.manager [None req-c79702cb-8a4b-4665-b9ca-3624a8692fbd 3cf2b75d6732438a9a3626bc5db6d76e efceb9f71c3c447ea59c2d2694f9e636 - - default default] [instance: b4e74925-5201-4f70-9beb-258ed9dc025a] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Dec 05 09:25:08 compute-1 nova_compute[189066]: 2025-12-05 09:25:08.409 189070 DEBUG nova.virt.libvirt.driver [None req-c79702cb-8a4b-4665-b9ca-3624a8692fbd 3cf2b75d6732438a9a3626bc5db6d76e efceb9f71c3c447ea59c2d2694f9e636 - - default default] [instance: b4e74925-5201-4f70-9beb-258ed9dc025a] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Dec 05 09:25:08 compute-1 nova_compute[189066]: 2025-12-05 09:25:08.410 189070 INFO nova.virt.libvirt.driver [None req-c79702cb-8a4b-4665-b9ca-3624a8692fbd 3cf2b75d6732438a9a3626bc5db6d76e efceb9f71c3c447ea59c2d2694f9e636 - - default default] [instance: b4e74925-5201-4f70-9beb-258ed9dc025a] Creating image(s)
Dec 05 09:25:08 compute-1 nova_compute[189066]: 2025-12-05 09:25:08.410 189070 DEBUG oslo_concurrency.lockutils [None req-c79702cb-8a4b-4665-b9ca-3624a8692fbd 3cf2b75d6732438a9a3626bc5db6d76e efceb9f71c3c447ea59c2d2694f9e636 - - default default] Acquiring lock "/var/lib/nova/instances/b4e74925-5201-4f70-9beb-258ed9dc025a/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:25:08 compute-1 nova_compute[189066]: 2025-12-05 09:25:08.411 189070 DEBUG oslo_concurrency.lockutils [None req-c79702cb-8a4b-4665-b9ca-3624a8692fbd 3cf2b75d6732438a9a3626bc5db6d76e efceb9f71c3c447ea59c2d2694f9e636 - - default default] Lock "/var/lib/nova/instances/b4e74925-5201-4f70-9beb-258ed9dc025a/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:25:08 compute-1 nova_compute[189066]: 2025-12-05 09:25:08.411 189070 DEBUG oslo_concurrency.lockutils [None req-c79702cb-8a4b-4665-b9ca-3624a8692fbd 3cf2b75d6732438a9a3626bc5db6d76e efceb9f71c3c447ea59c2d2694f9e636 - - default default] Lock "/var/lib/nova/instances/b4e74925-5201-4f70-9beb-258ed9dc025a/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:25:08 compute-1 nova_compute[189066]: 2025-12-05 09:25:08.425 189070 DEBUG oslo_concurrency.processutils [None req-c79702cb-8a4b-4665-b9ca-3624a8692fbd 3cf2b75d6732438a9a3626bc5db6d76e efceb9f71c3c447ea59c2d2694f9e636 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5b1bdb5cc50b9bf43ebe3094e9279a12a18df0ce --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 09:25:08 compute-1 nova_compute[189066]: 2025-12-05 09:25:08.486 189070 DEBUG oslo_concurrency.processutils [None req-c79702cb-8a4b-4665-b9ca-3624a8692fbd 3cf2b75d6732438a9a3626bc5db6d76e efceb9f71c3c447ea59c2d2694f9e636 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5b1bdb5cc50b9bf43ebe3094e9279a12a18df0ce --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 09:25:08 compute-1 nova_compute[189066]: 2025-12-05 09:25:08.487 189070 DEBUG oslo_concurrency.lockutils [None req-c79702cb-8a4b-4665-b9ca-3624a8692fbd 3cf2b75d6732438a9a3626bc5db6d76e efceb9f71c3c447ea59c2d2694f9e636 - - default default] Acquiring lock "5b1bdb5cc50b9bf43ebe3094e9279a12a18df0ce" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:25:08 compute-1 nova_compute[189066]: 2025-12-05 09:25:08.488 189070 DEBUG oslo_concurrency.lockutils [None req-c79702cb-8a4b-4665-b9ca-3624a8692fbd 3cf2b75d6732438a9a3626bc5db6d76e efceb9f71c3c447ea59c2d2694f9e636 - - default default] Lock "5b1bdb5cc50b9bf43ebe3094e9279a12a18df0ce" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:25:08 compute-1 nova_compute[189066]: 2025-12-05 09:25:08.498 189070 DEBUG oslo_concurrency.processutils [None req-c79702cb-8a4b-4665-b9ca-3624a8692fbd 3cf2b75d6732438a9a3626bc5db6d76e efceb9f71c3c447ea59c2d2694f9e636 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5b1bdb5cc50b9bf43ebe3094e9279a12a18df0ce --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 09:25:08 compute-1 nova_compute[189066]: 2025-12-05 09:25:08.524 189070 DEBUG nova.policy [None req-c79702cb-8a4b-4665-b9ca-3624a8692fbd 3cf2b75d6732438a9a3626bc5db6d76e efceb9f71c3c447ea59c2d2694f9e636 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '3cf2b75d6732438a9a3626bc5db6d76e', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'efceb9f71c3c447ea59c2d2694f9e636', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Dec 05 09:25:08 compute-1 nova_compute[189066]: 2025-12-05 09:25:08.566 189070 DEBUG oslo_concurrency.processutils [None req-c79702cb-8a4b-4665-b9ca-3624a8692fbd 3cf2b75d6732438a9a3626bc5db6d76e efceb9f71c3c447ea59c2d2694f9e636 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5b1bdb5cc50b9bf43ebe3094e9279a12a18df0ce --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 09:25:08 compute-1 nova_compute[189066]: 2025-12-05 09:25:08.568 189070 DEBUG oslo_concurrency.processutils [None req-c79702cb-8a4b-4665-b9ca-3624a8692fbd 3cf2b75d6732438a9a3626bc5db6d76e efceb9f71c3c447ea59c2d2694f9e636 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5b1bdb5cc50b9bf43ebe3094e9279a12a18df0ce,backing_fmt=raw /var/lib/nova/instances/b4e74925-5201-4f70-9beb-258ed9dc025a/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 09:25:08 compute-1 nova_compute[189066]: 2025-12-05 09:25:08.612 189070 DEBUG oslo_concurrency.processutils [None req-c79702cb-8a4b-4665-b9ca-3624a8692fbd 3cf2b75d6732438a9a3626bc5db6d76e efceb9f71c3c447ea59c2d2694f9e636 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5b1bdb5cc50b9bf43ebe3094e9279a12a18df0ce,backing_fmt=raw /var/lib/nova/instances/b4e74925-5201-4f70-9beb-258ed9dc025a/disk 1073741824" returned: 0 in 0.044s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 09:25:08 compute-1 nova_compute[189066]: 2025-12-05 09:25:08.614 189070 DEBUG oslo_concurrency.lockutils [None req-c79702cb-8a4b-4665-b9ca-3624a8692fbd 3cf2b75d6732438a9a3626bc5db6d76e efceb9f71c3c447ea59c2d2694f9e636 - - default default] Lock "5b1bdb5cc50b9bf43ebe3094e9279a12a18df0ce" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.126s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:25:08 compute-1 nova_compute[189066]: 2025-12-05 09:25:08.614 189070 DEBUG oslo_concurrency.processutils [None req-c79702cb-8a4b-4665-b9ca-3624a8692fbd 3cf2b75d6732438a9a3626bc5db6d76e efceb9f71c3c447ea59c2d2694f9e636 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5b1bdb5cc50b9bf43ebe3094e9279a12a18df0ce --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 09:25:08 compute-1 nova_compute[189066]: 2025-12-05 09:25:08.688 189070 DEBUG oslo_concurrency.processutils [None req-c79702cb-8a4b-4665-b9ca-3624a8692fbd 3cf2b75d6732438a9a3626bc5db6d76e efceb9f71c3c447ea59c2d2694f9e636 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5b1bdb5cc50b9bf43ebe3094e9279a12a18df0ce --force-share --output=json" returned: 0 in 0.074s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 09:25:08 compute-1 nova_compute[189066]: 2025-12-05 09:25:08.690 189070 DEBUG nova.virt.disk.api [None req-c79702cb-8a4b-4665-b9ca-3624a8692fbd 3cf2b75d6732438a9a3626bc5db6d76e efceb9f71c3c447ea59c2d2694f9e636 - - default default] Checking if we can resize image /var/lib/nova/instances/b4e74925-5201-4f70-9beb-258ed9dc025a/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Dec 05 09:25:08 compute-1 nova_compute[189066]: 2025-12-05 09:25:08.690 189070 DEBUG oslo_concurrency.processutils [None req-c79702cb-8a4b-4665-b9ca-3624a8692fbd 3cf2b75d6732438a9a3626bc5db6d76e efceb9f71c3c447ea59c2d2694f9e636 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b4e74925-5201-4f70-9beb-258ed9dc025a/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 09:25:08 compute-1 nova_compute[189066]: 2025-12-05 09:25:08.751 189070 DEBUG oslo_concurrency.processutils [None req-c79702cb-8a4b-4665-b9ca-3624a8692fbd 3cf2b75d6732438a9a3626bc5db6d76e efceb9f71c3c447ea59c2d2694f9e636 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b4e74925-5201-4f70-9beb-258ed9dc025a/disk --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 09:25:08 compute-1 nova_compute[189066]: 2025-12-05 09:25:08.753 189070 DEBUG nova.virt.disk.api [None req-c79702cb-8a4b-4665-b9ca-3624a8692fbd 3cf2b75d6732438a9a3626bc5db6d76e efceb9f71c3c447ea59c2d2694f9e636 - - default default] Cannot resize image /var/lib/nova/instances/b4e74925-5201-4f70-9beb-258ed9dc025a/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Dec 05 09:25:08 compute-1 nova_compute[189066]: 2025-12-05 09:25:08.753 189070 DEBUG nova.objects.instance [None req-c79702cb-8a4b-4665-b9ca-3624a8692fbd 3cf2b75d6732438a9a3626bc5db6d76e efceb9f71c3c447ea59c2d2694f9e636 - - default default] Lazy-loading 'migration_context' on Instance uuid b4e74925-5201-4f70-9beb-258ed9dc025a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 09:25:08 compute-1 nova_compute[189066]: 2025-12-05 09:25:08.784 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:25:08 compute-1 nova_compute[189066]: 2025-12-05 09:25:08.811 189070 DEBUG nova.virt.libvirt.driver [None req-c79702cb-8a4b-4665-b9ca-3624a8692fbd 3cf2b75d6732438a9a3626bc5db6d76e efceb9f71c3c447ea59c2d2694f9e636 - - default default] [instance: b4e74925-5201-4f70-9beb-258ed9dc025a] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Dec 05 09:25:08 compute-1 nova_compute[189066]: 2025-12-05 09:25:08.812 189070 DEBUG nova.virt.libvirt.driver [None req-c79702cb-8a4b-4665-b9ca-3624a8692fbd 3cf2b75d6732438a9a3626bc5db6d76e efceb9f71c3c447ea59c2d2694f9e636 - - default default] [instance: b4e74925-5201-4f70-9beb-258ed9dc025a] Ensure instance console log exists: /var/lib/nova/instances/b4e74925-5201-4f70-9beb-258ed9dc025a/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Dec 05 09:25:08 compute-1 nova_compute[189066]: 2025-12-05 09:25:08.812 189070 DEBUG oslo_concurrency.lockutils [None req-c79702cb-8a4b-4665-b9ca-3624a8692fbd 3cf2b75d6732438a9a3626bc5db6d76e efceb9f71c3c447ea59c2d2694f9e636 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:25:08 compute-1 nova_compute[189066]: 2025-12-05 09:25:08.813 189070 DEBUG oslo_concurrency.lockutils [None req-c79702cb-8a4b-4665-b9ca-3624a8692fbd 3cf2b75d6732438a9a3626bc5db6d76e efceb9f71c3c447ea59c2d2694f9e636 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:25:08 compute-1 nova_compute[189066]: 2025-12-05 09:25:08.813 189070 DEBUG oslo_concurrency.lockutils [None req-c79702cb-8a4b-4665-b9ca-3624a8692fbd 3cf2b75d6732438a9a3626bc5db6d76e efceb9f71c3c447ea59c2d2694f9e636 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:25:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:25:08.870 105272 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:25:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:25:08.870 105272 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:25:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:25:08.871 105272 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:25:09 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:25:09.247 105272 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=2deaed7a-68f6-453c-b7f8-10ef033f3762, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '8'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.747 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '2b94d7d0-b000-460f-8883-8953d60115d0', 'name': 'tempest-TestNetworkAdvancedServerOps-server-804921789', 'flavor': {'id': 'fbadeab4-f24f-4100-963a-d228b2a6f7c4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '3ebffd97-b242-42d7-b245-ebdaf8e4377c'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000003', 'OS-EXT-SRV-ATTR:host': 'compute-1.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'shutdown', 'tenant_id': 'e26ae3fdd48d4947978a480f70e14f84', 'user_id': '65751a90715341b2984ef84ebbaa1650', 'hostId': 'e310e8c2de91dded1bf420f01eae623153594a2dfb619537eb8a35c6', 'status': 'stopped', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.751 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '6b54aedd-9d9e-436b-9010-c5b04ffaca40', 'name': 'tempest-TestGettingAddress-server-520554038', 'flavor': {'id': 'fbadeab4-f24f-4100-963a-d228b2a6f7c4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '3ebffd97-b242-42d7-b245-ebdaf8e4377c'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000007', 'OS-EXT-SRV-ATTR:host': 'compute-1.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'paused', 'tenant_id': 'fa1cd463d74b49139a088d332d37e611', 'user_id': 'fae1c60e378945ea84b34c4824b835b1', 'hostId': '2bcb83f9ba5db29ea79c746e62a8a2dc63c5c999436a0571013bf81c', 'status': 'paused', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.752 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.755 12 DEBUG ceilometer.compute.pollsters [-] Instance 2b94d7d0-b000-460f-8883-8953d60115d0 was shut off while getting sample of network.outgoing.packets: Failed to inspect data of instance <name=instance-00000003, id=2b94d7d0-b000-460f-8883-8953d60115d0>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.759 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 6b54aedd-9d9e-436b-9010-c5b04ffaca40 / tap64e441f0-08 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.760 12 DEBUG ceilometer.compute.pollsters [-] 6b54aedd-9d9e-436b-9010-c5b04ffaca40/network.outgoing.packets volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.770 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '946b3ee5-1bed-40c2-8054-dfc165fc1a1a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'fae1c60e378945ea84b34c4824b835b1', 'user_name': None, 'project_id': 'fa1cd463d74b49139a088d332d37e611', 'project_name': None, 'resource_id': 'instance-00000007-6b54aedd-9d9e-436b-9010-c5b04ffaca40-tap64e441f0-08', 'timestamp': '2025-12-05T09:25:10.752918', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-520554038', 'name': 'tap64e441f0-08', 'instance_id': '6b54aedd-9d9e-436b-9010-c5b04ffaca40', 'instance_type': 'm1.nano', 'host': '2bcb83f9ba5db29ea79c746e62a8a2dc63c5c999436a0571013bf81c', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'fbadeab4-f24f-4100-963a-d228b2a6f7c4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '3ebffd97-b242-42d7-b245-ebdaf8e4377c'}, 'image_ref': '3ebffd97-b242-42d7-b245-ebdaf8e4377c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:5f:73:50', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap64e441f0-08'}, 'message_id': '4c001ae6-d1bc-11f0-a2f3-fa163e9454b0', 'monotonic_time': 3983.813708633, 'message_signature': 'b8b54a3d30c08d651d189f4c18b5c33880ae91b33d09cb6d168357f1e623affb'}]}, 'timestamp': '2025-12-05 09:25:10.762448', '_unique_id': 'a381fa465c3f4482a279ff79463f8623'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.770 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.770 12 ERROR oslo_messaging.notify.messaging     yield
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.770 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.770 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.770 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.770 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.770 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.770 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.770 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.770 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.770 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.770 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.770 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.770 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.770 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.770 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.770 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.770 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.770 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.770 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.770 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.770 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.770 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.770 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.770 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.770 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.770 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.770 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.770 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.770 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.770 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.772 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.773 12 DEBUG ceilometer.compute.pollsters [-] Instance 2b94d7d0-b000-460f-8883-8953d60115d0 was shut off while getting sample of disk.device.write.bytes: Failed to inspect data of instance <name=instance-00000003, id=2b94d7d0-b000-460f-8883-8953d60115d0>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.802 12 DEBUG ceilometer.compute.pollsters [-] 6b54aedd-9d9e-436b-9010-c5b04ffaca40/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.803 12 DEBUG ceilometer.compute.pollsters [-] 6b54aedd-9d9e-436b-9010-c5b04ffaca40/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.805 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f0a3faaa-5f76-47ab-8fd6-c190f08468e1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'fae1c60e378945ea84b34c4824b835b1', 'user_name': None, 'project_id': 'fa1cd463d74b49139a088d332d37e611', 'project_name': None, 'resource_id': '6b54aedd-9d9e-436b-9010-c5b04ffaca40-vda', 'timestamp': '2025-12-05T09:25:10.772468', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-520554038', 'name': 'instance-00000007', 'instance_id': '6b54aedd-9d9e-436b-9010-c5b04ffaca40', 'instance_type': 'm1.nano', 'host': '2bcb83f9ba5db29ea79c746e62a8a2dc63c5c999436a0571013bf81c', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'fbadeab4-f24f-4100-963a-d228b2a6f7c4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '3ebffd97-b242-42d7-b245-ebdaf8e4377c'}, 'image_ref': '3ebffd97-b242-42d7-b245-ebdaf8e4377c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '4c066b76-d1bc-11f0-a2f3-fa163e9454b0', 'monotonic_time': 3983.831327045, 'message_signature': 'b39c58ba123a424f67e3675a21d5f8e7aa7f646f01f37bb2ebf5b7aa80572a8c'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'fae1c60e378945ea84b34c4824b835b1', 'user_name': None, 'project_id': 'fa1cd463d74b49139a088d332d37e611', 'project_name': None, 'resource_id': '6b54aedd-9d9e-436b-9010-c5b04ffaca40-sda', 'timestamp': '2025-12-05T09:25:10.772468', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-520554038', 'name': 'instance-00000007', 'instance_id': '6b54aedd-9d9e-436b-9010-c5b04ffaca40', 'instance_type': 'm1.nano', 'host': '2bcb83f9ba5db29ea79c746e62a8a2dc63c5c999436a0571013bf81c', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'fbadeab4-f24f-4100-963a-d228b2a6f7c4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '3ebffd97-b242-42d7-b245-ebdaf8e4377c'}, 'image_ref': '3ebffd97-b242-42d7-b245-ebdaf8e4377c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '4c068e8a-d1bc-11f0-a2f3-fa163e9454b0', 'monotonic_time': 3983.831327045, 'message_signature': 'e633b6eb976300372825f26dbc94454bf449a96e6fd1cd226c1919ff8b326edd'}]}, 'timestamp': '2025-12-05 09:25:10.804172', '_unique_id': '8d9909a00b744e9fb20ade7b83d10577'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.805 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.805 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.805 12 ERROR oslo_messaging.notify.messaging     yield
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.805 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.805 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.805 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.805 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.805 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.805 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.805 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.805 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.805 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.805 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.805 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.805 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.805 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.805 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.805 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.805 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.805 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.805 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.805 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.805 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.805 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.805 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.805 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.805 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.805 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.805 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.805 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.805 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.805 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.805 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.805 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.805 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.805 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.805 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.805 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.805 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.805 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.805 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.805 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.805 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.805 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.805 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.805 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.805 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.805 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.805 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.805 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.805 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.805 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.805 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.805 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.806 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.807 12 DEBUG ceilometer.compute.pollsters [-] Instance 2b94d7d0-b000-460f-8883-8953d60115d0 was shut off while getting sample of disk.device.allocation: Failed to inspect data of instance <name=instance-00000003, id=2b94d7d0-b000-460f-8883-8953d60115d0>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.818 12 DEBUG ceilometer.compute.pollsters [-] 6b54aedd-9d9e-436b-9010-c5b04ffaca40/disk.device.allocation volume: 204800 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.819 12 DEBUG ceilometer.compute.pollsters [-] 6b54aedd-9d9e-436b-9010-c5b04ffaca40/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.821 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '269201ba-aba5-4096-961c-36c80db49cad', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 204800, 'user_id': 'fae1c60e378945ea84b34c4824b835b1', 'user_name': None, 'project_id': 'fa1cd463d74b49139a088d332d37e611', 'project_name': None, 'resource_id': '6b54aedd-9d9e-436b-9010-c5b04ffaca40-vda', 'timestamp': '2025-12-05T09:25:10.806973', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-520554038', 'name': 'instance-00000007', 'instance_id': '6b54aedd-9d9e-436b-9010-c5b04ffaca40', 'instance_type': 'm1.nano', 'host': '2bcb83f9ba5db29ea79c746e62a8a2dc63c5c999436a0571013bf81c', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'fbadeab4-f24f-4100-963a-d228b2a6f7c4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '3ebffd97-b242-42d7-b245-ebdaf8e4377c'}, 'image_ref': '3ebffd97-b242-42d7-b245-ebdaf8e4377c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '4c08e2ca-d1bc-11f0-a2f3-fa163e9454b0', 'monotonic_time': 3983.865475791, 'message_signature': 'b91e150b7417396fd8fb3dd2b026b158353d5d7519f972a013176a2a20b53243'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': 'fae1c60e378945ea84b34c4824b835b1', 'user_name': None, 'project_id': 'fa1cd463d74b49139a088d332d37e611', 'project_name': None, 'resource_id': '6b54aedd-9d9e-436b-9010-c5b04ffaca40-sda', 'timestamp': '2025-12-05T09:25:10.806973', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-520554038', 'name': 'instance-00000007', 'instance_id': '6b54aedd-9d9e-436b-9010-c5b04ffaca40', 'instance_type': 'm1.nano', 'host': '2bcb83f9ba5db29ea79c746e62a8a2dc63c5c999436a0571013bf81c', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'fbadeab4-f24f-4100-963a-d228b2a6f7c4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '3ebffd97-b242-42d7-b245-ebdaf8e4377c'}, 'image_ref': '3ebffd97-b242-42d7-b245-ebdaf8e4377c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '4c08f13e-d1bc-11f0-a2f3-fa163e9454b0', 'monotonic_time': 3983.865475791, 'message_signature': '8ea9c703d32ff5a4094d2dd8976a4d60617e170d32e746ed1fb124004d1f2da3'}]}, 'timestamp': '2025-12-05 09:25:10.819830', '_unique_id': 'f2d6925f2f10449caf0cfbd36d0636ce'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.821 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.821 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.821 12 ERROR oslo_messaging.notify.messaging     yield
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.821 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.821 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.821 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.821 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.821 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.821 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.821 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.821 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.821 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.821 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.821 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.821 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.821 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.821 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.821 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.821 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.821 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.821 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.821 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.821 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.821 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.821 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.821 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.821 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.821 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.821 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.821 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.821 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.821 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.821 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.821 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.821 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.821 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.821 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.821 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.821 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.821 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.821 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.821 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.821 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.821 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.821 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.821 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.821 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.821 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.821 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.821 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.821 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.821 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.821 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.821 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.822 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.823 12 DEBUG ceilometer.compute.pollsters [-] Instance 2b94d7d0-b000-460f-8883-8953d60115d0 was shut off while getting sample of disk.device.write.latency: Failed to inspect data of instance <name=instance-00000003, id=2b94d7d0-b000-460f-8883-8953d60115d0>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.823 12 DEBUG ceilometer.compute.pollsters [-] 6b54aedd-9d9e-436b-9010-c5b04ffaca40/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.823 12 DEBUG ceilometer.compute.pollsters [-] 6b54aedd-9d9e-436b-9010-c5b04ffaca40/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.824 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '76d13503-e195-425d-8e78-f09c8db7acf5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': 'fae1c60e378945ea84b34c4824b835b1', 'user_name': None, 'project_id': 'fa1cd463d74b49139a088d332d37e611', 'project_name': None, 'resource_id': '6b54aedd-9d9e-436b-9010-c5b04ffaca40-vda', 'timestamp': '2025-12-05T09:25:10.822358', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-520554038', 'name': 'instance-00000007', 'instance_id': '6b54aedd-9d9e-436b-9010-c5b04ffaca40', 'instance_type': 'm1.nano', 'host': '2bcb83f9ba5db29ea79c746e62a8a2dc63c5c999436a0571013bf81c', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'fbadeab4-f24f-4100-963a-d228b2a6f7c4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '3ebffd97-b242-42d7-b245-ebdaf8e4377c'}, 'image_ref': '3ebffd97-b242-42d7-b245-ebdaf8e4377c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '4c098388-d1bc-11f0-a2f3-fa163e9454b0', 'monotonic_time': 3983.831327045, 'message_signature': '144fecba009eb47736166534a3d8cb1615cfb36a4770e38a11c8867a52179cda'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': 'fae1c60e378945ea84b34c4824b835b1', 'user_name': None, 'project_id': 'fa1cd463d74b49139a088d332d37e611', 'project_name': None, 'resource_id': '6b54aedd-9d9e-436b-9010-c5b04ffaca40-sda', 'timestamp': '2025-12-05T09:25:10.822358', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-520554038', 'name': 'instance-00000007', 'instance_id': '6b54aedd-9d9e-436b-9010-c5b04ffaca40', 'instance_type': 'm1.nano', 'host': '2bcb83f9ba5db29ea79c746e62a8a2dc63c5c999436a0571013bf81c', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'fbadeab4-f24f-4100-963a-d228b2a6f7c4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '3ebffd97-b242-42d7-b245-ebdaf8e4377c'}, 'image_ref': '3ebffd97-b242-42d7-b245-ebdaf8e4377c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '4c098fc2-d1bc-11f0-a2f3-fa163e9454b0', 'monotonic_time': 3983.831327045, 'message_signature': 'dc7c9e05acf31c29c16b7e4e524234b83000b0ff215a7a3ca309fc8d5df92872'}]}, 'timestamp': '2025-12-05 09:25:10.823833', '_unique_id': '2efa5802a63d4b39a52ebed33e490bb8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.824 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.824 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.824 12 ERROR oslo_messaging.notify.messaging     yield
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.824 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.824 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.824 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.824 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.824 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.824 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.824 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.824 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.824 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.824 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.824 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.824 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.824 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.824 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.824 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.824 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.824 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.824 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.824 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.824 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.824 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.824 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.824 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.824 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.824 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.824 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.824 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.824 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.824 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.824 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.824 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.824 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.824 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.824 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.824 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.824 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.824 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.824 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.824 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.824 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.824 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.824 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.824 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.824 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.824 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.824 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.824 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.824 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.824 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.824 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.824 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.825 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.826 12 DEBUG ceilometer.compute.pollsters [-] Instance 2b94d7d0-b000-460f-8883-8953d60115d0 was shut off while getting sample of disk.device.usage: Failed to inspect data of instance <name=instance-00000003, id=2b94d7d0-b000-460f-8883-8953d60115d0>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.826 12 DEBUG ceilometer.compute.pollsters [-] 6b54aedd-9d9e-436b-9010-c5b04ffaca40/disk.device.usage volume: 196624 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.826 12 DEBUG ceilometer.compute.pollsters [-] 6b54aedd-9d9e-436b-9010-c5b04ffaca40/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.827 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c5bebbb9-2285-4df9-b2f7-00d2691afd43', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 196624, 'user_id': 'fae1c60e378945ea84b34c4824b835b1', 'user_name': None, 'project_id': 'fa1cd463d74b49139a088d332d37e611', 'project_name': None, 'resource_id': '6b54aedd-9d9e-436b-9010-c5b04ffaca40-vda', 'timestamp': '2025-12-05T09:25:10.825636', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-520554038', 'name': 'instance-00000007', 'instance_id': '6b54aedd-9d9e-436b-9010-c5b04ffaca40', 'instance_type': 'm1.nano', 'host': '2bcb83f9ba5db29ea79c746e62a8a2dc63c5c999436a0571013bf81c', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'fbadeab4-f24f-4100-963a-d228b2a6f7c4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '3ebffd97-b242-42d7-b245-ebdaf8e4377c'}, 'image_ref': '3ebffd97-b242-42d7-b245-ebdaf8e4377c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '4c0a00d8-d1bc-11f0-a2f3-fa163e9454b0', 'monotonic_time': 3983.865475791, 'message_signature': '223ac006598a3a80b3b6960fb7b2f046a30fb99489997cc966b42f7d6cc12812'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'fae1c60e378945ea84b34c4824b835b1', 'user_name': None, 'project_id': 'fa1cd463d74b49139a088d332d37e611', 'project_name': None, 'resource_id': '6b54aedd-9d9e-436b-9010-c5b04ffaca40-sda', 'timestamp': '2025-12-05T09:25:10.825636', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-520554038', 'name': 'instance-00000007', 'instance_id': '6b54aedd-9d9e-436b-9010-c5b04ffaca40', 'instance_type': 'm1.nano', 'host': '2bcb83f9ba5db29ea79c746e62a8a2dc63c5c999436a0571013bf81c', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'fbadeab4-f24f-4100-963a-d228b2a6f7c4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '3ebffd97-b242-42d7-b245-ebdaf8e4377c'}, 'image_ref': '3ebffd97-b242-42d7-b245-ebdaf8e4377c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '4c0a0a1a-d1bc-11f0-a2f3-fa163e9454b0', 'monotonic_time': 3983.865475791, 'message_signature': 'a3587b714ba4fc875711b0bf349739a93114af73b40558f3d6343c5a06a81f4e'}]}, 'timestamp': '2025-12-05 09:25:10.826958', '_unique_id': '114f85e88a0849438a22420475a7319d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.827 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.827 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.827 12 ERROR oslo_messaging.notify.messaging     yield
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.827 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.827 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.827 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.827 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.827 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.827 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.827 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.827 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.827 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.827 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.827 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.827 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.827 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.827 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.827 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.827 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.827 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.827 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.827 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.827 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.827 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.827 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.827 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.827 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.827 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.827 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.827 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.827 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.827 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.827 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.827 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.827 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.827 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.827 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.827 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.827 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.827 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.827 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.827 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.827 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.827 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.827 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.827 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.827 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.827 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.827 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.827 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.827 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.827 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.827 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.827 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.828 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.828 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.828 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-TestGettingAddress-server-520554038>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestGettingAddress-server-520554038>]
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.829 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.829 12 DEBUG ceilometer.compute.pollsters [-] Instance 2b94d7d0-b000-460f-8883-8953d60115d0 was shut off while getting sample of disk.device.read.latency: Failed to inspect data of instance <name=instance-00000003, id=2b94d7d0-b000-460f-8883-8953d60115d0>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.830 12 DEBUG ceilometer.compute.pollsters [-] 6b54aedd-9d9e-436b-9010-c5b04ffaca40/disk.device.read.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.830 12 DEBUG ceilometer.compute.pollsters [-] 6b54aedd-9d9e-436b-9010-c5b04ffaca40/disk.device.read.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.831 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c45f45a8-be51-4bad-9e94-5f895b8506b8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': 'fae1c60e378945ea84b34c4824b835b1', 'user_name': None, 'project_id': 'fa1cd463d74b49139a088d332d37e611', 'project_name': None, 'resource_id': '6b54aedd-9d9e-436b-9010-c5b04ffaca40-vda', 'timestamp': '2025-12-05T09:25:10.829319', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-520554038', 'name': 'instance-00000007', 'instance_id': '6b54aedd-9d9e-436b-9010-c5b04ffaca40', 'instance_type': 'm1.nano', 'host': '2bcb83f9ba5db29ea79c746e62a8a2dc63c5c999436a0571013bf81c', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'fbadeab4-f24f-4100-963a-d228b2a6f7c4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '3ebffd97-b242-42d7-b245-ebdaf8e4377c'}, 'image_ref': '3ebffd97-b242-42d7-b245-ebdaf8e4377c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '4c0a8c56-d1bc-11f0-a2f3-fa163e9454b0', 'monotonic_time': 3983.831327045, 'message_signature': '2b941ca8624299a19440c520d98578a67b6398e0b8e68b3a5a40ac1a5f18a0ba'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': 'fae1c60e378945ea84b34c4824b835b1', 'user_name': None, 'project_id': 'fa1cd463d74b49139a088d332d37e611', 'project_name': None, 'resource_id': '6b54aedd-9d9e-436b-9010-c5b04ffaca40-sda', 'timestamp': '2025-12-05T09:25:10.829319', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-520554038', 'name': 'instance-00000007', 'instance_id': '6b54aedd-9d9e-436b-9010-c5b04ffaca40', 'instance_type': 'm1.nano', 'host': '2bcb83f9ba5db29ea79c746e62a8a2dc63c5c999436a0571013bf81c', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'fbadeab4-f24f-4100-963a-d228b2a6f7c4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '3ebffd97-b242-42d7-b245-ebdaf8e4377c'}, 'image_ref': '3ebffd97-b242-42d7-b245-ebdaf8e4377c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '4c0a952a-d1bc-11f0-a2f3-fa163e9454b0', 'monotonic_time': 3983.831327045, 'message_signature': 'e15dc963fe5028041e514451b17dd6b710480413d9e5ed213384dfa6a883e98f'}]}, 'timestamp': '2025-12-05 09:25:10.830516', '_unique_id': 'bf88690f178446aaa6fcd3b0bb9c7064'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.831 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.831 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.831 12 ERROR oslo_messaging.notify.messaging     yield
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.831 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.831 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.831 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.831 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.831 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.831 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.831 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.831 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.831 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.831 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.831 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.831 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.831 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.831 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.831 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.831 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.831 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.831 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.831 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.831 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.831 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.831 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.831 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.831 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.831 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.831 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.831 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.831 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.831 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.831 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.831 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.831 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.831 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.831 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.831 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.831 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.831 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.831 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.831 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.831 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.831 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.831 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.831 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.831 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.831 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.831 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.831 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.831 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.831 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.831 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.831 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.832 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.832 12 DEBUG ceilometer.compute.pollsters [-] Instance 2b94d7d0-b000-460f-8883-8953d60115d0 was shut off while getting sample of disk.device.write.requests: Failed to inspect data of instance <name=instance-00000003, id=2b94d7d0-b000-460f-8883-8953d60115d0>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.832 12 DEBUG ceilometer.compute.pollsters [-] 6b54aedd-9d9e-436b-9010-c5b04ffaca40/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.833 12 DEBUG ceilometer.compute.pollsters [-] 6b54aedd-9d9e-436b-9010-c5b04ffaca40/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.834 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '11447c50-e527-4765-b35d-753ab692ed1e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': 'fae1c60e378945ea84b34c4824b835b1', 'user_name': None, 'project_id': 'fa1cd463d74b49139a088d332d37e611', 'project_name': None, 'resource_id': '6b54aedd-9d9e-436b-9010-c5b04ffaca40-vda', 'timestamp': '2025-12-05T09:25:10.832081', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-520554038', 'name': 'instance-00000007', 'instance_id': '6b54aedd-9d9e-436b-9010-c5b04ffaca40', 'instance_type': 'm1.nano', 'host': '2bcb83f9ba5db29ea79c746e62a8a2dc63c5c999436a0571013bf81c', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'fbadeab4-f24f-4100-963a-d228b2a6f7c4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '3ebffd97-b242-42d7-b245-ebdaf8e4377c'}, 'image_ref': '3ebffd97-b242-42d7-b245-ebdaf8e4377c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '4c0af83a-d1bc-11f0-a2f3-fa163e9454b0', 'monotonic_time': 3983.831327045, 'message_signature': '4d03d0a16f3d8e46b6e970cbb7b3af05b9f37df56793893a49f58d9103b9fff7'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': 'fae1c60e378945ea84b34c4824b835b1', 'user_name': None, 'project_id': 'fa1cd463d74b49139a088d332d37e611', 'project_name': None, 'resource_id': '6b54aedd-9d9e-436b-9010-c5b04ffaca40-sda', 'timestamp': '2025-12-05T09:25:10.832081', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-520554038', 'name': 'instance-00000007', 'instance_id': '6b54aedd-9d9e-436b-9010-c5b04ffaca40', 'instance_type': 'm1.nano', 'host': '2bcb83f9ba5db29ea79c746e62a8a2dc63c5c999436a0571013bf81c', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'fbadeab4-f24f-4100-963a-d228b2a6f7c4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '3ebffd97-b242-42d7-b245-ebdaf8e4377c'}, 'image_ref': '3ebffd97-b242-42d7-b245-ebdaf8e4377c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '4c0b024e-d1bc-11f0-a2f3-fa163e9454b0', 'monotonic_time': 3983.831327045, 'message_signature': '936b03128b7e1f6614d211cb5d082424d4b01cdc38efe575d23f539c26a8ef01'}]}, 'timestamp': '2025-12-05 09:25:10.833337', '_unique_id': '6af23917ea4b4d499f5fd9e3a0c16ddf'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.834 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.834 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.834 12 ERROR oslo_messaging.notify.messaging     yield
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.834 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.834 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.834 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.834 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.834 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.834 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.834 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.834 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.834 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.834 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.834 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.834 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.834 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.834 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.834 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.834 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.834 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.834 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.834 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.834 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.834 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.834 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.834 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.834 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.834 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.834 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.834 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.834 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.834 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.834 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.834 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.834 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.834 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.834 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.834 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.834 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.834 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.834 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.834 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.834 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.834 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.834 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.834 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.834 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.834 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.834 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.834 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.834 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.834 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.834 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.834 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.834 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.834 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.835 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-TestGettingAddress-server-520554038>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestGettingAddress-server-520554038>]
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.835 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.835 12 DEBUG ceilometer.compute.pollsters [-] Instance 2b94d7d0-b000-460f-8883-8953d60115d0 was shut off while getting sample of network.incoming.bytes: Failed to inspect data of instance <name=instance-00000003, id=2b94d7d0-b000-460f-8883-8953d60115d0>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.835 12 DEBUG ceilometer.compute.pollsters [-] 6b54aedd-9d9e-436b-9010-c5b04ffaca40/network.incoming.bytes volume: 90 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.836 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '108238fd-216c-4e81-902d-dcdce1cf0f00', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 90, 'user_id': 'fae1c60e378945ea84b34c4824b835b1', 'user_name': None, 'project_id': 'fa1cd463d74b49139a088d332d37e611', 'project_name': None, 'resource_id': 'instance-00000007-6b54aedd-9d9e-436b-9010-c5b04ffaca40-tap64e441f0-08', 'timestamp': '2025-12-05T09:25:10.835235', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-520554038', 'name': 'tap64e441f0-08', 'instance_id': '6b54aedd-9d9e-436b-9010-c5b04ffaca40', 'instance_type': 'm1.nano', 'host': '2bcb83f9ba5db29ea79c746e62a8a2dc63c5c999436a0571013bf81c', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'fbadeab4-f24f-4100-963a-d228b2a6f7c4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '3ebffd97-b242-42d7-b245-ebdaf8e4377c'}, 'image_ref': '3ebffd97-b242-42d7-b245-ebdaf8e4377c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:5f:73:50', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap64e441f0-08'}, 'message_id': '4c0b710c-d1bc-11f0-a2f3-fa163e9454b0', 'monotonic_time': 3983.813708633, 'message_signature': '8ce7d600840020c374034ee9bb549a89bc51a593abd100d571f10c111e9f557d'}]}, 'timestamp': '2025-12-05 09:25:10.836168', '_unique_id': 'd30d3321ca024e6f8eb9c107923a1325'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.836 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.836 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.836 12 ERROR oslo_messaging.notify.messaging     yield
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.836 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.836 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.836 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.836 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.836 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.836 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.836 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.836 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.836 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.836 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.836 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.836 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.836 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.836 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.836 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.836 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.836 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.836 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.836 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.836 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.836 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.836 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.836 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.836 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.836 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.836 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.836 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.836 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.836 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.836 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.836 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.836 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.836 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.836 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.836 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.836 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.836 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.836 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.836 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.836 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.836 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.836 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.836 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.836 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.836 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.836 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.836 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.836 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.836 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.836 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.836 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.837 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.838 12 DEBUG ceilometer.compute.pollsters [-] Instance 2b94d7d0-b000-460f-8883-8953d60115d0 was shut off while getting sample of cpu: Failed to inspect data of instance <name=instance-00000003, id=2b94d7d0-b000-460f-8883-8953d60115d0>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.853 12 DEBUG ceilometer.compute.pollsters [-] 6b54aedd-9d9e-436b-9010-c5b04ffaca40/cpu volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.855 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'bb3f099c-0292-48ad-86f3-c587d944f40a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': 'fae1c60e378945ea84b34c4824b835b1', 'user_name': None, 'project_id': 'fa1cd463d74b49139a088d332d37e611', 'project_name': None, 'resource_id': '6b54aedd-9d9e-436b-9010-c5b04ffaca40', 'timestamp': '2025-12-05T09:25:10.837500', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-520554038', 'name': 'instance-00000007', 'instance_id': '6b54aedd-9d9e-436b-9010-c5b04ffaca40', 'instance_type': 'm1.nano', 'host': '2bcb83f9ba5db29ea79c746e62a8a2dc63c5c999436a0571013bf81c', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'fbadeab4-f24f-4100-963a-d228b2a6f7c4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '3ebffd97-b242-42d7-b245-ebdaf8e4377c'}, 'image_ref': '3ebffd97-b242-42d7-b245-ebdaf8e4377c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '4c0e393c-d1bc-11f0-a2f3-fa163e9454b0', 'monotonic_time': 3983.911051427, 'message_signature': 'd0d8c893315cb143a35f02fdf563fb642bcd45cb43a3a80639cc6a848007cfb1'}]}, 'timestamp': '2025-12-05 09:25:10.854590', '_unique_id': '1728dc10c5ce453d8481f6532797477a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.855 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.855 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.855 12 ERROR oslo_messaging.notify.messaging     yield
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.855 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.855 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.855 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.855 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.855 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.855 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.855 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.855 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.855 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.855 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.855 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.855 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.855 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.855 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.855 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.855 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.855 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.855 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.855 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.855 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.855 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.855 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.855 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.855 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.855 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.855 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.855 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.855 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.855 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.855 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.855 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.855 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.855 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.855 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.855 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.855 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.855 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.855 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.855 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.855 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.855 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.855 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.855 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.855 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.855 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.855 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.855 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.855 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.855 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.855 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.855 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.856 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.857 12 DEBUG ceilometer.compute.pollsters [-] Instance 2b94d7d0-b000-460f-8883-8953d60115d0 was shut off while getting sample of network.outgoing.packets.drop: Failed to inspect data of instance <name=instance-00000003, id=2b94d7d0-b000-460f-8883-8953d60115d0>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.857 12 DEBUG ceilometer.compute.pollsters [-] 6b54aedd-9d9e-436b-9010-c5b04ffaca40/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.860 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '917b67f7-bab3-4534-96de-bf2a477d4e68', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'fae1c60e378945ea84b34c4824b835b1', 'user_name': None, 'project_id': 'fa1cd463d74b49139a088d332d37e611', 'project_name': None, 'resource_id': 'instance-00000007-6b54aedd-9d9e-436b-9010-c5b04ffaca40-tap64e441f0-08', 'timestamp': '2025-12-05T09:25:10.856838', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-520554038', 'name': 'tap64e441f0-08', 'instance_id': '6b54aedd-9d9e-436b-9010-c5b04ffaca40', 'instance_type': 'm1.nano', 'host': '2bcb83f9ba5db29ea79c746e62a8a2dc63c5c999436a0571013bf81c', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'fbadeab4-f24f-4100-963a-d228b2a6f7c4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '3ebffd97-b242-42d7-b245-ebdaf8e4377c'}, 'image_ref': '3ebffd97-b242-42d7-b245-ebdaf8e4377c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:5f:73:50', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap64e441f0-08'}, 'message_id': '4c0ecbcc-d1bc-11f0-a2f3-fa163e9454b0', 'monotonic_time': 3983.813708633, 'message_signature': '9e9f835108a64ed6fc0d16c360e7f5c1eb2959ad753ad4d2a56229c6874b9291'}]}, 'timestamp': '2025-12-05 09:25:10.858153', '_unique_id': 'ec9e16659f0f4951b1d7f67c1a301bc5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.860 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.860 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.860 12 ERROR oslo_messaging.notify.messaging     yield
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.860 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.860 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.860 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.860 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.860 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.860 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.860 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.860 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.860 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.860 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.860 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.860 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.860 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.860 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.860 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.860 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.860 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.860 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.860 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.860 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.860 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.860 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.860 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.860 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.860 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.860 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.860 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.860 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.860 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.860 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.860 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.860 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.860 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.860 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.860 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.860 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.860 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.860 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.860 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.860 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.860 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.860 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.860 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.860 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.860 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.860 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.860 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.860 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.860 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.860 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.860 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.861 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.861 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.861 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-TestGettingAddress-server-520554038>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestGettingAddress-server-520554038>]
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.861 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.862 12 DEBUG ceilometer.compute.pollsters [-] Instance 2b94d7d0-b000-460f-8883-8953d60115d0 was shut off while getting sample of network.incoming.packets.drop: Failed to inspect data of instance <name=instance-00000003, id=2b94d7d0-b000-460f-8883-8953d60115d0>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.862 12 DEBUG ceilometer.compute.pollsters [-] 6b54aedd-9d9e-436b-9010-c5b04ffaca40/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.863 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8bd39476-8651-4724-a0c1-95fb3a4698f2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'fae1c60e378945ea84b34c4824b835b1', 'user_name': None, 'project_id': 'fa1cd463d74b49139a088d332d37e611', 'project_name': None, 'resource_id': 'instance-00000007-6b54aedd-9d9e-436b-9010-c5b04ffaca40-tap64e441f0-08', 'timestamp': '2025-12-05T09:25:10.861672', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-520554038', 'name': 'tap64e441f0-08', 'instance_id': '6b54aedd-9d9e-436b-9010-c5b04ffaca40', 'instance_type': 'm1.nano', 'host': '2bcb83f9ba5db29ea79c746e62a8a2dc63c5c999436a0571013bf81c', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'fbadeab4-f24f-4100-963a-d228b2a6f7c4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '3ebffd97-b242-42d7-b245-ebdaf8e4377c'}, 'image_ref': '3ebffd97-b242-42d7-b245-ebdaf8e4377c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:5f:73:50', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap64e441f0-08'}, 'message_id': '4c0f8ada-d1bc-11f0-a2f3-fa163e9454b0', 'monotonic_time': 3983.813708633, 'message_signature': '8a7ec5c3ef014bc7bf1b17fa8e086808fdbb2d28bbb4bf8dbb1367d9f253bcc0'}]}, 'timestamp': '2025-12-05 09:25:10.863106', '_unique_id': 'debd48f441cf4b008afb3af247cba369'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.863 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.863 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.863 12 ERROR oslo_messaging.notify.messaging     yield
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.863 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.863 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.863 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.863 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.863 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.863 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.863 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.863 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.863 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.863 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.863 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.863 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.863 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.863 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.863 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.863 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.863 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.863 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.863 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.863 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.863 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.863 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.863 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.863 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.863 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.863 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.863 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.863 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.863 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.863 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.863 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.863 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.863 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.863 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.863 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.863 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.863 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.863 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.863 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.863 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.863 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.863 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.863 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.863 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.863 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.863 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.863 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.863 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.863 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.863 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.863 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.864 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.865 12 DEBUG ceilometer.compute.pollsters [-] Instance 2b94d7d0-b000-460f-8883-8953d60115d0 was shut off while getting sample of network.incoming.bytes.delta: Failed to inspect data of instance <name=instance-00000003, id=2b94d7d0-b000-460f-8883-8953d60115d0>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.865 12 DEBUG ceilometer.compute.pollsters [-] 6b54aedd-9d9e-436b-9010-c5b04ffaca40/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.866 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '44071368-8241-48e2-993a-7b001298647e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'fae1c60e378945ea84b34c4824b835b1', 'user_name': None, 'project_id': 'fa1cd463d74b49139a088d332d37e611', 'project_name': None, 'resource_id': 'instance-00000007-6b54aedd-9d9e-436b-9010-c5b04ffaca40-tap64e441f0-08', 'timestamp': '2025-12-05T09:25:10.864374', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-520554038', 'name': 'tap64e441f0-08', 'instance_id': '6b54aedd-9d9e-436b-9010-c5b04ffaca40', 'instance_type': 'm1.nano', 'host': '2bcb83f9ba5db29ea79c746e62a8a2dc63c5c999436a0571013bf81c', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'fbadeab4-f24f-4100-963a-d228b2a6f7c4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '3ebffd97-b242-42d7-b245-ebdaf8e4377c'}, 'image_ref': '3ebffd97-b242-42d7-b245-ebdaf8e4377c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:5f:73:50', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap64e441f0-08'}, 'message_id': '4c0fe642-d1bc-11f0-a2f3-fa163e9454b0', 'monotonic_time': 3983.813708633, 'message_signature': 'acbae4077908422c7bd7819bb2bc3bdb2863df3b34198faf2ab64f328e05addc'}]}, 'timestamp': '2025-12-05 09:25:10.865382', '_unique_id': 'de1de1efb6b44102811a32af6143d429'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.866 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.866 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.866 12 ERROR oslo_messaging.notify.messaging     yield
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.866 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.866 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.866 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.866 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.866 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.866 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.866 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.866 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.866 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.866 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.866 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.866 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.866 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.866 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.866 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.866 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.866 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.866 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.866 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.866 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.866 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.866 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.866 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.866 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.866 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.866 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.866 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.866 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.866 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.866 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.866 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.866 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.866 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.866 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.866 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.866 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.866 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.866 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.866 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.866 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.866 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.866 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.866 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.866 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.866 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.866 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.866 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.866 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.866 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.866 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.866 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.866 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.867 12 DEBUG ceilometer.compute.pollsters [-] Instance 2b94d7d0-b000-460f-8883-8953d60115d0 was shut off while getting sample of disk.device.read.bytes: Failed to inspect data of instance <name=instance-00000003, id=2b94d7d0-b000-460f-8883-8953d60115d0>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.867 12 DEBUG ceilometer.compute.pollsters [-] 6b54aedd-9d9e-436b-9010-c5b04ffaca40/disk.device.read.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.867 12 DEBUG ceilometer.compute.pollsters [-] 6b54aedd-9d9e-436b-9010-c5b04ffaca40/disk.device.read.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.868 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'dd42a667-d0ad-44f6-81b4-8f05405936b9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'fae1c60e378945ea84b34c4824b835b1', 'user_name': None, 'project_id': 'fa1cd463d74b49139a088d332d37e611', 'project_name': None, 'resource_id': '6b54aedd-9d9e-436b-9010-c5b04ffaca40-vda', 'timestamp': '2025-12-05T09:25:10.866849', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-520554038', 'name': 'instance-00000007', 'instance_id': '6b54aedd-9d9e-436b-9010-c5b04ffaca40', 'instance_type': 'm1.nano', 'host': '2bcb83f9ba5db29ea79c746e62a8a2dc63c5c999436a0571013bf81c', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'fbadeab4-f24f-4100-963a-d228b2a6f7c4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '3ebffd97-b242-42d7-b245-ebdaf8e4377c'}, 'image_ref': '3ebffd97-b242-42d7-b245-ebdaf8e4377c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '4c1049c0-d1bc-11f0-a2f3-fa163e9454b0', 'monotonic_time': 3983.831327045, 'message_signature': '7d62b86ac861a4fd0065a2d1152ce41973989a82c147ece3799413775a01a785'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'fae1c60e378945ea84b34c4824b835b1', 'user_name': None, 'project_id': 'fa1cd463d74b49139a088d332d37e611', 'project_name': None, 'resource_id': '6b54aedd-9d9e-436b-9010-c5b04ffaca40-sda', 'timestamp': '2025-12-05T09:25:10.866849', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-520554038', 'name': 'instance-00000007', 'instance_id': '6b54aedd-9d9e-436b-9010-c5b04ffaca40', 'instance_type': 'm1.nano', 'host': '2bcb83f9ba5db29ea79c746e62a8a2dc63c5c999436a0571013bf81c', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'fbadeab4-f24f-4100-963a-d228b2a6f7c4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '3ebffd97-b242-42d7-b245-ebdaf8e4377c'}, 'image_ref': '3ebffd97-b242-42d7-b245-ebdaf8e4377c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '4c1053ac-d1bc-11f0-a2f3-fa163e9454b0', 'monotonic_time': 3983.831327045, 'message_signature': '6d65f4e58772e32e85f1deaf39e589ed1d8b03fdd287f61f115ddd20e3715218'}]}, 'timestamp': '2025-12-05 09:25:10.868159', '_unique_id': 'dbcc69b7b4e04d368dc8b90a20ea4a44'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.868 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.868 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.868 12 ERROR oslo_messaging.notify.messaging     yield
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.868 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.868 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.868 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.868 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.868 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.868 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.868 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.868 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.868 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.868 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.868 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.868 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.868 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.868 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.868 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.868 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.868 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.868 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.868 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.868 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.868 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.868 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.868 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.868 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.868 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.868 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.868 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.868 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.868 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.868 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.868 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.868 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.868 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.868 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.868 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.868 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.868 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.868 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.868 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.868 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.868 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.868 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.868 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.868 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.868 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.868 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.868 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.868 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.868 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.868 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.868 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.869 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.869 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.869 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-TestGettingAddress-server-520554038>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestGettingAddress-server-520554038>]
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.869 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.870 12 DEBUG ceilometer.compute.pollsters [-] Instance 2b94d7d0-b000-460f-8883-8953d60115d0 was shut off while getting sample of disk.device.capacity: Failed to inspect data of instance <name=instance-00000003, id=2b94d7d0-b000-460f-8883-8953d60115d0>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.870 12 DEBUG ceilometer.compute.pollsters [-] 6b54aedd-9d9e-436b-9010-c5b04ffaca40/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.870 12 DEBUG ceilometer.compute.pollsters [-] 6b54aedd-9d9e-436b-9010-c5b04ffaca40/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.871 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'dd077ee5-2fb5-43f3-990f-d690cc2d4b06', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'fae1c60e378945ea84b34c4824b835b1', 'user_name': None, 'project_id': 'fa1cd463d74b49139a088d332d37e611', 'project_name': None, 'resource_id': '6b54aedd-9d9e-436b-9010-c5b04ffaca40-vda', 'timestamp': '2025-12-05T09:25:10.869686', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-520554038', 'name': 'instance-00000007', 'instance_id': '6b54aedd-9d9e-436b-9010-c5b04ffaca40', 'instance_type': 'm1.nano', 'host': '2bcb83f9ba5db29ea79c746e62a8a2dc63c5c999436a0571013bf81c', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'fbadeab4-f24f-4100-963a-d228b2a6f7c4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '3ebffd97-b242-42d7-b245-ebdaf8e4377c'}, 'image_ref': '3ebffd97-b242-42d7-b245-ebdaf8e4377c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '4c10b31a-d1bc-11f0-a2f3-fa163e9454b0', 'monotonic_time': 3983.865475791, 'message_signature': '4530bf3b3881b75ffee453d232b9e8b8ab3e1ee56eca527fa1004ebe7ffc60b1'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'fae1c60e378945ea84b34c4824b835b1', 'user_name': None, 'project_id': 'fa1cd463d74b49139a088d332d37e611', 'project_name': None, 'resource_id': '6b54aedd-9d9e-436b-9010-c5b04ffaca40-sda', 'timestamp': '2025-12-05T09:25:10.869686', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-520554038', 'name': 'instance-00000007', 'instance_id': '6b54aedd-9d9e-436b-9010-c5b04ffaca40', 'instance_type': 'm1.nano', 'host': '2bcb83f9ba5db29ea79c746e62a8a2dc63c5c999436a0571013bf81c', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'fbadeab4-f24f-4100-963a-d228b2a6f7c4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '3ebffd97-b242-42d7-b245-ebdaf8e4377c'}, 'image_ref': '3ebffd97-b242-42d7-b245-ebdaf8e4377c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '4c10bc52-d1bc-11f0-a2f3-fa163e9454b0', 'monotonic_time': 3983.865475791, 'message_signature': 'd06a01fdec90a5124cb2264927ad2fc05f458b8bc93dffa179157df64b31601b'}]}, 'timestamp': '2025-12-05 09:25:10.870846', '_unique_id': '520970e7e0a64746a4c2b066fa892fa8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.871 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.871 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.871 12 ERROR oslo_messaging.notify.messaging     yield
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.871 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.871 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.871 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.871 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.871 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.871 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.871 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.871 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.871 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.871 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.871 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.871 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.871 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.871 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.871 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.871 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.871 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.871 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.871 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.871 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.871 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.871 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.871 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.871 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.871 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.871 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.871 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.871 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.871 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.871 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.871 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.871 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.871 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.871 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.871 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.871 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.871 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.871 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.871 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.871 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.871 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.871 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.871 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.871 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.871 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.871 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.871 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.871 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.871 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.871 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.871 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.872 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.872 12 DEBUG ceilometer.compute.pollsters [-] Instance 2b94d7d0-b000-460f-8883-8953d60115d0 was shut off while getting sample of network.outgoing.bytes: Failed to inspect data of instance <name=instance-00000003, id=2b94d7d0-b000-460f-8883-8953d60115d0>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.872 12 DEBUG ceilometer.compute.pollsters [-] 6b54aedd-9d9e-436b-9010-c5b04ffaca40/network.outgoing.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.874 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1d10eea8-d0e2-4732-83a8-d2942fbaa676', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'fae1c60e378945ea84b34c4824b835b1', 'user_name': None, 'project_id': 'fa1cd463d74b49139a088d332d37e611', 'project_name': None, 'resource_id': 'instance-00000007-6b54aedd-9d9e-436b-9010-c5b04ffaca40-tap64e441f0-08', 'timestamp': '2025-12-05T09:25:10.872292', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-520554038', 'name': 'tap64e441f0-08', 'instance_id': '6b54aedd-9d9e-436b-9010-c5b04ffaca40', 'instance_type': 'm1.nano', 'host': '2bcb83f9ba5db29ea79c746e62a8a2dc63c5c999436a0571013bf81c', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'fbadeab4-f24f-4100-963a-d228b2a6f7c4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '3ebffd97-b242-42d7-b245-ebdaf8e4377c'}, 'image_ref': '3ebffd97-b242-42d7-b245-ebdaf8e4377c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:5f:73:50', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap64e441f0-08'}, 'message_id': '4c11180a-d1bc-11f0-a2f3-fa163e9454b0', 'monotonic_time': 3983.813708633, 'message_signature': '12b78939ad9e1f23eed830983b92768ff4c6e6590827e21898e20a78e75b986a'}]}, 'timestamp': '2025-12-05 09:25:10.873234', '_unique_id': '565eb7a29b2f41e0a85ee99d2fc655e3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.874 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.874 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.874 12 ERROR oslo_messaging.notify.messaging     yield
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.874 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.874 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.874 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.874 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.874 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.874 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.874 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.874 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.874 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.874 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.874 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.874 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.874 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.874 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.874 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.874 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.874 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.874 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.874 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.874 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.874 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.874 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.874 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.874 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.874 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.874 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.874 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.874 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.874 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.874 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.874 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.874 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.874 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.874 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.874 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.874 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.874 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.874 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.874 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.874 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.874 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.874 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.874 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.874 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.874 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.874 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.874 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.874 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.874 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.874 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.874 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.874 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.875 12 DEBUG ceilometer.compute.pollsters [-] Instance 2b94d7d0-b000-460f-8883-8953d60115d0 was shut off while getting sample of network.incoming.packets.error: Failed to inspect data of instance <name=instance-00000003, id=2b94d7d0-b000-460f-8883-8953d60115d0>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.875 12 DEBUG ceilometer.compute.pollsters [-] 6b54aedd-9d9e-436b-9010-c5b04ffaca40/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.876 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '20f0cb41-3848-41a6-8c08-2aa1252ba55b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'fae1c60e378945ea84b34c4824b835b1', 'user_name': None, 'project_id': 'fa1cd463d74b49139a088d332d37e611', 'project_name': None, 'resource_id': 'instance-00000007-6b54aedd-9d9e-436b-9010-c5b04ffaca40-tap64e441f0-08', 'timestamp': '2025-12-05T09:25:10.874718', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-520554038', 'name': 'tap64e441f0-08', 'instance_id': '6b54aedd-9d9e-436b-9010-c5b04ffaca40', 'instance_type': 'm1.nano', 'host': '2bcb83f9ba5db29ea79c746e62a8a2dc63c5c999436a0571013bf81c', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'fbadeab4-f24f-4100-963a-d228b2a6f7c4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '3ebffd97-b242-42d7-b245-ebdaf8e4377c'}, 'image_ref': '3ebffd97-b242-42d7-b245-ebdaf8e4377c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:5f:73:50', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap64e441f0-08'}, 'message_id': '4c117af2-d1bc-11f0-a2f3-fa163e9454b0', 'monotonic_time': 3983.813708633, 'message_signature': '3a4a3aa1c554d5c960d57c3bd7185200e090870c95c7ba875bbc615f9a38bc88'}]}, 'timestamp': '2025-12-05 09:25:10.875741', '_unique_id': 'e857ae47c658477b8a90f2dcdb667ae9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.876 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.876 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.876 12 ERROR oslo_messaging.notify.messaging     yield
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.876 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.876 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.876 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.876 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.876 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.876 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.876 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.876 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.876 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.876 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.876 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.876 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.876 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.876 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.876 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.876 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.876 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.876 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.876 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.876 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.876 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.876 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.876 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.876 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.876 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.876 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.876 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.876 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.876 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.876 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.876 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.876 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.876 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.876 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.876 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.876 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.876 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.876 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.876 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.876 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.876 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.876 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.876 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.876 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.876 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.876 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.876 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.876 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.876 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.876 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.876 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.876 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.877 12 DEBUG ceilometer.compute.pollsters [-] Instance 2b94d7d0-b000-460f-8883-8953d60115d0 was shut off while getting sample of disk.device.read.requests: Failed to inspect data of instance <name=instance-00000003, id=2b94d7d0-b000-460f-8883-8953d60115d0>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.877 12 DEBUG ceilometer.compute.pollsters [-] 6b54aedd-9d9e-436b-9010-c5b04ffaca40/disk.device.read.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.877 12 DEBUG ceilometer.compute.pollsters [-] 6b54aedd-9d9e-436b-9010-c5b04ffaca40/disk.device.read.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.878 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'cbf37490-b2cb-499f-928e-dcd68eacce4d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': 'fae1c60e378945ea84b34c4824b835b1', 'user_name': None, 'project_id': 'fa1cd463d74b49139a088d332d37e611', 'project_name': None, 'resource_id': '6b54aedd-9d9e-436b-9010-c5b04ffaca40-vda', 'timestamp': '2025-12-05T09:25:10.876975', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-520554038', 'name': 'instance-00000007', 'instance_id': '6b54aedd-9d9e-436b-9010-c5b04ffaca40', 'instance_type': 'm1.nano', 'host': '2bcb83f9ba5db29ea79c746e62a8a2dc63c5c999436a0571013bf81c', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'fbadeab4-f24f-4100-963a-d228b2a6f7c4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '3ebffd97-b242-42d7-b245-ebdaf8e4377c'}, 'image_ref': '3ebffd97-b242-42d7-b245-ebdaf8e4377c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '4c11d2d6-d1bc-11f0-a2f3-fa163e9454b0', 'monotonic_time': 3983.831327045, 'message_signature': '10a389398a283203e63ddef8e410221bca4b2f8aff128f5f8cee676b6a3310d8'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': 'fae1c60e378945ea84b34c4824b835b1', 'user_name': None, 'project_id': 'fa1cd463d74b49139a088d332d37e611', 'project_name': None, 'resource_id': '6b54aedd-9d9e-436b-9010-c5b04ffaca40-sda', 'timestamp': '2025-12-05T09:25:10.876975', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-520554038', 'name': 'instance-00000007', 'instance_id': '6b54aedd-9d9e-436b-9010-c5b04ffaca40', 'instance_type': 'm1.nano', 'host': '2bcb83f9ba5db29ea79c746e62a8a2dc63c5c999436a0571013bf81c', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'fbadeab4-f24f-4100-963a-d228b2a6f7c4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '3ebffd97-b242-42d7-b245-ebdaf8e4377c'}, 'image_ref': '3ebffd97-b242-42d7-b245-ebdaf8e4377c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '4c11dc0e-d1bc-11f0-a2f3-fa163e9454b0', 'monotonic_time': 3983.831327045, 'message_signature': '49b64fa7bb892e320b17dddd0d65470592cc05016364908f108b8214d121300b'}]}, 'timestamp': '2025-12-05 09:25:10.878204', '_unique_id': '158a61caafbd4a388e06cfecf9577a86'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.878 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.878 12 ERROR oslo_messaging.notify.messaging     yield
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.878 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.878 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.878 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.878 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.878 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.878 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.878 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.878 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.878 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.878 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.878 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.878 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.878 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.878 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.878 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.878 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.878 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.878 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.878 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.878 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.878 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.878 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.878 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.878 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.878 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.878 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.878 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.878 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.878 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.879 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.880 12 DEBUG ceilometer.compute.pollsters [-] Instance 2b94d7d0-b000-460f-8883-8953d60115d0 was shut off while getting sample of memory.usage: Failed to inspect data of instance <name=instance-00000003, id=2b94d7d0-b000-460f-8883-8953d60115d0>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.880 12 DEBUG ceilometer.compute.pollsters [-] 6b54aedd-9d9e-436b-9010-c5b04ffaca40/memory.usage volume: Unavailable _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.880 12 WARNING ceilometer.compute.pollsters [-] memory.usage statistic in not available for instance 6b54aedd-9d9e-436b-9010-c5b04ffaca40: ceilometer.compute.pollsters.NoVolumeException
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.881 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.881 12 DEBUG ceilometer.compute.pollsters [-] Instance 2b94d7d0-b000-460f-8883-8953d60115d0 was shut off while getting sample of network.incoming.packets: Failed to inspect data of instance <name=instance-00000003, id=2b94d7d0-b000-460f-8883-8953d60115d0>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.881 12 DEBUG ceilometer.compute.pollsters [-] 6b54aedd-9d9e-436b-9010-c5b04ffaca40/network.incoming.packets volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.882 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '633102bb-1a92-4c14-b8f2-3266faeaf0b0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 1, 'user_id': 'fae1c60e378945ea84b34c4824b835b1', 'user_name': None, 'project_id': 'fa1cd463d74b49139a088d332d37e611', 'project_name': None, 'resource_id': 'instance-00000007-6b54aedd-9d9e-436b-9010-c5b04ffaca40-tap64e441f0-08', 'timestamp': '2025-12-05T09:25:10.881246', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-520554038', 'name': 'tap64e441f0-08', 'instance_id': '6b54aedd-9d9e-436b-9010-c5b04ffaca40', 'instance_type': 'm1.nano', 'host': '2bcb83f9ba5db29ea79c746e62a8a2dc63c5c999436a0571013bf81c', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'fbadeab4-f24f-4100-963a-d228b2a6f7c4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '3ebffd97-b242-42d7-b245-ebdaf8e4377c'}, 'image_ref': '3ebffd97-b242-42d7-b245-ebdaf8e4377c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:5f:73:50', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap64e441f0-08'}, 'message_id': '4c12786c-d1bc-11f0-a2f3-fa163e9454b0', 'monotonic_time': 3983.813708633, 'message_signature': '54516842a0953ac39c84b353433b4c562cdd2f4a9bbd7a9333b5f26680a09872'}]}, 'timestamp': '2025-12-05 09:25:10.882223', '_unique_id': 'b1c29c944cea41d48f4068bcb3b1b0a9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.882 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.882 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.882 12 ERROR oslo_messaging.notify.messaging     yield
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.882 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.882 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.882 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.882 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.882 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.882 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.882 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.882 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.882 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.882 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.882 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.882 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.882 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.882 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.882 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.882 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.882 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.882 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.882 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.882 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.882 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.882 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.882 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.882 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.882 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.882 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.882 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.882 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.882 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.882 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.882 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.882 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.882 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.882 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.882 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.882 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.882 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.882 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.882 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.882 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.882 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.882 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.882 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.882 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.882 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.882 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.882 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.882 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.882 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.882 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.882 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.883 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.884 12 DEBUG ceilometer.compute.pollsters [-] Instance 2b94d7d0-b000-460f-8883-8953d60115d0 was shut off while getting sample of network.outgoing.packets.error: Failed to inspect data of instance <name=instance-00000003, id=2b94d7d0-b000-460f-8883-8953d60115d0>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.884 12 DEBUG ceilometer.compute.pollsters [-] 6b54aedd-9d9e-436b-9010-c5b04ffaca40/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.884 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5eb95963-251e-4f03-a0ae-6e6c9137b672', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'fae1c60e378945ea84b34c4824b835b1', 'user_name': None, 'project_id': 'fa1cd463d74b49139a088d332d37e611', 'project_name': None, 'resource_id': 'instance-00000007-6b54aedd-9d9e-436b-9010-c5b04ffaca40-tap64e441f0-08', 'timestamp': '2025-12-05T09:25:10.883465', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-520554038', 'name': 'tap64e441f0-08', 'instance_id': '6b54aedd-9d9e-436b-9010-c5b04ffaca40', 'instance_type': 'm1.nano', 'host': '2bcb83f9ba5db29ea79c746e62a8a2dc63c5c999436a0571013bf81c', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'fbadeab4-f24f-4100-963a-d228b2a6f7c4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '3ebffd97-b242-42d7-b245-ebdaf8e4377c'}, 'image_ref': '3ebffd97-b242-42d7-b245-ebdaf8e4377c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:5f:73:50', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap64e441f0-08'}, 'message_id': '4c12cd26-d1bc-11f0-a2f3-fa163e9454b0', 'monotonic_time': 3983.813708633, 'message_signature': 'b2c248e1d162ca69e5e9d44f80ffaff452409f59c4d32b0f8a7854badb977a3f'}]}, 'timestamp': '2025-12-05 09:25:10.884390', '_unique_id': 'f286fb2ee08d4aae89b4cf87dadf2d1d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.884 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.884 12 ERROR oslo_messaging.notify.messaging     yield
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.884 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.884 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.884 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.884 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.884 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.884 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.884 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.884 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.884 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.884 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.884 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.884 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.884 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.884 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.884 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.884 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.884 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.884 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.884 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.884 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.884 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.884 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.884 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.884 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.884 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.884 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.884 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.884 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.884 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.885 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.885 12 DEBUG ceilometer.compute.pollsters [-] Instance 2b94d7d0-b000-460f-8883-8953d60115d0 was shut off while getting sample of network.outgoing.bytes.delta: Failed to inspect data of instance <name=instance-00000003, id=2b94d7d0-b000-460f-8883-8953d60115d0>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.886 12 DEBUG ceilometer.compute.pollsters [-] 6b54aedd-9d9e-436b-9010-c5b04ffaca40/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.886 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5117fe4f-cc25-4c5a-8240-e376619209b9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'fae1c60e378945ea84b34c4824b835b1', 'user_name': None, 'project_id': 'fa1cd463d74b49139a088d332d37e611', 'project_name': None, 'resource_id': 'instance-00000007-6b54aedd-9d9e-436b-9010-c5b04ffaca40-tap64e441f0-08', 'timestamp': '2025-12-05T09:25:10.885495', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-520554038', 'name': 'tap64e441f0-08', 'instance_id': '6b54aedd-9d9e-436b-9010-c5b04ffaca40', 'instance_type': 'm1.nano', 'host': '2bcb83f9ba5db29ea79c746e62a8a2dc63c5c999436a0571013bf81c', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'fbadeab4-f24f-4100-963a-d228b2a6f7c4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '3ebffd97-b242-42d7-b245-ebdaf8e4377c'}, 'image_ref': '3ebffd97-b242-42d7-b245-ebdaf8e4377c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:5f:73:50', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap64e441f0-08'}, 'message_id': '4c131880-d1bc-11f0-a2f3-fa163e9454b0', 'monotonic_time': 3983.813708633, 'message_signature': '3bb6a4b0e3c7319e9fb9a35bf37c2cbd1a5b536bbbd37542b948bf70a8442971'}]}, 'timestamp': '2025-12-05 09:25:10.886319', '_unique_id': '8cec6f3760b449eead5f7eaf2303eacd'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.886 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.886 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.886 12 ERROR oslo_messaging.notify.messaging     yield
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.886 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.886 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.886 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.886 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.886 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.886 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.886 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.886 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.886 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.886 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.886 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.886 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.886 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.886 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.886 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.886 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.886 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.886 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.886 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.886 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.886 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.886 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.886 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.886 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.886 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.886 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.886 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.886 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.886 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.886 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.886 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.886 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.886 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.886 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.886 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.886 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.886 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.886 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.886 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.886 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.886 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.886 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.886 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.886 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.886 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.886 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.886 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.886 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.886 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.886 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 09:25:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:25:10.886 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:25:11 compute-1 nova_compute[189066]: 2025-12-05 09:25:11.280 189070 DEBUG nova.compute.manager [req-a3031e24-94f2-46ab-847d-1c78364fa56f req-c206260e-ba28-4b2b-a26a-adc2efcacdaf 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 6b54aedd-9d9e-436b-9010-c5b04ffaca40] Received event network-vif-plugged-64e441f0-08b6-483c-9732-5217cdbe1468 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 09:25:11 compute-1 nova_compute[189066]: 2025-12-05 09:25:11.281 189070 DEBUG oslo_concurrency.lockutils [req-a3031e24-94f2-46ab-847d-1c78364fa56f req-c206260e-ba28-4b2b-a26a-adc2efcacdaf 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Acquiring lock "6b54aedd-9d9e-436b-9010-c5b04ffaca40-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:25:11 compute-1 nova_compute[189066]: 2025-12-05 09:25:11.281 189070 DEBUG oslo_concurrency.lockutils [req-a3031e24-94f2-46ab-847d-1c78364fa56f req-c206260e-ba28-4b2b-a26a-adc2efcacdaf 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Lock "6b54aedd-9d9e-436b-9010-c5b04ffaca40-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:25:11 compute-1 nova_compute[189066]: 2025-12-05 09:25:11.282 189070 DEBUG oslo_concurrency.lockutils [req-a3031e24-94f2-46ab-847d-1c78364fa56f req-c206260e-ba28-4b2b-a26a-adc2efcacdaf 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Lock "6b54aedd-9d9e-436b-9010-c5b04ffaca40-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:25:11 compute-1 nova_compute[189066]: 2025-12-05 09:25:11.282 189070 DEBUG nova.compute.manager [req-a3031e24-94f2-46ab-847d-1c78364fa56f req-c206260e-ba28-4b2b-a26a-adc2efcacdaf 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 6b54aedd-9d9e-436b-9010-c5b04ffaca40] Processing event network-vif-plugged-64e441f0-08b6-483c-9732-5217cdbe1468 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Dec 05 09:25:11 compute-1 nova_compute[189066]: 2025-12-05 09:25:11.282 189070 DEBUG nova.compute.manager [None req-84589b53-7718-467d-b317-24ceebd62a0b fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] [instance: 6b54aedd-9d9e-436b-9010-c5b04ffaca40] Instance event wait completed in 4 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Dec 05 09:25:11 compute-1 nova_compute[189066]: 2025-12-05 09:25:11.286 189070 DEBUG nova.virt.driver [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] Emitting event <LifecycleEvent: 1764926711.286478, 6b54aedd-9d9e-436b-9010-c5b04ffaca40 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 09:25:11 compute-1 nova_compute[189066]: 2025-12-05 09:25:11.287 189070 INFO nova.compute.manager [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] [instance: 6b54aedd-9d9e-436b-9010-c5b04ffaca40] VM Resumed (Lifecycle Event)
Dec 05 09:25:11 compute-1 nova_compute[189066]: 2025-12-05 09:25:11.288 189070 DEBUG nova.virt.libvirt.driver [None req-84589b53-7718-467d-b317-24ceebd62a0b fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] [instance: 6b54aedd-9d9e-436b-9010-c5b04ffaca40] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Dec 05 09:25:11 compute-1 nova_compute[189066]: 2025-12-05 09:25:11.292 189070 INFO nova.virt.libvirt.driver [-] [instance: 6b54aedd-9d9e-436b-9010-c5b04ffaca40] Instance spawned successfully.
Dec 05 09:25:11 compute-1 nova_compute[189066]: 2025-12-05 09:25:11.292 189070 DEBUG nova.virt.libvirt.driver [None req-84589b53-7718-467d-b317-24ceebd62a0b fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] [instance: 6b54aedd-9d9e-436b-9010-c5b04ffaca40] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Dec 05 09:25:11 compute-1 nova_compute[189066]: 2025-12-05 09:25:11.373 189070 DEBUG nova.virt.libvirt.driver [None req-84589b53-7718-467d-b317-24ceebd62a0b fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] [instance: 6b54aedd-9d9e-436b-9010-c5b04ffaca40] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 09:25:11 compute-1 nova_compute[189066]: 2025-12-05 09:25:11.374 189070 DEBUG nova.virt.libvirt.driver [None req-84589b53-7718-467d-b317-24ceebd62a0b fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] [instance: 6b54aedd-9d9e-436b-9010-c5b04ffaca40] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 09:25:11 compute-1 nova_compute[189066]: 2025-12-05 09:25:11.374 189070 DEBUG nova.virt.libvirt.driver [None req-84589b53-7718-467d-b317-24ceebd62a0b fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] [instance: 6b54aedd-9d9e-436b-9010-c5b04ffaca40] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 09:25:11 compute-1 nova_compute[189066]: 2025-12-05 09:25:11.375 189070 DEBUG nova.virt.libvirt.driver [None req-84589b53-7718-467d-b317-24ceebd62a0b fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] [instance: 6b54aedd-9d9e-436b-9010-c5b04ffaca40] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 09:25:11 compute-1 nova_compute[189066]: 2025-12-05 09:25:11.375 189070 DEBUG nova.virt.libvirt.driver [None req-84589b53-7718-467d-b317-24ceebd62a0b fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] [instance: 6b54aedd-9d9e-436b-9010-c5b04ffaca40] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 09:25:11 compute-1 nova_compute[189066]: 2025-12-05 09:25:11.376 189070 DEBUG nova.virt.libvirt.driver [None req-84589b53-7718-467d-b317-24ceebd62a0b fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] [instance: 6b54aedd-9d9e-436b-9010-c5b04ffaca40] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 09:25:11 compute-1 nova_compute[189066]: 2025-12-05 09:25:11.388 189070 DEBUG nova.compute.manager [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] [instance: 6b54aedd-9d9e-436b-9010-c5b04ffaca40] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 09:25:11 compute-1 nova_compute[189066]: 2025-12-05 09:25:11.392 189070 DEBUG nova.compute.manager [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] [instance: 6b54aedd-9d9e-436b-9010-c5b04ffaca40] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 09:25:11 compute-1 nova_compute[189066]: 2025-12-05 09:25:11.453 189070 INFO nova.compute.manager [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] [instance: 6b54aedd-9d9e-436b-9010-c5b04ffaca40] During sync_power_state the instance has a pending task (spawning). Skip.
Dec 05 09:25:11 compute-1 nova_compute[189066]: 2025-12-05 09:25:11.552 189070 INFO nova.compute.manager [None req-84589b53-7718-467d-b317-24ceebd62a0b fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] [instance: 6b54aedd-9d9e-436b-9010-c5b04ffaca40] Took 43.97 seconds to spawn the instance on the hypervisor.
Dec 05 09:25:11 compute-1 nova_compute[189066]: 2025-12-05 09:25:11.553 189070 DEBUG nova.compute.manager [None req-84589b53-7718-467d-b317-24ceebd62a0b fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] [instance: 6b54aedd-9d9e-436b-9010-c5b04ffaca40] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 09:25:11 compute-1 podman[222351]: 2025-12-05 09:25:11.634942874 +0000 UTC m=+0.066726006 container health_status b07f36300303a5c1729056a61e344d6c6fd9b2e404082d8e8ce7185d45652c2b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, version=9.6, build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, io.openshift.expose-services=, name=ubi9-minimal, vendor=Red Hat, Inc., release=1755695350, container_name=openstack_network_exporter, config_id=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, maintainer=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b)
Dec 05 09:25:11 compute-1 nova_compute[189066]: 2025-12-05 09:25:11.831 189070 INFO nova.compute.manager [None req-84589b53-7718-467d-b317-24ceebd62a0b fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] [instance: 6b54aedd-9d9e-436b-9010-c5b04ffaca40] Took 48.61 seconds to build instance.
Dec 05 09:25:11 compute-1 nova_compute[189066]: 2025-12-05 09:25:11.939 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:25:11 compute-1 nova_compute[189066]: 2025-12-05 09:25:11.951 189070 DEBUG oslo_concurrency.lockutils [None req-84589b53-7718-467d-b317-24ceebd62a0b fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] Lock "6b54aedd-9d9e-436b-9010-c5b04ffaca40" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 49.047s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:25:13 compute-1 nova_compute[189066]: 2025-12-05 09:25:13.563 189070 DEBUG nova.network.neutron [None req-9012beb5-e7ac-4468-9dd4-0f0122c05614 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] [instance: 2b94d7d0-b000-460f-8883-8953d60115d0] Updating instance_info_cache with network_info: [{"id": "b2d6abb2-b32e-4bf7-bbdd-00d28c47ddc5", "address": "fa:16:3e:57:1d:2e", "network": {"id": "359084ff-88c9-4324-bfef-b1ef9f403118", "bridge": "br-int", "label": "tempest-network-smoke--271958037", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.195", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e26ae3fdd48d4947978a480f70e14f84", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb2d6abb2-b3", "ovs_interfaceid": "b2d6abb2-b32e-4bf7-bbdd-00d28c47ddc5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 09:25:13 compute-1 nova_compute[189066]: 2025-12-05 09:25:13.787 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:25:14 compute-1 nova_compute[189066]: 2025-12-05 09:25:14.290 189070 DEBUG oslo_concurrency.lockutils [None req-9012beb5-e7ac-4468-9dd4-0f0122c05614 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] Releasing lock "refresh_cache-2b94d7d0-b000-460f-8883-8953d60115d0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 09:25:14 compute-1 nova_compute[189066]: 2025-12-05 09:25:14.291 189070 DEBUG nova.objects.instance [None req-9012beb5-e7ac-4468-9dd4-0f0122c05614 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] Lazy-loading 'migration_context' on Instance uuid 2b94d7d0-b000-460f-8883-8953d60115d0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 09:25:14 compute-1 nova_compute[189066]: 2025-12-05 09:25:14.476 189070 DEBUG nova.virt.libvirt.host [None req-9012beb5-e7ac-4468-9dd4-0f0122c05614 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] Checking UEFI support for host arch (x86_64) supports_uefi /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1754
Dec 05 09:25:14 compute-1 nova_compute[189066]: 2025-12-05 09:25:14.476 189070 INFO nova.virt.libvirt.host [None req-9012beb5-e7ac-4468-9dd4-0f0122c05614 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] UEFI support detected
Dec 05 09:25:14 compute-1 nova_compute[189066]: 2025-12-05 09:25:14.479 189070 DEBUG nova.virt.libvirt.vif [None req-9012beb5-e7ac-4468-9dd4-0f0122c05614 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-05T09:21:50Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-804921789',display_name='tempest-TestNetworkAdvancedServerOps-server-804921789',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-804921789',id=3,image_ref='3ebffd97-b242-42d7-b245-ebdaf8e4377c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDW3qadE13zsRSDKJlvOjZ/sgAUF4xLDU490CG+fZt4wcZlik42yE/XMAhTvQFBPRgAXCQSd0Lb+5EKERW2D+JVe4gr8wR12Ds02HfiWcpROJrTejCLEVPRm4reH/9sdrA==',key_name='tempest-TestNetworkAdvancedServerOps-175847342',keypairs=<?>,launch_index=0,launched_at=2025-12-05T09:24:53Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=MigrationContext,new_flavor=Flavor(1),node='compute-0.ctlplane.example.com',numa_topology=<?>,old_flavor=Flavor(1),os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='e26ae3fdd48d4947978a480f70e14f84',ramdisk_id='',reservation_id='r-rmofgs1d',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='3ebffd97-b242-42d7-b245-ebdaf8e4377c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-TestNetworkAdvancedServerOps-1829130727',owner_user_name='tempest-TestNetworkAdvancedServerOps-1829130727-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-12-05T09:24:53Z,user_data=None,user_id='65751a90715341b2984ef84ebbaa1650',uuid=2b94d7d0-b000-460f-8883-8953d60115d0,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='resized') vif={"id": "b2d6abb2-b32e-4bf7-bbdd-00d28c47ddc5", "address": "fa:16:3e:57:1d:2e", "network": {"id": "359084ff-88c9-4324-bfef-b1ef9f403118", "bridge": "br-int", "label": "tempest-network-smoke--271958037", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.195", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e26ae3fdd48d4947978a480f70e14f84", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb2d6abb2-b3", "ovs_interfaceid": "b2d6abb2-b32e-4bf7-bbdd-00d28c47ddc5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Dec 05 09:25:14 compute-1 nova_compute[189066]: 2025-12-05 09:25:14.479 189070 DEBUG nova.network.os_vif_util [None req-9012beb5-e7ac-4468-9dd4-0f0122c05614 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] Converting VIF {"id": "b2d6abb2-b32e-4bf7-bbdd-00d28c47ddc5", "address": "fa:16:3e:57:1d:2e", "network": {"id": "359084ff-88c9-4324-bfef-b1ef9f403118", "bridge": "br-int", "label": "tempest-network-smoke--271958037", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.195", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e26ae3fdd48d4947978a480f70e14f84", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb2d6abb2-b3", "ovs_interfaceid": "b2d6abb2-b32e-4bf7-bbdd-00d28c47ddc5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 09:25:14 compute-1 nova_compute[189066]: 2025-12-05 09:25:14.480 189070 DEBUG nova.network.os_vif_util [None req-9012beb5-e7ac-4468-9dd4-0f0122c05614 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:57:1d:2e,bridge_name='br-int',has_traffic_filtering=True,id=b2d6abb2-b32e-4bf7-bbdd-00d28c47ddc5,network=Network(359084ff-88c9-4324-bfef-b1ef9f403118),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb2d6abb2-b3') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 09:25:14 compute-1 nova_compute[189066]: 2025-12-05 09:25:14.481 189070 DEBUG os_vif [None req-9012beb5-e7ac-4468-9dd4-0f0122c05614 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:57:1d:2e,bridge_name='br-int',has_traffic_filtering=True,id=b2d6abb2-b32e-4bf7-bbdd-00d28c47ddc5,network=Network(359084ff-88c9-4324-bfef-b1ef9f403118),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb2d6abb2-b3') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Dec 05 09:25:14 compute-1 nova_compute[189066]: 2025-12-05 09:25:14.482 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:25:14 compute-1 nova_compute[189066]: 2025-12-05 09:25:14.483 189070 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb2d6abb2-b3, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 09:25:14 compute-1 nova_compute[189066]: 2025-12-05 09:25:14.483 189070 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 09:25:14 compute-1 nova_compute[189066]: 2025-12-05 09:25:14.486 189070 INFO os_vif [None req-9012beb5-e7ac-4468-9dd4-0f0122c05614 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:57:1d:2e,bridge_name='br-int',has_traffic_filtering=True,id=b2d6abb2-b32e-4bf7-bbdd-00d28c47ddc5,network=Network(359084ff-88c9-4324-bfef-b1ef9f403118),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb2d6abb2-b3')
Dec 05 09:25:14 compute-1 nova_compute[189066]: 2025-12-05 09:25:14.486 189070 DEBUG oslo_concurrency.lockutils [None req-9012beb5-e7ac-4468-9dd4-0f0122c05614 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_source" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:25:14 compute-1 nova_compute[189066]: 2025-12-05 09:25:14.486 189070 DEBUG oslo_concurrency.lockutils [None req-9012beb5-e7ac-4468-9dd4-0f0122c05614 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_source" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:25:14 compute-1 nova_compute[189066]: 2025-12-05 09:25:14.598 189070 DEBUG nova.network.neutron [None req-c79702cb-8a4b-4665-b9ca-3624a8692fbd 3cf2b75d6732438a9a3626bc5db6d76e efceb9f71c3c447ea59c2d2694f9e636 - - default default] [instance: b4e74925-5201-4f70-9beb-258ed9dc025a] Successfully created port: 9895d3af-515e-43f4-bc1f-97b82be6c710 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Dec 05 09:25:14 compute-1 podman[222374]: 2025-12-05 09:25:14.633680613 +0000 UTC m=+0.067089769 container health_status b9f45cdcb567bb250381fa80d86c78c226d5638c7de1b6d79ee017275814ff68 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Dec 05 09:25:14 compute-1 nova_compute[189066]: 2025-12-05 09:25:14.841 189070 DEBUG nova.compute.provider_tree [None req-9012beb5-e7ac-4468-9dd4-0f0122c05614 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] Inventory has not changed in ProviderTree for provider: be68f9f1-7820-4bfa-8dbd-210e13729f64 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 09:25:15 compute-1 nova_compute[189066]: 2025-12-05 09:25:15.086 189070 DEBUG nova.scheduler.client.report [None req-9012beb5-e7ac-4468-9dd4-0f0122c05614 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] Inventory has not changed for provider be68f9f1-7820-4bfa-8dbd-210e13729f64 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 09:25:15 compute-1 nova_compute[189066]: 2025-12-05 09:25:15.390 189070 DEBUG oslo_concurrency.lockutils [None req-9012beb5-e7ac-4468-9dd4-0f0122c05614 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_source" :: held 0.904s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:25:15 compute-1 nova_compute[189066]: 2025-12-05 09:25:15.623 189070 INFO nova.scheduler.client.report [None req-9012beb5-e7ac-4468-9dd4-0f0122c05614 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] Deleted allocation for migration 28d48d37-57dd-4441-8e4e-cf3cfac9be6c
Dec 05 09:25:15 compute-1 nova_compute[189066]: 2025-12-05 09:25:15.756 189070 DEBUG oslo_concurrency.lockutils [None req-9012beb5-e7ac-4468-9dd4-0f0122c05614 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] Lock "2b94d7d0-b000-460f-8883-8953d60115d0" "released" by "nova.compute.manager.ComputeManager.confirm_resize.<locals>.do_confirm_resize" :: held 10.368s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:25:15 compute-1 nova_compute[189066]: 2025-12-05 09:25:15.909 189070 DEBUG nova.compute.manager [req-cccea979-46bd-4fff-8fea-80fea322db88 req-5b65acaf-45e6-48c9-b99c-3d66ad9df9fa 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 6b54aedd-9d9e-436b-9010-c5b04ffaca40] Received event network-vif-plugged-64e441f0-08b6-483c-9732-5217cdbe1468 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 09:25:15 compute-1 nova_compute[189066]: 2025-12-05 09:25:15.910 189070 DEBUG oslo_concurrency.lockutils [req-cccea979-46bd-4fff-8fea-80fea322db88 req-5b65acaf-45e6-48c9-b99c-3d66ad9df9fa 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Acquiring lock "6b54aedd-9d9e-436b-9010-c5b04ffaca40-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:25:15 compute-1 nova_compute[189066]: 2025-12-05 09:25:15.910 189070 DEBUG oslo_concurrency.lockutils [req-cccea979-46bd-4fff-8fea-80fea322db88 req-5b65acaf-45e6-48c9-b99c-3d66ad9df9fa 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Lock "6b54aedd-9d9e-436b-9010-c5b04ffaca40-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:25:15 compute-1 nova_compute[189066]: 2025-12-05 09:25:15.910 189070 DEBUG oslo_concurrency.lockutils [req-cccea979-46bd-4fff-8fea-80fea322db88 req-5b65acaf-45e6-48c9-b99c-3d66ad9df9fa 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Lock "6b54aedd-9d9e-436b-9010-c5b04ffaca40-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:25:15 compute-1 nova_compute[189066]: 2025-12-05 09:25:15.911 189070 DEBUG nova.compute.manager [req-cccea979-46bd-4fff-8fea-80fea322db88 req-5b65acaf-45e6-48c9-b99c-3d66ad9df9fa 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 6b54aedd-9d9e-436b-9010-c5b04ffaca40] No waiting events found dispatching network-vif-plugged-64e441f0-08b6-483c-9732-5217cdbe1468 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 09:25:15 compute-1 nova_compute[189066]: 2025-12-05 09:25:15.911 189070 WARNING nova.compute.manager [req-cccea979-46bd-4fff-8fea-80fea322db88 req-5b65acaf-45e6-48c9-b99c-3d66ad9df9fa 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 6b54aedd-9d9e-436b-9010-c5b04ffaca40] Received unexpected event network-vif-plugged-64e441f0-08b6-483c-9732-5217cdbe1468 for instance with vm_state active and task_state None.
Dec 05 09:25:16 compute-1 nova_compute[189066]: 2025-12-05 09:25:16.943 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:25:16 compute-1 sshd-session[222396]: Received disconnect from 122.168.194.41 port 33394:11: Bye Bye [preauth]
Dec 05 09:25:16 compute-1 sshd-session[222396]: Disconnected from authenticating user root 122.168.194.41 port 33394 [preauth]
Dec 05 09:25:17 compute-1 nova_compute[189066]: 2025-12-05 09:25:17.588 189070 DEBUG nova.network.neutron [None req-c79702cb-8a4b-4665-b9ca-3624a8692fbd 3cf2b75d6732438a9a3626bc5db6d76e efceb9f71c3c447ea59c2d2694f9e636 - - default default] [instance: b4e74925-5201-4f70-9beb-258ed9dc025a] Successfully updated port: 9895d3af-515e-43f4-bc1f-97b82be6c710 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Dec 05 09:25:17 compute-1 nova_compute[189066]: 2025-12-05 09:25:17.745 189070 DEBUG oslo_concurrency.lockutils [None req-c79702cb-8a4b-4665-b9ca-3624a8692fbd 3cf2b75d6732438a9a3626bc5db6d76e efceb9f71c3c447ea59c2d2694f9e636 - - default default] Acquiring lock "refresh_cache-b4e74925-5201-4f70-9beb-258ed9dc025a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 09:25:17 compute-1 nova_compute[189066]: 2025-12-05 09:25:17.746 189070 DEBUG oslo_concurrency.lockutils [None req-c79702cb-8a4b-4665-b9ca-3624a8692fbd 3cf2b75d6732438a9a3626bc5db6d76e efceb9f71c3c447ea59c2d2694f9e636 - - default default] Acquired lock "refresh_cache-b4e74925-5201-4f70-9beb-258ed9dc025a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 09:25:17 compute-1 nova_compute[189066]: 2025-12-05 09:25:17.746 189070 DEBUG nova.network.neutron [None req-c79702cb-8a4b-4665-b9ca-3624a8692fbd 3cf2b75d6732438a9a3626bc5db6d76e efceb9f71c3c447ea59c2d2694f9e636 - - default default] [instance: b4e74925-5201-4f70-9beb-258ed9dc025a] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Dec 05 09:25:18 compute-1 nova_compute[189066]: 2025-12-05 09:25:18.301 189070 DEBUG nova.network.neutron [None req-c79702cb-8a4b-4665-b9ca-3624a8692fbd 3cf2b75d6732438a9a3626bc5db6d76e efceb9f71c3c447ea59c2d2694f9e636 - - default default] [instance: b4e74925-5201-4f70-9beb-258ed9dc025a] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Dec 05 09:25:18 compute-1 nova_compute[189066]: 2025-12-05 09:25:18.789 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:25:19 compute-1 nova_compute[189066]: 2025-12-05 09:25:19.598 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:25:20 compute-1 nova_compute[189066]: 2025-12-05 09:25:20.522 189070 DEBUG nova.compute.manager [req-ba127b3b-54c4-4fc7-bbbf-f81f56364169 req-e029139d-e231-40f3-8337-54fa4f3ad45a 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: b4e74925-5201-4f70-9beb-258ed9dc025a] Received event network-changed-9895d3af-515e-43f4-bc1f-97b82be6c710 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 09:25:20 compute-1 nova_compute[189066]: 2025-12-05 09:25:20.523 189070 DEBUG nova.compute.manager [req-ba127b3b-54c4-4fc7-bbbf-f81f56364169 req-e029139d-e231-40f3-8337-54fa4f3ad45a 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: b4e74925-5201-4f70-9beb-258ed9dc025a] Refreshing instance network info cache due to event network-changed-9895d3af-515e-43f4-bc1f-97b82be6c710. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Dec 05 09:25:20 compute-1 nova_compute[189066]: 2025-12-05 09:25:20.523 189070 DEBUG oslo_concurrency.lockutils [req-ba127b3b-54c4-4fc7-bbbf-f81f56364169 req-e029139d-e231-40f3-8337-54fa4f3ad45a 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Acquiring lock "refresh_cache-b4e74925-5201-4f70-9beb-258ed9dc025a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 09:25:20 compute-1 nova_compute[189066]: 2025-12-05 09:25:20.791 189070 DEBUG nova.network.neutron [None req-c79702cb-8a4b-4665-b9ca-3624a8692fbd 3cf2b75d6732438a9a3626bc5db6d76e efceb9f71c3c447ea59c2d2694f9e636 - - default default] [instance: b4e74925-5201-4f70-9beb-258ed9dc025a] Updating instance_info_cache with network_info: [{"id": "9895d3af-515e-43f4-bc1f-97b82be6c710", "address": "fa:16:3e:64:c3:47", "network": {"id": "6ce38059-35e2-48bf-bd81-40e486d57627", "bridge": "br-int", "label": "tempest-network-smoke--458420447", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "efceb9f71c3c447ea59c2d2694f9e636", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9895d3af-51", "ovs_interfaceid": "9895d3af-515e-43f4-bc1f-97b82be6c710", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 09:25:20 compute-1 nova_compute[189066]: 2025-12-05 09:25:20.840 189070 DEBUG oslo_concurrency.lockutils [None req-c79702cb-8a4b-4665-b9ca-3624a8692fbd 3cf2b75d6732438a9a3626bc5db6d76e efceb9f71c3c447ea59c2d2694f9e636 - - default default] Releasing lock "refresh_cache-b4e74925-5201-4f70-9beb-258ed9dc025a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 09:25:20 compute-1 nova_compute[189066]: 2025-12-05 09:25:20.841 189070 DEBUG nova.compute.manager [None req-c79702cb-8a4b-4665-b9ca-3624a8692fbd 3cf2b75d6732438a9a3626bc5db6d76e efceb9f71c3c447ea59c2d2694f9e636 - - default default] [instance: b4e74925-5201-4f70-9beb-258ed9dc025a] Instance network_info: |[{"id": "9895d3af-515e-43f4-bc1f-97b82be6c710", "address": "fa:16:3e:64:c3:47", "network": {"id": "6ce38059-35e2-48bf-bd81-40e486d57627", "bridge": "br-int", "label": "tempest-network-smoke--458420447", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "efceb9f71c3c447ea59c2d2694f9e636", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9895d3af-51", "ovs_interfaceid": "9895d3af-515e-43f4-bc1f-97b82be6c710", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Dec 05 09:25:20 compute-1 nova_compute[189066]: 2025-12-05 09:25:20.841 189070 DEBUG oslo_concurrency.lockutils [req-ba127b3b-54c4-4fc7-bbbf-f81f56364169 req-e029139d-e231-40f3-8337-54fa4f3ad45a 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Acquired lock "refresh_cache-b4e74925-5201-4f70-9beb-258ed9dc025a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 09:25:20 compute-1 nova_compute[189066]: 2025-12-05 09:25:20.842 189070 DEBUG nova.network.neutron [req-ba127b3b-54c4-4fc7-bbbf-f81f56364169 req-e029139d-e231-40f3-8337-54fa4f3ad45a 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: b4e74925-5201-4f70-9beb-258ed9dc025a] Refreshing network info cache for port 9895d3af-515e-43f4-bc1f-97b82be6c710 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Dec 05 09:25:20 compute-1 nova_compute[189066]: 2025-12-05 09:25:20.845 189070 DEBUG nova.virt.libvirt.driver [None req-c79702cb-8a4b-4665-b9ca-3624a8692fbd 3cf2b75d6732438a9a3626bc5db6d76e efceb9f71c3c447ea59c2d2694f9e636 - - default default] [instance: b4e74925-5201-4f70-9beb-258ed9dc025a] Start _get_guest_xml network_info=[{"id": "9895d3af-515e-43f4-bc1f-97b82be6c710", "address": "fa:16:3e:64:c3:47", "network": {"id": "6ce38059-35e2-48bf-bd81-40e486d57627", "bridge": "br-int", "label": "tempest-network-smoke--458420447", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "efceb9f71c3c447ea59c2d2694f9e636", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9895d3af-51", "ovs_interfaceid": "9895d3af-515e-43f4-bc1f-97b82be6c710", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T09:19:04Z,direct_url=<?>,disk_format='qcow2',id=3ebffd97-b242-42d7-b245-ebdaf8e4377c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='1cb29f3743454b7fb71af92993455437',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T09:19:06Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encrypted': False, 'encryption_secret_uuid': None, 'size': 0, 'boot_index': 0, 'device_name': '/dev/vda', 'disk_bus': 'virtio', 'guest_format': None, 'encryption_options': None, 'encryption_format': None, 'device_type': 'disk', 'image_id': '3ebffd97-b242-42d7-b245-ebdaf8e4377c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Dec 05 09:25:20 compute-1 nova_compute[189066]: 2025-12-05 09:25:20.850 189070 WARNING nova.virt.libvirt.driver [None req-c79702cb-8a4b-4665-b9ca-3624a8692fbd 3cf2b75d6732438a9a3626bc5db6d76e efceb9f71c3c447ea59c2d2694f9e636 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 05 09:25:20 compute-1 nova_compute[189066]: 2025-12-05 09:25:20.859 189070 DEBUG nova.virt.libvirt.host [None req-c79702cb-8a4b-4665-b9ca-3624a8692fbd 3cf2b75d6732438a9a3626bc5db6d76e efceb9f71c3c447ea59c2d2694f9e636 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Dec 05 09:25:20 compute-1 nova_compute[189066]: 2025-12-05 09:25:20.860 189070 DEBUG nova.virt.libvirt.host [None req-c79702cb-8a4b-4665-b9ca-3624a8692fbd 3cf2b75d6732438a9a3626bc5db6d76e efceb9f71c3c447ea59c2d2694f9e636 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Dec 05 09:25:20 compute-1 nova_compute[189066]: 2025-12-05 09:25:20.868 189070 DEBUG nova.virt.libvirt.host [None req-c79702cb-8a4b-4665-b9ca-3624a8692fbd 3cf2b75d6732438a9a3626bc5db6d76e efceb9f71c3c447ea59c2d2694f9e636 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Dec 05 09:25:20 compute-1 nova_compute[189066]: 2025-12-05 09:25:20.869 189070 DEBUG nova.virt.libvirt.host [None req-c79702cb-8a4b-4665-b9ca-3624a8692fbd 3cf2b75d6732438a9a3626bc5db6d76e efceb9f71c3c447ea59c2d2694f9e636 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Dec 05 09:25:20 compute-1 nova_compute[189066]: 2025-12-05 09:25:20.870 189070 DEBUG nova.virt.libvirt.driver [None req-c79702cb-8a4b-4665-b9ca-3624a8692fbd 3cf2b75d6732438a9a3626bc5db6d76e efceb9f71c3c447ea59c2d2694f9e636 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Dec 05 09:25:20 compute-1 nova_compute[189066]: 2025-12-05 09:25:20.870 189070 DEBUG nova.virt.hardware [None req-c79702cb-8a4b-4665-b9ca-3624a8692fbd 3cf2b75d6732438a9a3626bc5db6d76e efceb9f71c3c447ea59c2d2694f9e636 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-05T09:19:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='fbadeab4-f24f-4100-963a-d228b2a6f7c4',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T09:19:04Z,direct_url=<?>,disk_format='qcow2',id=3ebffd97-b242-42d7-b245-ebdaf8e4377c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='1cb29f3743454b7fb71af92993455437',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T09:19:06Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Dec 05 09:25:20 compute-1 nova_compute[189066]: 2025-12-05 09:25:20.871 189070 DEBUG nova.virt.hardware [None req-c79702cb-8a4b-4665-b9ca-3624a8692fbd 3cf2b75d6732438a9a3626bc5db6d76e efceb9f71c3c447ea59c2d2694f9e636 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Dec 05 09:25:20 compute-1 nova_compute[189066]: 2025-12-05 09:25:20.871 189070 DEBUG nova.virt.hardware [None req-c79702cb-8a4b-4665-b9ca-3624a8692fbd 3cf2b75d6732438a9a3626bc5db6d76e efceb9f71c3c447ea59c2d2694f9e636 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Dec 05 09:25:20 compute-1 nova_compute[189066]: 2025-12-05 09:25:20.871 189070 DEBUG nova.virt.hardware [None req-c79702cb-8a4b-4665-b9ca-3624a8692fbd 3cf2b75d6732438a9a3626bc5db6d76e efceb9f71c3c447ea59c2d2694f9e636 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Dec 05 09:25:20 compute-1 nova_compute[189066]: 2025-12-05 09:25:20.872 189070 DEBUG nova.virt.hardware [None req-c79702cb-8a4b-4665-b9ca-3624a8692fbd 3cf2b75d6732438a9a3626bc5db6d76e efceb9f71c3c447ea59c2d2694f9e636 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Dec 05 09:25:20 compute-1 nova_compute[189066]: 2025-12-05 09:25:20.872 189070 DEBUG nova.virt.hardware [None req-c79702cb-8a4b-4665-b9ca-3624a8692fbd 3cf2b75d6732438a9a3626bc5db6d76e efceb9f71c3c447ea59c2d2694f9e636 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Dec 05 09:25:20 compute-1 nova_compute[189066]: 2025-12-05 09:25:20.872 189070 DEBUG nova.virt.hardware [None req-c79702cb-8a4b-4665-b9ca-3624a8692fbd 3cf2b75d6732438a9a3626bc5db6d76e efceb9f71c3c447ea59c2d2694f9e636 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Dec 05 09:25:20 compute-1 nova_compute[189066]: 2025-12-05 09:25:20.873 189070 DEBUG nova.virt.hardware [None req-c79702cb-8a4b-4665-b9ca-3624a8692fbd 3cf2b75d6732438a9a3626bc5db6d76e efceb9f71c3c447ea59c2d2694f9e636 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Dec 05 09:25:20 compute-1 nova_compute[189066]: 2025-12-05 09:25:20.873 189070 DEBUG nova.virt.hardware [None req-c79702cb-8a4b-4665-b9ca-3624a8692fbd 3cf2b75d6732438a9a3626bc5db6d76e efceb9f71c3c447ea59c2d2694f9e636 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Dec 05 09:25:20 compute-1 nova_compute[189066]: 2025-12-05 09:25:20.873 189070 DEBUG nova.virt.hardware [None req-c79702cb-8a4b-4665-b9ca-3624a8692fbd 3cf2b75d6732438a9a3626bc5db6d76e efceb9f71c3c447ea59c2d2694f9e636 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Dec 05 09:25:20 compute-1 nova_compute[189066]: 2025-12-05 09:25:20.874 189070 DEBUG nova.virt.hardware [None req-c79702cb-8a4b-4665-b9ca-3624a8692fbd 3cf2b75d6732438a9a3626bc5db6d76e efceb9f71c3c447ea59c2d2694f9e636 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Dec 05 09:25:20 compute-1 nova_compute[189066]: 2025-12-05 09:25:20.879 189070 DEBUG nova.virt.libvirt.vif [None req-c79702cb-8a4b-4665-b9ca-3624a8692fbd 3cf2b75d6732438a9a3626bc5db6d76e efceb9f71c3c447ea59c2d2694f9e636 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T09:25:04Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-tempest-TestSecurityGroupsBasicOps-1546276319-access_point-1221653276',display_name='tempest-server-tempest-TestSecurityGroupsBasicOps-1546276319-access_point-1221653276',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-server-tempest-testsecuritygroupsbasicops-1546276319-ac',id=9,image_ref='3ebffd97-b242-42d7-b245-ebdaf8e4377c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBFJkw+l11RVGYygdYFY+5dTvjVW6UnWERneAkkJsvcy3vaYGsZxqSJKH1T1OhIDkYKsREB1fnusNh4+5qwwXs+krdjcuQPiAa96VT7yue5/M3ONTUDBINBXBPmqK+uIgKg==',key_name='tempest-TestSecurityGroupsBasicOps-926259432',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='efceb9f71c3c447ea59c2d2694f9e636',ramdisk_id='',reservation_id='r-asrkgshf',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='3ebffd97-b242-42d7-b245-ebdaf8e4377c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestSecurityGroupsBasicOps-1546276319',owner_user_name='tempest-TestSecurityGroupsBasicOps-1546276319-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T09:25:08Z,user_data=None,user_id='3cf2b75d6732438a9a3626bc5db6d76e',uuid=b4e74925-5201-4f70-9beb-258ed9dc025a,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "9895d3af-515e-43f4-bc1f-97b82be6c710", "address": "fa:16:3e:64:c3:47", "network": {"id": "6ce38059-35e2-48bf-bd81-40e486d57627", "bridge": "br-int", "label": "tempest-network-smoke--458420447", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "efceb9f71c3c447ea59c2d2694f9e636", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9895d3af-51", "ovs_interfaceid": "9895d3af-515e-43f4-bc1f-97b82be6c710", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Dec 05 09:25:20 compute-1 nova_compute[189066]: 2025-12-05 09:25:20.879 189070 DEBUG nova.network.os_vif_util [None req-c79702cb-8a4b-4665-b9ca-3624a8692fbd 3cf2b75d6732438a9a3626bc5db6d76e efceb9f71c3c447ea59c2d2694f9e636 - - default default] Converting VIF {"id": "9895d3af-515e-43f4-bc1f-97b82be6c710", "address": "fa:16:3e:64:c3:47", "network": {"id": "6ce38059-35e2-48bf-bd81-40e486d57627", "bridge": "br-int", "label": "tempest-network-smoke--458420447", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "efceb9f71c3c447ea59c2d2694f9e636", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9895d3af-51", "ovs_interfaceid": "9895d3af-515e-43f4-bc1f-97b82be6c710", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 09:25:20 compute-1 nova_compute[189066]: 2025-12-05 09:25:20.880 189070 DEBUG nova.network.os_vif_util [None req-c79702cb-8a4b-4665-b9ca-3624a8692fbd 3cf2b75d6732438a9a3626bc5db6d76e efceb9f71c3c447ea59c2d2694f9e636 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:64:c3:47,bridge_name='br-int',has_traffic_filtering=True,id=9895d3af-515e-43f4-bc1f-97b82be6c710,network=Network(6ce38059-35e2-48bf-bd81-40e486d57627),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9895d3af-51') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 09:25:20 compute-1 nova_compute[189066]: 2025-12-05 09:25:20.881 189070 DEBUG nova.objects.instance [None req-c79702cb-8a4b-4665-b9ca-3624a8692fbd 3cf2b75d6732438a9a3626bc5db6d76e efceb9f71c3c447ea59c2d2694f9e636 - - default default] Lazy-loading 'pci_devices' on Instance uuid b4e74925-5201-4f70-9beb-258ed9dc025a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 09:25:20 compute-1 nova_compute[189066]: 2025-12-05 09:25:20.906 189070 DEBUG nova.virt.libvirt.driver [None req-c79702cb-8a4b-4665-b9ca-3624a8692fbd 3cf2b75d6732438a9a3626bc5db6d76e efceb9f71c3c447ea59c2d2694f9e636 - - default default] [instance: b4e74925-5201-4f70-9beb-258ed9dc025a] End _get_guest_xml xml=<domain type="kvm">
Dec 05 09:25:20 compute-1 nova_compute[189066]:   <uuid>b4e74925-5201-4f70-9beb-258ed9dc025a</uuid>
Dec 05 09:25:20 compute-1 nova_compute[189066]:   <name>instance-00000009</name>
Dec 05 09:25:20 compute-1 nova_compute[189066]:   <memory>131072</memory>
Dec 05 09:25:20 compute-1 nova_compute[189066]:   <vcpu>1</vcpu>
Dec 05 09:25:20 compute-1 nova_compute[189066]:   <metadata>
Dec 05 09:25:20 compute-1 nova_compute[189066]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 05 09:25:20 compute-1 nova_compute[189066]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec 05 09:25:20 compute-1 nova_compute[189066]:       <nova:name>tempest-server-tempest-TestSecurityGroupsBasicOps-1546276319-access_point-1221653276</nova:name>
Dec 05 09:25:20 compute-1 nova_compute[189066]:       <nova:creationTime>2025-12-05 09:25:20</nova:creationTime>
Dec 05 09:25:20 compute-1 nova_compute[189066]:       <nova:flavor name="m1.nano">
Dec 05 09:25:20 compute-1 nova_compute[189066]:         <nova:memory>128</nova:memory>
Dec 05 09:25:20 compute-1 nova_compute[189066]:         <nova:disk>1</nova:disk>
Dec 05 09:25:20 compute-1 nova_compute[189066]:         <nova:swap>0</nova:swap>
Dec 05 09:25:20 compute-1 nova_compute[189066]:         <nova:ephemeral>0</nova:ephemeral>
Dec 05 09:25:20 compute-1 nova_compute[189066]:         <nova:vcpus>1</nova:vcpus>
Dec 05 09:25:20 compute-1 nova_compute[189066]:       </nova:flavor>
Dec 05 09:25:20 compute-1 nova_compute[189066]:       <nova:owner>
Dec 05 09:25:20 compute-1 nova_compute[189066]:         <nova:user uuid="3cf2b75d6732438a9a3626bc5db6d76e">tempest-TestSecurityGroupsBasicOps-1546276319-project-member</nova:user>
Dec 05 09:25:20 compute-1 nova_compute[189066]:         <nova:project uuid="efceb9f71c3c447ea59c2d2694f9e636">tempest-TestSecurityGroupsBasicOps-1546276319</nova:project>
Dec 05 09:25:20 compute-1 nova_compute[189066]:       </nova:owner>
Dec 05 09:25:20 compute-1 nova_compute[189066]:       <nova:root type="image" uuid="3ebffd97-b242-42d7-b245-ebdaf8e4377c"/>
Dec 05 09:25:20 compute-1 nova_compute[189066]:       <nova:ports>
Dec 05 09:25:20 compute-1 nova_compute[189066]:         <nova:port uuid="9895d3af-515e-43f4-bc1f-97b82be6c710">
Dec 05 09:25:20 compute-1 nova_compute[189066]:           <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Dec 05 09:25:20 compute-1 nova_compute[189066]:         </nova:port>
Dec 05 09:25:20 compute-1 nova_compute[189066]:       </nova:ports>
Dec 05 09:25:20 compute-1 nova_compute[189066]:     </nova:instance>
Dec 05 09:25:20 compute-1 nova_compute[189066]:   </metadata>
Dec 05 09:25:20 compute-1 nova_compute[189066]:   <sysinfo type="smbios">
Dec 05 09:25:20 compute-1 nova_compute[189066]:     <system>
Dec 05 09:25:20 compute-1 nova_compute[189066]:       <entry name="manufacturer">RDO</entry>
Dec 05 09:25:20 compute-1 nova_compute[189066]:       <entry name="product">OpenStack Compute</entry>
Dec 05 09:25:20 compute-1 nova_compute[189066]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec 05 09:25:20 compute-1 nova_compute[189066]:       <entry name="serial">b4e74925-5201-4f70-9beb-258ed9dc025a</entry>
Dec 05 09:25:20 compute-1 nova_compute[189066]:       <entry name="uuid">b4e74925-5201-4f70-9beb-258ed9dc025a</entry>
Dec 05 09:25:20 compute-1 nova_compute[189066]:       <entry name="family">Virtual Machine</entry>
Dec 05 09:25:20 compute-1 nova_compute[189066]:     </system>
Dec 05 09:25:20 compute-1 nova_compute[189066]:   </sysinfo>
Dec 05 09:25:20 compute-1 nova_compute[189066]:   <os>
Dec 05 09:25:20 compute-1 nova_compute[189066]:     <type arch="x86_64" machine="q35">hvm</type>
Dec 05 09:25:20 compute-1 nova_compute[189066]:     <boot dev="hd"/>
Dec 05 09:25:20 compute-1 nova_compute[189066]:     <smbios mode="sysinfo"/>
Dec 05 09:25:20 compute-1 nova_compute[189066]:   </os>
Dec 05 09:25:20 compute-1 nova_compute[189066]:   <features>
Dec 05 09:25:20 compute-1 nova_compute[189066]:     <acpi/>
Dec 05 09:25:20 compute-1 nova_compute[189066]:     <apic/>
Dec 05 09:25:20 compute-1 nova_compute[189066]:     <vmcoreinfo/>
Dec 05 09:25:20 compute-1 nova_compute[189066]:   </features>
Dec 05 09:25:20 compute-1 nova_compute[189066]:   <clock offset="utc">
Dec 05 09:25:20 compute-1 nova_compute[189066]:     <timer name="pit" tickpolicy="delay"/>
Dec 05 09:25:20 compute-1 nova_compute[189066]:     <timer name="rtc" tickpolicy="catchup"/>
Dec 05 09:25:20 compute-1 nova_compute[189066]:     <timer name="hpet" present="no"/>
Dec 05 09:25:20 compute-1 nova_compute[189066]:   </clock>
Dec 05 09:25:20 compute-1 nova_compute[189066]:   <cpu mode="custom" match="exact">
Dec 05 09:25:20 compute-1 nova_compute[189066]:     <model>Nehalem</model>
Dec 05 09:25:20 compute-1 nova_compute[189066]:     <topology sockets="1" cores="1" threads="1"/>
Dec 05 09:25:20 compute-1 nova_compute[189066]:   </cpu>
Dec 05 09:25:20 compute-1 nova_compute[189066]:   <devices>
Dec 05 09:25:20 compute-1 nova_compute[189066]:     <disk type="file" device="disk">
Dec 05 09:25:20 compute-1 nova_compute[189066]:       <driver name="qemu" type="qcow2" cache="none"/>
Dec 05 09:25:20 compute-1 nova_compute[189066]:       <source file="/var/lib/nova/instances/b4e74925-5201-4f70-9beb-258ed9dc025a/disk"/>
Dec 05 09:25:20 compute-1 nova_compute[189066]:       <target dev="vda" bus="virtio"/>
Dec 05 09:25:20 compute-1 nova_compute[189066]:     </disk>
Dec 05 09:25:20 compute-1 nova_compute[189066]:     <disk type="file" device="cdrom">
Dec 05 09:25:20 compute-1 nova_compute[189066]:       <driver name="qemu" type="raw" cache="none"/>
Dec 05 09:25:20 compute-1 nova_compute[189066]:       <source file="/var/lib/nova/instances/b4e74925-5201-4f70-9beb-258ed9dc025a/disk.config"/>
Dec 05 09:25:20 compute-1 nova_compute[189066]:       <target dev="sda" bus="sata"/>
Dec 05 09:25:20 compute-1 nova_compute[189066]:     </disk>
Dec 05 09:25:20 compute-1 nova_compute[189066]:     <interface type="ethernet">
Dec 05 09:25:20 compute-1 nova_compute[189066]:       <mac address="fa:16:3e:64:c3:47"/>
Dec 05 09:25:20 compute-1 nova_compute[189066]:       <model type="virtio"/>
Dec 05 09:25:20 compute-1 nova_compute[189066]:       <driver name="vhost" rx_queue_size="512"/>
Dec 05 09:25:20 compute-1 nova_compute[189066]:       <mtu size="1442"/>
Dec 05 09:25:20 compute-1 nova_compute[189066]:       <target dev="tap9895d3af-51"/>
Dec 05 09:25:20 compute-1 nova_compute[189066]:     </interface>
Dec 05 09:25:20 compute-1 nova_compute[189066]:     <serial type="pty">
Dec 05 09:25:20 compute-1 nova_compute[189066]:       <log file="/var/lib/nova/instances/b4e74925-5201-4f70-9beb-258ed9dc025a/console.log" append="off"/>
Dec 05 09:25:20 compute-1 nova_compute[189066]:     </serial>
Dec 05 09:25:20 compute-1 nova_compute[189066]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Dec 05 09:25:20 compute-1 nova_compute[189066]:     <video>
Dec 05 09:25:20 compute-1 nova_compute[189066]:       <model type="virtio"/>
Dec 05 09:25:20 compute-1 nova_compute[189066]:     </video>
Dec 05 09:25:20 compute-1 nova_compute[189066]:     <input type="tablet" bus="usb"/>
Dec 05 09:25:20 compute-1 nova_compute[189066]:     <rng model="virtio">
Dec 05 09:25:20 compute-1 nova_compute[189066]:       <backend model="random">/dev/urandom</backend>
Dec 05 09:25:20 compute-1 nova_compute[189066]:     </rng>
Dec 05 09:25:20 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root"/>
Dec 05 09:25:20 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:25:20 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:25:20 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:25:20 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:25:20 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:25:20 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:25:20 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:25:20 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:25:20 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:25:20 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:25:20 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:25:20 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:25:20 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:25:20 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:25:20 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:25:20 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:25:20 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:25:20 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:25:20 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:25:20 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:25:20 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:25:20 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:25:20 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:25:20 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:25:20 compute-1 nova_compute[189066]:     <controller type="usb" index="0"/>
Dec 05 09:25:20 compute-1 nova_compute[189066]:     <memballoon model="virtio">
Dec 05 09:25:20 compute-1 nova_compute[189066]:       <stats period="10"/>
Dec 05 09:25:20 compute-1 nova_compute[189066]:     </memballoon>
Dec 05 09:25:20 compute-1 nova_compute[189066]:   </devices>
Dec 05 09:25:20 compute-1 nova_compute[189066]: </domain>
Dec 05 09:25:20 compute-1 nova_compute[189066]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Dec 05 09:25:20 compute-1 nova_compute[189066]: 2025-12-05 09:25:20.907 189070 DEBUG nova.compute.manager [None req-c79702cb-8a4b-4665-b9ca-3624a8692fbd 3cf2b75d6732438a9a3626bc5db6d76e efceb9f71c3c447ea59c2d2694f9e636 - - default default] [instance: b4e74925-5201-4f70-9beb-258ed9dc025a] Preparing to wait for external event network-vif-plugged-9895d3af-515e-43f4-bc1f-97b82be6c710 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Dec 05 09:25:20 compute-1 nova_compute[189066]: 2025-12-05 09:25:20.908 189070 DEBUG oslo_concurrency.lockutils [None req-c79702cb-8a4b-4665-b9ca-3624a8692fbd 3cf2b75d6732438a9a3626bc5db6d76e efceb9f71c3c447ea59c2d2694f9e636 - - default default] Acquiring lock "b4e74925-5201-4f70-9beb-258ed9dc025a-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:25:20 compute-1 nova_compute[189066]: 2025-12-05 09:25:20.908 189070 DEBUG oslo_concurrency.lockutils [None req-c79702cb-8a4b-4665-b9ca-3624a8692fbd 3cf2b75d6732438a9a3626bc5db6d76e efceb9f71c3c447ea59c2d2694f9e636 - - default default] Lock "b4e74925-5201-4f70-9beb-258ed9dc025a-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:25:20 compute-1 nova_compute[189066]: 2025-12-05 09:25:20.908 189070 DEBUG oslo_concurrency.lockutils [None req-c79702cb-8a4b-4665-b9ca-3624a8692fbd 3cf2b75d6732438a9a3626bc5db6d76e efceb9f71c3c447ea59c2d2694f9e636 - - default default] Lock "b4e74925-5201-4f70-9beb-258ed9dc025a-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:25:20 compute-1 nova_compute[189066]: 2025-12-05 09:25:20.909 189070 DEBUG nova.virt.libvirt.vif [None req-c79702cb-8a4b-4665-b9ca-3624a8692fbd 3cf2b75d6732438a9a3626bc5db6d76e efceb9f71c3c447ea59c2d2694f9e636 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T09:25:04Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-tempest-TestSecurityGroupsBasicOps-1546276319-access_point-1221653276',display_name='tempest-server-tempest-TestSecurityGroupsBasicOps-1546276319-access_point-1221653276',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-server-tempest-testsecuritygroupsbasicops-1546276319-ac',id=9,image_ref='3ebffd97-b242-42d7-b245-ebdaf8e4377c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBFJkw+l11RVGYygdYFY+5dTvjVW6UnWERneAkkJsvcy3vaYGsZxqSJKH1T1OhIDkYKsREB1fnusNh4+5qwwXs+krdjcuQPiAa96VT7yue5/M3ONTUDBINBXBPmqK+uIgKg==',key_name='tempest-TestSecurityGroupsBasicOps-926259432',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='efceb9f71c3c447ea59c2d2694f9e636',ramdisk_id='',reservation_id='r-asrkgshf',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='3ebffd97-b242-42d7-b245-ebdaf8e4377c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestSecurityGroupsBasicOps-1546276319',owner_user_name='tempest-TestSecurityGroupsBasicOps-1546276319-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T09:25:08Z,user_data=None,user_id='3cf2b75d6732438a9a3626bc5db6d76e',uuid=b4e74925-5201-4f70-9beb-258ed9dc025a,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "9895d3af-515e-43f4-bc1f-97b82be6c710", "address": "fa:16:3e:64:c3:47", "network": {"id": "6ce38059-35e2-48bf-bd81-40e486d57627", "bridge": "br-int", "label": "tempest-network-smoke--458420447", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "efceb9f71c3c447ea59c2d2694f9e636", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9895d3af-51", "ovs_interfaceid": "9895d3af-515e-43f4-bc1f-97b82be6c710", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Dec 05 09:25:20 compute-1 nova_compute[189066]: 2025-12-05 09:25:20.909 189070 DEBUG nova.network.os_vif_util [None req-c79702cb-8a4b-4665-b9ca-3624a8692fbd 3cf2b75d6732438a9a3626bc5db6d76e efceb9f71c3c447ea59c2d2694f9e636 - - default default] Converting VIF {"id": "9895d3af-515e-43f4-bc1f-97b82be6c710", "address": "fa:16:3e:64:c3:47", "network": {"id": "6ce38059-35e2-48bf-bd81-40e486d57627", "bridge": "br-int", "label": "tempest-network-smoke--458420447", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "efceb9f71c3c447ea59c2d2694f9e636", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9895d3af-51", "ovs_interfaceid": "9895d3af-515e-43f4-bc1f-97b82be6c710", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 09:25:20 compute-1 nova_compute[189066]: 2025-12-05 09:25:20.909 189070 DEBUG nova.network.os_vif_util [None req-c79702cb-8a4b-4665-b9ca-3624a8692fbd 3cf2b75d6732438a9a3626bc5db6d76e efceb9f71c3c447ea59c2d2694f9e636 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:64:c3:47,bridge_name='br-int',has_traffic_filtering=True,id=9895d3af-515e-43f4-bc1f-97b82be6c710,network=Network(6ce38059-35e2-48bf-bd81-40e486d57627),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9895d3af-51') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 09:25:20 compute-1 nova_compute[189066]: 2025-12-05 09:25:20.910 189070 DEBUG os_vif [None req-c79702cb-8a4b-4665-b9ca-3624a8692fbd 3cf2b75d6732438a9a3626bc5db6d76e efceb9f71c3c447ea59c2d2694f9e636 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:64:c3:47,bridge_name='br-int',has_traffic_filtering=True,id=9895d3af-515e-43f4-bc1f-97b82be6c710,network=Network(6ce38059-35e2-48bf-bd81-40e486d57627),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9895d3af-51') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Dec 05 09:25:20 compute-1 nova_compute[189066]: 2025-12-05 09:25:20.910 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:25:20 compute-1 nova_compute[189066]: 2025-12-05 09:25:20.911 189070 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 09:25:20 compute-1 nova_compute[189066]: 2025-12-05 09:25:20.911 189070 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 09:25:20 compute-1 nova_compute[189066]: 2025-12-05 09:25:20.914 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:25:20 compute-1 nova_compute[189066]: 2025-12-05 09:25:20.914 189070 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap9895d3af-51, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 09:25:20 compute-1 nova_compute[189066]: 2025-12-05 09:25:20.915 189070 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap9895d3af-51, col_values=(('external_ids', {'iface-id': '9895d3af-515e-43f4-bc1f-97b82be6c710', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:64:c3:47', 'vm-uuid': 'b4e74925-5201-4f70-9beb-258ed9dc025a'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 09:25:20 compute-1 nova_compute[189066]: 2025-12-05 09:25:20.916 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:25:20 compute-1 NetworkManager[55704]: <info>  [1764926720.9177] manager: (tap9895d3af-51): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/37)
Dec 05 09:25:20 compute-1 nova_compute[189066]: 2025-12-05 09:25:20.919 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 09:25:20 compute-1 nova_compute[189066]: 2025-12-05 09:25:20.927 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:25:20 compute-1 nova_compute[189066]: 2025-12-05 09:25:20.929 189070 INFO os_vif [None req-c79702cb-8a4b-4665-b9ca-3624a8692fbd 3cf2b75d6732438a9a3626bc5db6d76e efceb9f71c3c447ea59c2d2694f9e636 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:64:c3:47,bridge_name='br-int',has_traffic_filtering=True,id=9895d3af-515e-43f4-bc1f-97b82be6c710,network=Network(6ce38059-35e2-48bf-bd81-40e486d57627),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9895d3af-51')
Dec 05 09:25:21 compute-1 nova_compute[189066]: 2025-12-05 09:25:21.005 189070 DEBUG nova.virt.libvirt.driver [None req-c79702cb-8a4b-4665-b9ca-3624a8692fbd 3cf2b75d6732438a9a3626bc5db6d76e efceb9f71c3c447ea59c2d2694f9e636 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 05 09:25:21 compute-1 nova_compute[189066]: 2025-12-05 09:25:21.006 189070 DEBUG nova.virt.libvirt.driver [None req-c79702cb-8a4b-4665-b9ca-3624a8692fbd 3cf2b75d6732438a9a3626bc5db6d76e efceb9f71c3c447ea59c2d2694f9e636 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 05 09:25:21 compute-1 nova_compute[189066]: 2025-12-05 09:25:21.006 189070 DEBUG nova.virt.libvirt.driver [None req-c79702cb-8a4b-4665-b9ca-3624a8692fbd 3cf2b75d6732438a9a3626bc5db6d76e efceb9f71c3c447ea59c2d2694f9e636 - - default default] No VIF found with MAC fa:16:3e:64:c3:47, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Dec 05 09:25:21 compute-1 nova_compute[189066]: 2025-12-05 09:25:21.007 189070 INFO nova.virt.libvirt.driver [None req-c79702cb-8a4b-4665-b9ca-3624a8692fbd 3cf2b75d6732438a9a3626bc5db6d76e efceb9f71c3c447ea59c2d2694f9e636 - - default default] [instance: b4e74925-5201-4f70-9beb-258ed9dc025a] Using config drive
Dec 05 09:25:21 compute-1 podman[222401]: 2025-12-05 09:25:21.041379583 +0000 UTC m=+0.061984273 container health_status 550b604b3b40c506829669f3af0f3e8dc4e7ac01464222ce7f26998708fe1a96 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Dec 05 09:25:22 compute-1 nova_compute[189066]: 2025-12-05 09:25:22.020 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:25:22 compute-1 nova_compute[189066]: 2025-12-05 09:25:22.021 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:25:22 compute-1 nova_compute[189066]: 2025-12-05 09:25:22.021 189070 DEBUG nova.compute.manager [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 05 09:25:22 compute-1 nova_compute[189066]: 2025-12-05 09:25:22.021 189070 DEBUG nova.compute.manager [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 05 09:25:22 compute-1 nova_compute[189066]: 2025-12-05 09:25:22.067 189070 DEBUG nova.compute.manager [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] [instance: b4e74925-5201-4f70-9beb-258ed9dc025a] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871
Dec 05 09:25:22 compute-1 nova_compute[189066]: 2025-12-05 09:25:22.360 189070 INFO nova.virt.libvirt.driver [None req-c79702cb-8a4b-4665-b9ca-3624a8692fbd 3cf2b75d6732438a9a3626bc5db6d76e efceb9f71c3c447ea59c2d2694f9e636 - - default default] [instance: b4e74925-5201-4f70-9beb-258ed9dc025a] Creating config drive at /var/lib/nova/instances/b4e74925-5201-4f70-9beb-258ed9dc025a/disk.config
Dec 05 09:25:22 compute-1 nova_compute[189066]: 2025-12-05 09:25:22.367 189070 DEBUG oslo_concurrency.processutils [None req-c79702cb-8a4b-4665-b9ca-3624a8692fbd 3cf2b75d6732438a9a3626bc5db6d76e efceb9f71c3c447ea59c2d2694f9e636 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/b4e74925-5201-4f70-9beb-258ed9dc025a/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp1qrc7pxi execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 09:25:22 compute-1 nova_compute[189066]: 2025-12-05 09:25:22.499 189070 DEBUG oslo_concurrency.processutils [None req-c79702cb-8a4b-4665-b9ca-3624a8692fbd 3cf2b75d6732438a9a3626bc5db6d76e efceb9f71c3c447ea59c2d2694f9e636 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/b4e74925-5201-4f70-9beb-258ed9dc025a/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp1qrc7pxi" returned: 0 in 0.133s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 09:25:22 compute-1 kernel: tap9895d3af-51: entered promiscuous mode
Dec 05 09:25:22 compute-1 NetworkManager[55704]: <info>  [1764926722.5691] manager: (tap9895d3af-51): new Tun device (/org/freedesktop/NetworkManager/Devices/38)
Dec 05 09:25:22 compute-1 systemd-udevd[222440]: Network interface NamePolicy= disabled on kernel command line.
Dec 05 09:25:22 compute-1 ovn_controller[95809]: 2025-12-05T09:25:22Z|00056|binding|INFO|Claiming lport 9895d3af-515e-43f4-bc1f-97b82be6c710 for this chassis.
Dec 05 09:25:22 compute-1 ovn_controller[95809]: 2025-12-05T09:25:22Z|00057|binding|INFO|9895d3af-515e-43f4-bc1f-97b82be6c710: Claiming fa:16:3e:64:c3:47 10.100.0.14
Dec 05 09:25:22 compute-1 nova_compute[189066]: 2025-12-05 09:25:22.609 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:25:22 compute-1 NetworkManager[55704]: <info>  [1764926722.6277] device (tap9895d3af-51): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec 05 09:25:22 compute-1 NetworkManager[55704]: <info>  [1764926722.6288] device (tap9895d3af-51): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec 05 09:25:22 compute-1 ovn_controller[95809]: 2025-12-05T09:25:22Z|00058|binding|INFO|Setting lport 9895d3af-515e-43f4-bc1f-97b82be6c710 ovn-installed in OVS
Dec 05 09:25:22 compute-1 ovn_controller[95809]: 2025-12-05T09:25:22Z|00059|binding|INFO|Setting lport 9895d3af-515e-43f4-bc1f-97b82be6c710 up in Southbound
Dec 05 09:25:22 compute-1 nova_compute[189066]: 2025-12-05 09:25:22.630 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:25:22 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:25:22.624 105272 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:64:c3:47 10.100.0.14'], port_security=['fa:16:3e:64:c3:47 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': 'b4e74925-5201-4f70-9beb-258ed9dc025a', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6ce38059-35e2-48bf-bd81-40e486d57627', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'efceb9f71c3c447ea59c2d2694f9e636', 'neutron:revision_number': '2', 'neutron:security_group_ids': '005d7801-6db4-4517-9d42-7f13da167a01 7f744df6-6fc6-436b-b5e3-475437dac336', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f2000da9-080b-4941-bfdd-da4222442ad0, chassis=[<ovs.db.idl.Row object at 0x7fc06f54c970>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fc06f54c970>], logical_port=9895d3af-515e-43f4-bc1f-97b82be6c710) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 09:25:22 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:25:22.628 105272 INFO neutron.agent.ovn.metadata.agent [-] Port 9895d3af-515e-43f4-bc1f-97b82be6c710 in datapath 6ce38059-35e2-48bf-bd81-40e486d57627 bound to our chassis
Dec 05 09:25:22 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:25:22.631 105272 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 6ce38059-35e2-48bf-bd81-40e486d57627
Dec 05 09:25:22 compute-1 systemd-machined[154815]: New machine qemu-4-instance-00000009.
Dec 05 09:25:22 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:25:22.648 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[ca96cede-33ef-4d6f-8891-67a98d5cd82b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:25:22 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:25:22.651 105272 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap6ce38059-31 in ovnmeta-6ce38059-35e2-48bf-bd81-40e486d57627 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Dec 05 09:25:22 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:25:22.653 220556 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap6ce38059-30 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Dec 05 09:25:22 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:25:22.654 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[a0d7fa18-6092-41d7-8999-67434a0febd9]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:25:22 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:25:22.655 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[d1efcc9b-10fa-4bbe-8ccc-fc59dde26e9d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:25:22 compute-1 systemd[1]: Started Virtual Machine qemu-4-instance-00000009.
Dec 05 09:25:22 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:25:22.671 105809 DEBUG oslo.privsep.daemon [-] privsep: reply[406b642a-f72a-4c87-a548-9d1798143110]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:25:22 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:25:22.696 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[cea757aa-9eb0-45b5-8fa6-8cded8715a7b]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:25:22 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:25:22.734 220896 DEBUG oslo.privsep.daemon [-] privsep: reply[06a8835e-1687-4a02-991e-09af4416593d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:25:22 compute-1 NetworkManager[55704]: <info>  [1764926722.7452] manager: (tap6ce38059-30): new Veth device (/org/freedesktop/NetworkManager/Devices/39)
Dec 05 09:25:22 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:25:22.748 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[25f1b57e-ea23-4cd1-bb19-73d704872ad9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:25:22 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:25:22.790 220896 DEBUG oslo.privsep.daemon [-] privsep: reply[b735a38b-2fda-4ff4-89a6-36a0a76f00d6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:25:22 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:25:22.794 220896 DEBUG oslo.privsep.daemon [-] privsep: reply[09460fa1-41aa-4f95-b067-b3479491bc2f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:25:22 compute-1 NetworkManager[55704]: <info>  [1764926722.8235] device (tap6ce38059-30): carrier: link connected
Dec 05 09:25:22 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:25:22.823 220896 DEBUG oslo.privsep.daemon [-] privsep: reply[a501cbd6-83d1-425e-babc-2c14722d6920]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:25:22 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:25:22.844 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[6ed3b7fb-f47c-4a90-af2d-bc96e4b22c8f]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap6ce38059-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:32:af:11'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 20], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 399582, 'reachable_time': 37696, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 222476, 'error': None, 'target': 'ovnmeta-6ce38059-35e2-48bf-bd81-40e486d57627', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:25:22 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:25:22.864 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[6d03f14b-5088-40b4-8b2d-9c75193b3ea8]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe32:af11'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 399582, 'tstamp': 399582}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 222477, 'error': None, 'target': 'ovnmeta-6ce38059-35e2-48bf-bd81-40e486d57627', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:25:22 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:25:22.882 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[f5323a18-b5d7-4c81-9a1e-934c1b494e4b]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap6ce38059-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:32:af:11'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 20], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 399582, 'reachable_time': 37696, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 152, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 152, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 222478, 'error': None, 'target': 'ovnmeta-6ce38059-35e2-48bf-bd81-40e486d57627', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:25:22 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:25:22.924 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[e4edcb2a-a373-43fe-a4b3-5936ad070b25]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:25:22 compute-1 nova_compute[189066]: 2025-12-05 09:25:22.988 189070 DEBUG oslo_concurrency.lockutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Acquiring lock "refresh_cache-6b54aedd-9d9e-436b-9010-c5b04ffaca40" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 09:25:22 compute-1 nova_compute[189066]: 2025-12-05 09:25:22.989 189070 DEBUG oslo_concurrency.lockutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Acquired lock "refresh_cache-6b54aedd-9d9e-436b-9010-c5b04ffaca40" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 09:25:22 compute-1 nova_compute[189066]: 2025-12-05 09:25:22.989 189070 DEBUG nova.network.neutron [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] [instance: 6b54aedd-9d9e-436b-9010-c5b04ffaca40] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Dec 05 09:25:22 compute-1 nova_compute[189066]: 2025-12-05 09:25:22.989 189070 DEBUG nova.objects.instance [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 6b54aedd-9d9e-436b-9010-c5b04ffaca40 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 09:25:23 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:25:23.007 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[00282cf1-5805-4c68-b57e-fe363999b76a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:25:23 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:25:23.009 105272 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap6ce38059-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 09:25:23 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:25:23.010 105272 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 09:25:23 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:25:23.010 105272 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap6ce38059-30, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 09:25:23 compute-1 nova_compute[189066]: 2025-12-05 09:25:23.013 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:25:23 compute-1 NetworkManager[55704]: <info>  [1764926723.0140] manager: (tap6ce38059-30): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/40)
Dec 05 09:25:23 compute-1 kernel: tap6ce38059-30: entered promiscuous mode
Dec 05 09:25:23 compute-1 nova_compute[189066]: 2025-12-05 09:25:23.018 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:25:23 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:25:23.019 105272 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap6ce38059-30, col_values=(('external_ids', {'iface-id': '76b453b1-696b-4d6e-a37b-38d94febf25b'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 09:25:23 compute-1 nova_compute[189066]: 2025-12-05 09:25:23.021 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:25:23 compute-1 ovn_controller[95809]: 2025-12-05T09:25:23Z|00060|binding|INFO|Releasing lport 76b453b1-696b-4d6e-a37b-38d94febf25b from this chassis (sb_readonly=0)
Dec 05 09:25:23 compute-1 nova_compute[189066]: 2025-12-05 09:25:23.022 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:25:23 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:25:23.023 105272 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/6ce38059-35e2-48bf-bd81-40e486d57627.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/6ce38059-35e2-48bf-bd81-40e486d57627.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Dec 05 09:25:23 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:25:23.024 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[df0ede08-5969-4123-8dcf-b62b9c1ad2c1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:25:23 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:25:23.025 105272 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Dec 05 09:25:23 compute-1 ovn_metadata_agent[105267]: global
Dec 05 09:25:23 compute-1 ovn_metadata_agent[105267]:     log         /dev/log local0 debug
Dec 05 09:25:23 compute-1 ovn_metadata_agent[105267]:     log-tag     haproxy-metadata-proxy-6ce38059-35e2-48bf-bd81-40e486d57627
Dec 05 09:25:23 compute-1 ovn_metadata_agent[105267]:     user        root
Dec 05 09:25:23 compute-1 ovn_metadata_agent[105267]:     group       root
Dec 05 09:25:23 compute-1 ovn_metadata_agent[105267]:     maxconn     1024
Dec 05 09:25:23 compute-1 ovn_metadata_agent[105267]:     pidfile     /var/lib/neutron/external/pids/6ce38059-35e2-48bf-bd81-40e486d57627.pid.haproxy
Dec 05 09:25:23 compute-1 ovn_metadata_agent[105267]:     daemon
Dec 05 09:25:23 compute-1 ovn_metadata_agent[105267]: 
Dec 05 09:25:23 compute-1 ovn_metadata_agent[105267]: defaults
Dec 05 09:25:23 compute-1 ovn_metadata_agent[105267]:     log global
Dec 05 09:25:23 compute-1 ovn_metadata_agent[105267]:     mode http
Dec 05 09:25:23 compute-1 ovn_metadata_agent[105267]:     option httplog
Dec 05 09:25:23 compute-1 ovn_metadata_agent[105267]:     option dontlognull
Dec 05 09:25:23 compute-1 ovn_metadata_agent[105267]:     option http-server-close
Dec 05 09:25:23 compute-1 ovn_metadata_agent[105267]:     option forwardfor
Dec 05 09:25:23 compute-1 ovn_metadata_agent[105267]:     retries                 3
Dec 05 09:25:23 compute-1 ovn_metadata_agent[105267]:     timeout http-request    30s
Dec 05 09:25:23 compute-1 ovn_metadata_agent[105267]:     timeout connect         30s
Dec 05 09:25:23 compute-1 ovn_metadata_agent[105267]:     timeout client          32s
Dec 05 09:25:23 compute-1 ovn_metadata_agent[105267]:     timeout server          32s
Dec 05 09:25:23 compute-1 ovn_metadata_agent[105267]:     timeout http-keep-alive 30s
Dec 05 09:25:23 compute-1 ovn_metadata_agent[105267]: 
Dec 05 09:25:23 compute-1 ovn_metadata_agent[105267]: 
Dec 05 09:25:23 compute-1 ovn_metadata_agent[105267]: listen listener
Dec 05 09:25:23 compute-1 ovn_metadata_agent[105267]:     bind 169.254.169.254:80
Dec 05 09:25:23 compute-1 ovn_metadata_agent[105267]:     server metadata /var/lib/neutron/metadata_proxy
Dec 05 09:25:23 compute-1 ovn_metadata_agent[105267]:     http-request add-header X-OVN-Network-ID 6ce38059-35e2-48bf-bd81-40e486d57627
Dec 05 09:25:23 compute-1 ovn_metadata_agent[105267]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Dec 05 09:25:23 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:25:23.026 105272 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-6ce38059-35e2-48bf-bd81-40e486d57627', 'env', 'PROCESS_TAG=haproxy-6ce38059-35e2-48bf-bd81-40e486d57627', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/6ce38059-35e2-48bf-bd81-40e486d57627.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Dec 05 09:25:23 compute-1 nova_compute[189066]: 2025-12-05 09:25:23.033 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:25:23 compute-1 nova_compute[189066]: 2025-12-05 09:25:23.053 189070 DEBUG nova.virt.driver [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] Emitting event <LifecycleEvent: 1764926723.0525944, b4e74925-5201-4f70-9beb-258ed9dc025a => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 09:25:23 compute-1 nova_compute[189066]: 2025-12-05 09:25:23.053 189070 INFO nova.compute.manager [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] [instance: b4e74925-5201-4f70-9beb-258ed9dc025a] VM Started (Lifecycle Event)
Dec 05 09:25:23 compute-1 nova_compute[189066]: 2025-12-05 09:25:23.086 189070 DEBUG nova.compute.manager [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] [instance: b4e74925-5201-4f70-9beb-258ed9dc025a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 09:25:23 compute-1 nova_compute[189066]: 2025-12-05 09:25:23.090 189070 DEBUG nova.virt.driver [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] Emitting event <LifecycleEvent: 1764926723.0536017, b4e74925-5201-4f70-9beb-258ed9dc025a => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 09:25:23 compute-1 nova_compute[189066]: 2025-12-05 09:25:23.091 189070 INFO nova.compute.manager [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] [instance: b4e74925-5201-4f70-9beb-258ed9dc025a] VM Paused (Lifecycle Event)
Dec 05 09:25:23 compute-1 nova_compute[189066]: 2025-12-05 09:25:23.119 189070 DEBUG nova.compute.manager [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] [instance: b4e74925-5201-4f70-9beb-258ed9dc025a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 09:25:23 compute-1 nova_compute[189066]: 2025-12-05 09:25:23.124 189070 DEBUG nova.compute.manager [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] [instance: b4e74925-5201-4f70-9beb-258ed9dc025a] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 09:25:23 compute-1 nova_compute[189066]: 2025-12-05 09:25:23.181 189070 INFO nova.compute.manager [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] [instance: b4e74925-5201-4f70-9beb-258ed9dc025a] During sync_power_state the instance has a pending task (spawning). Skip.
Dec 05 09:25:23 compute-1 podman[222518]: 2025-12-05 09:25:23.506607495 +0000 UTC m=+0.073978358 container create 59e5cde960d47d37f774c357e054496b8373da5bb55a8cf4f2c4ddb44f7813d2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6ce38059-35e2-48bf-bd81-40e486d57627, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Dec 05 09:25:23 compute-1 systemd[1]: Started libpod-conmon-59e5cde960d47d37f774c357e054496b8373da5bb55a8cf4f2c4ddb44f7813d2.scope.
Dec 05 09:25:23 compute-1 podman[222518]: 2025-12-05 09:25:23.476810679 +0000 UTC m=+0.044181542 image pull 014dc726c85414b29f2dde7b5d875685d08784761c0f0ffa8630d1583a877bf9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Dec 05 09:25:23 compute-1 systemd[1]: Started libcrun container.
Dec 05 09:25:23 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/adfcae394d927dbf643aea13343d5616fa9d94a201d320693c9ee93d7dc0abfb/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 05 09:25:23 compute-1 podman[222518]: 2025-12-05 09:25:23.643888966 +0000 UTC m=+0.211259849 container init 59e5cde960d47d37f774c357e054496b8373da5bb55a8cf4f2c4ddb44f7813d2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6ce38059-35e2-48bf-bd81-40e486d57627, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_managed=true)
Dec 05 09:25:23 compute-1 podman[222518]: 2025-12-05 09:25:23.650250314 +0000 UTC m=+0.217621177 container start 59e5cde960d47d37f774c357e054496b8373da5bb55a8cf4f2c4ddb44f7813d2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6ce38059-35e2-48bf-bd81-40e486d57627, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS)
Dec 05 09:25:23 compute-1 neutron-haproxy-ovnmeta-6ce38059-35e2-48bf-bd81-40e486d57627[222536]: [NOTICE]   (222545) : New worker (222547) forked
Dec 05 09:25:23 compute-1 neutron-haproxy-ovnmeta-6ce38059-35e2-48bf-bd81-40e486d57627[222536]: [NOTICE]   (222545) : Loading success.
Dec 05 09:25:23 compute-1 nova_compute[189066]: 2025-12-05 09:25:23.794 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:25:24 compute-1 ovn_controller[95809]: 2025-12-05T09:25:24Z|00008|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:5f:73:50 10.100.0.3
Dec 05 09:25:24 compute-1 ovn_controller[95809]: 2025-12-05T09:25:24Z|00009|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:5f:73:50 10.100.0.3
Dec 05 09:25:25 compute-1 nova_compute[189066]: 2025-12-05 09:25:25.621 189070 DEBUG nova.compute.manager [req-831bb91b-6b3c-4a6b-92f0-cf1261a568ba req-b3ab22a2-7140-4065-a38c-df4cf95f01a7 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: b4e74925-5201-4f70-9beb-258ed9dc025a] Received event network-vif-plugged-9895d3af-515e-43f4-bc1f-97b82be6c710 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 09:25:25 compute-1 nova_compute[189066]: 2025-12-05 09:25:25.622 189070 DEBUG oslo_concurrency.lockutils [req-831bb91b-6b3c-4a6b-92f0-cf1261a568ba req-b3ab22a2-7140-4065-a38c-df4cf95f01a7 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Acquiring lock "b4e74925-5201-4f70-9beb-258ed9dc025a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:25:25 compute-1 nova_compute[189066]: 2025-12-05 09:25:25.622 189070 DEBUG oslo_concurrency.lockutils [req-831bb91b-6b3c-4a6b-92f0-cf1261a568ba req-b3ab22a2-7140-4065-a38c-df4cf95f01a7 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Lock "b4e74925-5201-4f70-9beb-258ed9dc025a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:25:25 compute-1 nova_compute[189066]: 2025-12-05 09:25:25.622 189070 DEBUG oslo_concurrency.lockutils [req-831bb91b-6b3c-4a6b-92f0-cf1261a568ba req-b3ab22a2-7140-4065-a38c-df4cf95f01a7 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Lock "b4e74925-5201-4f70-9beb-258ed9dc025a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:25:25 compute-1 nova_compute[189066]: 2025-12-05 09:25:25.622 189070 DEBUG nova.compute.manager [req-831bb91b-6b3c-4a6b-92f0-cf1261a568ba req-b3ab22a2-7140-4065-a38c-df4cf95f01a7 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: b4e74925-5201-4f70-9beb-258ed9dc025a] Processing event network-vif-plugged-9895d3af-515e-43f4-bc1f-97b82be6c710 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Dec 05 09:25:25 compute-1 nova_compute[189066]: 2025-12-05 09:25:25.623 189070 DEBUG nova.compute.manager [None req-c79702cb-8a4b-4665-b9ca-3624a8692fbd 3cf2b75d6732438a9a3626bc5db6d76e efceb9f71c3c447ea59c2d2694f9e636 - - default default] [instance: b4e74925-5201-4f70-9beb-258ed9dc025a] Instance event wait completed in 2 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Dec 05 09:25:25 compute-1 nova_compute[189066]: 2025-12-05 09:25:25.630 189070 DEBUG nova.virt.driver [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] Emitting event <LifecycleEvent: 1764926725.6291535, b4e74925-5201-4f70-9beb-258ed9dc025a => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 09:25:25 compute-1 nova_compute[189066]: 2025-12-05 09:25:25.631 189070 INFO nova.compute.manager [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] [instance: b4e74925-5201-4f70-9beb-258ed9dc025a] VM Resumed (Lifecycle Event)
Dec 05 09:25:25 compute-1 nova_compute[189066]: 2025-12-05 09:25:25.635 189070 DEBUG nova.virt.libvirt.driver [None req-c79702cb-8a4b-4665-b9ca-3624a8692fbd 3cf2b75d6732438a9a3626bc5db6d76e efceb9f71c3c447ea59c2d2694f9e636 - - default default] [instance: b4e74925-5201-4f70-9beb-258ed9dc025a] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Dec 05 09:25:25 compute-1 nova_compute[189066]: 2025-12-05 09:25:25.641 189070 INFO nova.virt.libvirt.driver [-] [instance: b4e74925-5201-4f70-9beb-258ed9dc025a] Instance spawned successfully.
Dec 05 09:25:25 compute-1 nova_compute[189066]: 2025-12-05 09:25:25.642 189070 DEBUG nova.virt.libvirt.driver [None req-c79702cb-8a4b-4665-b9ca-3624a8692fbd 3cf2b75d6732438a9a3626bc5db6d76e efceb9f71c3c447ea59c2d2694f9e636 - - default default] [instance: b4e74925-5201-4f70-9beb-258ed9dc025a] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Dec 05 09:25:25 compute-1 nova_compute[189066]: 2025-12-05 09:25:25.856 189070 DEBUG nova.compute.manager [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] [instance: b4e74925-5201-4f70-9beb-258ed9dc025a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 09:25:25 compute-1 nova_compute[189066]: 2025-12-05 09:25:25.862 189070 DEBUG nova.virt.libvirt.driver [None req-c79702cb-8a4b-4665-b9ca-3624a8692fbd 3cf2b75d6732438a9a3626bc5db6d76e efceb9f71c3c447ea59c2d2694f9e636 - - default default] [instance: b4e74925-5201-4f70-9beb-258ed9dc025a] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 09:25:25 compute-1 nova_compute[189066]: 2025-12-05 09:25:25.862 189070 DEBUG nova.virt.libvirt.driver [None req-c79702cb-8a4b-4665-b9ca-3624a8692fbd 3cf2b75d6732438a9a3626bc5db6d76e efceb9f71c3c447ea59c2d2694f9e636 - - default default] [instance: b4e74925-5201-4f70-9beb-258ed9dc025a] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 09:25:25 compute-1 nova_compute[189066]: 2025-12-05 09:25:25.863 189070 DEBUG nova.virt.libvirt.driver [None req-c79702cb-8a4b-4665-b9ca-3624a8692fbd 3cf2b75d6732438a9a3626bc5db6d76e efceb9f71c3c447ea59c2d2694f9e636 - - default default] [instance: b4e74925-5201-4f70-9beb-258ed9dc025a] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 09:25:25 compute-1 nova_compute[189066]: 2025-12-05 09:25:25.864 189070 DEBUG nova.virt.libvirt.driver [None req-c79702cb-8a4b-4665-b9ca-3624a8692fbd 3cf2b75d6732438a9a3626bc5db6d76e efceb9f71c3c447ea59c2d2694f9e636 - - default default] [instance: b4e74925-5201-4f70-9beb-258ed9dc025a] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 09:25:25 compute-1 nova_compute[189066]: 2025-12-05 09:25:25.864 189070 DEBUG nova.virt.libvirt.driver [None req-c79702cb-8a4b-4665-b9ca-3624a8692fbd 3cf2b75d6732438a9a3626bc5db6d76e efceb9f71c3c447ea59c2d2694f9e636 - - default default] [instance: b4e74925-5201-4f70-9beb-258ed9dc025a] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 09:25:25 compute-1 nova_compute[189066]: 2025-12-05 09:25:25.865 189070 DEBUG nova.virt.libvirt.driver [None req-c79702cb-8a4b-4665-b9ca-3624a8692fbd 3cf2b75d6732438a9a3626bc5db6d76e efceb9f71c3c447ea59c2d2694f9e636 - - default default] [instance: b4e74925-5201-4f70-9beb-258ed9dc025a] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 09:25:25 compute-1 nova_compute[189066]: 2025-12-05 09:25:25.870 189070 DEBUG nova.compute.manager [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] [instance: b4e74925-5201-4f70-9beb-258ed9dc025a] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 09:25:25 compute-1 nova_compute[189066]: 2025-12-05 09:25:25.960 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:25:26 compute-1 nova_compute[189066]: 2025-12-05 09:25:26.110 189070 INFO nova.compute.manager [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] [instance: b4e74925-5201-4f70-9beb-258ed9dc025a] During sync_power_state the instance has a pending task (spawning). Skip.
Dec 05 09:25:26 compute-1 nova_compute[189066]: 2025-12-05 09:25:26.186 189070 INFO nova.compute.manager [None req-c79702cb-8a4b-4665-b9ca-3624a8692fbd 3cf2b75d6732438a9a3626bc5db6d76e efceb9f71c3c447ea59c2d2694f9e636 - - default default] [instance: b4e74925-5201-4f70-9beb-258ed9dc025a] Took 17.78 seconds to spawn the instance on the hypervisor.
Dec 05 09:25:26 compute-1 nova_compute[189066]: 2025-12-05 09:25:26.187 189070 DEBUG nova.compute.manager [None req-c79702cb-8a4b-4665-b9ca-3624a8692fbd 3cf2b75d6732438a9a3626bc5db6d76e efceb9f71c3c447ea59c2d2694f9e636 - - default default] [instance: b4e74925-5201-4f70-9beb-258ed9dc025a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 09:25:26 compute-1 nova_compute[189066]: 2025-12-05 09:25:26.317 189070 INFO nova.compute.manager [None req-c79702cb-8a4b-4665-b9ca-3624a8692fbd 3cf2b75d6732438a9a3626bc5db6d76e efceb9f71c3c447ea59c2d2694f9e636 - - default default] [instance: b4e74925-5201-4f70-9beb-258ed9dc025a] Took 18.99 seconds to build instance.
Dec 05 09:25:26 compute-1 nova_compute[189066]: 2025-12-05 09:25:26.350 189070 DEBUG oslo_concurrency.lockutils [None req-c79702cb-8a4b-4665-b9ca-3624a8692fbd 3cf2b75d6732438a9a3626bc5db6d76e efceb9f71c3c447ea59c2d2694f9e636 - - default default] Lock "b4e74925-5201-4f70-9beb-258ed9dc025a" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 19.352s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:25:26 compute-1 nova_compute[189066]: 2025-12-05 09:25:26.775 189070 DEBUG nova.network.neutron [req-ba127b3b-54c4-4fc7-bbbf-f81f56364169 req-e029139d-e231-40f3-8337-54fa4f3ad45a 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: b4e74925-5201-4f70-9beb-258ed9dc025a] Updated VIF entry in instance network info cache for port 9895d3af-515e-43f4-bc1f-97b82be6c710. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Dec 05 09:25:26 compute-1 nova_compute[189066]: 2025-12-05 09:25:26.776 189070 DEBUG nova.network.neutron [req-ba127b3b-54c4-4fc7-bbbf-f81f56364169 req-e029139d-e231-40f3-8337-54fa4f3ad45a 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: b4e74925-5201-4f70-9beb-258ed9dc025a] Updating instance_info_cache with network_info: [{"id": "9895d3af-515e-43f4-bc1f-97b82be6c710", "address": "fa:16:3e:64:c3:47", "network": {"id": "6ce38059-35e2-48bf-bd81-40e486d57627", "bridge": "br-int", "label": "tempest-network-smoke--458420447", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "efceb9f71c3c447ea59c2d2694f9e636", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9895d3af-51", "ovs_interfaceid": "9895d3af-515e-43f4-bc1f-97b82be6c710", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 09:25:26 compute-1 nova_compute[189066]: 2025-12-05 09:25:26.815 189070 DEBUG oslo_concurrency.lockutils [req-ba127b3b-54c4-4fc7-bbbf-f81f56364169 req-e029139d-e231-40f3-8337-54fa4f3ad45a 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Releasing lock "refresh_cache-b4e74925-5201-4f70-9beb-258ed9dc025a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 09:25:28 compute-1 nova_compute[189066]: 2025-12-05 09:25:28.117 189070 DEBUG nova.compute.manager [req-164d3b0e-16a4-4067-aa26-ad4a57bdb4d7 req-1375d3ca-7563-49ae-b79f-6c595d2b948f 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: b4e74925-5201-4f70-9beb-258ed9dc025a] Received event network-vif-plugged-9895d3af-515e-43f4-bc1f-97b82be6c710 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 09:25:28 compute-1 nova_compute[189066]: 2025-12-05 09:25:28.118 189070 DEBUG oslo_concurrency.lockutils [req-164d3b0e-16a4-4067-aa26-ad4a57bdb4d7 req-1375d3ca-7563-49ae-b79f-6c595d2b948f 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Acquiring lock "b4e74925-5201-4f70-9beb-258ed9dc025a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:25:28 compute-1 nova_compute[189066]: 2025-12-05 09:25:28.118 189070 DEBUG oslo_concurrency.lockutils [req-164d3b0e-16a4-4067-aa26-ad4a57bdb4d7 req-1375d3ca-7563-49ae-b79f-6c595d2b948f 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Lock "b4e74925-5201-4f70-9beb-258ed9dc025a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:25:28 compute-1 nova_compute[189066]: 2025-12-05 09:25:28.119 189070 DEBUG oslo_concurrency.lockutils [req-164d3b0e-16a4-4067-aa26-ad4a57bdb4d7 req-1375d3ca-7563-49ae-b79f-6c595d2b948f 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Lock "b4e74925-5201-4f70-9beb-258ed9dc025a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:25:28 compute-1 nova_compute[189066]: 2025-12-05 09:25:28.119 189070 DEBUG nova.compute.manager [req-164d3b0e-16a4-4067-aa26-ad4a57bdb4d7 req-1375d3ca-7563-49ae-b79f-6c595d2b948f 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: b4e74925-5201-4f70-9beb-258ed9dc025a] No waiting events found dispatching network-vif-plugged-9895d3af-515e-43f4-bc1f-97b82be6c710 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 09:25:28 compute-1 nova_compute[189066]: 2025-12-05 09:25:28.119 189070 WARNING nova.compute.manager [req-164d3b0e-16a4-4067-aa26-ad4a57bdb4d7 req-1375d3ca-7563-49ae-b79f-6c595d2b948f 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: b4e74925-5201-4f70-9beb-258ed9dc025a] Received unexpected event network-vif-plugged-9895d3af-515e-43f4-bc1f-97b82be6c710 for instance with vm_state active and task_state None.
Dec 05 09:25:28 compute-1 nova_compute[189066]: 2025-12-05 09:25:28.725 189070 DEBUG nova.compute.manager [req-52c91799-7d50-496e-b766-45ccaa35eb73 req-4e84eb55-9b4b-455c-97d2-be80d385797d 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 6b54aedd-9d9e-436b-9010-c5b04ffaca40] Received event network-changed-64e441f0-08b6-483c-9732-5217cdbe1468 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 09:25:28 compute-1 nova_compute[189066]: 2025-12-05 09:25:28.726 189070 DEBUG nova.compute.manager [req-52c91799-7d50-496e-b766-45ccaa35eb73 req-4e84eb55-9b4b-455c-97d2-be80d385797d 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 6b54aedd-9d9e-436b-9010-c5b04ffaca40] Refreshing instance network info cache due to event network-changed-64e441f0-08b6-483c-9732-5217cdbe1468. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Dec 05 09:25:28 compute-1 nova_compute[189066]: 2025-12-05 09:25:28.726 189070 DEBUG oslo_concurrency.lockutils [req-52c91799-7d50-496e-b766-45ccaa35eb73 req-4e84eb55-9b4b-455c-97d2-be80d385797d 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Acquiring lock "refresh_cache-6b54aedd-9d9e-436b-9010-c5b04ffaca40" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 09:25:28 compute-1 nova_compute[189066]: 2025-12-05 09:25:28.795 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:25:29 compute-1 podman[222560]: 2025-12-05 09:25:29.661448337 +0000 UTC m=+0.070349268 container health_status d60d3dfd47255bc7c068dbedc03dbf86678863f0414a1b8f8536e4d53dd67a13 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a42d21893a42142b0b00e96296a13ac9d2c0fedf55d6438ad104fd0309c63376-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2)
Dec 05 09:25:30 compute-1 nova_compute[189066]: 2025-12-05 09:25:30.963 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:25:30 compute-1 nova_compute[189066]: 2025-12-05 09:25:30.973 189070 DEBUG nova.network.neutron [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] [instance: 6b54aedd-9d9e-436b-9010-c5b04ffaca40] Updating instance_info_cache with network_info: [{"id": "64e441f0-08b6-483c-9732-5217cdbe1468", "address": "fa:16:3e:5f:73:50", "network": {"id": "f58cc02f-396f-494d-8f1e-d6f4412689c2", "bridge": "br-int", "label": "tempest-network-smoke--188393732", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe5f:7350", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.192", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fa1cd463d74b49139a088d332d37e611", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap64e441f0-08", "ovs_interfaceid": "64e441f0-08b6-483c-9732-5217cdbe1468", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 09:25:31 compute-1 nova_compute[189066]: 2025-12-05 09:25:31.008 189070 DEBUG oslo_concurrency.lockutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Releasing lock "refresh_cache-6b54aedd-9d9e-436b-9010-c5b04ffaca40" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 09:25:31 compute-1 nova_compute[189066]: 2025-12-05 09:25:31.008 189070 DEBUG nova.compute.manager [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] [instance: 6b54aedd-9d9e-436b-9010-c5b04ffaca40] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Dec 05 09:25:31 compute-1 nova_compute[189066]: 2025-12-05 09:25:31.009 189070 DEBUG oslo_concurrency.lockutils [req-52c91799-7d50-496e-b766-45ccaa35eb73 req-4e84eb55-9b4b-455c-97d2-be80d385797d 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Acquired lock "refresh_cache-6b54aedd-9d9e-436b-9010-c5b04ffaca40" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 09:25:31 compute-1 nova_compute[189066]: 2025-12-05 09:25:31.009 189070 DEBUG nova.network.neutron [req-52c91799-7d50-496e-b766-45ccaa35eb73 req-4e84eb55-9b4b-455c-97d2-be80d385797d 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 6b54aedd-9d9e-436b-9010-c5b04ffaca40] Refreshing network info cache for port 64e441f0-08b6-483c-9732-5217cdbe1468 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Dec 05 09:25:31 compute-1 nova_compute[189066]: 2025-12-05 09:25:31.010 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:25:31 compute-1 nova_compute[189066]: 2025-12-05 09:25:31.011 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:25:31 compute-1 nova_compute[189066]: 2025-12-05 09:25:31.012 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:25:31 compute-1 nova_compute[189066]: 2025-12-05 09:25:31.012 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:25:31 compute-1 nova_compute[189066]: 2025-12-05 09:25:31.013 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:25:31 compute-1 nova_compute[189066]: 2025-12-05 09:25:31.013 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:25:31 compute-1 nova_compute[189066]: 2025-12-05 09:25:31.013 189070 DEBUG nova.compute.manager [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 05 09:25:31 compute-1 nova_compute[189066]: 2025-12-05 09:25:31.013 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:25:31 compute-1 nova_compute[189066]: 2025-12-05 09:25:31.050 189070 DEBUG oslo_concurrency.lockutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:25:31 compute-1 nova_compute[189066]: 2025-12-05 09:25:31.051 189070 DEBUG oslo_concurrency.lockutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:25:31 compute-1 nova_compute[189066]: 2025-12-05 09:25:31.052 189070 DEBUG oslo_concurrency.lockutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:25:31 compute-1 nova_compute[189066]: 2025-12-05 09:25:31.052 189070 DEBUG nova.compute.resource_tracker [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 05 09:25:31 compute-1 nova_compute[189066]: 2025-12-05 09:25:31.174 189070 DEBUG oslo_concurrency.processutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b4e74925-5201-4f70-9beb-258ed9dc025a/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 09:25:31 compute-1 nova_compute[189066]: 2025-12-05 09:25:31.239 189070 DEBUG oslo_concurrency.processutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b4e74925-5201-4f70-9beb-258ed9dc025a/disk --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 09:25:31 compute-1 nova_compute[189066]: 2025-12-05 09:25:31.240 189070 DEBUG oslo_concurrency.processutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b4e74925-5201-4f70-9beb-258ed9dc025a/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 09:25:31 compute-1 nova_compute[189066]: 2025-12-05 09:25:31.304 189070 DEBUG oslo_concurrency.processutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b4e74925-5201-4f70-9beb-258ed9dc025a/disk --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 09:25:31 compute-1 nova_compute[189066]: 2025-12-05 09:25:31.311 189070 DEBUG oslo_concurrency.processutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6b54aedd-9d9e-436b-9010-c5b04ffaca40/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 09:25:31 compute-1 nova_compute[189066]: 2025-12-05 09:25:31.377 189070 DEBUG oslo_concurrency.processutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6b54aedd-9d9e-436b-9010-c5b04ffaca40/disk --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 09:25:31 compute-1 nova_compute[189066]: 2025-12-05 09:25:31.378 189070 DEBUG oslo_concurrency.processutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6b54aedd-9d9e-436b-9010-c5b04ffaca40/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 09:25:31 compute-1 nova_compute[189066]: 2025-12-05 09:25:31.443 189070 DEBUG oslo_concurrency.processutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6b54aedd-9d9e-436b-9010-c5b04ffaca40/disk --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 09:25:31 compute-1 nova_compute[189066]: 2025-12-05 09:25:31.644 189070 WARNING nova.virt.libvirt.driver [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 05 09:25:31 compute-1 nova_compute[189066]: 2025-12-05 09:25:31.646 189070 DEBUG nova.compute.resource_tracker [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5448MB free_disk=73.30412292480469GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 05 09:25:31 compute-1 nova_compute[189066]: 2025-12-05 09:25:31.647 189070 DEBUG oslo_concurrency.lockutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:25:31 compute-1 nova_compute[189066]: 2025-12-05 09:25:31.647 189070 DEBUG oslo_concurrency.lockutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:25:32 compute-1 nova_compute[189066]: 2025-12-05 09:25:32.654 189070 DEBUG nova.compute.resource_tracker [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Instance 6b54aedd-9d9e-436b-9010-c5b04ffaca40 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Dec 05 09:25:32 compute-1 nova_compute[189066]: 2025-12-05 09:25:32.656 189070 DEBUG nova.compute.resource_tracker [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Instance b4e74925-5201-4f70-9beb-258ed9dc025a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Dec 05 09:25:32 compute-1 nova_compute[189066]: 2025-12-05 09:25:32.657 189070 DEBUG nova.compute.resource_tracker [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 05 09:25:32 compute-1 nova_compute[189066]: 2025-12-05 09:25:32.657 189070 DEBUG nova.compute.resource_tracker [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 05 09:25:32 compute-1 podman[222593]: 2025-12-05 09:25:32.673766176 +0000 UTC m=+0.105082467 container health_status 0cdc951f3b455a1459471b20e1f1626ca53f702831c125f835d1b670aeb17ccf (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_controller, io.buildah.version=1.41.3, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a42d21893a42142b0b00e96296a13ac9d2c0fedf55d6438ad104fd0309c63376-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Dec 05 09:25:32 compute-1 nova_compute[189066]: 2025-12-05 09:25:32.777 189070 DEBUG nova.compute.provider_tree [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Inventory has not changed in ProviderTree for provider: be68f9f1-7820-4bfa-8dbd-210e13729f64 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 09:25:32 compute-1 nova_compute[189066]: 2025-12-05 09:25:32.806 189070 DEBUG nova.scheduler.client.report [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Inventory has not changed for provider be68f9f1-7820-4bfa-8dbd-210e13729f64 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 09:25:32 compute-1 nova_compute[189066]: 2025-12-05 09:25:32.849 189070 DEBUG nova.compute.resource_tracker [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 05 09:25:32 compute-1 nova_compute[189066]: 2025-12-05 09:25:32.850 189070 DEBUG oslo_concurrency.lockutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.202s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:25:33 compute-1 nova_compute[189066]: 2025-12-05 09:25:33.800 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:25:34 compute-1 nova_compute[189066]: 2025-12-05 09:25:34.624 189070 DEBUG nova.network.neutron [req-52c91799-7d50-496e-b766-45ccaa35eb73 req-4e84eb55-9b4b-455c-97d2-be80d385797d 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 6b54aedd-9d9e-436b-9010-c5b04ffaca40] Updated VIF entry in instance network info cache for port 64e441f0-08b6-483c-9732-5217cdbe1468. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Dec 05 09:25:34 compute-1 nova_compute[189066]: 2025-12-05 09:25:34.625 189070 DEBUG nova.network.neutron [req-52c91799-7d50-496e-b766-45ccaa35eb73 req-4e84eb55-9b4b-455c-97d2-be80d385797d 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 6b54aedd-9d9e-436b-9010-c5b04ffaca40] Updating instance_info_cache with network_info: [{"id": "64e441f0-08b6-483c-9732-5217cdbe1468", "address": "fa:16:3e:5f:73:50", "network": {"id": "f58cc02f-396f-494d-8f1e-d6f4412689c2", "bridge": "br-int", "label": "tempest-network-smoke--188393732", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe5f:7350", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.192", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fa1cd463d74b49139a088d332d37e611", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap64e441f0-08", "ovs_interfaceid": "64e441f0-08b6-483c-9732-5217cdbe1468", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 09:25:34 compute-1 nova_compute[189066]: 2025-12-05 09:25:34.646 189070 DEBUG oslo_concurrency.lockutils [req-52c91799-7d50-496e-b766-45ccaa35eb73 req-4e84eb55-9b4b-455c-97d2-be80d385797d 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Releasing lock "refresh_cache-6b54aedd-9d9e-436b-9010-c5b04ffaca40" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 09:25:35 compute-1 podman[222622]: 2025-12-05 09:25:35.623622941 +0000 UTC m=+0.063389358 container health_status e8cea6352187f28e672aa25c227f2705a90d58074b8cf42235a56df5d4026fd5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a42d21893a42142b0b00e96296a13ac9d2c0fedf55d6438ad104fd0309c63376-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2)
Dec 05 09:25:35 compute-1 nova_compute[189066]: 2025-12-05 09:25:35.960 189070 DEBUG oslo_concurrency.lockutils [None req-ea181a10-02f4-4eb7-a92a-d43bad462a51 fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] Acquiring lock "6b54aedd-9d9e-436b-9010-c5b04ffaca40" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:25:35 compute-1 nova_compute[189066]: 2025-12-05 09:25:35.962 189070 DEBUG oslo_concurrency.lockutils [None req-ea181a10-02f4-4eb7-a92a-d43bad462a51 fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] Lock "6b54aedd-9d9e-436b-9010-c5b04ffaca40" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.003s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:25:35 compute-1 nova_compute[189066]: 2025-12-05 09:25:35.963 189070 DEBUG oslo_concurrency.lockutils [None req-ea181a10-02f4-4eb7-a92a-d43bad462a51 fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] Acquiring lock "6b54aedd-9d9e-436b-9010-c5b04ffaca40-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:25:35 compute-1 nova_compute[189066]: 2025-12-05 09:25:35.963 189070 DEBUG oslo_concurrency.lockutils [None req-ea181a10-02f4-4eb7-a92a-d43bad462a51 fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] Lock "6b54aedd-9d9e-436b-9010-c5b04ffaca40-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:25:35 compute-1 nova_compute[189066]: 2025-12-05 09:25:35.964 189070 DEBUG oslo_concurrency.lockutils [None req-ea181a10-02f4-4eb7-a92a-d43bad462a51 fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] Lock "6b54aedd-9d9e-436b-9010-c5b04ffaca40-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:25:35 compute-1 nova_compute[189066]: 2025-12-05 09:25:35.965 189070 INFO nova.compute.manager [None req-ea181a10-02f4-4eb7-a92a-d43bad462a51 fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] [instance: 6b54aedd-9d9e-436b-9010-c5b04ffaca40] Terminating instance
Dec 05 09:25:35 compute-1 nova_compute[189066]: 2025-12-05 09:25:35.966 189070 DEBUG nova.compute.manager [None req-ea181a10-02f4-4eb7-a92a-d43bad462a51 fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] [instance: 6b54aedd-9d9e-436b-9010-c5b04ffaca40] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Dec 05 09:25:35 compute-1 ovn_controller[95809]: 2025-12-05T09:25:35Z|00061|binding|INFO|Releasing lport fdbc0f28-ff71-4c6c-87fd-d55723f69ac2 from this chassis (sb_readonly=0)
Dec 05 09:25:35 compute-1 ovn_controller[95809]: 2025-12-05T09:25:35Z|00062|binding|INFO|Releasing lport 76b453b1-696b-4d6e-a37b-38d94febf25b from this chassis (sb_readonly=0)
Dec 05 09:25:36 compute-1 nova_compute[189066]: 2025-12-05 09:25:36.006 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:25:36 compute-1 kernel: tap64e441f0-08 (unregistering): left promiscuous mode
Dec 05 09:25:36 compute-1 NetworkManager[55704]: <info>  [1764926736.0657] device (tap64e441f0-08): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec 05 09:25:36 compute-1 ovn_controller[95809]: 2025-12-05T09:25:36Z|00063|binding|INFO|Releasing lport 64e441f0-08b6-483c-9732-5217cdbe1468 from this chassis (sb_readonly=0)
Dec 05 09:25:36 compute-1 ovn_controller[95809]: 2025-12-05T09:25:36Z|00064|binding|INFO|Setting lport 64e441f0-08b6-483c-9732-5217cdbe1468 down in Southbound
Dec 05 09:25:36 compute-1 ovn_controller[95809]: 2025-12-05T09:25:36Z|00065|binding|INFO|Removing iface tap64e441f0-08 ovn-installed in OVS
Dec 05 09:25:36 compute-1 nova_compute[189066]: 2025-12-05 09:25:36.091 189070 DEBUG nova.compute.manager [req-0ae23b39-ae7d-42cc-89c2-5f6e3ccd069a req-cd67024a-a270-4a67-a372-f252352f5569 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 6b54aedd-9d9e-436b-9010-c5b04ffaca40] Received event network-changed-64e441f0-08b6-483c-9732-5217cdbe1468 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 09:25:36 compute-1 nova_compute[189066]: 2025-12-05 09:25:36.091 189070 DEBUG nova.compute.manager [req-0ae23b39-ae7d-42cc-89c2-5f6e3ccd069a req-cd67024a-a270-4a67-a372-f252352f5569 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 6b54aedd-9d9e-436b-9010-c5b04ffaca40] Refreshing instance network info cache due to event network-changed-64e441f0-08b6-483c-9732-5217cdbe1468. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Dec 05 09:25:36 compute-1 nova_compute[189066]: 2025-12-05 09:25:36.091 189070 DEBUG oslo_concurrency.lockutils [req-0ae23b39-ae7d-42cc-89c2-5f6e3ccd069a req-cd67024a-a270-4a67-a372-f252352f5569 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Acquiring lock "refresh_cache-6b54aedd-9d9e-436b-9010-c5b04ffaca40" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 09:25:36 compute-1 nova_compute[189066]: 2025-12-05 09:25:36.091 189070 DEBUG oslo_concurrency.lockutils [req-0ae23b39-ae7d-42cc-89c2-5f6e3ccd069a req-cd67024a-a270-4a67-a372-f252352f5569 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Acquired lock "refresh_cache-6b54aedd-9d9e-436b-9010-c5b04ffaca40" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 09:25:36 compute-1 nova_compute[189066]: 2025-12-05 09:25:36.092 189070 DEBUG nova.network.neutron [req-0ae23b39-ae7d-42cc-89c2-5f6e3ccd069a req-cd67024a-a270-4a67-a372-f252352f5569 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 6b54aedd-9d9e-436b-9010-c5b04ffaca40] Refreshing network info cache for port 64e441f0-08b6-483c-9732-5217cdbe1468 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Dec 05 09:25:36 compute-1 nova_compute[189066]: 2025-12-05 09:25:36.099 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:25:36 compute-1 nova_compute[189066]: 2025-12-05 09:25:36.103 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:25:36 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:25:36.107 105272 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:5f:73:50 10.100.0.3 2001:db8::f816:3eff:fe5f:7350'], port_security=['fa:16:3e:5f:73:50 10.100.0.3 2001:db8::f816:3eff:fe5f:7350'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28 2001:db8::f816:3eff:fe5f:7350/64', 'neutron:device_id': '6b54aedd-9d9e-436b-9010-c5b04ffaca40', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f58cc02f-396f-494d-8f1e-d6f4412689c2', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fa1cd463d74b49139a088d332d37e611', 'neutron:revision_number': '4', 'neutron:security_group_ids': '94445a7f-7152-4017-ac4c-5834cf45389c', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a03fccf0-8b31-495a-b68a-70be5d3c0194, chassis=[], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fc06f54c970>], logical_port=64e441f0-08b6-483c-9732-5217cdbe1468) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fc06f54c970>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 09:25:36 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:25:36.109 105272 INFO neutron.agent.ovn.metadata.agent [-] Port 64e441f0-08b6-483c-9732-5217cdbe1468 in datapath f58cc02f-396f-494d-8f1e-d6f4412689c2 unbound from our chassis
Dec 05 09:25:36 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:25:36.112 105272 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network f58cc02f-396f-494d-8f1e-d6f4412689c2, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 05 09:25:36 compute-1 nova_compute[189066]: 2025-12-05 09:25:36.114 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:25:36 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:25:36.114 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[133df173-bf29-4ca5-940d-b57d2992408a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:25:36 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:25:36.115 105272 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-f58cc02f-396f-494d-8f1e-d6f4412689c2 namespace which is not needed anymore
Dec 05 09:25:36 compute-1 systemd[1]: machine-qemu\x2d3\x2dinstance\x2d00000007.scope: Deactivated successfully.
Dec 05 09:25:36 compute-1 systemd[1]: machine-qemu\x2d3\x2dinstance\x2d00000007.scope: Consumed 15.140s CPU time.
Dec 05 09:25:36 compute-1 systemd-machined[154815]: Machine qemu-3-instance-00000007 terminated.
Dec 05 09:25:36 compute-1 nova_compute[189066]: 2025-12-05 09:25:36.191 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:25:36 compute-1 nova_compute[189066]: 2025-12-05 09:25:36.199 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:25:36 compute-1 nova_compute[189066]: 2025-12-05 09:25:36.235 189070 INFO nova.virt.libvirt.driver [-] [instance: 6b54aedd-9d9e-436b-9010-c5b04ffaca40] Instance destroyed successfully.
Dec 05 09:25:36 compute-1 nova_compute[189066]: 2025-12-05 09:25:36.236 189070 DEBUG nova.objects.instance [None req-ea181a10-02f4-4eb7-a92a-d43bad462a51 fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] Lazy-loading 'resources' on Instance uuid 6b54aedd-9d9e-436b-9010-c5b04ffaca40 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 09:25:36 compute-1 nova_compute[189066]: 2025-12-05 09:25:36.270 189070 DEBUG nova.virt.libvirt.vif [None req-ea181a10-02f4-4eb7-a92a-d43bad462a51 fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-05T09:24:17Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestGettingAddress-server-520554038',display_name='tempest-TestGettingAddress-server-520554038',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testgettingaddress-server-520554038',id=7,image_ref='3ebffd97-b242-42d7-b245-ebdaf8e4377c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBCHemSCbldDOeAvAklAY1GBE7IpwWfP2ZPUnyVv6eMnqJsn+zpySVsIxCVwt5x3P5v8sd6NBMOWL+BJqkJAyHVGLR2hUEXOHFYLsF9I/mokK0nJIvCbRk6qzwE1U+KhNRw==',key_name='tempest-TestGettingAddress-981102965',keypairs=<?>,launch_index=0,launched_at=2025-12-05T09:25:11Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='fa1cd463d74b49139a088d332d37e611',ramdisk_id='',reservation_id='r-9vtmxokr',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='3ebffd97-b242-42d7-b245-ebdaf8e4377c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestGettingAddress-8368731',owner_user_name='tempest-TestGettingAddress-8368731-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-05T09:25:11Z,user_data=None,user_id='fae1c60e378945ea84b34c4824b835b1',uuid=6b54aedd-9d9e-436b-9010-c5b04ffaca40,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "64e441f0-08b6-483c-9732-5217cdbe1468", "address": "fa:16:3e:5f:73:50", "network": {"id": "f58cc02f-396f-494d-8f1e-d6f4412689c2", "bridge": "br-int", "label": "tempest-network-smoke--188393732", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe5f:7350", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.192", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fa1cd463d74b49139a088d332d37e611", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap64e441f0-08", "ovs_interfaceid": "64e441f0-08b6-483c-9732-5217cdbe1468", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Dec 05 09:25:36 compute-1 nova_compute[189066]: 2025-12-05 09:25:36.271 189070 DEBUG nova.network.os_vif_util [None req-ea181a10-02f4-4eb7-a92a-d43bad462a51 fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] Converting VIF {"id": "64e441f0-08b6-483c-9732-5217cdbe1468", "address": "fa:16:3e:5f:73:50", "network": {"id": "f58cc02f-396f-494d-8f1e-d6f4412689c2", "bridge": "br-int", "label": "tempest-network-smoke--188393732", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe5f:7350", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.192", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fa1cd463d74b49139a088d332d37e611", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap64e441f0-08", "ovs_interfaceid": "64e441f0-08b6-483c-9732-5217cdbe1468", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 09:25:36 compute-1 nova_compute[189066]: 2025-12-05 09:25:36.272 189070 DEBUG nova.network.os_vif_util [None req-ea181a10-02f4-4eb7-a92a-d43bad462a51 fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:5f:73:50,bridge_name='br-int',has_traffic_filtering=True,id=64e441f0-08b6-483c-9732-5217cdbe1468,network=Network(f58cc02f-396f-494d-8f1e-d6f4412689c2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap64e441f0-08') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 09:25:36 compute-1 nova_compute[189066]: 2025-12-05 09:25:36.272 189070 DEBUG os_vif [None req-ea181a10-02f4-4eb7-a92a-d43bad462a51 fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:5f:73:50,bridge_name='br-int',has_traffic_filtering=True,id=64e441f0-08b6-483c-9732-5217cdbe1468,network=Network(f58cc02f-396f-494d-8f1e-d6f4412689c2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap64e441f0-08') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Dec 05 09:25:36 compute-1 neutron-haproxy-ovnmeta-f58cc02f-396f-494d-8f1e-d6f4412689c2[222294]: [NOTICE]   (222298) : haproxy version is 2.8.14-c23fe91
Dec 05 09:25:36 compute-1 neutron-haproxy-ovnmeta-f58cc02f-396f-494d-8f1e-d6f4412689c2[222294]: [NOTICE]   (222298) : path to executable is /usr/sbin/haproxy
Dec 05 09:25:36 compute-1 neutron-haproxy-ovnmeta-f58cc02f-396f-494d-8f1e-d6f4412689c2[222294]: [WARNING]  (222298) : Exiting Master process...
Dec 05 09:25:36 compute-1 neutron-haproxy-ovnmeta-f58cc02f-396f-494d-8f1e-d6f4412689c2[222294]: [ALERT]    (222298) : Current worker (222300) exited with code 143 (Terminated)
Dec 05 09:25:36 compute-1 neutron-haproxy-ovnmeta-f58cc02f-396f-494d-8f1e-d6f4412689c2[222294]: [WARNING]  (222298) : All workers exited. Exiting... (0)
Dec 05 09:25:36 compute-1 nova_compute[189066]: 2025-12-05 09:25:36.278 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:25:36 compute-1 systemd[1]: libpod-3b9c09fd3b4004552bca99c2c8284c30eec648aa0ec387345cc2c3eebabe0d87.scope: Deactivated successfully.
Dec 05 09:25:36 compute-1 nova_compute[189066]: 2025-12-05 09:25:36.279 189070 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap64e441f0-08, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 09:25:36 compute-1 nova_compute[189066]: 2025-12-05 09:25:36.281 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:25:36 compute-1 nova_compute[189066]: 2025-12-05 09:25:36.282 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:25:36 compute-1 podman[222676]: 2025-12-05 09:25:36.285323418 +0000 UTC m=+0.059747467 container died 3b9c09fd3b4004552bca99c2c8284c30eec648aa0ec387345cc2c3eebabe0d87 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f58cc02f-396f-494d-8f1e-d6f4412689c2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3)
Dec 05 09:25:36 compute-1 nova_compute[189066]: 2025-12-05 09:25:36.286 189070 INFO os_vif [None req-ea181a10-02f4-4eb7-a92a-d43bad462a51 fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:5f:73:50,bridge_name='br-int',has_traffic_filtering=True,id=64e441f0-08b6-483c-9732-5217cdbe1468,network=Network(f58cc02f-396f-494d-8f1e-d6f4412689c2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap64e441f0-08')
Dec 05 09:25:36 compute-1 nova_compute[189066]: 2025-12-05 09:25:36.287 189070 INFO nova.virt.libvirt.driver [None req-ea181a10-02f4-4eb7-a92a-d43bad462a51 fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] [instance: 6b54aedd-9d9e-436b-9010-c5b04ffaca40] Deleting instance files /var/lib/nova/instances/6b54aedd-9d9e-436b-9010-c5b04ffaca40_del
Dec 05 09:25:36 compute-1 nova_compute[189066]: 2025-12-05 09:25:36.288 189070 INFO nova.virt.libvirt.driver [None req-ea181a10-02f4-4eb7-a92a-d43bad462a51 fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] [instance: 6b54aedd-9d9e-436b-9010-c5b04ffaca40] Deletion of /var/lib/nova/instances/6b54aedd-9d9e-436b-9010-c5b04ffaca40_del complete
Dec 05 09:25:36 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-3b9c09fd3b4004552bca99c2c8284c30eec648aa0ec387345cc2c3eebabe0d87-userdata-shm.mount: Deactivated successfully.
Dec 05 09:25:36 compute-1 systemd[1]: var-lib-containers-storage-overlay-318d7a1752c837b5ccc0860fe58bdf6906d58916a38f4de6c081886ae9c90363-merged.mount: Deactivated successfully.
Dec 05 09:25:36 compute-1 podman[222676]: 2025-12-05 09:25:36.32588331 +0000 UTC m=+0.100307339 container cleanup 3b9c09fd3b4004552bca99c2c8284c30eec648aa0ec387345cc2c3eebabe0d87 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f58cc02f-396f-494d-8f1e-d6f4412689c2, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 05 09:25:36 compute-1 systemd[1]: libpod-conmon-3b9c09fd3b4004552bca99c2c8284c30eec648aa0ec387345cc2c3eebabe0d87.scope: Deactivated successfully.
Dec 05 09:25:36 compute-1 nova_compute[189066]: 2025-12-05 09:25:36.366 189070 INFO nova.compute.manager [None req-ea181a10-02f4-4eb7-a92a-d43bad462a51 fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] [instance: 6b54aedd-9d9e-436b-9010-c5b04ffaca40] Took 0.40 seconds to destroy the instance on the hypervisor.
Dec 05 09:25:36 compute-1 nova_compute[189066]: 2025-12-05 09:25:36.367 189070 DEBUG oslo.service.loopingcall [None req-ea181a10-02f4-4eb7-a92a-d43bad462a51 fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Dec 05 09:25:36 compute-1 nova_compute[189066]: 2025-12-05 09:25:36.367 189070 DEBUG nova.compute.manager [-] [instance: 6b54aedd-9d9e-436b-9010-c5b04ffaca40] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Dec 05 09:25:36 compute-1 nova_compute[189066]: 2025-12-05 09:25:36.367 189070 DEBUG nova.network.neutron [-] [instance: 6b54aedd-9d9e-436b-9010-c5b04ffaca40] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Dec 05 09:25:36 compute-1 podman[222712]: 2025-12-05 09:25:36.394145756 +0000 UTC m=+0.046287864 container remove 3b9c09fd3b4004552bca99c2c8284c30eec648aa0ec387345cc2c3eebabe0d87 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f58cc02f-396f-494d-8f1e-d6f4412689c2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 05 09:25:36 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:25:36.402 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[591195be-3d97-4b26-bc4b-5d995766e879]: (4, ('Fri Dec  5 09:25:36 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-f58cc02f-396f-494d-8f1e-d6f4412689c2 (3b9c09fd3b4004552bca99c2c8284c30eec648aa0ec387345cc2c3eebabe0d87)\n3b9c09fd3b4004552bca99c2c8284c30eec648aa0ec387345cc2c3eebabe0d87\nFri Dec  5 09:25:36 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-f58cc02f-396f-494d-8f1e-d6f4412689c2 (3b9c09fd3b4004552bca99c2c8284c30eec648aa0ec387345cc2c3eebabe0d87)\n3b9c09fd3b4004552bca99c2c8284c30eec648aa0ec387345cc2c3eebabe0d87\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:25:36 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:25:36.404 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[014fa42d-a98f-4182-a438-2bfbabec89d5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:25:36 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:25:36.405 105272 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf58cc02f-30, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 09:25:36 compute-1 nova_compute[189066]: 2025-12-05 09:25:36.408 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:25:36 compute-1 kernel: tapf58cc02f-30: left promiscuous mode
Dec 05 09:25:36 compute-1 nova_compute[189066]: 2025-12-05 09:25:36.411 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:25:36 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:25:36.414 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[b7beac6e-a049-40e1-9b68-c1c4a8ce0f1b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:25:36 compute-1 nova_compute[189066]: 2025-12-05 09:25:36.425 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:25:36 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:25:36.438 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[c1465e75-c5d5-41d3-81eb-6a99dad49373]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:25:36 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:25:36.440 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[0ef5cef5-98ff-4c0d-9374-5bc1d3a7796f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:25:36 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:25:36.457 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[3aa2105c-0d14-4289-81f1-f123c4877d88]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 397835, 'reachable_time': 34124, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 222727, 'error': None, 'target': 'ovnmeta-f58cc02f-396f-494d-8f1e-d6f4412689c2', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:25:36 compute-1 systemd[1]: run-netns-ovnmeta\x2df58cc02f\x2d396f\x2d494d\x2d8f1e\x2dd6f4412689c2.mount: Deactivated successfully.
Dec 05 09:25:36 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:25:36.461 105809 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-f58cc02f-396f-494d-8f1e-d6f4412689c2 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Dec 05 09:25:36 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:25:36.462 105809 DEBUG oslo.privsep.daemon [-] privsep: reply[61196f66-5b00-4e84-8935-c796607c9d26]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:25:37 compute-1 nova_compute[189066]: 2025-12-05 09:25:37.675 189070 DEBUG nova.compute.manager [req-f57fef76-4898-4496-98b3-dea4fe6c1100 req-c833b6c4-acb5-4acd-a23d-57b08f560768 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 6b54aedd-9d9e-436b-9010-c5b04ffaca40] Received event network-vif-unplugged-64e441f0-08b6-483c-9732-5217cdbe1468 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 09:25:37 compute-1 nova_compute[189066]: 2025-12-05 09:25:37.676 189070 DEBUG oslo_concurrency.lockutils [req-f57fef76-4898-4496-98b3-dea4fe6c1100 req-c833b6c4-acb5-4acd-a23d-57b08f560768 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Acquiring lock "6b54aedd-9d9e-436b-9010-c5b04ffaca40-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:25:37 compute-1 nova_compute[189066]: 2025-12-05 09:25:37.676 189070 DEBUG oslo_concurrency.lockutils [req-f57fef76-4898-4496-98b3-dea4fe6c1100 req-c833b6c4-acb5-4acd-a23d-57b08f560768 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Lock "6b54aedd-9d9e-436b-9010-c5b04ffaca40-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:25:37 compute-1 nova_compute[189066]: 2025-12-05 09:25:37.677 189070 DEBUG oslo_concurrency.lockutils [req-f57fef76-4898-4496-98b3-dea4fe6c1100 req-c833b6c4-acb5-4acd-a23d-57b08f560768 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Lock "6b54aedd-9d9e-436b-9010-c5b04ffaca40-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:25:37 compute-1 nova_compute[189066]: 2025-12-05 09:25:37.677 189070 DEBUG nova.compute.manager [req-f57fef76-4898-4496-98b3-dea4fe6c1100 req-c833b6c4-acb5-4acd-a23d-57b08f560768 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 6b54aedd-9d9e-436b-9010-c5b04ffaca40] No waiting events found dispatching network-vif-unplugged-64e441f0-08b6-483c-9732-5217cdbe1468 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 09:25:37 compute-1 nova_compute[189066]: 2025-12-05 09:25:37.677 189070 DEBUG nova.compute.manager [req-f57fef76-4898-4496-98b3-dea4fe6c1100 req-c833b6c4-acb5-4acd-a23d-57b08f560768 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 6b54aedd-9d9e-436b-9010-c5b04ffaca40] Received event network-vif-unplugged-64e441f0-08b6-483c-9732-5217cdbe1468 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Dec 05 09:25:38 compute-1 podman[222751]: 2025-12-05 09:25:38.626670809 +0000 UTC m=+0.071088216 container health_status 3f3288243aefe700d53cffce184353529fbaf6e44f1c19ec4792ed576af41176 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 05 09:25:38 compute-1 nova_compute[189066]: 2025-12-05 09:25:38.803 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:25:39 compute-1 ovn_controller[95809]: 2025-12-05T09:25:39Z|00010|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:64:c3:47 10.100.0.14
Dec 05 09:25:39 compute-1 ovn_controller[95809]: 2025-12-05T09:25:39Z|00011|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:64:c3:47 10.100.0.14
Dec 05 09:25:40 compute-1 sshd-session[222620]: Received disconnect from 101.47.162.91 port 58106:11: Bye Bye [preauth]
Dec 05 09:25:40 compute-1 sshd-session[222620]: Disconnected from authenticating user root 101.47.162.91 port 58106 [preauth]
Dec 05 09:25:41 compute-1 nova_compute[189066]: 2025-12-05 09:25:41.327 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:25:42 compute-1 podman[222771]: 2025-12-05 09:25:42.626952404 +0000 UTC m=+0.065473258 container health_status b07f36300303a5c1729056a61e344d6c6fd9b2e404082d8e8ce7185d45652c2b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, io.buildah.version=1.33.7, distribution-scope=public, config_id=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible)
Dec 05 09:25:43 compute-1 nova_compute[189066]: 2025-12-05 09:25:43.803 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:25:44 compute-1 nova_compute[189066]: 2025-12-05 09:25:44.118 189070 DEBUG nova.compute.manager [req-c7efca7f-3ccf-4d9e-8a23-79954cbe79b3 req-95e32b4c-b1a5-477f-93a0-bd799f7ac003 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 6b54aedd-9d9e-436b-9010-c5b04ffaca40] Received event network-vif-plugged-64e441f0-08b6-483c-9732-5217cdbe1468 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 09:25:44 compute-1 nova_compute[189066]: 2025-12-05 09:25:44.119 189070 DEBUG oslo_concurrency.lockutils [req-c7efca7f-3ccf-4d9e-8a23-79954cbe79b3 req-95e32b4c-b1a5-477f-93a0-bd799f7ac003 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Acquiring lock "6b54aedd-9d9e-436b-9010-c5b04ffaca40-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:25:44 compute-1 nova_compute[189066]: 2025-12-05 09:25:44.119 189070 DEBUG oslo_concurrency.lockutils [req-c7efca7f-3ccf-4d9e-8a23-79954cbe79b3 req-95e32b4c-b1a5-477f-93a0-bd799f7ac003 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Lock "6b54aedd-9d9e-436b-9010-c5b04ffaca40-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:25:44 compute-1 nova_compute[189066]: 2025-12-05 09:25:44.119 189070 DEBUG oslo_concurrency.lockutils [req-c7efca7f-3ccf-4d9e-8a23-79954cbe79b3 req-95e32b4c-b1a5-477f-93a0-bd799f7ac003 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Lock "6b54aedd-9d9e-436b-9010-c5b04ffaca40-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:25:44 compute-1 nova_compute[189066]: 2025-12-05 09:25:44.119 189070 DEBUG nova.compute.manager [req-c7efca7f-3ccf-4d9e-8a23-79954cbe79b3 req-95e32b4c-b1a5-477f-93a0-bd799f7ac003 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 6b54aedd-9d9e-436b-9010-c5b04ffaca40] No waiting events found dispatching network-vif-plugged-64e441f0-08b6-483c-9732-5217cdbe1468 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 09:25:44 compute-1 nova_compute[189066]: 2025-12-05 09:25:44.119 189070 WARNING nova.compute.manager [req-c7efca7f-3ccf-4d9e-8a23-79954cbe79b3 req-95e32b4c-b1a5-477f-93a0-bd799f7ac003 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 6b54aedd-9d9e-436b-9010-c5b04ffaca40] Received unexpected event network-vif-plugged-64e441f0-08b6-483c-9732-5217cdbe1468 for instance with vm_state active and task_state deleting.
Dec 05 09:25:44 compute-1 nova_compute[189066]: 2025-12-05 09:25:44.310 189070 DEBUG nova.network.neutron [-] [instance: 6b54aedd-9d9e-436b-9010-c5b04ffaca40] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 09:25:44 compute-1 nova_compute[189066]: 2025-12-05 09:25:44.333 189070 INFO nova.compute.manager [-] [instance: 6b54aedd-9d9e-436b-9010-c5b04ffaca40] Took 7.97 seconds to deallocate network for instance.
Dec 05 09:25:44 compute-1 nova_compute[189066]: 2025-12-05 09:25:44.460 189070 DEBUG oslo_concurrency.lockutils [None req-ea181a10-02f4-4eb7-a92a-d43bad462a51 fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:25:44 compute-1 nova_compute[189066]: 2025-12-05 09:25:44.461 189070 DEBUG oslo_concurrency.lockutils [None req-ea181a10-02f4-4eb7-a92a-d43bad462a51 fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:25:44 compute-1 nova_compute[189066]: 2025-12-05 09:25:44.617 189070 DEBUG nova.compute.provider_tree [None req-ea181a10-02f4-4eb7-a92a-d43bad462a51 fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] Inventory has not changed in ProviderTree for provider: be68f9f1-7820-4bfa-8dbd-210e13729f64 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 09:25:44 compute-1 nova_compute[189066]: 2025-12-05 09:25:44.644 189070 DEBUG nova.scheduler.client.report [None req-ea181a10-02f4-4eb7-a92a-d43bad462a51 fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] Inventory has not changed for provider be68f9f1-7820-4bfa-8dbd-210e13729f64 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 09:25:44 compute-1 nova_compute[189066]: 2025-12-05 09:25:44.850 189070 DEBUG oslo_concurrency.lockutils [None req-ea181a10-02f4-4eb7-a92a-d43bad462a51 fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.390s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:25:44 compute-1 nova_compute[189066]: 2025-12-05 09:25:44.904 189070 DEBUG nova.compute.manager [req-549f850e-5613-4811-a026-729759a45c64 req-4b77d5e0-93b2-47b0-81e0-bb907076e53e 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: b4e74925-5201-4f70-9beb-258ed9dc025a] Received event network-changed-9895d3af-515e-43f4-bc1f-97b82be6c710 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 09:25:44 compute-1 nova_compute[189066]: 2025-12-05 09:25:44.904 189070 DEBUG nova.compute.manager [req-549f850e-5613-4811-a026-729759a45c64 req-4b77d5e0-93b2-47b0-81e0-bb907076e53e 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: b4e74925-5201-4f70-9beb-258ed9dc025a] Refreshing instance network info cache due to event network-changed-9895d3af-515e-43f4-bc1f-97b82be6c710. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Dec 05 09:25:44 compute-1 nova_compute[189066]: 2025-12-05 09:25:44.905 189070 DEBUG oslo_concurrency.lockutils [req-549f850e-5613-4811-a026-729759a45c64 req-4b77d5e0-93b2-47b0-81e0-bb907076e53e 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Acquiring lock "refresh_cache-b4e74925-5201-4f70-9beb-258ed9dc025a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 09:25:44 compute-1 nova_compute[189066]: 2025-12-05 09:25:44.905 189070 DEBUG oslo_concurrency.lockutils [req-549f850e-5613-4811-a026-729759a45c64 req-4b77d5e0-93b2-47b0-81e0-bb907076e53e 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Acquired lock "refresh_cache-b4e74925-5201-4f70-9beb-258ed9dc025a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 09:25:44 compute-1 nova_compute[189066]: 2025-12-05 09:25:44.905 189070 DEBUG nova.network.neutron [req-549f850e-5613-4811-a026-729759a45c64 req-4b77d5e0-93b2-47b0-81e0-bb907076e53e 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: b4e74925-5201-4f70-9beb-258ed9dc025a] Refreshing network info cache for port 9895d3af-515e-43f4-bc1f-97b82be6c710 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Dec 05 09:25:44 compute-1 nova_compute[189066]: 2025-12-05 09:25:44.972 189070 INFO nova.scheduler.client.report [None req-ea181a10-02f4-4eb7-a92a-d43bad462a51 fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] Deleted allocations for instance 6b54aedd-9d9e-436b-9010-c5b04ffaca40
Dec 05 09:25:45 compute-1 podman[222792]: 2025-12-05 09:25:45.633796867 +0000 UTC m=+0.065965841 container health_status b9f45cdcb567bb250381fa80d86c78c226d5638c7de1b6d79ee017275814ff68 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 05 09:25:46 compute-1 nova_compute[189066]: 2025-12-05 09:25:46.331 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:25:48 compute-1 nova_compute[189066]: 2025-12-05 09:25:48.806 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:25:48 compute-1 nova_compute[189066]: 2025-12-05 09:25:48.974 189070 DEBUG nova.network.neutron [req-0ae23b39-ae7d-42cc-89c2-5f6e3ccd069a req-cd67024a-a270-4a67-a372-f252352f5569 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 6b54aedd-9d9e-436b-9010-c5b04ffaca40] Updated VIF entry in instance network info cache for port 64e441f0-08b6-483c-9732-5217cdbe1468. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Dec 05 09:25:48 compute-1 nova_compute[189066]: 2025-12-05 09:25:48.975 189070 DEBUG nova.network.neutron [req-0ae23b39-ae7d-42cc-89c2-5f6e3ccd069a req-cd67024a-a270-4a67-a372-f252352f5569 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 6b54aedd-9d9e-436b-9010-c5b04ffaca40] Updating instance_info_cache with network_info: [{"id": "64e441f0-08b6-483c-9732-5217cdbe1468", "address": "fa:16:3e:5f:73:50", "network": {"id": "f58cc02f-396f-494d-8f1e-d6f4412689c2", "bridge": "br-int", "label": "tempest-network-smoke--188393732", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe5f:7350", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fa1cd463d74b49139a088d332d37e611", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap64e441f0-08", "ovs_interfaceid": "64e441f0-08b6-483c-9732-5217cdbe1468", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 09:25:49 compute-1 nova_compute[189066]: 2025-12-05 09:25:49.519 189070 DEBUG oslo_concurrency.lockutils [req-0ae23b39-ae7d-42cc-89c2-5f6e3ccd069a req-cd67024a-a270-4a67-a372-f252352f5569 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Releasing lock "refresh_cache-6b54aedd-9d9e-436b-9010-c5b04ffaca40" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 09:25:49 compute-1 nova_compute[189066]: 2025-12-05 09:25:49.673 189070 DEBUG oslo_concurrency.lockutils [None req-ea181a10-02f4-4eb7-a92a-d43bad462a51 fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] Lock "6b54aedd-9d9e-436b-9010-c5b04ffaca40" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 13.711s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:25:49 compute-1 ovn_controller[95809]: 2025-12-05T09:25:49Z|00066|binding|INFO|Releasing lport 76b453b1-696b-4d6e-a37b-38d94febf25b from this chassis (sb_readonly=0)
Dec 05 09:25:49 compute-1 nova_compute[189066]: 2025-12-05 09:25:49.926 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:25:50 compute-1 nova_compute[189066]: 2025-12-05 09:25:50.207 189070 DEBUG nova.compute.manager [req-02db40a2-7774-4467-98f4-daeb7bb190a9 req-1df8c2c0-04a9-4c68-bf8b-b302eb124c75 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 6b54aedd-9d9e-436b-9010-c5b04ffaca40] Received event network-vif-deleted-64e441f0-08b6-483c-9732-5217cdbe1468 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 09:25:50 compute-1 nova_compute[189066]: 2025-12-05 09:25:50.207 189070 INFO nova.compute.manager [req-02db40a2-7774-4467-98f4-daeb7bb190a9 req-1df8c2c0-04a9-4c68-bf8b-b302eb124c75 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 6b54aedd-9d9e-436b-9010-c5b04ffaca40] Neutron deleted interface 64e441f0-08b6-483c-9732-5217cdbe1468; detaching it from the instance and deleting it from the info cache
Dec 05 09:25:50 compute-1 nova_compute[189066]: 2025-12-05 09:25:50.208 189070 DEBUG nova.network.neutron [req-02db40a2-7774-4467-98f4-daeb7bb190a9 req-1df8c2c0-04a9-4c68-bf8b-b302eb124c75 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 6b54aedd-9d9e-436b-9010-c5b04ffaca40] Instance is deleted, no further info cache update update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:106
Dec 05 09:25:50 compute-1 nova_compute[189066]: 2025-12-05 09:25:50.211 189070 DEBUG nova.compute.manager [req-02db40a2-7774-4467-98f4-daeb7bb190a9 req-1df8c2c0-04a9-4c68-bf8b-b302eb124c75 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 6b54aedd-9d9e-436b-9010-c5b04ffaca40] Detach interface failed, port_id=64e441f0-08b6-483c-9732-5217cdbe1468, reason: Instance 6b54aedd-9d9e-436b-9010-c5b04ffaca40 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10882
Dec 05 09:25:51 compute-1 nova_compute[189066]: 2025-12-05 09:25:51.227 189070 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764926736.2261176, 6b54aedd-9d9e-436b-9010-c5b04ffaca40 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 09:25:51 compute-1 nova_compute[189066]: 2025-12-05 09:25:51.228 189070 INFO nova.compute.manager [-] [instance: 6b54aedd-9d9e-436b-9010-c5b04ffaca40] VM Stopped (Lifecycle Event)
Dec 05 09:25:51 compute-1 nova_compute[189066]: 2025-12-05 09:25:51.300 189070 DEBUG nova.compute.manager [None req-ed9fe4b0-3097-44d8-94ff-114c3dd488d9 - - - - - -] [instance: 6b54aedd-9d9e-436b-9010-c5b04ffaca40] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 09:25:51 compute-1 nova_compute[189066]: 2025-12-05 09:25:51.368 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:25:51 compute-1 podman[222816]: 2025-12-05 09:25:51.618174637 +0000 UTC m=+0.056446425 container health_status 550b604b3b40c506829669f3af0f3e8dc4e7ac01464222ce7f26998708fe1a96 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec 05 09:25:53 compute-1 nova_compute[189066]: 2025-12-05 09:25:53.087 189070 DEBUG nova.network.neutron [req-549f850e-5613-4811-a026-729759a45c64 req-4b77d5e0-93b2-47b0-81e0-bb907076e53e 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: b4e74925-5201-4f70-9beb-258ed9dc025a] Updated VIF entry in instance network info cache for port 9895d3af-515e-43f4-bc1f-97b82be6c710. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Dec 05 09:25:53 compute-1 nova_compute[189066]: 2025-12-05 09:25:53.088 189070 DEBUG nova.network.neutron [req-549f850e-5613-4811-a026-729759a45c64 req-4b77d5e0-93b2-47b0-81e0-bb907076e53e 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: b4e74925-5201-4f70-9beb-258ed9dc025a] Updating instance_info_cache with network_info: [{"id": "9895d3af-515e-43f4-bc1f-97b82be6c710", "address": "fa:16:3e:64:c3:47", "network": {"id": "6ce38059-35e2-48bf-bd81-40e486d57627", "bridge": "br-int", "label": "tempest-network-smoke--458420447", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.195", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "efceb9f71c3c447ea59c2d2694f9e636", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9895d3af-51", "ovs_interfaceid": "9895d3af-515e-43f4-bc1f-97b82be6c710", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 09:25:53 compute-1 nova_compute[189066]: 2025-12-05 09:25:53.125 189070 DEBUG oslo_concurrency.lockutils [req-549f850e-5613-4811-a026-729759a45c64 req-4b77d5e0-93b2-47b0-81e0-bb907076e53e 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Releasing lock "refresh_cache-b4e74925-5201-4f70-9beb-258ed9dc025a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 09:25:53 compute-1 nova_compute[189066]: 2025-12-05 09:25:53.807 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:25:56 compute-1 nova_compute[189066]: 2025-12-05 09:25:56.408 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:25:58 compute-1 nova_compute[189066]: 2025-12-05 09:25:58.865 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:26:00 compute-1 nova_compute[189066]: 2025-12-05 09:26:00.136 189070 DEBUG nova.compute.manager [req-49053ccc-f68a-4c28-a20a-0d7620b8f6d5 req-882ff673-44c0-4c93-826c-5e5d8600c33b 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: b4e74925-5201-4f70-9beb-258ed9dc025a] Received event network-changed-9895d3af-515e-43f4-bc1f-97b82be6c710 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 09:26:00 compute-1 nova_compute[189066]: 2025-12-05 09:26:00.137 189070 DEBUG nova.compute.manager [req-49053ccc-f68a-4c28-a20a-0d7620b8f6d5 req-882ff673-44c0-4c93-826c-5e5d8600c33b 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: b4e74925-5201-4f70-9beb-258ed9dc025a] Refreshing instance network info cache due to event network-changed-9895d3af-515e-43f4-bc1f-97b82be6c710. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Dec 05 09:26:00 compute-1 nova_compute[189066]: 2025-12-05 09:26:00.137 189070 DEBUG oslo_concurrency.lockutils [req-49053ccc-f68a-4c28-a20a-0d7620b8f6d5 req-882ff673-44c0-4c93-826c-5e5d8600c33b 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Acquiring lock "refresh_cache-b4e74925-5201-4f70-9beb-258ed9dc025a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 09:26:00 compute-1 nova_compute[189066]: 2025-12-05 09:26:00.137 189070 DEBUG oslo_concurrency.lockutils [req-49053ccc-f68a-4c28-a20a-0d7620b8f6d5 req-882ff673-44c0-4c93-826c-5e5d8600c33b 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Acquired lock "refresh_cache-b4e74925-5201-4f70-9beb-258ed9dc025a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 09:26:00 compute-1 nova_compute[189066]: 2025-12-05 09:26:00.138 189070 DEBUG nova.network.neutron [req-49053ccc-f68a-4c28-a20a-0d7620b8f6d5 req-882ff673-44c0-4c93-826c-5e5d8600c33b 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: b4e74925-5201-4f70-9beb-258ed9dc025a] Refreshing network info cache for port 9895d3af-515e-43f4-bc1f-97b82be6c710 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Dec 05 09:26:00 compute-1 nova_compute[189066]: 2025-12-05 09:26:00.169 189070 DEBUG oslo_concurrency.lockutils [None req-9bcddf5c-229a-4ed2-9868-7ebcebeb1b6c 3cf2b75d6732438a9a3626bc5db6d76e efceb9f71c3c447ea59c2d2694f9e636 - - default default] Acquiring lock "b4e74925-5201-4f70-9beb-258ed9dc025a" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:26:00 compute-1 nova_compute[189066]: 2025-12-05 09:26:00.170 189070 DEBUG oslo_concurrency.lockutils [None req-9bcddf5c-229a-4ed2-9868-7ebcebeb1b6c 3cf2b75d6732438a9a3626bc5db6d76e efceb9f71c3c447ea59c2d2694f9e636 - - default default] Lock "b4e74925-5201-4f70-9beb-258ed9dc025a" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:26:00 compute-1 nova_compute[189066]: 2025-12-05 09:26:00.171 189070 DEBUG oslo_concurrency.lockutils [None req-9bcddf5c-229a-4ed2-9868-7ebcebeb1b6c 3cf2b75d6732438a9a3626bc5db6d76e efceb9f71c3c447ea59c2d2694f9e636 - - default default] Acquiring lock "b4e74925-5201-4f70-9beb-258ed9dc025a-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:26:00 compute-1 nova_compute[189066]: 2025-12-05 09:26:00.171 189070 DEBUG oslo_concurrency.lockutils [None req-9bcddf5c-229a-4ed2-9868-7ebcebeb1b6c 3cf2b75d6732438a9a3626bc5db6d76e efceb9f71c3c447ea59c2d2694f9e636 - - default default] Lock "b4e74925-5201-4f70-9beb-258ed9dc025a-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:26:00 compute-1 nova_compute[189066]: 2025-12-05 09:26:00.171 189070 DEBUG oslo_concurrency.lockutils [None req-9bcddf5c-229a-4ed2-9868-7ebcebeb1b6c 3cf2b75d6732438a9a3626bc5db6d76e efceb9f71c3c447ea59c2d2694f9e636 - - default default] Lock "b4e74925-5201-4f70-9beb-258ed9dc025a-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:26:00 compute-1 nova_compute[189066]: 2025-12-05 09:26:00.173 189070 INFO nova.compute.manager [None req-9bcddf5c-229a-4ed2-9868-7ebcebeb1b6c 3cf2b75d6732438a9a3626bc5db6d76e efceb9f71c3c447ea59c2d2694f9e636 - - default default] [instance: b4e74925-5201-4f70-9beb-258ed9dc025a] Terminating instance
Dec 05 09:26:00 compute-1 nova_compute[189066]: 2025-12-05 09:26:00.174 189070 DEBUG nova.compute.manager [None req-9bcddf5c-229a-4ed2-9868-7ebcebeb1b6c 3cf2b75d6732438a9a3626bc5db6d76e efceb9f71c3c447ea59c2d2694f9e636 - - default default] [instance: b4e74925-5201-4f70-9beb-258ed9dc025a] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Dec 05 09:26:00 compute-1 kernel: tap9895d3af-51 (unregistering): left promiscuous mode
Dec 05 09:26:00 compute-1 NetworkManager[55704]: <info>  [1764926760.2111] device (tap9895d3af-51): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec 05 09:26:00 compute-1 ovn_controller[95809]: 2025-12-05T09:26:00Z|00067|binding|INFO|Releasing lport 9895d3af-515e-43f4-bc1f-97b82be6c710 from this chassis (sb_readonly=0)
Dec 05 09:26:00 compute-1 ovn_controller[95809]: 2025-12-05T09:26:00Z|00068|binding|INFO|Setting lport 9895d3af-515e-43f4-bc1f-97b82be6c710 down in Southbound
Dec 05 09:26:00 compute-1 nova_compute[189066]: 2025-12-05 09:26:00.217 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:26:00 compute-1 ovn_controller[95809]: 2025-12-05T09:26:00Z|00069|binding|INFO|Removing iface tap9895d3af-51 ovn-installed in OVS
Dec 05 09:26:00 compute-1 nova_compute[189066]: 2025-12-05 09:26:00.219 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:26:00 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:26:00.233 105272 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:64:c3:47 10.100.0.14'], port_security=['fa:16:3e:64:c3:47 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': 'b4e74925-5201-4f70-9beb-258ed9dc025a', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6ce38059-35e2-48bf-bd81-40e486d57627', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'efceb9f71c3c447ea59c2d2694f9e636', 'neutron:revision_number': '4', 'neutron:security_group_ids': '005d7801-6db4-4517-9d42-7f13da167a01 7f744df6-6fc6-436b-b5e3-475437dac336', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f2000da9-080b-4941-bfdd-da4222442ad0, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fc06f54c970>], logical_port=9895d3af-515e-43f4-bc1f-97b82be6c710) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fc06f54c970>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 09:26:00 compute-1 nova_compute[189066]: 2025-12-05 09:26:00.235 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:26:00 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:26:00.236 105272 INFO neutron.agent.ovn.metadata.agent [-] Port 9895d3af-515e-43f4-bc1f-97b82be6c710 in datapath 6ce38059-35e2-48bf-bd81-40e486d57627 unbound from our chassis
Dec 05 09:26:00 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:26:00.238 105272 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 6ce38059-35e2-48bf-bd81-40e486d57627, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 05 09:26:00 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:26:00.240 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[66621d9b-f1f3-41d0-b6cb-2dd3fc11498c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:26:00 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:26:00.241 105272 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-6ce38059-35e2-48bf-bd81-40e486d57627 namespace which is not needed anymore
Dec 05 09:26:00 compute-1 systemd[1]: machine-qemu\x2d4\x2dinstance\x2d00000009.scope: Deactivated successfully.
Dec 05 09:26:00 compute-1 systemd[1]: machine-qemu\x2d4\x2dinstance\x2d00000009.scope: Consumed 15.026s CPU time.
Dec 05 09:26:00 compute-1 systemd-machined[154815]: Machine qemu-4-instance-00000009 terminated.
Dec 05 09:26:00 compute-1 podman[222844]: 2025-12-05 09:26:00.333760853 +0000 UTC m=+0.079945356 container health_status d60d3dfd47255bc7c068dbedc03dbf86678863f0414a1b8f8536e4d53dd67a13 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a42d21893a42142b0b00e96296a13ac9d2c0fedf55d6438ad104fd0309c63376-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_id=ceilometer_agent_compute, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible)
Dec 05 09:26:00 compute-1 neutron-haproxy-ovnmeta-6ce38059-35e2-48bf-bd81-40e486d57627[222536]: [NOTICE]   (222545) : haproxy version is 2.8.14-c23fe91
Dec 05 09:26:00 compute-1 neutron-haproxy-ovnmeta-6ce38059-35e2-48bf-bd81-40e486d57627[222536]: [NOTICE]   (222545) : path to executable is /usr/sbin/haproxy
Dec 05 09:26:00 compute-1 neutron-haproxy-ovnmeta-6ce38059-35e2-48bf-bd81-40e486d57627[222536]: [WARNING]  (222545) : Exiting Master process...
Dec 05 09:26:00 compute-1 neutron-haproxy-ovnmeta-6ce38059-35e2-48bf-bd81-40e486d57627[222536]: [ALERT]    (222545) : Current worker (222547) exited with code 143 (Terminated)
Dec 05 09:26:00 compute-1 neutron-haproxy-ovnmeta-6ce38059-35e2-48bf-bd81-40e486d57627[222536]: [WARNING]  (222545) : All workers exited. Exiting... (0)
Dec 05 09:26:00 compute-1 systemd[1]: libpod-59e5cde960d47d37f774c357e054496b8373da5bb55a8cf4f2c4ddb44f7813d2.scope: Deactivated successfully.
Dec 05 09:26:00 compute-1 podman[222888]: 2025-12-05 09:26:00.396988415 +0000 UTC m=+0.049245927 container died 59e5cde960d47d37f774c357e054496b8373da5bb55a8cf4f2c4ddb44f7813d2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6ce38059-35e2-48bf-bd81-40e486d57627, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 05 09:26:00 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-59e5cde960d47d37f774c357e054496b8373da5bb55a8cf4f2c4ddb44f7813d2-userdata-shm.mount: Deactivated successfully.
Dec 05 09:26:00 compute-1 systemd[1]: var-lib-containers-storage-overlay-adfcae394d927dbf643aea13343d5616fa9d94a201d320693c9ee93d7dc0abfb-merged.mount: Deactivated successfully.
Dec 05 09:26:00 compute-1 podman[222888]: 2025-12-05 09:26:00.452341463 +0000 UTC m=+0.104598975 container cleanup 59e5cde960d47d37f774c357e054496b8373da5bb55a8cf4f2c4ddb44f7813d2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6ce38059-35e2-48bf-bd81-40e486d57627, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true)
Dec 05 09:26:00 compute-1 systemd[1]: libpod-conmon-59e5cde960d47d37f774c357e054496b8373da5bb55a8cf4f2c4ddb44f7813d2.scope: Deactivated successfully.
Dec 05 09:26:00 compute-1 nova_compute[189066]: 2025-12-05 09:26:00.463 189070 INFO nova.virt.libvirt.driver [-] [instance: b4e74925-5201-4f70-9beb-258ed9dc025a] Instance destroyed successfully.
Dec 05 09:26:00 compute-1 nova_compute[189066]: 2025-12-05 09:26:00.463 189070 DEBUG nova.objects.instance [None req-9bcddf5c-229a-4ed2-9868-7ebcebeb1b6c 3cf2b75d6732438a9a3626bc5db6d76e efceb9f71c3c447ea59c2d2694f9e636 - - default default] Lazy-loading 'resources' on Instance uuid b4e74925-5201-4f70-9beb-258ed9dc025a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 09:26:00 compute-1 nova_compute[189066]: 2025-12-05 09:26:00.483 189070 DEBUG nova.virt.libvirt.vif [None req-9bcddf5c-229a-4ed2-9868-7ebcebeb1b6c 3cf2b75d6732438a9a3626bc5db6d76e efceb9f71c3c447ea59c2d2694f9e636 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-05T09:25:04Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-server-tempest-TestSecurityGroupsBasicOps-1546276319-access_point-1221653276',display_name='tempest-server-tempest-TestSecurityGroupsBasicOps-1546276319-access_point-1221653276',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-server-tempest-testsecuritygroupsbasicops-1546276319-ac',id=9,image_ref='3ebffd97-b242-42d7-b245-ebdaf8e4377c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBFJkw+l11RVGYygdYFY+5dTvjVW6UnWERneAkkJsvcy3vaYGsZxqSJKH1T1OhIDkYKsREB1fnusNh4+5qwwXs+krdjcuQPiAa96VT7yue5/M3ONTUDBINBXBPmqK+uIgKg==',key_name='tempest-TestSecurityGroupsBasicOps-926259432',keypairs=<?>,launch_index=0,launched_at=2025-12-05T09:25:26Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='efceb9f71c3c447ea59c2d2694f9e636',ramdisk_id='',reservation_id='r-asrkgshf',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='3ebffd97-b242-42d7-b245-ebdaf8e4377c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestSecurityGroupsBasicOps-1546276319',owner_user_name='tempest-TestSecurityGroupsBasicOps-1546276319-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-05T09:25:26Z,user_data=None,user_id='3cf2b75d6732438a9a3626bc5db6d76e',uuid=b4e74925-5201-4f70-9beb-258ed9dc025a,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "9895d3af-515e-43f4-bc1f-97b82be6c710", "address": "fa:16:3e:64:c3:47", "network": {"id": "6ce38059-35e2-48bf-bd81-40e486d57627", "bridge": "br-int", "label": "tempest-network-smoke--458420447", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.195", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "efceb9f71c3c447ea59c2d2694f9e636", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9895d3af-51", "ovs_interfaceid": "9895d3af-515e-43f4-bc1f-97b82be6c710", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Dec 05 09:26:00 compute-1 nova_compute[189066]: 2025-12-05 09:26:00.483 189070 DEBUG nova.network.os_vif_util [None req-9bcddf5c-229a-4ed2-9868-7ebcebeb1b6c 3cf2b75d6732438a9a3626bc5db6d76e efceb9f71c3c447ea59c2d2694f9e636 - - default default] Converting VIF {"id": "9895d3af-515e-43f4-bc1f-97b82be6c710", "address": "fa:16:3e:64:c3:47", "network": {"id": "6ce38059-35e2-48bf-bd81-40e486d57627", "bridge": "br-int", "label": "tempest-network-smoke--458420447", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.195", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "efceb9f71c3c447ea59c2d2694f9e636", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9895d3af-51", "ovs_interfaceid": "9895d3af-515e-43f4-bc1f-97b82be6c710", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 09:26:00 compute-1 nova_compute[189066]: 2025-12-05 09:26:00.485 189070 DEBUG nova.network.os_vif_util [None req-9bcddf5c-229a-4ed2-9868-7ebcebeb1b6c 3cf2b75d6732438a9a3626bc5db6d76e efceb9f71c3c447ea59c2d2694f9e636 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:64:c3:47,bridge_name='br-int',has_traffic_filtering=True,id=9895d3af-515e-43f4-bc1f-97b82be6c710,network=Network(6ce38059-35e2-48bf-bd81-40e486d57627),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9895d3af-51') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 09:26:00 compute-1 nova_compute[189066]: 2025-12-05 09:26:00.485 189070 DEBUG os_vif [None req-9bcddf5c-229a-4ed2-9868-7ebcebeb1b6c 3cf2b75d6732438a9a3626bc5db6d76e efceb9f71c3c447ea59c2d2694f9e636 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:64:c3:47,bridge_name='br-int',has_traffic_filtering=True,id=9895d3af-515e-43f4-bc1f-97b82be6c710,network=Network(6ce38059-35e2-48bf-bd81-40e486d57627),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9895d3af-51') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Dec 05 09:26:00 compute-1 nova_compute[189066]: 2025-12-05 09:26:00.489 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:26:00 compute-1 nova_compute[189066]: 2025-12-05 09:26:00.489 189070 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9895d3af-51, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 09:26:00 compute-1 nova_compute[189066]: 2025-12-05 09:26:00.491 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:26:00 compute-1 nova_compute[189066]: 2025-12-05 09:26:00.494 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 09:26:00 compute-1 nova_compute[189066]: 2025-12-05 09:26:00.499 189070 INFO os_vif [None req-9bcddf5c-229a-4ed2-9868-7ebcebeb1b6c 3cf2b75d6732438a9a3626bc5db6d76e efceb9f71c3c447ea59c2d2694f9e636 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:64:c3:47,bridge_name='br-int',has_traffic_filtering=True,id=9895d3af-515e-43f4-bc1f-97b82be6c710,network=Network(6ce38059-35e2-48bf-bd81-40e486d57627),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9895d3af-51')
Dec 05 09:26:00 compute-1 nova_compute[189066]: 2025-12-05 09:26:00.500 189070 INFO nova.virt.libvirt.driver [None req-9bcddf5c-229a-4ed2-9868-7ebcebeb1b6c 3cf2b75d6732438a9a3626bc5db6d76e efceb9f71c3c447ea59c2d2694f9e636 - - default default] [instance: b4e74925-5201-4f70-9beb-258ed9dc025a] Deleting instance files /var/lib/nova/instances/b4e74925-5201-4f70-9beb-258ed9dc025a_del
Dec 05 09:26:00 compute-1 nova_compute[189066]: 2025-12-05 09:26:00.501 189070 INFO nova.virt.libvirt.driver [None req-9bcddf5c-229a-4ed2-9868-7ebcebeb1b6c 3cf2b75d6732438a9a3626bc5db6d76e efceb9f71c3c447ea59c2d2694f9e636 - - default default] [instance: b4e74925-5201-4f70-9beb-258ed9dc025a] Deletion of /var/lib/nova/instances/b4e74925-5201-4f70-9beb-258ed9dc025a_del complete
Dec 05 09:26:00 compute-1 podman[222933]: 2025-12-05 09:26:00.528164996 +0000 UTC m=+0.052458048 container remove 59e5cde960d47d37f774c357e054496b8373da5bb55a8cf4f2c4ddb44f7813d2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6ce38059-35e2-48bf-bd81-40e486d57627, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Dec 05 09:26:00 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:26:00.534 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[0ca0eb75-3d3b-40ce-96e3-33074c97e641]: (4, ('Fri Dec  5 09:26:00 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-6ce38059-35e2-48bf-bd81-40e486d57627 (59e5cde960d47d37f774c357e054496b8373da5bb55a8cf4f2c4ddb44f7813d2)\n59e5cde960d47d37f774c357e054496b8373da5bb55a8cf4f2c4ddb44f7813d2\nFri Dec  5 09:26:00 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-6ce38059-35e2-48bf-bd81-40e486d57627 (59e5cde960d47d37f774c357e054496b8373da5bb55a8cf4f2c4ddb44f7813d2)\n59e5cde960d47d37f774c357e054496b8373da5bb55a8cf4f2c4ddb44f7813d2\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:26:00 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:26:00.535 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[f49f353d-8b26-4066-aa46-28563346cf1b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:26:00 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:26:00.536 105272 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap6ce38059-30, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 09:26:00 compute-1 nova_compute[189066]: 2025-12-05 09:26:00.538 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:26:00 compute-1 kernel: tap6ce38059-30: left promiscuous mode
Dec 05 09:26:00 compute-1 nova_compute[189066]: 2025-12-05 09:26:00.540 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:26:00 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:26:00.544 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[a3cb410d-e2c4-4a55-9d2e-d23e5e9cdb3f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:26:00 compute-1 nova_compute[189066]: 2025-12-05 09:26:00.552 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:26:00 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:26:00.558 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[2a865ad4-9728-4093-be46-e0c7b5fede23]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:26:00 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:26:00.560 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[1c9874a2-7358-4cec-831a-c2f0c74fdd86]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:26:00 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:26:00.581 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[f7a5c4ad-c7f8-4ed2-839c-1e1b171688c9]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 399572, 'reachable_time': 28264, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 222946, 'error': None, 'target': 'ovnmeta-6ce38059-35e2-48bf-bd81-40e486d57627', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:26:00 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:26:00.585 105809 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-6ce38059-35e2-48bf-bd81-40e486d57627 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Dec 05 09:26:00 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:26:00.585 105809 DEBUG oslo.privsep.daemon [-] privsep: reply[09280007-5cfe-4c5f-88f5-4a07ee495308]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:26:00 compute-1 systemd[1]: run-netns-ovnmeta\x2d6ce38059\x2d35e2\x2d48bf\x2dbd81\x2d40e486d57627.mount: Deactivated successfully.
Dec 05 09:26:01 compute-1 nova_compute[189066]: 2025-12-05 09:26:01.262 189070 INFO nova.compute.manager [None req-9bcddf5c-229a-4ed2-9868-7ebcebeb1b6c 3cf2b75d6732438a9a3626bc5db6d76e efceb9f71c3c447ea59c2d2694f9e636 - - default default] [instance: b4e74925-5201-4f70-9beb-258ed9dc025a] Took 1.09 seconds to destroy the instance on the hypervisor.
Dec 05 09:26:01 compute-1 nova_compute[189066]: 2025-12-05 09:26:01.262 189070 DEBUG oslo.service.loopingcall [None req-9bcddf5c-229a-4ed2-9868-7ebcebeb1b6c 3cf2b75d6732438a9a3626bc5db6d76e efceb9f71c3c447ea59c2d2694f9e636 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Dec 05 09:26:01 compute-1 nova_compute[189066]: 2025-12-05 09:26:01.262 189070 DEBUG nova.compute.manager [-] [instance: b4e74925-5201-4f70-9beb-258ed9dc025a] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Dec 05 09:26:01 compute-1 nova_compute[189066]: 2025-12-05 09:26:01.263 189070 DEBUG nova.network.neutron [-] [instance: b4e74925-5201-4f70-9beb-258ed9dc025a] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Dec 05 09:26:01 compute-1 sshd-session[222879]: Received disconnect from 185.118.15.236 port 36168:11: Bye Bye [preauth]
Dec 05 09:26:01 compute-1 sshd-session[222879]: Disconnected from authenticating user root 185.118.15.236 port 36168 [preauth]
Dec 05 09:26:03 compute-1 podman[222947]: 2025-12-05 09:26:03.747829357 +0000 UTC m=+0.182516370 container health_status 0cdc951f3b455a1459471b20e1f1626ca53f702831c125f835d1b670aeb17ccf (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a42d21893a42142b0b00e96296a13ac9d2c0fedf55d6438ad104fd0309c63376-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Dec 05 09:26:03 compute-1 nova_compute[189066]: 2025-12-05 09:26:03.868 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:26:05 compute-1 nova_compute[189066]: 2025-12-05 09:26:05.264 189070 DEBUG nova.compute.manager [req-5893e1ad-7128-4d0e-9115-bd9b39c0948e req-673f9c68-38d4-4d3b-be79-7523256d4cd8 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: b4e74925-5201-4f70-9beb-258ed9dc025a] Received event network-vif-unplugged-9895d3af-515e-43f4-bc1f-97b82be6c710 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 09:26:05 compute-1 nova_compute[189066]: 2025-12-05 09:26:05.264 189070 DEBUG oslo_concurrency.lockutils [req-5893e1ad-7128-4d0e-9115-bd9b39c0948e req-673f9c68-38d4-4d3b-be79-7523256d4cd8 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Acquiring lock "b4e74925-5201-4f70-9beb-258ed9dc025a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:26:05 compute-1 nova_compute[189066]: 2025-12-05 09:26:05.264 189070 DEBUG oslo_concurrency.lockutils [req-5893e1ad-7128-4d0e-9115-bd9b39c0948e req-673f9c68-38d4-4d3b-be79-7523256d4cd8 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Lock "b4e74925-5201-4f70-9beb-258ed9dc025a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:26:05 compute-1 nova_compute[189066]: 2025-12-05 09:26:05.265 189070 DEBUG oslo_concurrency.lockutils [req-5893e1ad-7128-4d0e-9115-bd9b39c0948e req-673f9c68-38d4-4d3b-be79-7523256d4cd8 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Lock "b4e74925-5201-4f70-9beb-258ed9dc025a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:26:05 compute-1 nova_compute[189066]: 2025-12-05 09:26:05.265 189070 DEBUG nova.compute.manager [req-5893e1ad-7128-4d0e-9115-bd9b39c0948e req-673f9c68-38d4-4d3b-be79-7523256d4cd8 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: b4e74925-5201-4f70-9beb-258ed9dc025a] No waiting events found dispatching network-vif-unplugged-9895d3af-515e-43f4-bc1f-97b82be6c710 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 09:26:05 compute-1 nova_compute[189066]: 2025-12-05 09:26:05.265 189070 DEBUG nova.compute.manager [req-5893e1ad-7128-4d0e-9115-bd9b39c0948e req-673f9c68-38d4-4d3b-be79-7523256d4cd8 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: b4e74925-5201-4f70-9beb-258ed9dc025a] Received event network-vif-unplugged-9895d3af-515e-43f4-bc1f-97b82be6c710 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Dec 05 09:26:05 compute-1 nova_compute[189066]: 2025-12-05 09:26:05.464 189070 DEBUG nova.network.neutron [req-49053ccc-f68a-4c28-a20a-0d7620b8f6d5 req-882ff673-44c0-4c93-826c-5e5d8600c33b 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: b4e74925-5201-4f70-9beb-258ed9dc025a] Updated VIF entry in instance network info cache for port 9895d3af-515e-43f4-bc1f-97b82be6c710. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Dec 05 09:26:05 compute-1 nova_compute[189066]: 2025-12-05 09:26:05.465 189070 DEBUG nova.network.neutron [req-49053ccc-f68a-4c28-a20a-0d7620b8f6d5 req-882ff673-44c0-4c93-826c-5e5d8600c33b 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: b4e74925-5201-4f70-9beb-258ed9dc025a] Updating instance_info_cache with network_info: [{"id": "9895d3af-515e-43f4-bc1f-97b82be6c710", "address": "fa:16:3e:64:c3:47", "network": {"id": "6ce38059-35e2-48bf-bd81-40e486d57627", "bridge": "br-int", "label": "tempest-network-smoke--458420447", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "efceb9f71c3c447ea59c2d2694f9e636", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9895d3af-51", "ovs_interfaceid": "9895d3af-515e-43f4-bc1f-97b82be6c710", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 09:26:05 compute-1 nova_compute[189066]: 2025-12-05 09:26:05.491 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:26:06 compute-1 podman[222975]: 2025-12-05 09:26:06.620414144 +0000 UTC m=+0.053775170 container health_status e8cea6352187f28e672aa25c227f2705a90d58074b8cf42235a56df5d4026fd5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a42d21893a42142b0b00e96296a13ac9d2c0fedf55d6438ad104fd0309c63376-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251125)
Dec 05 09:26:07 compute-1 nova_compute[189066]: 2025-12-05 09:26:07.104 189070 DEBUG oslo_concurrency.lockutils [req-49053ccc-f68a-4c28-a20a-0d7620b8f6d5 req-882ff673-44c0-4c93-826c-5e5d8600c33b 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Releasing lock "refresh_cache-b4e74925-5201-4f70-9beb-258ed9dc025a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 09:26:08 compute-1 nova_compute[189066]: 2025-12-05 09:26:08.060 189070 DEBUG nova.network.neutron [-] [instance: b4e74925-5201-4f70-9beb-258ed9dc025a] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 09:26:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:26:08.870 105272 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:26:08 compute-1 nova_compute[189066]: 2025-12-05 09:26:08.871 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:26:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:26:08.871 105272 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:26:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:26:08.871 105272 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:26:09 compute-1 podman[222995]: 2025-12-05 09:26:09.663680195 +0000 UTC m=+0.101643121 container health_status 3f3288243aefe700d53cffce184353529fbaf6e44f1c19ec4792ed576af41176 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=multipathd, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 05 09:26:09 compute-1 nova_compute[189066]: 2025-12-05 09:26:09.780 189070 INFO nova.compute.manager [-] [instance: b4e74925-5201-4f70-9beb-258ed9dc025a] Took 8.52 seconds to deallocate network for instance.
Dec 05 09:26:09 compute-1 nova_compute[189066]: 2025-12-05 09:26:09.790 189070 DEBUG nova.compute.manager [req-d4df850a-180d-463d-8ec6-20967d97f277 req-fed8a11c-a930-4b59-83d9-e66c9c694320 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: b4e74925-5201-4f70-9beb-258ed9dc025a] Received event network-vif-deleted-9895d3af-515e-43f4-bc1f-97b82be6c710 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 09:26:09 compute-1 nova_compute[189066]: 2025-12-05 09:26:09.791 189070 INFO nova.compute.manager [req-d4df850a-180d-463d-8ec6-20967d97f277 req-fed8a11c-a930-4b59-83d9-e66c9c694320 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: b4e74925-5201-4f70-9beb-258ed9dc025a] Neutron deleted interface 9895d3af-515e-43f4-bc1f-97b82be6c710; detaching it from the instance and deleting it from the info cache
Dec 05 09:26:09 compute-1 nova_compute[189066]: 2025-12-05 09:26:09.791 189070 DEBUG nova.network.neutron [req-d4df850a-180d-463d-8ec6-20967d97f277 req-fed8a11c-a930-4b59-83d9-e66c9c694320 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: b4e74925-5201-4f70-9beb-258ed9dc025a] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 09:26:09 compute-1 nova_compute[189066]: 2025-12-05 09:26:09.846 189070 DEBUG nova.compute.manager [req-d4df850a-180d-463d-8ec6-20967d97f277 req-fed8a11c-a930-4b59-83d9-e66c9c694320 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: b4e74925-5201-4f70-9beb-258ed9dc025a] Detach interface failed, port_id=9895d3af-515e-43f4-bc1f-97b82be6c710, reason: Instance b4e74925-5201-4f70-9beb-258ed9dc025a could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10882
Dec 05 09:26:09 compute-1 nova_compute[189066]: 2025-12-05 09:26:09.846 189070 DEBUG nova.compute.manager [req-d4df850a-180d-463d-8ec6-20967d97f277 req-fed8a11c-a930-4b59-83d9-e66c9c694320 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: b4e74925-5201-4f70-9beb-258ed9dc025a] Received event network-vif-plugged-9895d3af-515e-43f4-bc1f-97b82be6c710 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 09:26:09 compute-1 nova_compute[189066]: 2025-12-05 09:26:09.847 189070 DEBUG oslo_concurrency.lockutils [req-d4df850a-180d-463d-8ec6-20967d97f277 req-fed8a11c-a930-4b59-83d9-e66c9c694320 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Acquiring lock "b4e74925-5201-4f70-9beb-258ed9dc025a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:26:09 compute-1 nova_compute[189066]: 2025-12-05 09:26:09.847 189070 DEBUG oslo_concurrency.lockutils [req-d4df850a-180d-463d-8ec6-20967d97f277 req-fed8a11c-a930-4b59-83d9-e66c9c694320 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Lock "b4e74925-5201-4f70-9beb-258ed9dc025a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:26:09 compute-1 nova_compute[189066]: 2025-12-05 09:26:09.847 189070 DEBUG oslo_concurrency.lockutils [req-d4df850a-180d-463d-8ec6-20967d97f277 req-fed8a11c-a930-4b59-83d9-e66c9c694320 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Lock "b4e74925-5201-4f70-9beb-258ed9dc025a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:26:09 compute-1 nova_compute[189066]: 2025-12-05 09:26:09.848 189070 DEBUG nova.compute.manager [req-d4df850a-180d-463d-8ec6-20967d97f277 req-fed8a11c-a930-4b59-83d9-e66c9c694320 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: b4e74925-5201-4f70-9beb-258ed9dc025a] No waiting events found dispatching network-vif-plugged-9895d3af-515e-43f4-bc1f-97b82be6c710 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 09:26:09 compute-1 nova_compute[189066]: 2025-12-05 09:26:09.848 189070 WARNING nova.compute.manager [req-d4df850a-180d-463d-8ec6-20967d97f277 req-fed8a11c-a930-4b59-83d9-e66c9c694320 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: b4e74925-5201-4f70-9beb-258ed9dc025a] Received unexpected event network-vif-plugged-9895d3af-515e-43f4-bc1f-97b82be6c710 for instance with vm_state active and task_state deleting.
Dec 05 09:26:09 compute-1 nova_compute[189066]: 2025-12-05 09:26:09.997 189070 DEBUG oslo_concurrency.lockutils [None req-9bcddf5c-229a-4ed2-9868-7ebcebeb1b6c 3cf2b75d6732438a9a3626bc5db6d76e efceb9f71c3c447ea59c2d2694f9e636 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:26:09 compute-1 nova_compute[189066]: 2025-12-05 09:26:09.998 189070 DEBUG oslo_concurrency.lockutils [None req-9bcddf5c-229a-4ed2-9868-7ebcebeb1b6c 3cf2b75d6732438a9a3626bc5db6d76e efceb9f71c3c447ea59c2d2694f9e636 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:26:10 compute-1 nova_compute[189066]: 2025-12-05 09:26:10.084 189070 DEBUG nova.compute.provider_tree [None req-9bcddf5c-229a-4ed2-9868-7ebcebeb1b6c 3cf2b75d6732438a9a3626bc5db6d76e efceb9f71c3c447ea59c2d2694f9e636 - - default default] Inventory has not changed in ProviderTree for provider: be68f9f1-7820-4bfa-8dbd-210e13729f64 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 09:26:10 compute-1 nova_compute[189066]: 2025-12-05 09:26:10.165 189070 DEBUG nova.scheduler.client.report [None req-9bcddf5c-229a-4ed2-9868-7ebcebeb1b6c 3cf2b75d6732438a9a3626bc5db6d76e efceb9f71c3c447ea59c2d2694f9e636 - - default default] Inventory has not changed for provider be68f9f1-7820-4bfa-8dbd-210e13729f64 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 09:26:10 compute-1 nova_compute[189066]: 2025-12-05 09:26:10.252 189070 DEBUG oslo_concurrency.lockutils [None req-9bcddf5c-229a-4ed2-9868-7ebcebeb1b6c 3cf2b75d6732438a9a3626bc5db6d76e efceb9f71c3c447ea59c2d2694f9e636 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.254s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:26:10 compute-1 nova_compute[189066]: 2025-12-05 09:26:10.291 189070 INFO nova.scheduler.client.report [None req-9bcddf5c-229a-4ed2-9868-7ebcebeb1b6c 3cf2b75d6732438a9a3626bc5db6d76e efceb9f71c3c447ea59c2d2694f9e636 - - default default] Deleted allocations for instance b4e74925-5201-4f70-9beb-258ed9dc025a
Dec 05 09:26:10 compute-1 nova_compute[189066]: 2025-12-05 09:26:10.375 189070 DEBUG oslo_concurrency.lockutils [None req-9bcddf5c-229a-4ed2-9868-7ebcebeb1b6c 3cf2b75d6732438a9a3626bc5db6d76e efceb9f71c3c447ea59c2d2694f9e636 - - default default] Lock "b4e74925-5201-4f70-9beb-258ed9dc025a" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 10.205s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:26:10 compute-1 nova_compute[189066]: 2025-12-05 09:26:10.528 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:26:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:26:11.156 105272 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=9, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f2:d3:89', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '8e:43:2c:7d:20:dd'}, ipsec=False) old=SB_Global(nb_cfg=8) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 09:26:11 compute-1 nova_compute[189066]: 2025-12-05 09:26:11.156 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:26:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:26:11.157 105272 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 5 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Dec 05 09:26:13 compute-1 podman[223016]: 2025-12-05 09:26:13.630206827 +0000 UTC m=+0.065434787 container health_status b07f36300303a5c1729056a61e344d6c6fd9b2e404082d8e8ce7185d45652c2b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=openstack_network_exporter, architecture=x86_64, vendor=Red Hat, Inc., version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, io.openshift.expose-services=, vcs-type=git, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, release=1755695350, io.openshift.tags=minimal rhel9, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Dec 05 09:26:13 compute-1 nova_compute[189066]: 2025-12-05 09:26:13.873 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:26:14 compute-1 nova_compute[189066]: 2025-12-05 09:26:14.349 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:26:14 compute-1 nova_compute[189066]: 2025-12-05 09:26:14.688 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:26:14 compute-1 nova_compute[189066]: 2025-12-05 09:26:14.844 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:26:14 compute-1 nova_compute[189066]: 2025-12-05 09:26:14.844 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:26:14 compute-1 nova_compute[189066]: 2025-12-05 09:26:14.873 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:26:14 compute-1 nova_compute[189066]: 2025-12-05 09:26:14.873 189070 DEBUG nova.compute.manager [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 05 09:26:14 compute-1 nova_compute[189066]: 2025-12-05 09:26:14.874 189070 DEBUG nova.compute.manager [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 05 09:26:14 compute-1 nova_compute[189066]: 2025-12-05 09:26:14.892 189070 DEBUG nova.compute.manager [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Dec 05 09:26:14 compute-1 nova_compute[189066]: 2025-12-05 09:26:14.893 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:26:14 compute-1 nova_compute[189066]: 2025-12-05 09:26:14.893 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:26:14 compute-1 nova_compute[189066]: 2025-12-05 09:26:14.893 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:26:14 compute-1 nova_compute[189066]: 2025-12-05 09:26:14.894 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:26:14 compute-1 nova_compute[189066]: 2025-12-05 09:26:14.894 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:26:14 compute-1 nova_compute[189066]: 2025-12-05 09:26:14.894 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:26:14 compute-1 nova_compute[189066]: 2025-12-05 09:26:14.894 189070 DEBUG nova.compute.manager [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 05 09:26:14 compute-1 nova_compute[189066]: 2025-12-05 09:26:14.894 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:26:14 compute-1 nova_compute[189066]: 2025-12-05 09:26:14.926 189070 DEBUG oslo_concurrency.lockutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:26:14 compute-1 nova_compute[189066]: 2025-12-05 09:26:14.927 189070 DEBUG oslo_concurrency.lockutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:26:14 compute-1 nova_compute[189066]: 2025-12-05 09:26:14.927 189070 DEBUG oslo_concurrency.lockutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:26:14 compute-1 nova_compute[189066]: 2025-12-05 09:26:14.927 189070 DEBUG nova.compute.resource_tracker [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 05 09:26:15 compute-1 nova_compute[189066]: 2025-12-05 09:26:15.097 189070 WARNING nova.virt.libvirt.driver [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 05 09:26:15 compute-1 nova_compute[189066]: 2025-12-05 09:26:15.099 189070 DEBUG nova.compute.resource_tracker [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5775MB free_disk=73.33374786376953GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 05 09:26:15 compute-1 nova_compute[189066]: 2025-12-05 09:26:15.099 189070 DEBUG oslo_concurrency.lockutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:26:15 compute-1 nova_compute[189066]: 2025-12-05 09:26:15.099 189070 DEBUG oslo_concurrency.lockutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:26:15 compute-1 nova_compute[189066]: 2025-12-05 09:26:15.168 189070 DEBUG nova.compute.resource_tracker [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 05 09:26:15 compute-1 nova_compute[189066]: 2025-12-05 09:26:15.169 189070 DEBUG nova.compute.resource_tracker [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 05 09:26:15 compute-1 nova_compute[189066]: 2025-12-05 09:26:15.197 189070 DEBUG nova.compute.provider_tree [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Inventory has not changed in ProviderTree for provider: be68f9f1-7820-4bfa-8dbd-210e13729f64 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 09:26:15 compute-1 nova_compute[189066]: 2025-12-05 09:26:15.212 189070 DEBUG nova.scheduler.client.report [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Inventory has not changed for provider be68f9f1-7820-4bfa-8dbd-210e13729f64 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 09:26:15 compute-1 nova_compute[189066]: 2025-12-05 09:26:15.237 189070 DEBUG nova.compute.resource_tracker [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 05 09:26:15 compute-1 nova_compute[189066]: 2025-12-05 09:26:15.237 189070 DEBUG oslo_concurrency.lockutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.138s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:26:15 compute-1 nova_compute[189066]: 2025-12-05 09:26:15.462 189070 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764926760.46027, b4e74925-5201-4f70-9beb-258ed9dc025a => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 09:26:15 compute-1 nova_compute[189066]: 2025-12-05 09:26:15.462 189070 INFO nova.compute.manager [-] [instance: b4e74925-5201-4f70-9beb-258ed9dc025a] VM Stopped (Lifecycle Event)
Dec 05 09:26:15 compute-1 nova_compute[189066]: 2025-12-05 09:26:15.531 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:26:15 compute-1 nova_compute[189066]: 2025-12-05 09:26:15.643 189070 DEBUG nova.compute.manager [None req-fd05d31b-f957-4530-b2ca-5d89f786438a - - - - - -] [instance: b4e74925-5201-4f70-9beb-258ed9dc025a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 09:26:16 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:26:16.159 105272 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=2deaed7a-68f6-453c-b7f8-10ef033f3762, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '9'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 09:26:16 compute-1 podman[223039]: 2025-12-05 09:26:16.632662962 +0000 UTC m=+0.071260082 container health_status b9f45cdcb567bb250381fa80d86c78c226d5638c7de1b6d79ee017275814ff68 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Dec 05 09:26:18 compute-1 nova_compute[189066]: 2025-12-05 09:26:18.906 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:26:20 compute-1 nova_compute[189066]: 2025-12-05 09:26:20.533 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:26:22 compute-1 podman[223063]: 2025-12-05 09:26:22.621592396 +0000 UTC m=+0.055313967 container health_status 550b604b3b40c506829669f3af0f3e8dc4e7ac01464222ce7f26998708fe1a96 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec 05 09:26:23 compute-1 nova_compute[189066]: 2025-12-05 09:26:23.910 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:26:25 compute-1 nova_compute[189066]: 2025-12-05 09:26:25.536 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:26:28 compute-1 nova_compute[189066]: 2025-12-05 09:26:28.912 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:26:30 compute-1 nova_compute[189066]: 2025-12-05 09:26:30.537 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:26:30 compute-1 podman[223087]: 2025-12-05 09:26:30.623867749 +0000 UTC m=+0.064521366 container health_status d60d3dfd47255bc7c068dbedc03dbf86678863f0414a1b8f8536e4d53dd67a13 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a42d21893a42142b0b00e96296a13ac9d2c0fedf55d6438ad104fd0309c63376-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, config_id=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team)
Dec 05 09:26:32 compute-1 sshd-session[223108]: Received disconnect from 122.168.194.41 port 51634:11: Bye Bye [preauth]
Dec 05 09:26:32 compute-1 sshd-session[223108]: Disconnected from authenticating user root 122.168.194.41 port 51634 [preauth]
Dec 05 09:26:33 compute-1 nova_compute[189066]: 2025-12-05 09:26:33.916 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:26:34 compute-1 podman[223110]: 2025-12-05 09:26:34.680352423 +0000 UTC m=+0.116945180 container health_status 0cdc951f3b455a1459471b20e1f1626ca53f702831c125f835d1b670aeb17ccf (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a42d21893a42142b0b00e96296a13ac9d2c0fedf55d6438ad104fd0309c63376-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_controller)
Dec 05 09:26:35 compute-1 nova_compute[189066]: 2025-12-05 09:26:35.541 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:26:37 compute-1 podman[223138]: 2025-12-05 09:26:37.60885839 +0000 UTC m=+0.053277367 container health_status e8cea6352187f28e672aa25c227f2705a90d58074b8cf42235a56df5d4026fd5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a42d21893a42142b0b00e96296a13ac9d2c0fedf55d6438ad104fd0309c63376-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251125, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Dec 05 09:26:38 compute-1 nova_compute[189066]: 2025-12-05 09:26:38.917 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:26:40 compute-1 nova_compute[189066]: 2025-12-05 09:26:40.542 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:26:40 compute-1 podman[223157]: 2025-12-05 09:26:40.641165292 +0000 UTC m=+0.062041073 container health_status 3f3288243aefe700d53cffce184353529fbaf6e44f1c19ec4792ed576af41176 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Dec 05 09:26:43 compute-1 nova_compute[189066]: 2025-12-05 09:26:43.952 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:26:44 compute-1 podman[223177]: 2025-12-05 09:26:44.627154984 +0000 UTC m=+0.068368600 container health_status b07f36300303a5c1729056a61e344d6c6fd9b2e404082d8e8ce7185d45652c2b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, distribution-scope=public, version=9.6, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, build-date=2025-08-20T13:12:41, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, container_name=openstack_network_exporter, managed_by=edpm_ansible, vcs-type=git, com.redhat.component=ubi9-minimal-container, config_id=openstack_network_exporter, io.buildah.version=1.33.7)
Dec 05 09:26:45 compute-1 nova_compute[189066]: 2025-12-05 09:26:45.544 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:26:47 compute-1 podman[223198]: 2025-12-05 09:26:47.622562134 +0000 UTC m=+0.060048525 container health_status b9f45cdcb567bb250381fa80d86c78c226d5638c7de1b6d79ee017275814ff68 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 05 09:26:48 compute-1 nova_compute[189066]: 2025-12-05 09:26:48.956 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:26:50 compute-1 nova_compute[189066]: 2025-12-05 09:26:50.549 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:26:53 compute-1 podman[223222]: 2025-12-05 09:26:53.610840703 +0000 UTC m=+0.052487267 container health_status 550b604b3b40c506829669f3af0f3e8dc4e7ac01464222ce7f26998708fe1a96 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec 05 09:26:53 compute-1 nova_compute[189066]: 2025-12-05 09:26:53.958 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:26:55 compute-1 nova_compute[189066]: 2025-12-05 09:26:55.554 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:26:55 compute-1 nova_compute[189066]: 2025-12-05 09:26:55.985 189070 DEBUG oslo_concurrency.lockutils [None req-6ce70ff3-e851-4175-bf63-4a9beff89553 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] Acquiring lock "9d2b0f76-0408-4f6b-8a37-ac9882a44b4e" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:26:55 compute-1 nova_compute[189066]: 2025-12-05 09:26:55.986 189070 DEBUG oslo_concurrency.lockutils [None req-6ce70ff3-e851-4175-bf63-4a9beff89553 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] Lock "9d2b0f76-0408-4f6b-8a37-ac9882a44b4e" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:26:56 compute-1 nova_compute[189066]: 2025-12-05 09:26:56.026 189070 DEBUG nova.compute.manager [None req-6ce70ff3-e851-4175-bf63-4a9beff89553 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] [instance: 9d2b0f76-0408-4f6b-8a37-ac9882a44b4e] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Dec 05 09:26:56 compute-1 nova_compute[189066]: 2025-12-05 09:26:56.216 189070 DEBUG oslo_concurrency.lockutils [None req-6ce70ff3-e851-4175-bf63-4a9beff89553 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:26:56 compute-1 nova_compute[189066]: 2025-12-05 09:26:56.216 189070 DEBUG oslo_concurrency.lockutils [None req-6ce70ff3-e851-4175-bf63-4a9beff89553 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:26:56 compute-1 nova_compute[189066]: 2025-12-05 09:26:56.224 189070 DEBUG nova.virt.hardware [None req-6ce70ff3-e851-4175-bf63-4a9beff89553 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Dec 05 09:26:56 compute-1 nova_compute[189066]: 2025-12-05 09:26:56.225 189070 INFO nova.compute.claims [None req-6ce70ff3-e851-4175-bf63-4a9beff89553 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] [instance: 9d2b0f76-0408-4f6b-8a37-ac9882a44b4e] Claim successful on node compute-1.ctlplane.example.com
Dec 05 09:26:56 compute-1 nova_compute[189066]: 2025-12-05 09:26:56.988 189070 DEBUG nova.compute.provider_tree [None req-6ce70ff3-e851-4175-bf63-4a9beff89553 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] Inventory has not changed in ProviderTree for provider: be68f9f1-7820-4bfa-8dbd-210e13729f64 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 09:26:57 compute-1 nova_compute[189066]: 2025-12-05 09:26:57.000 189070 DEBUG oslo_concurrency.lockutils [None req-084a5c26-267c-4925-bdee-4818a6d5ea30 a9b7ab1c9c854146af8af16f337a063d aa5b008731384d18ba83c8d69e76bcef - - default default] Acquiring lock "43d83f29-ba12-4205-ba09-545c3dc28920" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:26:57 compute-1 nova_compute[189066]: 2025-12-05 09:26:57.000 189070 DEBUG oslo_concurrency.lockutils [None req-084a5c26-267c-4925-bdee-4818a6d5ea30 a9b7ab1c9c854146af8af16f337a063d aa5b008731384d18ba83c8d69e76bcef - - default default] Lock "43d83f29-ba12-4205-ba09-545c3dc28920" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:26:57 compute-1 nova_compute[189066]: 2025-12-05 09:26:57.018 189070 DEBUG nova.scheduler.client.report [None req-6ce70ff3-e851-4175-bf63-4a9beff89553 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] Inventory has not changed for provider be68f9f1-7820-4bfa-8dbd-210e13729f64 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 09:26:57 compute-1 nova_compute[189066]: 2025-12-05 09:26:57.030 189070 DEBUG nova.compute.manager [None req-084a5c26-267c-4925-bdee-4818a6d5ea30 a9b7ab1c9c854146af8af16f337a063d aa5b008731384d18ba83c8d69e76bcef - - default default] [instance: 43d83f29-ba12-4205-ba09-545c3dc28920] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Dec 05 09:26:57 compute-1 nova_compute[189066]: 2025-12-05 09:26:57.070 189070 DEBUG oslo_concurrency.lockutils [None req-6ce70ff3-e851-4175-bf63-4a9beff89553 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.854s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:26:57 compute-1 nova_compute[189066]: 2025-12-05 09:26:57.071 189070 DEBUG nova.compute.manager [None req-6ce70ff3-e851-4175-bf63-4a9beff89553 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] [instance: 9d2b0f76-0408-4f6b-8a37-ac9882a44b4e] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Dec 05 09:26:57 compute-1 nova_compute[189066]: 2025-12-05 09:26:57.141 189070 DEBUG nova.compute.manager [None req-6ce70ff3-e851-4175-bf63-4a9beff89553 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] [instance: 9d2b0f76-0408-4f6b-8a37-ac9882a44b4e] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Dec 05 09:26:57 compute-1 nova_compute[189066]: 2025-12-05 09:26:57.143 189070 DEBUG nova.network.neutron [None req-6ce70ff3-e851-4175-bf63-4a9beff89553 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] [instance: 9d2b0f76-0408-4f6b-8a37-ac9882a44b4e] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Dec 05 09:26:57 compute-1 nova_compute[189066]: 2025-12-05 09:26:57.146 189070 DEBUG oslo_concurrency.lockutils [None req-084a5c26-267c-4925-bdee-4818a6d5ea30 a9b7ab1c9c854146af8af16f337a063d aa5b008731384d18ba83c8d69e76bcef - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:26:57 compute-1 nova_compute[189066]: 2025-12-05 09:26:57.146 189070 DEBUG oslo_concurrency.lockutils [None req-084a5c26-267c-4925-bdee-4818a6d5ea30 a9b7ab1c9c854146af8af16f337a063d aa5b008731384d18ba83c8d69e76bcef - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:26:57 compute-1 nova_compute[189066]: 2025-12-05 09:26:57.153 189070 DEBUG nova.virt.hardware [None req-084a5c26-267c-4925-bdee-4818a6d5ea30 a9b7ab1c9c854146af8af16f337a063d aa5b008731384d18ba83c8d69e76bcef - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Dec 05 09:26:57 compute-1 nova_compute[189066]: 2025-12-05 09:26:57.153 189070 INFO nova.compute.claims [None req-084a5c26-267c-4925-bdee-4818a6d5ea30 a9b7ab1c9c854146af8af16f337a063d aa5b008731384d18ba83c8d69e76bcef - - default default] [instance: 43d83f29-ba12-4205-ba09-545c3dc28920] Claim successful on node compute-1.ctlplane.example.com
Dec 05 09:26:57 compute-1 nova_compute[189066]: 2025-12-05 09:26:57.340 189070 INFO nova.virt.libvirt.driver [None req-6ce70ff3-e851-4175-bf63-4a9beff89553 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] [instance: 9d2b0f76-0408-4f6b-8a37-ac9882a44b4e] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Dec 05 09:26:57 compute-1 nova_compute[189066]: 2025-12-05 09:26:57.427 189070 DEBUG nova.compute.manager [None req-6ce70ff3-e851-4175-bf63-4a9beff89553 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] [instance: 9d2b0f76-0408-4f6b-8a37-ac9882a44b4e] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Dec 05 09:26:57 compute-1 nova_compute[189066]: 2025-12-05 09:26:57.488 189070 DEBUG nova.policy [None req-6ce70ff3-e851-4175-bf63-4a9beff89553 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '65751a90715341b2984ef84ebbaa1650', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'e26ae3fdd48d4947978a480f70e14f84', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Dec 05 09:26:57 compute-1 nova_compute[189066]: 2025-12-05 09:26:57.555 189070 DEBUG nova.compute.provider_tree [None req-084a5c26-267c-4925-bdee-4818a6d5ea30 a9b7ab1c9c854146af8af16f337a063d aa5b008731384d18ba83c8d69e76bcef - - default default] Inventory has not changed in ProviderTree for provider: be68f9f1-7820-4bfa-8dbd-210e13729f64 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 09:26:57 compute-1 nova_compute[189066]: 2025-12-05 09:26:57.645 189070 DEBUG nova.scheduler.client.report [None req-084a5c26-267c-4925-bdee-4818a6d5ea30 a9b7ab1c9c854146af8af16f337a063d aa5b008731384d18ba83c8d69e76bcef - - default default] Inventory has not changed for provider be68f9f1-7820-4bfa-8dbd-210e13729f64 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 09:26:57 compute-1 nova_compute[189066]: 2025-12-05 09:26:57.737 189070 DEBUG oslo_concurrency.lockutils [None req-084a5c26-267c-4925-bdee-4818a6d5ea30 a9b7ab1c9c854146af8af16f337a063d aa5b008731384d18ba83c8d69e76bcef - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.591s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:26:57 compute-1 nova_compute[189066]: 2025-12-05 09:26:57.738 189070 DEBUG nova.compute.manager [None req-084a5c26-267c-4925-bdee-4818a6d5ea30 a9b7ab1c9c854146af8af16f337a063d aa5b008731384d18ba83c8d69e76bcef - - default default] [instance: 43d83f29-ba12-4205-ba09-545c3dc28920] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Dec 05 09:26:57 compute-1 nova_compute[189066]: 2025-12-05 09:26:57.742 189070 DEBUG nova.compute.manager [None req-6ce70ff3-e851-4175-bf63-4a9beff89553 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] [instance: 9d2b0f76-0408-4f6b-8a37-ac9882a44b4e] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Dec 05 09:26:57 compute-1 nova_compute[189066]: 2025-12-05 09:26:57.743 189070 DEBUG nova.virt.libvirt.driver [None req-6ce70ff3-e851-4175-bf63-4a9beff89553 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] [instance: 9d2b0f76-0408-4f6b-8a37-ac9882a44b4e] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Dec 05 09:26:57 compute-1 nova_compute[189066]: 2025-12-05 09:26:57.743 189070 INFO nova.virt.libvirt.driver [None req-6ce70ff3-e851-4175-bf63-4a9beff89553 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] [instance: 9d2b0f76-0408-4f6b-8a37-ac9882a44b4e] Creating image(s)
Dec 05 09:26:57 compute-1 nova_compute[189066]: 2025-12-05 09:26:57.744 189070 DEBUG oslo_concurrency.lockutils [None req-6ce70ff3-e851-4175-bf63-4a9beff89553 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] Acquiring lock "/var/lib/nova/instances/9d2b0f76-0408-4f6b-8a37-ac9882a44b4e/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:26:57 compute-1 nova_compute[189066]: 2025-12-05 09:26:57.744 189070 DEBUG oslo_concurrency.lockutils [None req-6ce70ff3-e851-4175-bf63-4a9beff89553 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] Lock "/var/lib/nova/instances/9d2b0f76-0408-4f6b-8a37-ac9882a44b4e/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:26:57 compute-1 nova_compute[189066]: 2025-12-05 09:26:57.745 189070 DEBUG oslo_concurrency.lockutils [None req-6ce70ff3-e851-4175-bf63-4a9beff89553 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] Lock "/var/lib/nova/instances/9d2b0f76-0408-4f6b-8a37-ac9882a44b4e/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:26:57 compute-1 nova_compute[189066]: 2025-12-05 09:26:57.760 189070 DEBUG oslo_concurrency.processutils [None req-6ce70ff3-e851-4175-bf63-4a9beff89553 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5b1bdb5cc50b9bf43ebe3094e9279a12a18df0ce --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 09:26:57 compute-1 nova_compute[189066]: 2025-12-05 09:26:57.824 189070 DEBUG oslo_concurrency.processutils [None req-6ce70ff3-e851-4175-bf63-4a9beff89553 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5b1bdb5cc50b9bf43ebe3094e9279a12a18df0ce --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 09:26:57 compute-1 nova_compute[189066]: 2025-12-05 09:26:57.825 189070 DEBUG oslo_concurrency.lockutils [None req-6ce70ff3-e851-4175-bf63-4a9beff89553 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] Acquiring lock "5b1bdb5cc50b9bf43ebe3094e9279a12a18df0ce" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:26:57 compute-1 nova_compute[189066]: 2025-12-05 09:26:57.826 189070 DEBUG oslo_concurrency.lockutils [None req-6ce70ff3-e851-4175-bf63-4a9beff89553 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] Lock "5b1bdb5cc50b9bf43ebe3094e9279a12a18df0ce" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:26:57 compute-1 nova_compute[189066]: 2025-12-05 09:26:57.839 189070 DEBUG oslo_concurrency.processutils [None req-6ce70ff3-e851-4175-bf63-4a9beff89553 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5b1bdb5cc50b9bf43ebe3094e9279a12a18df0ce --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 09:26:57 compute-1 nova_compute[189066]: 2025-12-05 09:26:57.903 189070 DEBUG oslo_concurrency.processutils [None req-6ce70ff3-e851-4175-bf63-4a9beff89553 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5b1bdb5cc50b9bf43ebe3094e9279a12a18df0ce --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 09:26:57 compute-1 nova_compute[189066]: 2025-12-05 09:26:57.905 189070 DEBUG oslo_concurrency.processutils [None req-6ce70ff3-e851-4175-bf63-4a9beff89553 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5b1bdb5cc50b9bf43ebe3094e9279a12a18df0ce,backing_fmt=raw /var/lib/nova/instances/9d2b0f76-0408-4f6b-8a37-ac9882a44b4e/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 09:26:57 compute-1 nova_compute[189066]: 2025-12-05 09:26:57.938 189070 DEBUG oslo_concurrency.processutils [None req-6ce70ff3-e851-4175-bf63-4a9beff89553 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5b1bdb5cc50b9bf43ebe3094e9279a12a18df0ce,backing_fmt=raw /var/lib/nova/instances/9d2b0f76-0408-4f6b-8a37-ac9882a44b4e/disk 1073741824" returned: 0 in 0.033s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 09:26:57 compute-1 nova_compute[189066]: 2025-12-05 09:26:57.939 189070 DEBUG oslo_concurrency.lockutils [None req-6ce70ff3-e851-4175-bf63-4a9beff89553 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] Lock "5b1bdb5cc50b9bf43ebe3094e9279a12a18df0ce" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.114s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:26:57 compute-1 nova_compute[189066]: 2025-12-05 09:26:57.940 189070 DEBUG oslo_concurrency.processutils [None req-6ce70ff3-e851-4175-bf63-4a9beff89553 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5b1bdb5cc50b9bf43ebe3094e9279a12a18df0ce --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 09:26:57 compute-1 nova_compute[189066]: 2025-12-05 09:26:57.998 189070 DEBUG oslo_concurrency.processutils [None req-6ce70ff3-e851-4175-bf63-4a9beff89553 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5b1bdb5cc50b9bf43ebe3094e9279a12a18df0ce --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 09:26:57 compute-1 nova_compute[189066]: 2025-12-05 09:26:57.999 189070 DEBUG nova.virt.disk.api [None req-6ce70ff3-e851-4175-bf63-4a9beff89553 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] Checking if we can resize image /var/lib/nova/instances/9d2b0f76-0408-4f6b-8a37-ac9882a44b4e/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Dec 05 09:26:57 compute-1 nova_compute[189066]: 2025-12-05 09:26:57.999 189070 DEBUG oslo_concurrency.processutils [None req-6ce70ff3-e851-4175-bf63-4a9beff89553 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9d2b0f76-0408-4f6b-8a37-ac9882a44b4e/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 09:26:58 compute-1 nova_compute[189066]: 2025-12-05 09:26:58.064 189070 DEBUG nova.compute.manager [None req-084a5c26-267c-4925-bdee-4818a6d5ea30 a9b7ab1c9c854146af8af16f337a063d aa5b008731384d18ba83c8d69e76bcef - - default default] [instance: 43d83f29-ba12-4205-ba09-545c3dc28920] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Dec 05 09:26:58 compute-1 nova_compute[189066]: 2025-12-05 09:26:58.064 189070 DEBUG nova.network.neutron [None req-084a5c26-267c-4925-bdee-4818a6d5ea30 a9b7ab1c9c854146af8af16f337a063d aa5b008731384d18ba83c8d69e76bcef - - default default] [instance: 43d83f29-ba12-4205-ba09-545c3dc28920] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Dec 05 09:26:58 compute-1 nova_compute[189066]: 2025-12-05 09:26:58.070 189070 DEBUG oslo_concurrency.processutils [None req-6ce70ff3-e851-4175-bf63-4a9beff89553 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9d2b0f76-0408-4f6b-8a37-ac9882a44b4e/disk --force-share --output=json" returned: 0 in 0.071s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 09:26:58 compute-1 nova_compute[189066]: 2025-12-05 09:26:58.071 189070 DEBUG nova.virt.disk.api [None req-6ce70ff3-e851-4175-bf63-4a9beff89553 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] Cannot resize image /var/lib/nova/instances/9d2b0f76-0408-4f6b-8a37-ac9882a44b4e/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Dec 05 09:26:58 compute-1 nova_compute[189066]: 2025-12-05 09:26:58.071 189070 DEBUG nova.objects.instance [None req-6ce70ff3-e851-4175-bf63-4a9beff89553 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] Lazy-loading 'migration_context' on Instance uuid 9d2b0f76-0408-4f6b-8a37-ac9882a44b4e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 09:26:58 compute-1 nova_compute[189066]: 2025-12-05 09:26:58.224 189070 INFO nova.virt.libvirt.driver [None req-084a5c26-267c-4925-bdee-4818a6d5ea30 a9b7ab1c9c854146af8af16f337a063d aa5b008731384d18ba83c8d69e76bcef - - default default] [instance: 43d83f29-ba12-4205-ba09-545c3dc28920] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Dec 05 09:26:58 compute-1 nova_compute[189066]: 2025-12-05 09:26:58.227 189070 DEBUG nova.virt.libvirt.driver [None req-6ce70ff3-e851-4175-bf63-4a9beff89553 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] [instance: 9d2b0f76-0408-4f6b-8a37-ac9882a44b4e] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Dec 05 09:26:58 compute-1 nova_compute[189066]: 2025-12-05 09:26:58.228 189070 DEBUG nova.virt.libvirt.driver [None req-6ce70ff3-e851-4175-bf63-4a9beff89553 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] [instance: 9d2b0f76-0408-4f6b-8a37-ac9882a44b4e] Ensure instance console log exists: /var/lib/nova/instances/9d2b0f76-0408-4f6b-8a37-ac9882a44b4e/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Dec 05 09:26:58 compute-1 nova_compute[189066]: 2025-12-05 09:26:58.228 189070 DEBUG oslo_concurrency.lockutils [None req-6ce70ff3-e851-4175-bf63-4a9beff89553 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:26:58 compute-1 nova_compute[189066]: 2025-12-05 09:26:58.229 189070 DEBUG oslo_concurrency.lockutils [None req-6ce70ff3-e851-4175-bf63-4a9beff89553 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:26:58 compute-1 nova_compute[189066]: 2025-12-05 09:26:58.229 189070 DEBUG oslo_concurrency.lockutils [None req-6ce70ff3-e851-4175-bf63-4a9beff89553 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:26:58 compute-1 nova_compute[189066]: 2025-12-05 09:26:58.278 189070 DEBUG nova.compute.manager [None req-084a5c26-267c-4925-bdee-4818a6d5ea30 a9b7ab1c9c854146af8af16f337a063d aa5b008731384d18ba83c8d69e76bcef - - default default] [instance: 43d83f29-ba12-4205-ba09-545c3dc28920] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Dec 05 09:26:58 compute-1 nova_compute[189066]: 2025-12-05 09:26:58.386 189070 DEBUG nova.compute.manager [None req-084a5c26-267c-4925-bdee-4818a6d5ea30 a9b7ab1c9c854146af8af16f337a063d aa5b008731384d18ba83c8d69e76bcef - - default default] [instance: 43d83f29-ba12-4205-ba09-545c3dc28920] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Dec 05 09:26:58 compute-1 nova_compute[189066]: 2025-12-05 09:26:58.388 189070 DEBUG nova.virt.libvirt.driver [None req-084a5c26-267c-4925-bdee-4818a6d5ea30 a9b7ab1c9c854146af8af16f337a063d aa5b008731384d18ba83c8d69e76bcef - - default default] [instance: 43d83f29-ba12-4205-ba09-545c3dc28920] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Dec 05 09:26:58 compute-1 nova_compute[189066]: 2025-12-05 09:26:58.388 189070 INFO nova.virt.libvirt.driver [None req-084a5c26-267c-4925-bdee-4818a6d5ea30 a9b7ab1c9c854146af8af16f337a063d aa5b008731384d18ba83c8d69e76bcef - - default default] [instance: 43d83f29-ba12-4205-ba09-545c3dc28920] Creating image(s)
Dec 05 09:26:58 compute-1 nova_compute[189066]: 2025-12-05 09:26:58.389 189070 DEBUG oslo_concurrency.lockutils [None req-084a5c26-267c-4925-bdee-4818a6d5ea30 a9b7ab1c9c854146af8af16f337a063d aa5b008731384d18ba83c8d69e76bcef - - default default] Acquiring lock "/var/lib/nova/instances/43d83f29-ba12-4205-ba09-545c3dc28920/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:26:58 compute-1 nova_compute[189066]: 2025-12-05 09:26:58.389 189070 DEBUG oslo_concurrency.lockutils [None req-084a5c26-267c-4925-bdee-4818a6d5ea30 a9b7ab1c9c854146af8af16f337a063d aa5b008731384d18ba83c8d69e76bcef - - default default] Lock "/var/lib/nova/instances/43d83f29-ba12-4205-ba09-545c3dc28920/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:26:58 compute-1 nova_compute[189066]: 2025-12-05 09:26:58.390 189070 DEBUG oslo_concurrency.lockutils [None req-084a5c26-267c-4925-bdee-4818a6d5ea30 a9b7ab1c9c854146af8af16f337a063d aa5b008731384d18ba83c8d69e76bcef - - default default] Lock "/var/lib/nova/instances/43d83f29-ba12-4205-ba09-545c3dc28920/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:26:58 compute-1 nova_compute[189066]: 2025-12-05 09:26:58.402 189070 DEBUG oslo_concurrency.processutils [None req-084a5c26-267c-4925-bdee-4818a6d5ea30 a9b7ab1c9c854146af8af16f337a063d aa5b008731384d18ba83c8d69e76bcef - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5b1bdb5cc50b9bf43ebe3094e9279a12a18df0ce --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 09:26:58 compute-1 nova_compute[189066]: 2025-12-05 09:26:58.501 189070 DEBUG oslo_concurrency.processutils [None req-084a5c26-267c-4925-bdee-4818a6d5ea30 a9b7ab1c9c854146af8af16f337a063d aa5b008731384d18ba83c8d69e76bcef - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5b1bdb5cc50b9bf43ebe3094e9279a12a18df0ce --force-share --output=json" returned: 0 in 0.099s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 09:26:58 compute-1 nova_compute[189066]: 2025-12-05 09:26:58.502 189070 DEBUG oslo_concurrency.lockutils [None req-084a5c26-267c-4925-bdee-4818a6d5ea30 a9b7ab1c9c854146af8af16f337a063d aa5b008731384d18ba83c8d69e76bcef - - default default] Acquiring lock "5b1bdb5cc50b9bf43ebe3094e9279a12a18df0ce" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:26:58 compute-1 nova_compute[189066]: 2025-12-05 09:26:58.503 189070 DEBUG oslo_concurrency.lockutils [None req-084a5c26-267c-4925-bdee-4818a6d5ea30 a9b7ab1c9c854146af8af16f337a063d aa5b008731384d18ba83c8d69e76bcef - - default default] Lock "5b1bdb5cc50b9bf43ebe3094e9279a12a18df0ce" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:26:58 compute-1 nova_compute[189066]: 2025-12-05 09:26:58.515 189070 DEBUG oslo_concurrency.processutils [None req-084a5c26-267c-4925-bdee-4818a6d5ea30 a9b7ab1c9c854146af8af16f337a063d aa5b008731384d18ba83c8d69e76bcef - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5b1bdb5cc50b9bf43ebe3094e9279a12a18df0ce --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 09:26:58 compute-1 nova_compute[189066]: 2025-12-05 09:26:58.574 189070 DEBUG oslo_concurrency.processutils [None req-084a5c26-267c-4925-bdee-4818a6d5ea30 a9b7ab1c9c854146af8af16f337a063d aa5b008731384d18ba83c8d69e76bcef - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5b1bdb5cc50b9bf43ebe3094e9279a12a18df0ce --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 09:26:58 compute-1 nova_compute[189066]: 2025-12-05 09:26:58.575 189070 DEBUG oslo_concurrency.processutils [None req-084a5c26-267c-4925-bdee-4818a6d5ea30 a9b7ab1c9c854146af8af16f337a063d aa5b008731384d18ba83c8d69e76bcef - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5b1bdb5cc50b9bf43ebe3094e9279a12a18df0ce,backing_fmt=raw /var/lib/nova/instances/43d83f29-ba12-4205-ba09-545c3dc28920/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 09:26:58 compute-1 nova_compute[189066]: 2025-12-05 09:26:58.609 189070 DEBUG oslo_concurrency.processutils [None req-084a5c26-267c-4925-bdee-4818a6d5ea30 a9b7ab1c9c854146af8af16f337a063d aa5b008731384d18ba83c8d69e76bcef - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5b1bdb5cc50b9bf43ebe3094e9279a12a18df0ce,backing_fmt=raw /var/lib/nova/instances/43d83f29-ba12-4205-ba09-545c3dc28920/disk 1073741824" returned: 0 in 0.034s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 09:26:58 compute-1 nova_compute[189066]: 2025-12-05 09:26:58.611 189070 DEBUG oslo_concurrency.lockutils [None req-084a5c26-267c-4925-bdee-4818a6d5ea30 a9b7ab1c9c854146af8af16f337a063d aa5b008731384d18ba83c8d69e76bcef - - default default] Lock "5b1bdb5cc50b9bf43ebe3094e9279a12a18df0ce" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.107s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:26:58 compute-1 nova_compute[189066]: 2025-12-05 09:26:58.611 189070 DEBUG oslo_concurrency.processutils [None req-084a5c26-267c-4925-bdee-4818a6d5ea30 a9b7ab1c9c854146af8af16f337a063d aa5b008731384d18ba83c8d69e76bcef - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5b1bdb5cc50b9bf43ebe3094e9279a12a18df0ce --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 09:26:58 compute-1 nova_compute[189066]: 2025-12-05 09:26:58.670 189070 DEBUG oslo_concurrency.processutils [None req-084a5c26-267c-4925-bdee-4818a6d5ea30 a9b7ab1c9c854146af8af16f337a063d aa5b008731384d18ba83c8d69e76bcef - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5b1bdb5cc50b9bf43ebe3094e9279a12a18df0ce --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 09:26:58 compute-1 nova_compute[189066]: 2025-12-05 09:26:58.671 189070 DEBUG nova.virt.disk.api [None req-084a5c26-267c-4925-bdee-4818a6d5ea30 a9b7ab1c9c854146af8af16f337a063d aa5b008731384d18ba83c8d69e76bcef - - default default] Checking if we can resize image /var/lib/nova/instances/43d83f29-ba12-4205-ba09-545c3dc28920/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Dec 05 09:26:58 compute-1 nova_compute[189066]: 2025-12-05 09:26:58.671 189070 DEBUG oslo_concurrency.processutils [None req-084a5c26-267c-4925-bdee-4818a6d5ea30 a9b7ab1c9c854146af8af16f337a063d aa5b008731384d18ba83c8d69e76bcef - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/43d83f29-ba12-4205-ba09-545c3dc28920/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 09:26:58 compute-1 nova_compute[189066]: 2025-12-05 09:26:58.731 189070 DEBUG oslo_concurrency.processutils [None req-084a5c26-267c-4925-bdee-4818a6d5ea30 a9b7ab1c9c854146af8af16f337a063d aa5b008731384d18ba83c8d69e76bcef - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/43d83f29-ba12-4205-ba09-545c3dc28920/disk --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 09:26:58 compute-1 nova_compute[189066]: 2025-12-05 09:26:58.732 189070 DEBUG nova.virt.disk.api [None req-084a5c26-267c-4925-bdee-4818a6d5ea30 a9b7ab1c9c854146af8af16f337a063d aa5b008731384d18ba83c8d69e76bcef - - default default] Cannot resize image /var/lib/nova/instances/43d83f29-ba12-4205-ba09-545c3dc28920/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Dec 05 09:26:58 compute-1 nova_compute[189066]: 2025-12-05 09:26:58.733 189070 DEBUG nova.objects.instance [None req-084a5c26-267c-4925-bdee-4818a6d5ea30 a9b7ab1c9c854146af8af16f337a063d aa5b008731384d18ba83c8d69e76bcef - - default default] Lazy-loading 'migration_context' on Instance uuid 43d83f29-ba12-4205-ba09-545c3dc28920 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 09:26:58 compute-1 nova_compute[189066]: 2025-12-05 09:26:58.749 189070 DEBUG nova.virt.libvirt.driver [None req-084a5c26-267c-4925-bdee-4818a6d5ea30 a9b7ab1c9c854146af8af16f337a063d aa5b008731384d18ba83c8d69e76bcef - - default default] [instance: 43d83f29-ba12-4205-ba09-545c3dc28920] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Dec 05 09:26:58 compute-1 nova_compute[189066]: 2025-12-05 09:26:58.750 189070 DEBUG nova.virt.libvirt.driver [None req-084a5c26-267c-4925-bdee-4818a6d5ea30 a9b7ab1c9c854146af8af16f337a063d aa5b008731384d18ba83c8d69e76bcef - - default default] [instance: 43d83f29-ba12-4205-ba09-545c3dc28920] Ensure instance console log exists: /var/lib/nova/instances/43d83f29-ba12-4205-ba09-545c3dc28920/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Dec 05 09:26:58 compute-1 nova_compute[189066]: 2025-12-05 09:26:58.751 189070 DEBUG oslo_concurrency.lockutils [None req-084a5c26-267c-4925-bdee-4818a6d5ea30 a9b7ab1c9c854146af8af16f337a063d aa5b008731384d18ba83c8d69e76bcef - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:26:58 compute-1 nova_compute[189066]: 2025-12-05 09:26:58.751 189070 DEBUG oslo_concurrency.lockutils [None req-084a5c26-267c-4925-bdee-4818a6d5ea30 a9b7ab1c9c854146af8af16f337a063d aa5b008731384d18ba83c8d69e76bcef - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:26:58 compute-1 nova_compute[189066]: 2025-12-05 09:26:58.751 189070 DEBUG oslo_concurrency.lockutils [None req-084a5c26-267c-4925-bdee-4818a6d5ea30 a9b7ab1c9c854146af8af16f337a063d aa5b008731384d18ba83c8d69e76bcef - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:26:58 compute-1 nova_compute[189066]: 2025-12-05 09:26:58.915 189070 DEBUG nova.policy [None req-084a5c26-267c-4925-bdee-4818a6d5ea30 a9b7ab1c9c854146af8af16f337a063d aa5b008731384d18ba83c8d69e76bcef - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'a9b7ab1c9c854146af8af16f337a063d', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'aa5b008731384d18ba83c8d69e76bcef', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Dec 05 09:26:58 compute-1 nova_compute[189066]: 2025-12-05 09:26:58.962 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:26:59 compute-1 nova_compute[189066]: 2025-12-05 09:26:59.340 189070 DEBUG nova.network.neutron [None req-6ce70ff3-e851-4175-bf63-4a9beff89553 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] [instance: 9d2b0f76-0408-4f6b-8a37-ac9882a44b4e] Successfully created port: fee19e88-d18e-4020-97b6-26caf4ef6fa9 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Dec 05 09:27:00 compute-1 nova_compute[189066]: 2025-12-05 09:27:00.557 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:27:01 compute-1 podman[223276]: 2025-12-05 09:27:01.61943382 +0000 UTC m=+0.060849764 container health_status d60d3dfd47255bc7c068dbedc03dbf86678863f0414a1b8f8536e4d53dd67a13 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a42d21893a42142b0b00e96296a13ac9d2c0fedf55d6438ad104fd0309c63376-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20251125)
Dec 05 09:27:01 compute-1 nova_compute[189066]: 2025-12-05 09:27:01.845 189070 DEBUG nova.network.neutron [None req-6ce70ff3-e851-4175-bf63-4a9beff89553 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] [instance: 9d2b0f76-0408-4f6b-8a37-ac9882a44b4e] Successfully updated port: fee19e88-d18e-4020-97b6-26caf4ef6fa9 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Dec 05 09:27:01 compute-1 nova_compute[189066]: 2025-12-05 09:27:01.863 189070 DEBUG oslo_concurrency.lockutils [None req-6ce70ff3-e851-4175-bf63-4a9beff89553 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] Acquiring lock "refresh_cache-9d2b0f76-0408-4f6b-8a37-ac9882a44b4e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 09:27:01 compute-1 nova_compute[189066]: 2025-12-05 09:27:01.863 189070 DEBUG oslo_concurrency.lockutils [None req-6ce70ff3-e851-4175-bf63-4a9beff89553 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] Acquired lock "refresh_cache-9d2b0f76-0408-4f6b-8a37-ac9882a44b4e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 09:27:01 compute-1 nova_compute[189066]: 2025-12-05 09:27:01.863 189070 DEBUG nova.network.neutron [None req-6ce70ff3-e851-4175-bf63-4a9beff89553 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] [instance: 9d2b0f76-0408-4f6b-8a37-ac9882a44b4e] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Dec 05 09:27:01 compute-1 nova_compute[189066]: 2025-12-05 09:27:01.944 189070 DEBUG nova.network.neutron [None req-084a5c26-267c-4925-bdee-4818a6d5ea30 a9b7ab1c9c854146af8af16f337a063d aa5b008731384d18ba83c8d69e76bcef - - default default] [instance: 43d83f29-ba12-4205-ba09-545c3dc28920] Successfully updated port: 8977e440-f4bd-42c7-bf7b-c57e7184cc33 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Dec 05 09:27:02 compute-1 nova_compute[189066]: 2025-12-05 09:27:02.059 189070 DEBUG oslo_concurrency.lockutils [None req-084a5c26-267c-4925-bdee-4818a6d5ea30 a9b7ab1c9c854146af8af16f337a063d aa5b008731384d18ba83c8d69e76bcef - - default default] Acquiring lock "refresh_cache-43d83f29-ba12-4205-ba09-545c3dc28920" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 09:27:02 compute-1 nova_compute[189066]: 2025-12-05 09:27:02.060 189070 DEBUG oslo_concurrency.lockutils [None req-084a5c26-267c-4925-bdee-4818a6d5ea30 a9b7ab1c9c854146af8af16f337a063d aa5b008731384d18ba83c8d69e76bcef - - default default] Acquired lock "refresh_cache-43d83f29-ba12-4205-ba09-545c3dc28920" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 09:27:02 compute-1 nova_compute[189066]: 2025-12-05 09:27:02.060 189070 DEBUG nova.network.neutron [None req-084a5c26-267c-4925-bdee-4818a6d5ea30 a9b7ab1c9c854146af8af16f337a063d aa5b008731384d18ba83c8d69e76bcef - - default default] [instance: 43d83f29-ba12-4205-ba09-545c3dc28920] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Dec 05 09:27:02 compute-1 nova_compute[189066]: 2025-12-05 09:27:02.241 189070 DEBUG nova.network.neutron [None req-6ce70ff3-e851-4175-bf63-4a9beff89553 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] [instance: 9d2b0f76-0408-4f6b-8a37-ac9882a44b4e] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Dec 05 09:27:02 compute-1 nova_compute[189066]: 2025-12-05 09:27:02.379 189070 DEBUG nova.network.neutron [None req-084a5c26-267c-4925-bdee-4818a6d5ea30 a9b7ab1c9c854146af8af16f337a063d aa5b008731384d18ba83c8d69e76bcef - - default default] [instance: 43d83f29-ba12-4205-ba09-545c3dc28920] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Dec 05 09:27:02 compute-1 nova_compute[189066]: 2025-12-05 09:27:02.518 189070 DEBUG nova.compute.manager [req-3caab629-9639-44e0-90b9-02b78d37b7b4 req-b7965b1c-f9c5-453f-925c-8beffa7f29f7 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 9d2b0f76-0408-4f6b-8a37-ac9882a44b4e] Received event network-changed-fee19e88-d18e-4020-97b6-26caf4ef6fa9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 09:27:02 compute-1 nova_compute[189066]: 2025-12-05 09:27:02.518 189070 DEBUG nova.compute.manager [req-3caab629-9639-44e0-90b9-02b78d37b7b4 req-b7965b1c-f9c5-453f-925c-8beffa7f29f7 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 9d2b0f76-0408-4f6b-8a37-ac9882a44b4e] Refreshing instance network info cache due to event network-changed-fee19e88-d18e-4020-97b6-26caf4ef6fa9. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Dec 05 09:27:02 compute-1 nova_compute[189066]: 2025-12-05 09:27:02.519 189070 DEBUG oslo_concurrency.lockutils [req-3caab629-9639-44e0-90b9-02b78d37b7b4 req-b7965b1c-f9c5-453f-925c-8beffa7f29f7 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Acquiring lock "refresh_cache-9d2b0f76-0408-4f6b-8a37-ac9882a44b4e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 09:27:03 compute-1 nova_compute[189066]: 2025-12-05 09:27:03.962 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:27:04 compute-1 nova_compute[189066]: 2025-12-05 09:27:04.296 189070 DEBUG nova.network.neutron [None req-6ce70ff3-e851-4175-bf63-4a9beff89553 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] [instance: 9d2b0f76-0408-4f6b-8a37-ac9882a44b4e] Updating instance_info_cache with network_info: [{"id": "fee19e88-d18e-4020-97b6-26caf4ef6fa9", "address": "fa:16:3e:ad:12:e3", "network": {"id": "2d5ff262-0b2d-49fb-b643-980510ce97c7", "bridge": "br-int", "label": "tempest-network-smoke--2096539509", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e26ae3fdd48d4947978a480f70e14f84", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfee19e88-d1", "ovs_interfaceid": "fee19e88-d18e-4020-97b6-26caf4ef6fa9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 09:27:04 compute-1 nova_compute[189066]: 2025-12-05 09:27:04.306 189070 DEBUG nova.network.neutron [None req-084a5c26-267c-4925-bdee-4818a6d5ea30 a9b7ab1c9c854146af8af16f337a063d aa5b008731384d18ba83c8d69e76bcef - - default default] [instance: 43d83f29-ba12-4205-ba09-545c3dc28920] Updating instance_info_cache with network_info: [{"id": "8977e440-f4bd-42c7-bf7b-c57e7184cc33", "address": "fa:16:3e:81:c8:37", "network": {"id": "0a97aec7-0780-4b5e-9498-e796fd7b42fd", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-971219995-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "aa5b008731384d18ba83c8d69e76bcef", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8977e440-f4", "ovs_interfaceid": "8977e440-f4bd-42c7-bf7b-c57e7184cc33", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 09:27:04 compute-1 nova_compute[189066]: 2025-12-05 09:27:04.359 189070 DEBUG oslo_concurrency.lockutils [None req-6ce70ff3-e851-4175-bf63-4a9beff89553 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] Releasing lock "refresh_cache-9d2b0f76-0408-4f6b-8a37-ac9882a44b4e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 09:27:04 compute-1 nova_compute[189066]: 2025-12-05 09:27:04.359 189070 DEBUG nova.compute.manager [None req-6ce70ff3-e851-4175-bf63-4a9beff89553 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] [instance: 9d2b0f76-0408-4f6b-8a37-ac9882a44b4e] Instance network_info: |[{"id": "fee19e88-d18e-4020-97b6-26caf4ef6fa9", "address": "fa:16:3e:ad:12:e3", "network": {"id": "2d5ff262-0b2d-49fb-b643-980510ce97c7", "bridge": "br-int", "label": "tempest-network-smoke--2096539509", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e26ae3fdd48d4947978a480f70e14f84", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfee19e88-d1", "ovs_interfaceid": "fee19e88-d18e-4020-97b6-26caf4ef6fa9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Dec 05 09:27:04 compute-1 nova_compute[189066]: 2025-12-05 09:27:04.360 189070 DEBUG oslo_concurrency.lockutils [req-3caab629-9639-44e0-90b9-02b78d37b7b4 req-b7965b1c-f9c5-453f-925c-8beffa7f29f7 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Acquired lock "refresh_cache-9d2b0f76-0408-4f6b-8a37-ac9882a44b4e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 09:27:04 compute-1 nova_compute[189066]: 2025-12-05 09:27:04.360 189070 DEBUG nova.network.neutron [req-3caab629-9639-44e0-90b9-02b78d37b7b4 req-b7965b1c-f9c5-453f-925c-8beffa7f29f7 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 9d2b0f76-0408-4f6b-8a37-ac9882a44b4e] Refreshing network info cache for port fee19e88-d18e-4020-97b6-26caf4ef6fa9 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Dec 05 09:27:04 compute-1 nova_compute[189066]: 2025-12-05 09:27:04.363 189070 DEBUG nova.virt.libvirt.driver [None req-6ce70ff3-e851-4175-bf63-4a9beff89553 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] [instance: 9d2b0f76-0408-4f6b-8a37-ac9882a44b4e] Start _get_guest_xml network_info=[{"id": "fee19e88-d18e-4020-97b6-26caf4ef6fa9", "address": "fa:16:3e:ad:12:e3", "network": {"id": "2d5ff262-0b2d-49fb-b643-980510ce97c7", "bridge": "br-int", "label": "tempest-network-smoke--2096539509", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e26ae3fdd48d4947978a480f70e14f84", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfee19e88-d1", "ovs_interfaceid": "fee19e88-d18e-4020-97b6-26caf4ef6fa9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T09:19:04Z,direct_url=<?>,disk_format='qcow2',id=3ebffd97-b242-42d7-b245-ebdaf8e4377c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='1cb29f3743454b7fb71af92993455437',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T09:19:06Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encrypted': False, 'encryption_secret_uuid': None, 'size': 0, 'boot_index': 0, 'device_name': '/dev/vda', 'disk_bus': 'virtio', 'guest_format': None, 'encryption_options': None, 'encryption_format': None, 'device_type': 'disk', 'image_id': '3ebffd97-b242-42d7-b245-ebdaf8e4377c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Dec 05 09:27:04 compute-1 nova_compute[189066]: 2025-12-05 09:27:04.367 189070 WARNING nova.virt.libvirt.driver [None req-6ce70ff3-e851-4175-bf63-4a9beff89553 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 05 09:27:04 compute-1 nova_compute[189066]: 2025-12-05 09:27:04.369 189070 DEBUG oslo_concurrency.lockutils [None req-084a5c26-267c-4925-bdee-4818a6d5ea30 a9b7ab1c9c854146af8af16f337a063d aa5b008731384d18ba83c8d69e76bcef - - default default] Releasing lock "refresh_cache-43d83f29-ba12-4205-ba09-545c3dc28920" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 09:27:04 compute-1 nova_compute[189066]: 2025-12-05 09:27:04.369 189070 DEBUG nova.compute.manager [None req-084a5c26-267c-4925-bdee-4818a6d5ea30 a9b7ab1c9c854146af8af16f337a063d aa5b008731384d18ba83c8d69e76bcef - - default default] [instance: 43d83f29-ba12-4205-ba09-545c3dc28920] Instance network_info: |[{"id": "8977e440-f4bd-42c7-bf7b-c57e7184cc33", "address": "fa:16:3e:81:c8:37", "network": {"id": "0a97aec7-0780-4b5e-9498-e796fd7b42fd", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-971219995-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "aa5b008731384d18ba83c8d69e76bcef", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8977e440-f4", "ovs_interfaceid": "8977e440-f4bd-42c7-bf7b-c57e7184cc33", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Dec 05 09:27:04 compute-1 nova_compute[189066]: 2025-12-05 09:27:04.371 189070 DEBUG nova.virt.libvirt.driver [None req-084a5c26-267c-4925-bdee-4818a6d5ea30 a9b7ab1c9c854146af8af16f337a063d aa5b008731384d18ba83c8d69e76bcef - - default default] [instance: 43d83f29-ba12-4205-ba09-545c3dc28920] Start _get_guest_xml network_info=[{"id": "8977e440-f4bd-42c7-bf7b-c57e7184cc33", "address": "fa:16:3e:81:c8:37", "network": {"id": "0a97aec7-0780-4b5e-9498-e796fd7b42fd", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-971219995-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "aa5b008731384d18ba83c8d69e76bcef", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8977e440-f4", "ovs_interfaceid": "8977e440-f4bd-42c7-bf7b-c57e7184cc33", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T09:19:04Z,direct_url=<?>,disk_format='qcow2',id=3ebffd97-b242-42d7-b245-ebdaf8e4377c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='1cb29f3743454b7fb71af92993455437',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T09:19:06Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encrypted': False, 'encryption_secret_uuid': None, 'size': 0, 'boot_index': 0, 'device_name': '/dev/vda', 'disk_bus': 'virtio', 'guest_format': None, 'encryption_options': None, 'encryption_format': None, 'device_type': 'disk', 'image_id': '3ebffd97-b242-42d7-b245-ebdaf8e4377c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Dec 05 09:27:04 compute-1 nova_compute[189066]: 2025-12-05 09:27:04.375 189070 DEBUG nova.virt.libvirt.host [None req-6ce70ff3-e851-4175-bf63-4a9beff89553 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Dec 05 09:27:04 compute-1 nova_compute[189066]: 2025-12-05 09:27:04.375 189070 DEBUG nova.virt.libvirt.host [None req-6ce70ff3-e851-4175-bf63-4a9beff89553 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Dec 05 09:27:04 compute-1 nova_compute[189066]: 2025-12-05 09:27:04.376 189070 WARNING nova.virt.libvirt.driver [None req-084a5c26-267c-4925-bdee-4818a6d5ea30 a9b7ab1c9c854146af8af16f337a063d aa5b008731384d18ba83c8d69e76bcef - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 05 09:27:04 compute-1 nova_compute[189066]: 2025-12-05 09:27:04.379 189070 DEBUG nova.virt.libvirt.host [None req-6ce70ff3-e851-4175-bf63-4a9beff89553 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Dec 05 09:27:04 compute-1 nova_compute[189066]: 2025-12-05 09:27:04.380 189070 DEBUG nova.virt.libvirt.host [None req-6ce70ff3-e851-4175-bf63-4a9beff89553 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Dec 05 09:27:04 compute-1 nova_compute[189066]: 2025-12-05 09:27:04.381 189070 DEBUG nova.virt.libvirt.driver [None req-6ce70ff3-e851-4175-bf63-4a9beff89553 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Dec 05 09:27:04 compute-1 nova_compute[189066]: 2025-12-05 09:27:04.382 189070 DEBUG nova.virt.hardware [None req-6ce70ff3-e851-4175-bf63-4a9beff89553 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-05T09:19:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='fbadeab4-f24f-4100-963a-d228b2a6f7c4',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T09:19:04Z,direct_url=<?>,disk_format='qcow2',id=3ebffd97-b242-42d7-b245-ebdaf8e4377c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='1cb29f3743454b7fb71af92993455437',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T09:19:06Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Dec 05 09:27:04 compute-1 nova_compute[189066]: 2025-12-05 09:27:04.382 189070 DEBUG nova.virt.hardware [None req-6ce70ff3-e851-4175-bf63-4a9beff89553 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Dec 05 09:27:04 compute-1 nova_compute[189066]: 2025-12-05 09:27:04.382 189070 DEBUG nova.virt.hardware [None req-6ce70ff3-e851-4175-bf63-4a9beff89553 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Dec 05 09:27:04 compute-1 nova_compute[189066]: 2025-12-05 09:27:04.382 189070 DEBUG nova.virt.hardware [None req-6ce70ff3-e851-4175-bf63-4a9beff89553 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Dec 05 09:27:04 compute-1 nova_compute[189066]: 2025-12-05 09:27:04.383 189070 DEBUG nova.virt.hardware [None req-6ce70ff3-e851-4175-bf63-4a9beff89553 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Dec 05 09:27:04 compute-1 nova_compute[189066]: 2025-12-05 09:27:04.383 189070 DEBUG nova.virt.hardware [None req-6ce70ff3-e851-4175-bf63-4a9beff89553 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Dec 05 09:27:04 compute-1 nova_compute[189066]: 2025-12-05 09:27:04.383 189070 DEBUG nova.virt.hardware [None req-6ce70ff3-e851-4175-bf63-4a9beff89553 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Dec 05 09:27:04 compute-1 nova_compute[189066]: 2025-12-05 09:27:04.384 189070 DEBUG nova.virt.hardware [None req-6ce70ff3-e851-4175-bf63-4a9beff89553 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Dec 05 09:27:04 compute-1 nova_compute[189066]: 2025-12-05 09:27:04.384 189070 DEBUG nova.virt.hardware [None req-6ce70ff3-e851-4175-bf63-4a9beff89553 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Dec 05 09:27:04 compute-1 nova_compute[189066]: 2025-12-05 09:27:04.384 189070 DEBUG nova.virt.hardware [None req-6ce70ff3-e851-4175-bf63-4a9beff89553 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Dec 05 09:27:04 compute-1 nova_compute[189066]: 2025-12-05 09:27:04.384 189070 DEBUG nova.virt.hardware [None req-6ce70ff3-e851-4175-bf63-4a9beff89553 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Dec 05 09:27:04 compute-1 nova_compute[189066]: 2025-12-05 09:27:04.388 189070 DEBUG nova.virt.libvirt.vif [None req-6ce70ff3-e851-4175-bf63-4a9beff89553 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T09:26:52Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-1301659118',display_name='tempest-TestNetworkAdvancedServerOps-server-1301659118',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-1301659118',id=10,image_ref='3ebffd97-b242-42d7-b245-ebdaf8e4377c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBE/IH5546c3h1NqRaMnG8oeFTcOTykF/TM0HkVvHYHh6GuyCm0ZuW+dN3e3eFJ7fYYhI257JWo40Mrp34xaUZfOzZGF5l/jYsmwg8LZO1T3z+5cPaeu5EMQJR4Fd4T/olg==',key_name='tempest-TestNetworkAdvancedServerOps-1059800776',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='e26ae3fdd48d4947978a480f70e14f84',ramdisk_id='',reservation_id='r-i1fw0oxn',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='3ebffd97-b242-42d7-b245-ebdaf8e4377c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkAdvancedServerOps-1829130727',owner_user_name='tempest-TestNetworkAdvancedServerOps-1829130727-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T09:26:57Z,user_data=None,user_id='65751a90715341b2984ef84ebbaa1650',uuid=9d2b0f76-0408-4f6b-8a37-ac9882a44b4e,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "fee19e88-d18e-4020-97b6-26caf4ef6fa9", "address": "fa:16:3e:ad:12:e3", "network": {"id": "2d5ff262-0b2d-49fb-b643-980510ce97c7", "bridge": "br-int", "label": "tempest-network-smoke--2096539509", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e26ae3fdd48d4947978a480f70e14f84", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfee19e88-d1", "ovs_interfaceid": "fee19e88-d18e-4020-97b6-26caf4ef6fa9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Dec 05 09:27:04 compute-1 nova_compute[189066]: 2025-12-05 09:27:04.388 189070 DEBUG nova.network.os_vif_util [None req-6ce70ff3-e851-4175-bf63-4a9beff89553 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] Converting VIF {"id": "fee19e88-d18e-4020-97b6-26caf4ef6fa9", "address": "fa:16:3e:ad:12:e3", "network": {"id": "2d5ff262-0b2d-49fb-b643-980510ce97c7", "bridge": "br-int", "label": "tempest-network-smoke--2096539509", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e26ae3fdd48d4947978a480f70e14f84", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfee19e88-d1", "ovs_interfaceid": "fee19e88-d18e-4020-97b6-26caf4ef6fa9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 09:27:04 compute-1 nova_compute[189066]: 2025-12-05 09:27:04.389 189070 DEBUG nova.network.os_vif_util [None req-6ce70ff3-e851-4175-bf63-4a9beff89553 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ad:12:e3,bridge_name='br-int',has_traffic_filtering=True,id=fee19e88-d18e-4020-97b6-26caf4ef6fa9,network=Network(2d5ff262-0b2d-49fb-b643-980510ce97c7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfee19e88-d1') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 09:27:04 compute-1 nova_compute[189066]: 2025-12-05 09:27:04.390 189070 DEBUG nova.objects.instance [None req-6ce70ff3-e851-4175-bf63-4a9beff89553 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] Lazy-loading 'pci_devices' on Instance uuid 9d2b0f76-0408-4f6b-8a37-ac9882a44b4e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 09:27:04 compute-1 nova_compute[189066]: 2025-12-05 09:27:04.392 189070 DEBUG nova.virt.libvirt.host [None req-084a5c26-267c-4925-bdee-4818a6d5ea30 a9b7ab1c9c854146af8af16f337a063d aa5b008731384d18ba83c8d69e76bcef - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Dec 05 09:27:04 compute-1 nova_compute[189066]: 2025-12-05 09:27:04.392 189070 DEBUG nova.virt.libvirt.host [None req-084a5c26-267c-4925-bdee-4818a6d5ea30 a9b7ab1c9c854146af8af16f337a063d aa5b008731384d18ba83c8d69e76bcef - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Dec 05 09:27:04 compute-1 nova_compute[189066]: 2025-12-05 09:27:04.401 189070 DEBUG nova.virt.libvirt.host [None req-084a5c26-267c-4925-bdee-4818a6d5ea30 a9b7ab1c9c854146af8af16f337a063d aa5b008731384d18ba83c8d69e76bcef - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Dec 05 09:27:04 compute-1 nova_compute[189066]: 2025-12-05 09:27:04.401 189070 DEBUG nova.virt.libvirt.host [None req-084a5c26-267c-4925-bdee-4818a6d5ea30 a9b7ab1c9c854146af8af16f337a063d aa5b008731384d18ba83c8d69e76bcef - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Dec 05 09:27:04 compute-1 nova_compute[189066]: 2025-12-05 09:27:04.402 189070 DEBUG nova.virt.libvirt.driver [None req-084a5c26-267c-4925-bdee-4818a6d5ea30 a9b7ab1c9c854146af8af16f337a063d aa5b008731384d18ba83c8d69e76bcef - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Dec 05 09:27:04 compute-1 nova_compute[189066]: 2025-12-05 09:27:04.403 189070 DEBUG nova.virt.hardware [None req-084a5c26-267c-4925-bdee-4818a6d5ea30 a9b7ab1c9c854146af8af16f337a063d aa5b008731384d18ba83c8d69e76bcef - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-05T09:19:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='fbadeab4-f24f-4100-963a-d228b2a6f7c4',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T09:19:04Z,direct_url=<?>,disk_format='qcow2',id=3ebffd97-b242-42d7-b245-ebdaf8e4377c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='1cb29f3743454b7fb71af92993455437',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T09:19:06Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Dec 05 09:27:04 compute-1 nova_compute[189066]: 2025-12-05 09:27:04.403 189070 DEBUG nova.virt.hardware [None req-084a5c26-267c-4925-bdee-4818a6d5ea30 a9b7ab1c9c854146af8af16f337a063d aa5b008731384d18ba83c8d69e76bcef - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Dec 05 09:27:04 compute-1 nova_compute[189066]: 2025-12-05 09:27:04.403 189070 DEBUG nova.virt.hardware [None req-084a5c26-267c-4925-bdee-4818a6d5ea30 a9b7ab1c9c854146af8af16f337a063d aa5b008731384d18ba83c8d69e76bcef - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Dec 05 09:27:04 compute-1 nova_compute[189066]: 2025-12-05 09:27:04.404 189070 DEBUG nova.virt.hardware [None req-084a5c26-267c-4925-bdee-4818a6d5ea30 a9b7ab1c9c854146af8af16f337a063d aa5b008731384d18ba83c8d69e76bcef - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Dec 05 09:27:04 compute-1 nova_compute[189066]: 2025-12-05 09:27:04.404 189070 DEBUG nova.virt.hardware [None req-084a5c26-267c-4925-bdee-4818a6d5ea30 a9b7ab1c9c854146af8af16f337a063d aa5b008731384d18ba83c8d69e76bcef - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Dec 05 09:27:04 compute-1 nova_compute[189066]: 2025-12-05 09:27:04.404 189070 DEBUG nova.virt.hardware [None req-084a5c26-267c-4925-bdee-4818a6d5ea30 a9b7ab1c9c854146af8af16f337a063d aa5b008731384d18ba83c8d69e76bcef - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Dec 05 09:27:04 compute-1 nova_compute[189066]: 2025-12-05 09:27:04.404 189070 DEBUG nova.virt.hardware [None req-084a5c26-267c-4925-bdee-4818a6d5ea30 a9b7ab1c9c854146af8af16f337a063d aa5b008731384d18ba83c8d69e76bcef - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Dec 05 09:27:04 compute-1 nova_compute[189066]: 2025-12-05 09:27:04.405 189070 DEBUG nova.virt.hardware [None req-084a5c26-267c-4925-bdee-4818a6d5ea30 a9b7ab1c9c854146af8af16f337a063d aa5b008731384d18ba83c8d69e76bcef - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Dec 05 09:27:04 compute-1 nova_compute[189066]: 2025-12-05 09:27:04.405 189070 DEBUG nova.virt.hardware [None req-084a5c26-267c-4925-bdee-4818a6d5ea30 a9b7ab1c9c854146af8af16f337a063d aa5b008731384d18ba83c8d69e76bcef - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Dec 05 09:27:04 compute-1 nova_compute[189066]: 2025-12-05 09:27:04.405 189070 DEBUG nova.virt.hardware [None req-084a5c26-267c-4925-bdee-4818a6d5ea30 a9b7ab1c9c854146af8af16f337a063d aa5b008731384d18ba83c8d69e76bcef - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Dec 05 09:27:04 compute-1 nova_compute[189066]: 2025-12-05 09:27:04.406 189070 DEBUG nova.virt.hardware [None req-084a5c26-267c-4925-bdee-4818a6d5ea30 a9b7ab1c9c854146af8af16f337a063d aa5b008731384d18ba83c8d69e76bcef - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Dec 05 09:27:04 compute-1 nova_compute[189066]: 2025-12-05 09:27:04.409 189070 DEBUG nova.virt.libvirt.vif [None req-084a5c26-267c-4925-bdee-4818a6d5ea30 a9b7ab1c9c854146af8af16f337a063d aa5b008731384d18ba83c8d69e76bcef - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T09:26:51Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-LiveAutoBlockMigrationV225Test-server-1534053809',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-liveautoblockmigrationv225test-server-1534053809',id=11,image_ref='3ebffd97-b242-42d7-b245-ebdaf8e4377c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='aa5b008731384d18ba83c8d69e76bcef',ramdisk_id='',reservation_id='r-mjfc6v5g',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='3ebffd97-b242-42d7-b245-ebdaf8e4377c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-LiveAutoBlockMigrationV225Test-63392288',owner_user_name='tempest-LiveAutoBlockMigrationV225Test-63392288-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T09:26:58Z,user_data=None,user_id='a9b7ab1c9c854146af8af16f337a063d',uuid=43d83f29-ba12-4205-ba09-545c3dc28920,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "8977e440-f4bd-42c7-bf7b-c57e7184cc33", "address": "fa:16:3e:81:c8:37", "network": {"id": "0a97aec7-0780-4b5e-9498-e796fd7b42fd", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-971219995-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "aa5b008731384d18ba83c8d69e76bcef", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8977e440-f4", "ovs_interfaceid": "8977e440-f4bd-42c7-bf7b-c57e7184cc33", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Dec 05 09:27:04 compute-1 nova_compute[189066]: 2025-12-05 09:27:04.409 189070 DEBUG nova.network.os_vif_util [None req-084a5c26-267c-4925-bdee-4818a6d5ea30 a9b7ab1c9c854146af8af16f337a063d aa5b008731384d18ba83c8d69e76bcef - - default default] Converting VIF {"id": "8977e440-f4bd-42c7-bf7b-c57e7184cc33", "address": "fa:16:3e:81:c8:37", "network": {"id": "0a97aec7-0780-4b5e-9498-e796fd7b42fd", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-971219995-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "aa5b008731384d18ba83c8d69e76bcef", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8977e440-f4", "ovs_interfaceid": "8977e440-f4bd-42c7-bf7b-c57e7184cc33", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 09:27:04 compute-1 nova_compute[189066]: 2025-12-05 09:27:04.410 189070 DEBUG nova.network.os_vif_util [None req-084a5c26-267c-4925-bdee-4818a6d5ea30 a9b7ab1c9c854146af8af16f337a063d aa5b008731384d18ba83c8d69e76bcef - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:81:c8:37,bridge_name='br-int',has_traffic_filtering=True,id=8977e440-f4bd-42c7-bf7b-c57e7184cc33,network=Network(0a97aec7-0780-4b5e-9498-e796fd7b42fd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap8977e440-f4') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 09:27:04 compute-1 nova_compute[189066]: 2025-12-05 09:27:04.410 189070 DEBUG nova.objects.instance [None req-084a5c26-267c-4925-bdee-4818a6d5ea30 a9b7ab1c9c854146af8af16f337a063d aa5b008731384d18ba83c8d69e76bcef - - default default] Lazy-loading 'pci_devices' on Instance uuid 43d83f29-ba12-4205-ba09-545c3dc28920 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 09:27:04 compute-1 nova_compute[189066]: 2025-12-05 09:27:04.413 189070 DEBUG nova.virt.libvirt.driver [None req-6ce70ff3-e851-4175-bf63-4a9beff89553 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] [instance: 9d2b0f76-0408-4f6b-8a37-ac9882a44b4e] End _get_guest_xml xml=<domain type="kvm">
Dec 05 09:27:04 compute-1 nova_compute[189066]:   <uuid>9d2b0f76-0408-4f6b-8a37-ac9882a44b4e</uuid>
Dec 05 09:27:04 compute-1 nova_compute[189066]:   <name>instance-0000000a</name>
Dec 05 09:27:04 compute-1 nova_compute[189066]:   <memory>131072</memory>
Dec 05 09:27:04 compute-1 nova_compute[189066]:   <vcpu>1</vcpu>
Dec 05 09:27:04 compute-1 nova_compute[189066]:   <metadata>
Dec 05 09:27:04 compute-1 nova_compute[189066]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 05 09:27:04 compute-1 nova_compute[189066]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec 05 09:27:04 compute-1 nova_compute[189066]:       <nova:name>tempest-TestNetworkAdvancedServerOps-server-1301659118</nova:name>
Dec 05 09:27:04 compute-1 nova_compute[189066]:       <nova:creationTime>2025-12-05 09:27:04</nova:creationTime>
Dec 05 09:27:04 compute-1 nova_compute[189066]:       <nova:flavor name="m1.nano">
Dec 05 09:27:04 compute-1 nova_compute[189066]:         <nova:memory>128</nova:memory>
Dec 05 09:27:04 compute-1 nova_compute[189066]:         <nova:disk>1</nova:disk>
Dec 05 09:27:04 compute-1 nova_compute[189066]:         <nova:swap>0</nova:swap>
Dec 05 09:27:04 compute-1 nova_compute[189066]:         <nova:ephemeral>0</nova:ephemeral>
Dec 05 09:27:04 compute-1 nova_compute[189066]:         <nova:vcpus>1</nova:vcpus>
Dec 05 09:27:04 compute-1 nova_compute[189066]:       </nova:flavor>
Dec 05 09:27:04 compute-1 nova_compute[189066]:       <nova:owner>
Dec 05 09:27:04 compute-1 nova_compute[189066]:         <nova:user uuid="65751a90715341b2984ef84ebbaa1650">tempest-TestNetworkAdvancedServerOps-1829130727-project-member</nova:user>
Dec 05 09:27:04 compute-1 nova_compute[189066]:         <nova:project uuid="e26ae3fdd48d4947978a480f70e14f84">tempest-TestNetworkAdvancedServerOps-1829130727</nova:project>
Dec 05 09:27:04 compute-1 nova_compute[189066]:       </nova:owner>
Dec 05 09:27:04 compute-1 nova_compute[189066]:       <nova:root type="image" uuid="3ebffd97-b242-42d7-b245-ebdaf8e4377c"/>
Dec 05 09:27:04 compute-1 nova_compute[189066]:       <nova:ports>
Dec 05 09:27:04 compute-1 nova_compute[189066]:         <nova:port uuid="fee19e88-d18e-4020-97b6-26caf4ef6fa9">
Dec 05 09:27:04 compute-1 nova_compute[189066]:           <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Dec 05 09:27:04 compute-1 nova_compute[189066]:         </nova:port>
Dec 05 09:27:04 compute-1 nova_compute[189066]:       </nova:ports>
Dec 05 09:27:04 compute-1 nova_compute[189066]:     </nova:instance>
Dec 05 09:27:04 compute-1 nova_compute[189066]:   </metadata>
Dec 05 09:27:04 compute-1 nova_compute[189066]:   <sysinfo type="smbios">
Dec 05 09:27:04 compute-1 nova_compute[189066]:     <system>
Dec 05 09:27:04 compute-1 nova_compute[189066]:       <entry name="manufacturer">RDO</entry>
Dec 05 09:27:04 compute-1 nova_compute[189066]:       <entry name="product">OpenStack Compute</entry>
Dec 05 09:27:04 compute-1 nova_compute[189066]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec 05 09:27:04 compute-1 nova_compute[189066]:       <entry name="serial">9d2b0f76-0408-4f6b-8a37-ac9882a44b4e</entry>
Dec 05 09:27:04 compute-1 nova_compute[189066]:       <entry name="uuid">9d2b0f76-0408-4f6b-8a37-ac9882a44b4e</entry>
Dec 05 09:27:04 compute-1 nova_compute[189066]:       <entry name="family">Virtual Machine</entry>
Dec 05 09:27:04 compute-1 nova_compute[189066]:     </system>
Dec 05 09:27:04 compute-1 nova_compute[189066]:   </sysinfo>
Dec 05 09:27:04 compute-1 nova_compute[189066]:   <os>
Dec 05 09:27:04 compute-1 nova_compute[189066]:     <type arch="x86_64" machine="q35">hvm</type>
Dec 05 09:27:04 compute-1 nova_compute[189066]:     <boot dev="hd"/>
Dec 05 09:27:04 compute-1 nova_compute[189066]:     <smbios mode="sysinfo"/>
Dec 05 09:27:04 compute-1 nova_compute[189066]:   </os>
Dec 05 09:27:04 compute-1 nova_compute[189066]:   <features>
Dec 05 09:27:04 compute-1 nova_compute[189066]:     <acpi/>
Dec 05 09:27:04 compute-1 nova_compute[189066]:     <apic/>
Dec 05 09:27:04 compute-1 nova_compute[189066]:     <vmcoreinfo/>
Dec 05 09:27:04 compute-1 nova_compute[189066]:   </features>
Dec 05 09:27:04 compute-1 nova_compute[189066]:   <clock offset="utc">
Dec 05 09:27:04 compute-1 nova_compute[189066]:     <timer name="pit" tickpolicy="delay"/>
Dec 05 09:27:04 compute-1 nova_compute[189066]:     <timer name="rtc" tickpolicy="catchup"/>
Dec 05 09:27:04 compute-1 nova_compute[189066]:     <timer name="hpet" present="no"/>
Dec 05 09:27:04 compute-1 nova_compute[189066]:   </clock>
Dec 05 09:27:04 compute-1 nova_compute[189066]:   <cpu mode="custom" match="exact">
Dec 05 09:27:04 compute-1 nova_compute[189066]:     <model>Nehalem</model>
Dec 05 09:27:04 compute-1 nova_compute[189066]:     <topology sockets="1" cores="1" threads="1"/>
Dec 05 09:27:04 compute-1 nova_compute[189066]:   </cpu>
Dec 05 09:27:04 compute-1 nova_compute[189066]:   <devices>
Dec 05 09:27:04 compute-1 nova_compute[189066]:     <disk type="file" device="disk">
Dec 05 09:27:04 compute-1 nova_compute[189066]:       <driver name="qemu" type="qcow2" cache="none"/>
Dec 05 09:27:04 compute-1 nova_compute[189066]:       <source file="/var/lib/nova/instances/9d2b0f76-0408-4f6b-8a37-ac9882a44b4e/disk"/>
Dec 05 09:27:04 compute-1 nova_compute[189066]:       <target dev="vda" bus="virtio"/>
Dec 05 09:27:04 compute-1 nova_compute[189066]:     </disk>
Dec 05 09:27:04 compute-1 nova_compute[189066]:     <disk type="file" device="cdrom">
Dec 05 09:27:04 compute-1 nova_compute[189066]:       <driver name="qemu" type="raw" cache="none"/>
Dec 05 09:27:04 compute-1 nova_compute[189066]:       <source file="/var/lib/nova/instances/9d2b0f76-0408-4f6b-8a37-ac9882a44b4e/disk.config"/>
Dec 05 09:27:04 compute-1 nova_compute[189066]:       <target dev="sda" bus="sata"/>
Dec 05 09:27:04 compute-1 nova_compute[189066]:     </disk>
Dec 05 09:27:04 compute-1 nova_compute[189066]:     <interface type="ethernet">
Dec 05 09:27:04 compute-1 nova_compute[189066]:       <mac address="fa:16:3e:ad:12:e3"/>
Dec 05 09:27:04 compute-1 nova_compute[189066]:       <model type="virtio"/>
Dec 05 09:27:04 compute-1 nova_compute[189066]:       <driver name="vhost" rx_queue_size="512"/>
Dec 05 09:27:04 compute-1 nova_compute[189066]:       <mtu size="1442"/>
Dec 05 09:27:04 compute-1 nova_compute[189066]:       <target dev="tapfee19e88-d1"/>
Dec 05 09:27:04 compute-1 nova_compute[189066]:     </interface>
Dec 05 09:27:04 compute-1 nova_compute[189066]:     <serial type="pty">
Dec 05 09:27:04 compute-1 nova_compute[189066]:       <log file="/var/lib/nova/instances/9d2b0f76-0408-4f6b-8a37-ac9882a44b4e/console.log" append="off"/>
Dec 05 09:27:04 compute-1 nova_compute[189066]:     </serial>
Dec 05 09:27:04 compute-1 nova_compute[189066]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Dec 05 09:27:04 compute-1 nova_compute[189066]:     <video>
Dec 05 09:27:04 compute-1 nova_compute[189066]:       <model type="virtio"/>
Dec 05 09:27:04 compute-1 nova_compute[189066]:     </video>
Dec 05 09:27:04 compute-1 nova_compute[189066]:     <input type="tablet" bus="usb"/>
Dec 05 09:27:04 compute-1 nova_compute[189066]:     <rng model="virtio">
Dec 05 09:27:04 compute-1 nova_compute[189066]:       <backend model="random">/dev/urandom</backend>
Dec 05 09:27:04 compute-1 nova_compute[189066]:     </rng>
Dec 05 09:27:04 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root"/>
Dec 05 09:27:04 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:27:04 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:27:04 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:27:04 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:27:04 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:27:04 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:27:04 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:27:04 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:27:04 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:27:04 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:27:04 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:27:04 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:27:04 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:27:04 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:27:04 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:27:04 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:27:04 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:27:04 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:27:04 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:27:04 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:27:04 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:27:04 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:27:04 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:27:04 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:27:04 compute-1 nova_compute[189066]:     <controller type="usb" index="0"/>
Dec 05 09:27:04 compute-1 nova_compute[189066]:     <memballoon model="virtio">
Dec 05 09:27:04 compute-1 nova_compute[189066]:       <stats period="10"/>
Dec 05 09:27:04 compute-1 nova_compute[189066]:     </memballoon>
Dec 05 09:27:04 compute-1 nova_compute[189066]:   </devices>
Dec 05 09:27:04 compute-1 nova_compute[189066]: </domain>
Dec 05 09:27:04 compute-1 nova_compute[189066]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Dec 05 09:27:04 compute-1 nova_compute[189066]: 2025-12-05 09:27:04.415 189070 DEBUG nova.compute.manager [None req-6ce70ff3-e851-4175-bf63-4a9beff89553 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] [instance: 9d2b0f76-0408-4f6b-8a37-ac9882a44b4e] Preparing to wait for external event network-vif-plugged-fee19e88-d18e-4020-97b6-26caf4ef6fa9 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Dec 05 09:27:04 compute-1 nova_compute[189066]: 2025-12-05 09:27:04.415 189070 DEBUG oslo_concurrency.lockutils [None req-6ce70ff3-e851-4175-bf63-4a9beff89553 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] Acquiring lock "9d2b0f76-0408-4f6b-8a37-ac9882a44b4e-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:27:04 compute-1 nova_compute[189066]: 2025-12-05 09:27:04.415 189070 DEBUG oslo_concurrency.lockutils [None req-6ce70ff3-e851-4175-bf63-4a9beff89553 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] Lock "9d2b0f76-0408-4f6b-8a37-ac9882a44b4e-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:27:04 compute-1 nova_compute[189066]: 2025-12-05 09:27:04.416 189070 DEBUG oslo_concurrency.lockutils [None req-6ce70ff3-e851-4175-bf63-4a9beff89553 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] Lock "9d2b0f76-0408-4f6b-8a37-ac9882a44b4e-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:27:04 compute-1 nova_compute[189066]: 2025-12-05 09:27:04.416 189070 DEBUG nova.virt.libvirt.vif [None req-6ce70ff3-e851-4175-bf63-4a9beff89553 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T09:26:52Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-1301659118',display_name='tempest-TestNetworkAdvancedServerOps-server-1301659118',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-1301659118',id=10,image_ref='3ebffd97-b242-42d7-b245-ebdaf8e4377c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBE/IH5546c3h1NqRaMnG8oeFTcOTykF/TM0HkVvHYHh6GuyCm0ZuW+dN3e3eFJ7fYYhI257JWo40Mrp34xaUZfOzZGF5l/jYsmwg8LZO1T3z+5cPaeu5EMQJR4Fd4T/olg==',key_name='tempest-TestNetworkAdvancedServerOps-1059800776',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='e26ae3fdd48d4947978a480f70e14f84',ramdisk_id='',reservation_id='r-i1fw0oxn',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='3ebffd97-b242-42d7-b245-ebdaf8e4377c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkAdvancedServerOps-1829130727',owner_user_name='tempest-TestNetworkAdvancedServerOps-1829130727-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T09:26:57Z,user_data=None,user_id='65751a90715341b2984ef84ebbaa1650',uuid=9d2b0f76-0408-4f6b-8a37-ac9882a44b4e,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "fee19e88-d18e-4020-97b6-26caf4ef6fa9", "address": "fa:16:3e:ad:12:e3", "network": {"id": "2d5ff262-0b2d-49fb-b643-980510ce97c7", "bridge": "br-int", "label": "tempest-network-smoke--2096539509", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e26ae3fdd48d4947978a480f70e14f84", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfee19e88-d1", "ovs_interfaceid": "fee19e88-d18e-4020-97b6-26caf4ef6fa9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Dec 05 09:27:04 compute-1 nova_compute[189066]: 2025-12-05 09:27:04.417 189070 DEBUG nova.network.os_vif_util [None req-6ce70ff3-e851-4175-bf63-4a9beff89553 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] Converting VIF {"id": "fee19e88-d18e-4020-97b6-26caf4ef6fa9", "address": "fa:16:3e:ad:12:e3", "network": {"id": "2d5ff262-0b2d-49fb-b643-980510ce97c7", "bridge": "br-int", "label": "tempest-network-smoke--2096539509", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e26ae3fdd48d4947978a480f70e14f84", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfee19e88-d1", "ovs_interfaceid": "fee19e88-d18e-4020-97b6-26caf4ef6fa9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 09:27:04 compute-1 nova_compute[189066]: 2025-12-05 09:27:04.417 189070 DEBUG nova.network.os_vif_util [None req-6ce70ff3-e851-4175-bf63-4a9beff89553 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ad:12:e3,bridge_name='br-int',has_traffic_filtering=True,id=fee19e88-d18e-4020-97b6-26caf4ef6fa9,network=Network(2d5ff262-0b2d-49fb-b643-980510ce97c7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfee19e88-d1') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 09:27:04 compute-1 nova_compute[189066]: 2025-12-05 09:27:04.418 189070 DEBUG os_vif [None req-6ce70ff3-e851-4175-bf63-4a9beff89553 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:ad:12:e3,bridge_name='br-int',has_traffic_filtering=True,id=fee19e88-d18e-4020-97b6-26caf4ef6fa9,network=Network(2d5ff262-0b2d-49fb-b643-980510ce97c7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfee19e88-d1') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Dec 05 09:27:04 compute-1 nova_compute[189066]: 2025-12-05 09:27:04.418 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:27:04 compute-1 nova_compute[189066]: 2025-12-05 09:27:04.419 189070 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 09:27:04 compute-1 nova_compute[189066]: 2025-12-05 09:27:04.419 189070 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 09:27:04 compute-1 nova_compute[189066]: 2025-12-05 09:27:04.424 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:27:04 compute-1 nova_compute[189066]: 2025-12-05 09:27:04.424 189070 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapfee19e88-d1, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 09:27:04 compute-1 nova_compute[189066]: 2025-12-05 09:27:04.424 189070 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapfee19e88-d1, col_values=(('external_ids', {'iface-id': 'fee19e88-d18e-4020-97b6-26caf4ef6fa9', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:ad:12:e3', 'vm-uuid': '9d2b0f76-0408-4f6b-8a37-ac9882a44b4e'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 09:27:04 compute-1 nova_compute[189066]: 2025-12-05 09:27:04.426 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:27:04 compute-1 NetworkManager[55704]: <info>  [1764926824.4281] manager: (tapfee19e88-d1): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/41)
Dec 05 09:27:04 compute-1 nova_compute[189066]: 2025-12-05 09:27:04.430 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 09:27:04 compute-1 nova_compute[189066]: 2025-12-05 09:27:04.434 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:27:04 compute-1 nova_compute[189066]: 2025-12-05 09:27:04.435 189070 INFO os_vif [None req-6ce70ff3-e851-4175-bf63-4a9beff89553 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:ad:12:e3,bridge_name='br-int',has_traffic_filtering=True,id=fee19e88-d18e-4020-97b6-26caf4ef6fa9,network=Network(2d5ff262-0b2d-49fb-b643-980510ce97c7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfee19e88-d1')
Dec 05 09:27:04 compute-1 nova_compute[189066]: 2025-12-05 09:27:04.440 189070 DEBUG nova.virt.libvirt.driver [None req-084a5c26-267c-4925-bdee-4818a6d5ea30 a9b7ab1c9c854146af8af16f337a063d aa5b008731384d18ba83c8d69e76bcef - - default default] [instance: 43d83f29-ba12-4205-ba09-545c3dc28920] End _get_guest_xml xml=<domain type="kvm">
Dec 05 09:27:04 compute-1 nova_compute[189066]:   <uuid>43d83f29-ba12-4205-ba09-545c3dc28920</uuid>
Dec 05 09:27:04 compute-1 nova_compute[189066]:   <name>instance-0000000b</name>
Dec 05 09:27:04 compute-1 nova_compute[189066]:   <memory>131072</memory>
Dec 05 09:27:04 compute-1 nova_compute[189066]:   <vcpu>1</vcpu>
Dec 05 09:27:04 compute-1 nova_compute[189066]:   <metadata>
Dec 05 09:27:04 compute-1 nova_compute[189066]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 05 09:27:04 compute-1 nova_compute[189066]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec 05 09:27:04 compute-1 nova_compute[189066]:       <nova:name>tempest-LiveAutoBlockMigrationV225Test-server-1534053809</nova:name>
Dec 05 09:27:04 compute-1 nova_compute[189066]:       <nova:creationTime>2025-12-05 09:27:04</nova:creationTime>
Dec 05 09:27:04 compute-1 nova_compute[189066]:       <nova:flavor name="m1.nano">
Dec 05 09:27:04 compute-1 nova_compute[189066]:         <nova:memory>128</nova:memory>
Dec 05 09:27:04 compute-1 nova_compute[189066]:         <nova:disk>1</nova:disk>
Dec 05 09:27:04 compute-1 nova_compute[189066]:         <nova:swap>0</nova:swap>
Dec 05 09:27:04 compute-1 nova_compute[189066]:         <nova:ephemeral>0</nova:ephemeral>
Dec 05 09:27:04 compute-1 nova_compute[189066]:         <nova:vcpus>1</nova:vcpus>
Dec 05 09:27:04 compute-1 nova_compute[189066]:       </nova:flavor>
Dec 05 09:27:04 compute-1 nova_compute[189066]:       <nova:owner>
Dec 05 09:27:04 compute-1 nova_compute[189066]:         <nova:user uuid="a9b7ab1c9c854146af8af16f337a063d">tempest-LiveAutoBlockMigrationV225Test-63392288-project-member</nova:user>
Dec 05 09:27:04 compute-1 nova_compute[189066]:         <nova:project uuid="aa5b008731384d18ba83c8d69e76bcef">tempest-LiveAutoBlockMigrationV225Test-63392288</nova:project>
Dec 05 09:27:04 compute-1 nova_compute[189066]:       </nova:owner>
Dec 05 09:27:04 compute-1 nova_compute[189066]:       <nova:root type="image" uuid="3ebffd97-b242-42d7-b245-ebdaf8e4377c"/>
Dec 05 09:27:04 compute-1 nova_compute[189066]:       <nova:ports>
Dec 05 09:27:04 compute-1 nova_compute[189066]:         <nova:port uuid="8977e440-f4bd-42c7-bf7b-c57e7184cc33">
Dec 05 09:27:04 compute-1 nova_compute[189066]:           <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Dec 05 09:27:04 compute-1 nova_compute[189066]:         </nova:port>
Dec 05 09:27:04 compute-1 nova_compute[189066]:       </nova:ports>
Dec 05 09:27:04 compute-1 nova_compute[189066]:     </nova:instance>
Dec 05 09:27:04 compute-1 nova_compute[189066]:   </metadata>
Dec 05 09:27:04 compute-1 nova_compute[189066]:   <sysinfo type="smbios">
Dec 05 09:27:04 compute-1 nova_compute[189066]:     <system>
Dec 05 09:27:04 compute-1 nova_compute[189066]:       <entry name="manufacturer">RDO</entry>
Dec 05 09:27:04 compute-1 nova_compute[189066]:       <entry name="product">OpenStack Compute</entry>
Dec 05 09:27:04 compute-1 nova_compute[189066]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec 05 09:27:04 compute-1 nova_compute[189066]:       <entry name="serial">43d83f29-ba12-4205-ba09-545c3dc28920</entry>
Dec 05 09:27:04 compute-1 nova_compute[189066]:       <entry name="uuid">43d83f29-ba12-4205-ba09-545c3dc28920</entry>
Dec 05 09:27:04 compute-1 nova_compute[189066]:       <entry name="family">Virtual Machine</entry>
Dec 05 09:27:04 compute-1 nova_compute[189066]:     </system>
Dec 05 09:27:04 compute-1 nova_compute[189066]:   </sysinfo>
Dec 05 09:27:04 compute-1 nova_compute[189066]:   <os>
Dec 05 09:27:04 compute-1 nova_compute[189066]:     <type arch="x86_64" machine="q35">hvm</type>
Dec 05 09:27:04 compute-1 nova_compute[189066]:     <boot dev="hd"/>
Dec 05 09:27:04 compute-1 nova_compute[189066]:     <smbios mode="sysinfo"/>
Dec 05 09:27:04 compute-1 nova_compute[189066]:   </os>
Dec 05 09:27:04 compute-1 nova_compute[189066]:   <features>
Dec 05 09:27:04 compute-1 nova_compute[189066]:     <acpi/>
Dec 05 09:27:04 compute-1 nova_compute[189066]:     <apic/>
Dec 05 09:27:04 compute-1 nova_compute[189066]:     <vmcoreinfo/>
Dec 05 09:27:04 compute-1 nova_compute[189066]:   </features>
Dec 05 09:27:04 compute-1 nova_compute[189066]:   <clock offset="utc">
Dec 05 09:27:04 compute-1 nova_compute[189066]:     <timer name="pit" tickpolicy="delay"/>
Dec 05 09:27:04 compute-1 nova_compute[189066]:     <timer name="rtc" tickpolicy="catchup"/>
Dec 05 09:27:04 compute-1 nova_compute[189066]:     <timer name="hpet" present="no"/>
Dec 05 09:27:04 compute-1 nova_compute[189066]:   </clock>
Dec 05 09:27:04 compute-1 nova_compute[189066]:   <cpu mode="custom" match="exact">
Dec 05 09:27:04 compute-1 nova_compute[189066]:     <model>Nehalem</model>
Dec 05 09:27:04 compute-1 nova_compute[189066]:     <topology sockets="1" cores="1" threads="1"/>
Dec 05 09:27:04 compute-1 nova_compute[189066]:   </cpu>
Dec 05 09:27:04 compute-1 nova_compute[189066]:   <devices>
Dec 05 09:27:04 compute-1 nova_compute[189066]:     <disk type="file" device="disk">
Dec 05 09:27:04 compute-1 nova_compute[189066]:       <driver name="qemu" type="qcow2" cache="none"/>
Dec 05 09:27:04 compute-1 nova_compute[189066]:       <source file="/var/lib/nova/instances/43d83f29-ba12-4205-ba09-545c3dc28920/disk"/>
Dec 05 09:27:04 compute-1 nova_compute[189066]:       <target dev="vda" bus="virtio"/>
Dec 05 09:27:04 compute-1 nova_compute[189066]:     </disk>
Dec 05 09:27:04 compute-1 nova_compute[189066]:     <disk type="file" device="cdrom">
Dec 05 09:27:04 compute-1 nova_compute[189066]:       <driver name="qemu" type="raw" cache="none"/>
Dec 05 09:27:04 compute-1 nova_compute[189066]:       <source file="/var/lib/nova/instances/43d83f29-ba12-4205-ba09-545c3dc28920/disk.config"/>
Dec 05 09:27:04 compute-1 nova_compute[189066]:       <target dev="sda" bus="sata"/>
Dec 05 09:27:04 compute-1 nova_compute[189066]:     </disk>
Dec 05 09:27:04 compute-1 nova_compute[189066]:     <interface type="ethernet">
Dec 05 09:27:04 compute-1 nova_compute[189066]:       <mac address="fa:16:3e:81:c8:37"/>
Dec 05 09:27:04 compute-1 nova_compute[189066]:       <model type="virtio"/>
Dec 05 09:27:04 compute-1 nova_compute[189066]:       <driver name="vhost" rx_queue_size="512"/>
Dec 05 09:27:04 compute-1 nova_compute[189066]:       <mtu size="1442"/>
Dec 05 09:27:04 compute-1 nova_compute[189066]:       <target dev="tap8977e440-f4"/>
Dec 05 09:27:04 compute-1 nova_compute[189066]:     </interface>
Dec 05 09:27:04 compute-1 nova_compute[189066]:     <serial type="pty">
Dec 05 09:27:04 compute-1 nova_compute[189066]:       <log file="/var/lib/nova/instances/43d83f29-ba12-4205-ba09-545c3dc28920/console.log" append="off"/>
Dec 05 09:27:04 compute-1 nova_compute[189066]:     </serial>
Dec 05 09:27:04 compute-1 nova_compute[189066]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Dec 05 09:27:04 compute-1 nova_compute[189066]:     <video>
Dec 05 09:27:04 compute-1 nova_compute[189066]:       <model type="virtio"/>
Dec 05 09:27:04 compute-1 nova_compute[189066]:     </video>
Dec 05 09:27:04 compute-1 nova_compute[189066]:     <input type="tablet" bus="usb"/>
Dec 05 09:27:04 compute-1 nova_compute[189066]:     <rng model="virtio">
Dec 05 09:27:04 compute-1 nova_compute[189066]:       <backend model="random">/dev/urandom</backend>
Dec 05 09:27:04 compute-1 nova_compute[189066]:     </rng>
Dec 05 09:27:04 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root"/>
Dec 05 09:27:04 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:27:04 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:27:04 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:27:04 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:27:04 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:27:04 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:27:04 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:27:04 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:27:04 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:27:04 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:27:04 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:27:04 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:27:04 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:27:04 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:27:04 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:27:04 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:27:04 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:27:04 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:27:04 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:27:04 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:27:04 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:27:04 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:27:04 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:27:04 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:27:04 compute-1 nova_compute[189066]:     <controller type="usb" index="0"/>
Dec 05 09:27:04 compute-1 nova_compute[189066]:     <memballoon model="virtio">
Dec 05 09:27:04 compute-1 nova_compute[189066]:       <stats period="10"/>
Dec 05 09:27:04 compute-1 nova_compute[189066]:     </memballoon>
Dec 05 09:27:04 compute-1 nova_compute[189066]:   </devices>
Dec 05 09:27:04 compute-1 nova_compute[189066]: </domain>
Dec 05 09:27:04 compute-1 nova_compute[189066]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Dec 05 09:27:04 compute-1 nova_compute[189066]: 2025-12-05 09:27:04.441 189070 DEBUG nova.compute.manager [None req-084a5c26-267c-4925-bdee-4818a6d5ea30 a9b7ab1c9c854146af8af16f337a063d aa5b008731384d18ba83c8d69e76bcef - - default default] [instance: 43d83f29-ba12-4205-ba09-545c3dc28920] Preparing to wait for external event network-vif-plugged-8977e440-f4bd-42c7-bf7b-c57e7184cc33 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Dec 05 09:27:04 compute-1 nova_compute[189066]: 2025-12-05 09:27:04.441 189070 DEBUG oslo_concurrency.lockutils [None req-084a5c26-267c-4925-bdee-4818a6d5ea30 a9b7ab1c9c854146af8af16f337a063d aa5b008731384d18ba83c8d69e76bcef - - default default] Acquiring lock "43d83f29-ba12-4205-ba09-545c3dc28920-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:27:04 compute-1 nova_compute[189066]: 2025-12-05 09:27:04.441 189070 DEBUG oslo_concurrency.lockutils [None req-084a5c26-267c-4925-bdee-4818a6d5ea30 a9b7ab1c9c854146af8af16f337a063d aa5b008731384d18ba83c8d69e76bcef - - default default] Lock "43d83f29-ba12-4205-ba09-545c3dc28920-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:27:04 compute-1 nova_compute[189066]: 2025-12-05 09:27:04.441 189070 DEBUG oslo_concurrency.lockutils [None req-084a5c26-267c-4925-bdee-4818a6d5ea30 a9b7ab1c9c854146af8af16f337a063d aa5b008731384d18ba83c8d69e76bcef - - default default] Lock "43d83f29-ba12-4205-ba09-545c3dc28920-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:27:04 compute-1 nova_compute[189066]: 2025-12-05 09:27:04.442 189070 DEBUG nova.virt.libvirt.vif [None req-084a5c26-267c-4925-bdee-4818a6d5ea30 a9b7ab1c9c854146af8af16f337a063d aa5b008731384d18ba83c8d69e76bcef - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T09:26:51Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-LiveAutoBlockMigrationV225Test-server-1534053809',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-liveautoblockmigrationv225test-server-1534053809',id=11,image_ref='3ebffd97-b242-42d7-b245-ebdaf8e4377c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='aa5b008731384d18ba83c8d69e76bcef',ramdisk_id='',reservation_id='r-mjfc6v5g',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='3ebffd97-b242-42d7-b245-ebdaf8e4377c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-LiveAutoBlockMigrationV225Test-63392288',owner_user_name='tempest-LiveAutoBlockMigrationV225Test-63392288-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T09:26:58Z,user_data=None,user_id='a9b7ab1c9c854146af8af16f337a063d',uuid=43d83f29-ba12-4205-ba09-545c3dc28920,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "8977e440-f4bd-42c7-bf7b-c57e7184cc33", "address": "fa:16:3e:81:c8:37", "network": {"id": "0a97aec7-0780-4b5e-9498-e796fd7b42fd", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-971219995-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "aa5b008731384d18ba83c8d69e76bcef", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8977e440-f4", "ovs_interfaceid": "8977e440-f4bd-42c7-bf7b-c57e7184cc33", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Dec 05 09:27:04 compute-1 nova_compute[189066]: 2025-12-05 09:27:04.442 189070 DEBUG nova.network.os_vif_util [None req-084a5c26-267c-4925-bdee-4818a6d5ea30 a9b7ab1c9c854146af8af16f337a063d aa5b008731384d18ba83c8d69e76bcef - - default default] Converting VIF {"id": "8977e440-f4bd-42c7-bf7b-c57e7184cc33", "address": "fa:16:3e:81:c8:37", "network": {"id": "0a97aec7-0780-4b5e-9498-e796fd7b42fd", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-971219995-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "aa5b008731384d18ba83c8d69e76bcef", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8977e440-f4", "ovs_interfaceid": "8977e440-f4bd-42c7-bf7b-c57e7184cc33", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 09:27:04 compute-1 nova_compute[189066]: 2025-12-05 09:27:04.443 189070 DEBUG nova.network.os_vif_util [None req-084a5c26-267c-4925-bdee-4818a6d5ea30 a9b7ab1c9c854146af8af16f337a063d aa5b008731384d18ba83c8d69e76bcef - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:81:c8:37,bridge_name='br-int',has_traffic_filtering=True,id=8977e440-f4bd-42c7-bf7b-c57e7184cc33,network=Network(0a97aec7-0780-4b5e-9498-e796fd7b42fd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap8977e440-f4') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 09:27:04 compute-1 nova_compute[189066]: 2025-12-05 09:27:04.443 189070 DEBUG os_vif [None req-084a5c26-267c-4925-bdee-4818a6d5ea30 a9b7ab1c9c854146af8af16f337a063d aa5b008731384d18ba83c8d69e76bcef - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:81:c8:37,bridge_name='br-int',has_traffic_filtering=True,id=8977e440-f4bd-42c7-bf7b-c57e7184cc33,network=Network(0a97aec7-0780-4b5e-9498-e796fd7b42fd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap8977e440-f4') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Dec 05 09:27:04 compute-1 nova_compute[189066]: 2025-12-05 09:27:04.444 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:27:04 compute-1 nova_compute[189066]: 2025-12-05 09:27:04.444 189070 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 09:27:04 compute-1 nova_compute[189066]: 2025-12-05 09:27:04.444 189070 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 09:27:04 compute-1 nova_compute[189066]: 2025-12-05 09:27:04.446 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:27:04 compute-1 nova_compute[189066]: 2025-12-05 09:27:04.446 189070 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap8977e440-f4, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 09:27:04 compute-1 nova_compute[189066]: 2025-12-05 09:27:04.447 189070 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap8977e440-f4, col_values=(('external_ids', {'iface-id': '8977e440-f4bd-42c7-bf7b-c57e7184cc33', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:81:c8:37', 'vm-uuid': '43d83f29-ba12-4205-ba09-545c3dc28920'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 09:27:04 compute-1 NetworkManager[55704]: <info>  [1764926824.4503] manager: (tap8977e440-f4): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/42)
Dec 05 09:27:04 compute-1 nova_compute[189066]: 2025-12-05 09:27:04.451 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 09:27:04 compute-1 nova_compute[189066]: 2025-12-05 09:27:04.457 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:27:04 compute-1 nova_compute[189066]: 2025-12-05 09:27:04.458 189070 INFO os_vif [None req-084a5c26-267c-4925-bdee-4818a6d5ea30 a9b7ab1c9c854146af8af16f337a063d aa5b008731384d18ba83c8d69e76bcef - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:81:c8:37,bridge_name='br-int',has_traffic_filtering=True,id=8977e440-f4bd-42c7-bf7b-c57e7184cc33,network=Network(0a97aec7-0780-4b5e-9498-e796fd7b42fd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap8977e440-f4')
Dec 05 09:27:04 compute-1 nova_compute[189066]: 2025-12-05 09:27:04.493 189070 DEBUG nova.virt.libvirt.driver [None req-6ce70ff3-e851-4175-bf63-4a9beff89553 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 05 09:27:04 compute-1 nova_compute[189066]: 2025-12-05 09:27:04.493 189070 DEBUG nova.virt.libvirt.driver [None req-6ce70ff3-e851-4175-bf63-4a9beff89553 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 05 09:27:04 compute-1 nova_compute[189066]: 2025-12-05 09:27:04.493 189070 DEBUG nova.virt.libvirt.driver [None req-6ce70ff3-e851-4175-bf63-4a9beff89553 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] No VIF found with MAC fa:16:3e:ad:12:e3, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Dec 05 09:27:04 compute-1 nova_compute[189066]: 2025-12-05 09:27:04.494 189070 INFO nova.virt.libvirt.driver [None req-6ce70ff3-e851-4175-bf63-4a9beff89553 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] [instance: 9d2b0f76-0408-4f6b-8a37-ac9882a44b4e] Using config drive
Dec 05 09:27:04 compute-1 nova_compute[189066]: 2025-12-05 09:27:04.502 189070 DEBUG nova.virt.libvirt.driver [None req-084a5c26-267c-4925-bdee-4818a6d5ea30 a9b7ab1c9c854146af8af16f337a063d aa5b008731384d18ba83c8d69e76bcef - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 05 09:27:04 compute-1 nova_compute[189066]: 2025-12-05 09:27:04.502 189070 DEBUG nova.virt.libvirt.driver [None req-084a5c26-267c-4925-bdee-4818a6d5ea30 a9b7ab1c9c854146af8af16f337a063d aa5b008731384d18ba83c8d69e76bcef - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 05 09:27:04 compute-1 nova_compute[189066]: 2025-12-05 09:27:04.503 189070 DEBUG nova.virt.libvirt.driver [None req-084a5c26-267c-4925-bdee-4818a6d5ea30 a9b7ab1c9c854146af8af16f337a063d aa5b008731384d18ba83c8d69e76bcef - - default default] No VIF found with MAC fa:16:3e:81:c8:37, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Dec 05 09:27:04 compute-1 nova_compute[189066]: 2025-12-05 09:27:04.503 189070 INFO nova.virt.libvirt.driver [None req-084a5c26-267c-4925-bdee-4818a6d5ea30 a9b7ab1c9c854146af8af16f337a063d aa5b008731384d18ba83c8d69e76bcef - - default default] [instance: 43d83f29-ba12-4205-ba09-545c3dc28920] Using config drive
Dec 05 09:27:04 compute-1 nova_compute[189066]: 2025-12-05 09:27:04.814 189070 DEBUG nova.compute.manager [req-60d45398-3de6-42ba-9364-5fbf59d348a5 req-01307fb7-3034-448c-a1d9-b71d972ce8ee 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 43d83f29-ba12-4205-ba09-545c3dc28920] Received event network-changed-8977e440-f4bd-42c7-bf7b-c57e7184cc33 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 09:27:04 compute-1 nova_compute[189066]: 2025-12-05 09:27:04.814 189070 DEBUG nova.compute.manager [req-60d45398-3de6-42ba-9364-5fbf59d348a5 req-01307fb7-3034-448c-a1d9-b71d972ce8ee 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 43d83f29-ba12-4205-ba09-545c3dc28920] Refreshing instance network info cache due to event network-changed-8977e440-f4bd-42c7-bf7b-c57e7184cc33. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Dec 05 09:27:04 compute-1 nova_compute[189066]: 2025-12-05 09:27:04.815 189070 DEBUG oslo_concurrency.lockutils [req-60d45398-3de6-42ba-9364-5fbf59d348a5 req-01307fb7-3034-448c-a1d9-b71d972ce8ee 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Acquiring lock "refresh_cache-43d83f29-ba12-4205-ba09-545c3dc28920" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 09:27:04 compute-1 nova_compute[189066]: 2025-12-05 09:27:04.815 189070 DEBUG oslo_concurrency.lockutils [req-60d45398-3de6-42ba-9364-5fbf59d348a5 req-01307fb7-3034-448c-a1d9-b71d972ce8ee 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Acquired lock "refresh_cache-43d83f29-ba12-4205-ba09-545c3dc28920" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 09:27:04 compute-1 nova_compute[189066]: 2025-12-05 09:27:04.815 189070 DEBUG nova.network.neutron [req-60d45398-3de6-42ba-9364-5fbf59d348a5 req-01307fb7-3034-448c-a1d9-b71d972ce8ee 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 43d83f29-ba12-4205-ba09-545c3dc28920] Refreshing network info cache for port 8977e440-f4bd-42c7-bf7b-c57e7184cc33 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Dec 05 09:27:05 compute-1 podman[223301]: 2025-12-05 09:27:05.650387733 +0000 UTC m=+0.085132014 container health_status 0cdc951f3b455a1459471b20e1f1626ca53f702831c125f835d1b670aeb17ccf (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a42d21893a42142b0b00e96296a13ac9d2c0fedf55d6438ad104fd0309c63376-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 05 09:27:05 compute-1 nova_compute[189066]: 2025-12-05 09:27:05.791 189070 INFO nova.virt.libvirt.driver [None req-6ce70ff3-e851-4175-bf63-4a9beff89553 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] [instance: 9d2b0f76-0408-4f6b-8a37-ac9882a44b4e] Creating config drive at /var/lib/nova/instances/9d2b0f76-0408-4f6b-8a37-ac9882a44b4e/disk.config
Dec 05 09:27:05 compute-1 nova_compute[189066]: 2025-12-05 09:27:05.796 189070 DEBUG oslo_concurrency.processutils [None req-6ce70ff3-e851-4175-bf63-4a9beff89553 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/9d2b0f76-0408-4f6b-8a37-ac9882a44b4e/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpo_dsbpyf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 09:27:05 compute-1 nova_compute[189066]: 2025-12-05 09:27:05.854 189070 INFO nova.virt.libvirt.driver [None req-084a5c26-267c-4925-bdee-4818a6d5ea30 a9b7ab1c9c854146af8af16f337a063d aa5b008731384d18ba83c8d69e76bcef - - default default] [instance: 43d83f29-ba12-4205-ba09-545c3dc28920] Creating config drive at /var/lib/nova/instances/43d83f29-ba12-4205-ba09-545c3dc28920/disk.config
Dec 05 09:27:05 compute-1 nova_compute[189066]: 2025-12-05 09:27:05.859 189070 DEBUG oslo_concurrency.processutils [None req-084a5c26-267c-4925-bdee-4818a6d5ea30 a9b7ab1c9c854146af8af16f337a063d aa5b008731384d18ba83c8d69e76bcef - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/43d83f29-ba12-4205-ba09-545c3dc28920/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp9g_naxko execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 09:27:05 compute-1 nova_compute[189066]: 2025-12-05 09:27:05.925 189070 DEBUG oslo_concurrency.processutils [None req-6ce70ff3-e851-4175-bf63-4a9beff89553 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/9d2b0f76-0408-4f6b-8a37-ac9882a44b4e/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpo_dsbpyf" returned: 0 in 0.129s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 09:27:05 compute-1 nova_compute[189066]: 2025-12-05 09:27:05.986 189070 DEBUG oslo_concurrency.processutils [None req-084a5c26-267c-4925-bdee-4818a6d5ea30 a9b7ab1c9c854146af8af16f337a063d aa5b008731384d18ba83c8d69e76bcef - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/43d83f29-ba12-4205-ba09-545c3dc28920/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp9g_naxko" returned: 0 in 0.128s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 09:27:05 compute-1 kernel: tapfee19e88-d1: entered promiscuous mode
Dec 05 09:27:06 compute-1 NetworkManager[55704]: <info>  [1764926826.0013] manager: (tapfee19e88-d1): new Tun device (/org/freedesktop/NetworkManager/Devices/43)
Dec 05 09:27:06 compute-1 ovn_controller[95809]: 2025-12-05T09:27:06Z|00070|binding|INFO|Claiming lport fee19e88-d18e-4020-97b6-26caf4ef6fa9 for this chassis.
Dec 05 09:27:06 compute-1 ovn_controller[95809]: 2025-12-05T09:27:06Z|00071|binding|INFO|fee19e88-d18e-4020-97b6-26caf4ef6fa9: Claiming fa:16:3e:ad:12:e3 10.100.0.14
Dec 05 09:27:06 compute-1 nova_compute[189066]: 2025-12-05 09:27:06.003 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:27:06 compute-1 nova_compute[189066]: 2025-12-05 09:27:06.020 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:27:06 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:27:06.022 105272 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ad:12:e3 10.100.0.14'], port_security=['fa:16:3e:ad:12:e3 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '9d2b0f76-0408-4f6b-8a37-ac9882a44b4e', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-2d5ff262-0b2d-49fb-b643-980510ce97c7', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e26ae3fdd48d4947978a480f70e14f84', 'neutron:revision_number': '2', 'neutron:security_group_ids': '8cc6de91-de2a-4730-92b2-0be4bf62385a', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=512618ca-52a2-4c70-b76c-f25c0d00e2da, chassis=[<ovs.db.idl.Row object at 0x7fc06f54c970>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fc06f54c970>], logical_port=fee19e88-d18e-4020-97b6-26caf4ef6fa9) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 09:27:06 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:27:06.023 105272 INFO neutron.agent.ovn.metadata.agent [-] Port fee19e88-d18e-4020-97b6-26caf4ef6fa9 in datapath 2d5ff262-0b2d-49fb-b643-980510ce97c7 bound to our chassis
Dec 05 09:27:06 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:27:06.025 105272 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 2d5ff262-0b2d-49fb-b643-980510ce97c7
Dec 05 09:27:06 compute-1 systemd-udevd[223353]: Network interface NamePolicy= disabled on kernel command line.
Dec 05 09:27:06 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:27:06.040 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[b5d7758a-4ec8-4204-b98f-800f34e0b512]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:27:06 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:27:06.043 105272 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap2d5ff262-01 in ovnmeta-2d5ff262-0b2d-49fb-b643-980510ce97c7 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Dec 05 09:27:06 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:27:06.046 220556 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap2d5ff262-00 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Dec 05 09:27:06 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:27:06.046 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[c725fd4d-fe1f-4d7b-bb15-5ec08d291c8f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:27:06 compute-1 NetworkManager[55704]: <info>  [1764926826.0484] device (tapfee19e88-d1): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec 05 09:27:06 compute-1 NetworkManager[55704]: <info>  [1764926826.0500] manager: (tap8977e440-f4): new Tun device (/org/freedesktop/NetworkManager/Devices/44)
Dec 05 09:27:06 compute-1 NetworkManager[55704]: <info>  [1764926826.0506] device (tapfee19e88-d1): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec 05 09:27:06 compute-1 systemd-udevd[223362]: Network interface NamePolicy= disabled on kernel command line.
Dec 05 09:27:06 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:27:06.048 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[cd65f8a9-04b8-4f17-a603-1f3de592e81f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:27:06 compute-1 NetworkManager[55704]: <info>  [1764926826.0642] device (tap8977e440-f4): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec 05 09:27:06 compute-1 kernel: tap8977e440-f4: entered promiscuous mode
Dec 05 09:27:06 compute-1 NetworkManager[55704]: <info>  [1764926826.0654] device (tap8977e440-f4): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec 05 09:27:06 compute-1 ovn_controller[95809]: 2025-12-05T09:27:06Z|00072|binding|INFO|Claiming lport 8977e440-f4bd-42c7-bf7b-c57e7184cc33 for this chassis.
Dec 05 09:27:06 compute-1 ovn_controller[95809]: 2025-12-05T09:27:06Z|00073|binding|INFO|8977e440-f4bd-42c7-bf7b-c57e7184cc33: Claiming fa:16:3e:81:c8:37 10.100.0.12
Dec 05 09:27:06 compute-1 ovn_controller[95809]: 2025-12-05T09:27:06Z|00074|binding|INFO|Claiming lport d771a01e-fae5-4012-b36a-241c6c0bf739 for this chassis.
Dec 05 09:27:06 compute-1 ovn_controller[95809]: 2025-12-05T09:27:06Z|00075|binding|INFO|d771a01e-fae5-4012-b36a-241c6c0bf739: Claiming fa:16:3e:3f:ce:ca 19.80.0.182
Dec 05 09:27:06 compute-1 nova_compute[189066]: 2025-12-05 09:27:06.066 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:27:06 compute-1 nova_compute[189066]: 2025-12-05 09:27:06.069 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:27:06 compute-1 ovn_controller[95809]: 2025-12-05T09:27:06Z|00076|binding|INFO|Setting lport fee19e88-d18e-4020-97b6-26caf4ef6fa9 ovn-installed in OVS
Dec 05 09:27:06 compute-1 systemd-machined[154815]: New machine qemu-5-instance-0000000a.
Dec 05 09:27:06 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:27:06.075 105809 DEBUG oslo.privsep.daemon [-] privsep: reply[5318fdf4-5cb5-41ba-a43a-310d4c05eb76]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:27:06 compute-1 nova_compute[189066]: 2025-12-05 09:27:06.078 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:27:06 compute-1 ovn_controller[95809]: 2025-12-05T09:27:06Z|00077|binding|INFO|Setting lport fee19e88-d18e-4020-97b6-26caf4ef6fa9 up in Southbound
Dec 05 09:27:06 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:27:06.082 105272 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:3f:ce:ca 19.80.0.182'], port_security=['fa:16:3e:3f:ce:ca 19.80.0.182'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=['8977e440-f4bd-42c7-bf7b-c57e7184cc33'], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-subport-1788543283', 'neutron:cidrs': '19.80.0.182/24', 'neutron:device_id': '', 'neutron:device_owner': 'trunk:subport', 'neutron:mtu': '', 'neutron:network_name': 'neutron-e5fa9b67-cb83-4b68-927e-0eb9577e2a4d', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-subport-1788543283', 'neutron:project_id': 'aa5b008731384d18ba83c8d69e76bcef', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'ab115429-13ca-4258-9a08-fc76d0ca17b7', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[42], additional_encap=[], encap=[], mirror_rules=[], datapath=868394f9-662d-4629-849a-2949c92470e2, chassis=[<ovs.db.idl.Row object at 0x7fc06f54c970>], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=d771a01e-fae5-4012-b36a-241c6c0bf739) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 09:27:06 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:27:06.083 105272 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:81:c8:37 10.100.0.12'], port_security=['fa:16:3e:81:c8:37 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-parent-840969275', 'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '43d83f29-ba12-4205-ba09-545c3dc28920', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-0a97aec7-0780-4b5e-9498-e796fd7b42fd', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-parent-840969275', 'neutron:project_id': 'aa5b008731384d18ba83c8d69e76bcef', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'ab115429-13ca-4258-9a08-fc76d0ca17b7', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=36325ae0-997b-4e15-a889-e33151da06b1, chassis=[<ovs.db.idl.Row object at 0x7fc06f54c970>], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fc06f54c970>], logical_port=8977e440-f4bd-42c7-bf7b-c57e7184cc33) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 09:27:06 compute-1 systemd[1]: Started Virtual Machine qemu-5-instance-0000000a.
Dec 05 09:27:06 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:27:06.104 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[bff48c6e-0749-46ce-b926-6927e09acb93]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:27:06 compute-1 systemd-machined[154815]: New machine qemu-6-instance-0000000b.
Dec 05 09:27:06 compute-1 systemd[1]: Started Virtual Machine qemu-6-instance-0000000b.
Dec 05 09:27:06 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:27:06.150 220896 DEBUG oslo.privsep.daemon [-] privsep: reply[d91db425-ef2d-4568-898e-3aaf4af4706a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:27:06 compute-1 ovn_controller[95809]: 2025-12-05T09:27:06Z|00078|binding|INFO|Setting lport 8977e440-f4bd-42c7-bf7b-c57e7184cc33 ovn-installed in OVS
Dec 05 09:27:06 compute-1 ovn_controller[95809]: 2025-12-05T09:27:06Z|00079|binding|INFO|Setting lport 8977e440-f4bd-42c7-bf7b-c57e7184cc33 up in Southbound
Dec 05 09:27:06 compute-1 ovn_controller[95809]: 2025-12-05T09:27:06Z|00080|binding|INFO|Setting lport d771a01e-fae5-4012-b36a-241c6c0bf739 up in Southbound
Dec 05 09:27:06 compute-1 nova_compute[189066]: 2025-12-05 09:27:06.156 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:27:06 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:27:06.159 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[07ed9f22-d715-4c9e-a439-ded1da1011d9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:27:06 compute-1 NetworkManager[55704]: <info>  [1764926826.1600] manager: (tap2d5ff262-00): new Veth device (/org/freedesktop/NetworkManager/Devices/45)
Dec 05 09:27:06 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:27:06.201 220896 DEBUG oslo.privsep.daemon [-] privsep: reply[03b9178b-3d8c-4c85-bd05-73d0d4ce332b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:27:06 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:27:06.204 220896 DEBUG oslo.privsep.daemon [-] privsep: reply[d5000fb3-fc6d-4343-b34f-8af302292a98]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:27:06 compute-1 NetworkManager[55704]: <info>  [1764926826.2322] device (tap2d5ff262-00): carrier: link connected
Dec 05 09:27:06 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:27:06.235 220896 DEBUG oslo.privsep.daemon [-] privsep: reply[18f6b53d-800e-4330-9ddf-3e1bb7f43cc5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:27:06 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:27:06.253 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[e5d62c7b-c8a1-43cf-88cd-13a8c648e2db]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap2d5ff262-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:3d:1f:ea'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 25], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 409923, 'reachable_time': 27973, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 223405, 'error': None, 'target': 'ovnmeta-2d5ff262-0b2d-49fb-b643-980510ce97c7', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:27:06 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:27:06.271 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[e7988a7b-0c9d-4880-9df7-ae359fdff8af]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe3d:1fea'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 409923, 'tstamp': 409923}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 223406, 'error': None, 'target': 'ovnmeta-2d5ff262-0b2d-49fb-b643-980510ce97c7', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:27:06 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:27:06.288 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[2009d121-baf9-4c14-87b8-de714a8515c2]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap2d5ff262-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:3d:1f:ea'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 25], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 409923, 'reachable_time': 27973, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 223407, 'error': None, 'target': 'ovnmeta-2d5ff262-0b2d-49fb-b643-980510ce97c7', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:27:06 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:27:06.329 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[2996e0e1-56e8-4419-8103-e75e4b90075b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:27:06 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:27:06.395 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[39e0ad15-05f6-4d4a-b708-c5003ffd5d04]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:27:06 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:27:06.397 105272 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap2d5ff262-00, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 09:27:06 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:27:06.397 105272 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 09:27:06 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:27:06.398 105272 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap2d5ff262-00, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 09:27:06 compute-1 NetworkManager[55704]: <info>  [1764926826.4012] manager: (tap2d5ff262-00): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/46)
Dec 05 09:27:06 compute-1 kernel: tap2d5ff262-00: entered promiscuous mode
Dec 05 09:27:06 compute-1 nova_compute[189066]: 2025-12-05 09:27:06.400 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:27:06 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:27:06.403 105272 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap2d5ff262-00, col_values=(('external_ids', {'iface-id': 'da7e4261-23bc-43e6-a0d1-1c34dd95f6c8'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 09:27:06 compute-1 ovn_controller[95809]: 2025-12-05T09:27:06Z|00081|binding|INFO|Releasing lport da7e4261-23bc-43e6-a0d1-1c34dd95f6c8 from this chassis (sb_readonly=0)
Dec 05 09:27:06 compute-1 nova_compute[189066]: 2025-12-05 09:27:06.404 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:27:06 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:27:06.418 105272 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/2d5ff262-0b2d-49fb-b643-980510ce97c7.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/2d5ff262-0b2d-49fb-b643-980510ce97c7.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Dec 05 09:27:06 compute-1 nova_compute[189066]: 2025-12-05 09:27:06.418 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:27:06 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:27:06.419 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[dfca875f-2ae9-477d-8447-0d76f1c9bb17]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:27:06 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:27:06.420 105272 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Dec 05 09:27:06 compute-1 ovn_metadata_agent[105267]: global
Dec 05 09:27:06 compute-1 ovn_metadata_agent[105267]:     log         /dev/log local0 debug
Dec 05 09:27:06 compute-1 ovn_metadata_agent[105267]:     log-tag     haproxy-metadata-proxy-2d5ff262-0b2d-49fb-b643-980510ce97c7
Dec 05 09:27:06 compute-1 ovn_metadata_agent[105267]:     user        root
Dec 05 09:27:06 compute-1 ovn_metadata_agent[105267]:     group       root
Dec 05 09:27:06 compute-1 ovn_metadata_agent[105267]:     maxconn     1024
Dec 05 09:27:06 compute-1 ovn_metadata_agent[105267]:     pidfile     /var/lib/neutron/external/pids/2d5ff262-0b2d-49fb-b643-980510ce97c7.pid.haproxy
Dec 05 09:27:06 compute-1 ovn_metadata_agent[105267]:     daemon
Dec 05 09:27:06 compute-1 ovn_metadata_agent[105267]: 
Dec 05 09:27:06 compute-1 ovn_metadata_agent[105267]: defaults
Dec 05 09:27:06 compute-1 ovn_metadata_agent[105267]:     log global
Dec 05 09:27:06 compute-1 ovn_metadata_agent[105267]:     mode http
Dec 05 09:27:06 compute-1 ovn_metadata_agent[105267]:     option httplog
Dec 05 09:27:06 compute-1 ovn_metadata_agent[105267]:     option dontlognull
Dec 05 09:27:06 compute-1 ovn_metadata_agent[105267]:     option http-server-close
Dec 05 09:27:06 compute-1 ovn_metadata_agent[105267]:     option forwardfor
Dec 05 09:27:06 compute-1 ovn_metadata_agent[105267]:     retries                 3
Dec 05 09:27:06 compute-1 ovn_metadata_agent[105267]:     timeout http-request    30s
Dec 05 09:27:06 compute-1 ovn_metadata_agent[105267]:     timeout connect         30s
Dec 05 09:27:06 compute-1 ovn_metadata_agent[105267]:     timeout client          32s
Dec 05 09:27:06 compute-1 ovn_metadata_agent[105267]:     timeout server          32s
Dec 05 09:27:06 compute-1 ovn_metadata_agent[105267]:     timeout http-keep-alive 30s
Dec 05 09:27:06 compute-1 ovn_metadata_agent[105267]: 
Dec 05 09:27:06 compute-1 ovn_metadata_agent[105267]: 
Dec 05 09:27:06 compute-1 ovn_metadata_agent[105267]: listen listener
Dec 05 09:27:06 compute-1 ovn_metadata_agent[105267]:     bind 169.254.169.254:80
Dec 05 09:27:06 compute-1 ovn_metadata_agent[105267]:     server metadata /var/lib/neutron/metadata_proxy
Dec 05 09:27:06 compute-1 ovn_metadata_agent[105267]:     http-request add-header X-OVN-Network-ID 2d5ff262-0b2d-49fb-b643-980510ce97c7
Dec 05 09:27:06 compute-1 ovn_metadata_agent[105267]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Dec 05 09:27:06 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:27:06.422 105272 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-2d5ff262-0b2d-49fb-b643-980510ce97c7', 'env', 'PROCESS_TAG=haproxy-2d5ff262-0b2d-49fb-b643-980510ce97c7', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/2d5ff262-0b2d-49fb-b643-980510ce97c7.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Dec 05 09:27:06 compute-1 nova_compute[189066]: 2025-12-05 09:27:06.522 189070 DEBUG nova.virt.driver [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] Emitting event <LifecycleEvent: 1764926826.5221272, 43d83f29-ba12-4205-ba09-545c3dc28920 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 09:27:06 compute-1 nova_compute[189066]: 2025-12-05 09:27:06.523 189070 INFO nova.compute.manager [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] [instance: 43d83f29-ba12-4205-ba09-545c3dc28920] VM Started (Lifecycle Event)
Dec 05 09:27:06 compute-1 nova_compute[189066]: 2025-12-05 09:27:06.557 189070 DEBUG nova.compute.manager [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] [instance: 43d83f29-ba12-4205-ba09-545c3dc28920] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 09:27:06 compute-1 nova_compute[189066]: 2025-12-05 09:27:06.566 189070 DEBUG nova.virt.driver [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] Emitting event <LifecycleEvent: 1764926826.522379, 43d83f29-ba12-4205-ba09-545c3dc28920 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 09:27:06 compute-1 nova_compute[189066]: 2025-12-05 09:27:06.567 189070 INFO nova.compute.manager [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] [instance: 43d83f29-ba12-4205-ba09-545c3dc28920] VM Paused (Lifecycle Event)
Dec 05 09:27:06 compute-1 nova_compute[189066]: 2025-12-05 09:27:06.595 189070 DEBUG nova.compute.manager [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] [instance: 43d83f29-ba12-4205-ba09-545c3dc28920] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 09:27:06 compute-1 nova_compute[189066]: 2025-12-05 09:27:06.599 189070 DEBUG nova.compute.manager [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] [instance: 43d83f29-ba12-4205-ba09-545c3dc28920] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 09:27:06 compute-1 nova_compute[189066]: 2025-12-05 09:27:06.645 189070 INFO nova.compute.manager [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] [instance: 43d83f29-ba12-4205-ba09-545c3dc28920] During sync_power_state the instance has a pending task (spawning). Skip.
Dec 05 09:27:06 compute-1 nova_compute[189066]: 2025-12-05 09:27:06.646 189070 DEBUG nova.virt.driver [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] Emitting event <LifecycleEvent: 1764926826.6052036, 9d2b0f76-0408-4f6b-8a37-ac9882a44b4e => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 09:27:06 compute-1 nova_compute[189066]: 2025-12-05 09:27:06.646 189070 INFO nova.compute.manager [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] [instance: 9d2b0f76-0408-4f6b-8a37-ac9882a44b4e] VM Started (Lifecycle Event)
Dec 05 09:27:06 compute-1 nova_compute[189066]: 2025-12-05 09:27:06.669 189070 DEBUG nova.compute.manager [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] [instance: 9d2b0f76-0408-4f6b-8a37-ac9882a44b4e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 09:27:06 compute-1 nova_compute[189066]: 2025-12-05 09:27:06.673 189070 DEBUG nova.virt.driver [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] Emitting event <LifecycleEvent: 1764926826.605254, 9d2b0f76-0408-4f6b-8a37-ac9882a44b4e => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 09:27:06 compute-1 nova_compute[189066]: 2025-12-05 09:27:06.673 189070 INFO nova.compute.manager [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] [instance: 9d2b0f76-0408-4f6b-8a37-ac9882a44b4e] VM Paused (Lifecycle Event)
Dec 05 09:27:06 compute-1 nova_compute[189066]: 2025-12-05 09:27:06.693 189070 DEBUG nova.compute.manager [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] [instance: 9d2b0f76-0408-4f6b-8a37-ac9882a44b4e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 09:27:06 compute-1 nova_compute[189066]: 2025-12-05 09:27:06.696 189070 DEBUG nova.compute.manager [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] [instance: 9d2b0f76-0408-4f6b-8a37-ac9882a44b4e] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 09:27:06 compute-1 nova_compute[189066]: 2025-12-05 09:27:06.720 189070 INFO nova.compute.manager [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] [instance: 9d2b0f76-0408-4f6b-8a37-ac9882a44b4e] During sync_power_state the instance has a pending task (spawning). Skip.
Dec 05 09:27:06 compute-1 podman[223453]: 2025-12-05 09:27:06.850256706 +0000 UTC m=+0.045882585 container create 23ba8fc634738c5b2fd79cd10d3697ceba69f51d33ed176194dee31a9a898da5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-2d5ff262-0b2d-49fb-b643-980510ce97c7, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 05 09:27:06 compute-1 systemd[1]: Started libpod-conmon-23ba8fc634738c5b2fd79cd10d3697ceba69f51d33ed176194dee31a9a898da5.scope.
Dec 05 09:27:06 compute-1 systemd[1]: Started libcrun container.
Dec 05 09:27:06 compute-1 podman[223453]: 2025-12-05 09:27:06.825515984 +0000 UTC m=+0.021141883 image pull 014dc726c85414b29f2dde7b5d875685d08784761c0f0ffa8630d1583a877bf9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Dec 05 09:27:06 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1feb21e2e4d4c74cc9d649a43f86e416f91d98315e1476ee1fdc2a3f7dfd70d5/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 05 09:27:06 compute-1 podman[223453]: 2025-12-05 09:27:06.936140147 +0000 UTC m=+0.131766046 container init 23ba8fc634738c5b2fd79cd10d3697ceba69f51d33ed176194dee31a9a898da5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-2d5ff262-0b2d-49fb-b643-980510ce97c7, org.label-schema.build-date=20251125, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 05 09:27:06 compute-1 podman[223453]: 2025-12-05 09:27:06.943598292 +0000 UTC m=+0.139224171 container start 23ba8fc634738c5b2fd79cd10d3697ceba69f51d33ed176194dee31a9a898da5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-2d5ff262-0b2d-49fb-b643-980510ce97c7, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 05 09:27:06 compute-1 neutron-haproxy-ovnmeta-2d5ff262-0b2d-49fb-b643-980510ce97c7[223468]: [NOTICE]   (223472) : New worker (223474) forked
Dec 05 09:27:06 compute-1 neutron-haproxy-ovnmeta-2d5ff262-0b2d-49fb-b643-980510ce97c7[223468]: [NOTICE]   (223472) : Loading success.
Dec 05 09:27:07 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:27:07.002 105272 INFO neutron.agent.ovn.metadata.agent [-] Port d771a01e-fae5-4012-b36a-241c6c0bf739 in datapath e5fa9b67-cb83-4b68-927e-0eb9577e2a4d unbound from our chassis
Dec 05 09:27:07 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:27:07.005 105272 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network e5fa9b67-cb83-4b68-927e-0eb9577e2a4d
Dec 05 09:27:07 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:27:07.016 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[1d090b0b-e3ff-4a80-9573-f9653528cbca]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:27:07 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:27:07.018 105272 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tape5fa9b67-c1 in ovnmeta-e5fa9b67-cb83-4b68-927e-0eb9577e2a4d namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Dec 05 09:27:07 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:27:07.021 220556 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tape5fa9b67-c0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Dec 05 09:27:07 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:27:07.021 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[28f50422-3f2f-48f5-81b2-beb1b4701be4]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:27:07 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:27:07.022 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[4aaf1b1d-ba56-4fec-b9ea-66db7cd088ed]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:27:07 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:27:07.035 105809 DEBUG oslo.privsep.daemon [-] privsep: reply[4d788ab5-2816-4063-8e54-540e12b427e6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:27:07 compute-1 nova_compute[189066]: 2025-12-05 09:27:07.042 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:27:07 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:27:07.051 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[4bc8f719-c484-4149-a2fd-88bd77bf3c4f]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:27:07 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:27:07.087 220896 DEBUG oslo.privsep.daemon [-] privsep: reply[af1463ac-bdfb-42eb-8695-a10205b5a8fe]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:27:07 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:27:07.094 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[62dc3944-f2d7-4bf9-af53-98cd45b8c82b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:27:07 compute-1 NetworkManager[55704]: <info>  [1764926827.0957] manager: (tape5fa9b67-c0): new Veth device (/org/freedesktop/NetworkManager/Devices/47)
Dec 05 09:27:07 compute-1 systemd-udevd[223386]: Network interface NamePolicy= disabled on kernel command line.
Dec 05 09:27:07 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:27:07.133 220896 DEBUG oslo.privsep.daemon [-] privsep: reply[e7c27519-2a53-432b-8ce1-5d8888b69234]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:27:07 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:27:07.139 220896 DEBUG oslo.privsep.daemon [-] privsep: reply[2d257b3f-63b2-4c07-8d5f-311c5e42ed94]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:27:07 compute-1 NetworkManager[55704]: <info>  [1764926827.1662] device (tape5fa9b67-c0): carrier: link connected
Dec 05 09:27:07 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:27:07.170 220896 DEBUG oslo.privsep.daemon [-] privsep: reply[6f94adae-7734-4b57-810c-3be90024be67]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:27:07 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:27:07.196 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[f684f331-b8c0-445f-be9e-81765e649ffb]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tape5fa9b67-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:66:01:ce'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 26], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 410017, 'reachable_time': 44172, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 223493, 'error': None, 'target': 'ovnmeta-e5fa9b67-cb83-4b68-927e-0eb9577e2a4d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:27:07 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:27:07.225 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[2e339f27-f108-4b7d-bed2-90920764e78d]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe66:1ce'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 410017, 'tstamp': 410017}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 223494, 'error': None, 'target': 'ovnmeta-e5fa9b67-cb83-4b68-927e-0eb9577e2a4d', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:27:07 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:27:07.249 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[81d67851-00a6-480d-a0de-261544179a93]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tape5fa9b67-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:66:01:ce'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 26], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 410017, 'reachable_time': 44172, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 223495, 'error': None, 'target': 'ovnmeta-e5fa9b67-cb83-4b68-927e-0eb9577e2a4d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:27:07 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:27:07.284 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[3284991a-38bf-4185-9e0b-494453f29065]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:27:07 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:27:07.358 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[c60acd28-89d8-46a5-a258-d29d5306ba5f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:27:07 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:27:07.361 105272 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tape5fa9b67-c0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 09:27:07 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:27:07.361 105272 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 09:27:07 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:27:07.362 105272 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tape5fa9b67-c0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 09:27:07 compute-1 nova_compute[189066]: 2025-12-05 09:27:07.366 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:27:07 compute-1 NetworkManager[55704]: <info>  [1764926827.3672] manager: (tape5fa9b67-c0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/48)
Dec 05 09:27:07 compute-1 kernel: tape5fa9b67-c0: entered promiscuous mode
Dec 05 09:27:07 compute-1 nova_compute[189066]: 2025-12-05 09:27:07.369 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:27:07 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:27:07.371 105272 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tape5fa9b67-c0, col_values=(('external_ids', {'iface-id': '6691f267-8fd6-45bc-bce0-2a0ee4f4c825'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 09:27:07 compute-1 nova_compute[189066]: 2025-12-05 09:27:07.374 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:27:07 compute-1 ovn_controller[95809]: 2025-12-05T09:27:07Z|00082|binding|INFO|Releasing lport 6691f267-8fd6-45bc-bce0-2a0ee4f4c825 from this chassis (sb_readonly=0)
Dec 05 09:27:07 compute-1 nova_compute[189066]: 2025-12-05 09:27:07.375 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:27:07 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:27:07.378 105272 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/e5fa9b67-cb83-4b68-927e-0eb9577e2a4d.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/e5fa9b67-cb83-4b68-927e-0eb9577e2a4d.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Dec 05 09:27:07 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:27:07.380 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[8670dfe5-fe0b-42ae-94de-e1c4aa199e61]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:27:07 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:27:07.381 105272 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Dec 05 09:27:07 compute-1 ovn_metadata_agent[105267]: global
Dec 05 09:27:07 compute-1 ovn_metadata_agent[105267]:     log         /dev/log local0 debug
Dec 05 09:27:07 compute-1 ovn_metadata_agent[105267]:     log-tag     haproxy-metadata-proxy-e5fa9b67-cb83-4b68-927e-0eb9577e2a4d
Dec 05 09:27:07 compute-1 ovn_metadata_agent[105267]:     user        root
Dec 05 09:27:07 compute-1 ovn_metadata_agent[105267]:     group       root
Dec 05 09:27:07 compute-1 ovn_metadata_agent[105267]:     maxconn     1024
Dec 05 09:27:07 compute-1 ovn_metadata_agent[105267]:     pidfile     /var/lib/neutron/external/pids/e5fa9b67-cb83-4b68-927e-0eb9577e2a4d.pid.haproxy
Dec 05 09:27:07 compute-1 ovn_metadata_agent[105267]:     daemon
Dec 05 09:27:07 compute-1 ovn_metadata_agent[105267]: 
Dec 05 09:27:07 compute-1 ovn_metadata_agent[105267]: defaults
Dec 05 09:27:07 compute-1 ovn_metadata_agent[105267]:     log global
Dec 05 09:27:07 compute-1 ovn_metadata_agent[105267]:     mode http
Dec 05 09:27:07 compute-1 ovn_metadata_agent[105267]:     option httplog
Dec 05 09:27:07 compute-1 ovn_metadata_agent[105267]:     option dontlognull
Dec 05 09:27:07 compute-1 ovn_metadata_agent[105267]:     option http-server-close
Dec 05 09:27:07 compute-1 ovn_metadata_agent[105267]:     option forwardfor
Dec 05 09:27:07 compute-1 ovn_metadata_agent[105267]:     retries                 3
Dec 05 09:27:07 compute-1 ovn_metadata_agent[105267]:     timeout http-request    30s
Dec 05 09:27:07 compute-1 ovn_metadata_agent[105267]:     timeout connect         30s
Dec 05 09:27:07 compute-1 ovn_metadata_agent[105267]:     timeout client          32s
Dec 05 09:27:07 compute-1 ovn_metadata_agent[105267]:     timeout server          32s
Dec 05 09:27:07 compute-1 ovn_metadata_agent[105267]:     timeout http-keep-alive 30s
Dec 05 09:27:07 compute-1 ovn_metadata_agent[105267]: 
Dec 05 09:27:07 compute-1 ovn_metadata_agent[105267]: 
Dec 05 09:27:07 compute-1 ovn_metadata_agent[105267]: listen listener
Dec 05 09:27:07 compute-1 ovn_metadata_agent[105267]:     bind 169.254.169.254:80
Dec 05 09:27:07 compute-1 ovn_metadata_agent[105267]:     server metadata /var/lib/neutron/metadata_proxy
Dec 05 09:27:07 compute-1 ovn_metadata_agent[105267]:     http-request add-header X-OVN-Network-ID e5fa9b67-cb83-4b68-927e-0eb9577e2a4d
Dec 05 09:27:07 compute-1 ovn_metadata_agent[105267]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Dec 05 09:27:07 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:27:07.382 105272 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-e5fa9b67-cb83-4b68-927e-0eb9577e2a4d', 'env', 'PROCESS_TAG=haproxy-e5fa9b67-cb83-4b68-927e-0eb9577e2a4d', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/e5fa9b67-cb83-4b68-927e-0eb9577e2a4d.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Dec 05 09:27:07 compute-1 nova_compute[189066]: 2025-12-05 09:27:07.387 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:27:07 compute-1 nova_compute[189066]: 2025-12-05 09:27:07.512 189070 DEBUG nova.network.neutron [req-3caab629-9639-44e0-90b9-02b78d37b7b4 req-b7965b1c-f9c5-453f-925c-8beffa7f29f7 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 9d2b0f76-0408-4f6b-8a37-ac9882a44b4e] Updated VIF entry in instance network info cache for port fee19e88-d18e-4020-97b6-26caf4ef6fa9. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Dec 05 09:27:07 compute-1 nova_compute[189066]: 2025-12-05 09:27:07.512 189070 DEBUG nova.network.neutron [req-3caab629-9639-44e0-90b9-02b78d37b7b4 req-b7965b1c-f9c5-453f-925c-8beffa7f29f7 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 9d2b0f76-0408-4f6b-8a37-ac9882a44b4e] Updating instance_info_cache with network_info: [{"id": "fee19e88-d18e-4020-97b6-26caf4ef6fa9", "address": "fa:16:3e:ad:12:e3", "network": {"id": "2d5ff262-0b2d-49fb-b643-980510ce97c7", "bridge": "br-int", "label": "tempest-network-smoke--2096539509", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e26ae3fdd48d4947978a480f70e14f84", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfee19e88-d1", "ovs_interfaceid": "fee19e88-d18e-4020-97b6-26caf4ef6fa9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 09:27:07 compute-1 nova_compute[189066]: 2025-12-05 09:27:07.533 189070 DEBUG oslo_concurrency.lockutils [req-3caab629-9639-44e0-90b9-02b78d37b7b4 req-b7965b1c-f9c5-453f-925c-8beffa7f29f7 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Releasing lock "refresh_cache-9d2b0f76-0408-4f6b-8a37-ac9882a44b4e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 09:27:07 compute-1 podman[223526]: 2025-12-05 09:27:07.797156408 +0000 UTC m=+0.052560809 container create 2396afc30b4895da089705db463af93d561894176dc22dfffc8a2cfa8755b03c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e5fa9b67-cb83-4b68-927e-0eb9577e2a4d, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team)
Dec 05 09:27:07 compute-1 systemd[1]: Started libpod-conmon-2396afc30b4895da089705db463af93d561894176dc22dfffc8a2cfa8755b03c.scope.
Dec 05 09:27:07 compute-1 systemd[1]: Started libcrun container.
Dec 05 09:27:07 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3152458a210273a36fba88993508d1329324094df5b774ac0a70e62a7e26f15c/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 05 09:27:07 compute-1 podman[223526]: 2025-12-05 09:27:07.861120919 +0000 UTC m=+0.116525320 container init 2396afc30b4895da089705db463af93d561894176dc22dfffc8a2cfa8755b03c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e5fa9b67-cb83-4b68-927e-0eb9577e2a4d, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125)
Dec 05 09:27:07 compute-1 podman[223526]: 2025-12-05 09:27:07.866887021 +0000 UTC m=+0.122291422 container start 2396afc30b4895da089705db463af93d561894176dc22dfffc8a2cfa8755b03c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e5fa9b67-cb83-4b68-927e-0eb9577e2a4d, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team)
Dec 05 09:27:07 compute-1 podman[223526]: 2025-12-05 09:27:07.772672044 +0000 UTC m=+0.028076465 image pull 014dc726c85414b29f2dde7b5d875685d08784761c0f0ffa8630d1583a877bf9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Dec 05 09:27:07 compute-1 neutron-haproxy-ovnmeta-e5fa9b67-cb83-4b68-927e-0eb9577e2a4d[223542]: [NOTICE]   (223556) : New worker (223562) forked
Dec 05 09:27:07 compute-1 neutron-haproxy-ovnmeta-e5fa9b67-cb83-4b68-927e-0eb9577e2a4d[223542]: [NOTICE]   (223556) : Loading success.
Dec 05 09:27:07 compute-1 podman[223539]: 2025-12-05 09:27:07.907488864 +0000 UTC m=+0.067367835 container health_status e8cea6352187f28e672aa25c227f2705a90d58074b8cf42235a56df5d4026fd5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a42d21893a42142b0b00e96296a13ac9d2c0fedf55d6438ad104fd0309c63376-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true)
Dec 05 09:27:07 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:27:07.947 105272 INFO neutron.agent.ovn.metadata.agent [-] Port 8977e440-f4bd-42c7-bf7b-c57e7184cc33 in datapath 0a97aec7-0780-4b5e-9498-e796fd7b42fd unbound from our chassis
Dec 05 09:27:07 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:27:07.949 105272 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 0a97aec7-0780-4b5e-9498-e796fd7b42fd
Dec 05 09:27:07 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:27:07.963 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[37d4fec2-90cd-4449-80a5-ab7cf19f3e6e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:27:07 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:27:07.964 105272 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap0a97aec7-01 in ovnmeta-0a97aec7-0780-4b5e-9498-e796fd7b42fd namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Dec 05 09:27:07 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:27:07.966 220556 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap0a97aec7-00 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Dec 05 09:27:07 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:27:07.966 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[682a2645-3abe-4e98-849c-b71912b98c13]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:27:07 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:27:07.967 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[c9615cbc-6d48-44a6-b0f7-bdd61a0a638f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:27:07 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:27:07.980 105809 DEBUG oslo.privsep.daemon [-] privsep: reply[bde0e7a6-2c4a-4d99-a139-67fe274ad92b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:27:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:27:08.005 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[afcc6446-4a0e-42cb-855d-95b8e6c6af51]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:27:08 compute-1 nova_compute[189066]: 2025-12-05 09:27:08.014 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:27:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:27:08.039 220896 DEBUG oslo.privsep.daemon [-] privsep: reply[77881103-c792-46d1-bd85-634e56245059]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:27:08 compute-1 NetworkManager[55704]: <info>  [1764926828.0474] manager: (tap0a97aec7-00): new Veth device (/org/freedesktop/NetworkManager/Devices/49)
Dec 05 09:27:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:27:08.046 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[b27c744b-6280-4718-b8a9-caa533d974b3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:27:08 compute-1 nova_compute[189066]: 2025-12-05 09:27:08.076 189070 DEBUG nova.network.neutron [req-60d45398-3de6-42ba-9364-5fbf59d348a5 req-01307fb7-3034-448c-a1d9-b71d972ce8ee 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 43d83f29-ba12-4205-ba09-545c3dc28920] Updated VIF entry in instance network info cache for port 8977e440-f4bd-42c7-bf7b-c57e7184cc33. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Dec 05 09:27:08 compute-1 nova_compute[189066]: 2025-12-05 09:27:08.076 189070 DEBUG nova.network.neutron [req-60d45398-3de6-42ba-9364-5fbf59d348a5 req-01307fb7-3034-448c-a1d9-b71d972ce8ee 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 43d83f29-ba12-4205-ba09-545c3dc28920] Updating instance_info_cache with network_info: [{"id": "8977e440-f4bd-42c7-bf7b-c57e7184cc33", "address": "fa:16:3e:81:c8:37", "network": {"id": "0a97aec7-0780-4b5e-9498-e796fd7b42fd", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-971219995-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "aa5b008731384d18ba83c8d69e76bcef", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8977e440-f4", "ovs_interfaceid": "8977e440-f4bd-42c7-bf7b-c57e7184cc33", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 09:27:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:27:08.089 220896 DEBUG oslo.privsep.daemon [-] privsep: reply[5815423d-fe97-4324-8606-8ec2e6a67d12]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:27:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:27:08.093 220896 DEBUG oslo.privsep.daemon [-] privsep: reply[5d03cb7a-2c4f-4007-b86d-72cb14aa54d5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:27:08 compute-1 NetworkManager[55704]: <info>  [1764926828.1264] device (tap0a97aec7-00): carrier: link connected
Dec 05 09:27:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:27:08.133 220896 DEBUG oslo.privsep.daemon [-] privsep: reply[5daf4d13-1c85-4743-88a9-3bf35e108251]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:27:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:27:08.156 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[746eafc2-a8bf-4f44-bb02-e811399c33ba]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap0a97aec7-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:66:cb:d0'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 27], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 410113, 'reachable_time': 15206, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 223586, 'error': None, 'target': 'ovnmeta-0a97aec7-0780-4b5e-9498-e796fd7b42fd', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:27:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:27:08.175 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[afbe0edc-43dd-4fda-a974-5455b1bb134b]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe66:cbd0'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 410113, 'tstamp': 410113}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 223587, 'error': None, 'target': 'ovnmeta-0a97aec7-0780-4b5e-9498-e796fd7b42fd', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:27:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:27:08.192 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[8faacee7-2069-4fb0-a528-2d1133502f18]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap0a97aec7-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:66:cb:d0'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 27], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 410113, 'reachable_time': 15206, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 223588, 'error': None, 'target': 'ovnmeta-0a97aec7-0780-4b5e-9498-e796fd7b42fd', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:27:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:27:08.222 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[01e88c21-102c-4254-9a9e-0b9a4aa1744a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:27:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:27:08.288 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[77af463b-1c2b-4a5d-83ad-15f9aeb25486]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:27:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:27:08.290 105272 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap0a97aec7-00, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 09:27:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:27:08.291 105272 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 09:27:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:27:08.291 105272 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap0a97aec7-00, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 09:27:08 compute-1 NetworkManager[55704]: <info>  [1764926828.2949] manager: (tap0a97aec7-00): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/50)
Dec 05 09:27:08 compute-1 kernel: tap0a97aec7-00: entered promiscuous mode
Dec 05 09:27:08 compute-1 nova_compute[189066]: 2025-12-05 09:27:08.294 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:27:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:27:08.298 105272 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap0a97aec7-00, col_values=(('external_ids', {'iface-id': '0cfbe0fa-0fb7-480e-bf9d-3c9768dbeffa'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 09:27:08 compute-1 nova_compute[189066]: 2025-12-05 09:27:08.299 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:27:08 compute-1 ovn_controller[95809]: 2025-12-05T09:27:08Z|00083|binding|INFO|Releasing lport 0cfbe0fa-0fb7-480e-bf9d-3c9768dbeffa from this chassis (sb_readonly=0)
Dec 05 09:27:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:27:08.300 105272 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/0a97aec7-0780-4b5e-9498-e796fd7b42fd.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/0a97aec7-0780-4b5e-9498-e796fd7b42fd.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Dec 05 09:27:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:27:08.301 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[c9b21473-9509-4c3c-b632-93cf6d5e70f8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:27:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:27:08.302 105272 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Dec 05 09:27:08 compute-1 ovn_metadata_agent[105267]: global
Dec 05 09:27:08 compute-1 ovn_metadata_agent[105267]:     log         /dev/log local0 debug
Dec 05 09:27:08 compute-1 ovn_metadata_agent[105267]:     log-tag     haproxy-metadata-proxy-0a97aec7-0780-4b5e-9498-e796fd7b42fd
Dec 05 09:27:08 compute-1 ovn_metadata_agent[105267]:     user        root
Dec 05 09:27:08 compute-1 ovn_metadata_agent[105267]:     group       root
Dec 05 09:27:08 compute-1 ovn_metadata_agent[105267]:     maxconn     1024
Dec 05 09:27:08 compute-1 ovn_metadata_agent[105267]:     pidfile     /var/lib/neutron/external/pids/0a97aec7-0780-4b5e-9498-e796fd7b42fd.pid.haproxy
Dec 05 09:27:08 compute-1 ovn_metadata_agent[105267]:     daemon
Dec 05 09:27:08 compute-1 ovn_metadata_agent[105267]: 
Dec 05 09:27:08 compute-1 ovn_metadata_agent[105267]: defaults
Dec 05 09:27:08 compute-1 ovn_metadata_agent[105267]:     log global
Dec 05 09:27:08 compute-1 ovn_metadata_agent[105267]:     mode http
Dec 05 09:27:08 compute-1 ovn_metadata_agent[105267]:     option httplog
Dec 05 09:27:08 compute-1 ovn_metadata_agent[105267]:     option dontlognull
Dec 05 09:27:08 compute-1 ovn_metadata_agent[105267]:     option http-server-close
Dec 05 09:27:08 compute-1 ovn_metadata_agent[105267]:     option forwardfor
Dec 05 09:27:08 compute-1 ovn_metadata_agent[105267]:     retries                 3
Dec 05 09:27:08 compute-1 ovn_metadata_agent[105267]:     timeout http-request    30s
Dec 05 09:27:08 compute-1 ovn_metadata_agent[105267]:     timeout connect         30s
Dec 05 09:27:08 compute-1 ovn_metadata_agent[105267]:     timeout client          32s
Dec 05 09:27:08 compute-1 ovn_metadata_agent[105267]:     timeout server          32s
Dec 05 09:27:08 compute-1 ovn_metadata_agent[105267]:     timeout http-keep-alive 30s
Dec 05 09:27:08 compute-1 ovn_metadata_agent[105267]: 
Dec 05 09:27:08 compute-1 ovn_metadata_agent[105267]: 
Dec 05 09:27:08 compute-1 ovn_metadata_agent[105267]: listen listener
Dec 05 09:27:08 compute-1 ovn_metadata_agent[105267]:     bind 169.254.169.254:80
Dec 05 09:27:08 compute-1 ovn_metadata_agent[105267]:     server metadata /var/lib/neutron/metadata_proxy
Dec 05 09:27:08 compute-1 ovn_metadata_agent[105267]:     http-request add-header X-OVN-Network-ID 0a97aec7-0780-4b5e-9498-e796fd7b42fd
Dec 05 09:27:08 compute-1 ovn_metadata_agent[105267]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Dec 05 09:27:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:27:08.303 105272 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-0a97aec7-0780-4b5e-9498-e796fd7b42fd', 'env', 'PROCESS_TAG=haproxy-0a97aec7-0780-4b5e-9498-e796fd7b42fd', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/0a97aec7-0780-4b5e-9498-e796fd7b42fd.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Dec 05 09:27:08 compute-1 nova_compute[189066]: 2025-12-05 09:27:08.310 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:27:08 compute-1 podman[223620]: 2025-12-05 09:27:08.712767498 +0000 UTC m=+0.054575829 container create 65a4efaab1bb4c179caf23f9028f4b219d12683a280f7f7a945d46cfc9a9cf30 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0a97aec7-0780-4b5e-9498-e796fd7b42fd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Dec 05 09:27:08 compute-1 systemd[1]: Started libpod-conmon-65a4efaab1bb4c179caf23f9028f4b219d12683a280f7f7a945d46cfc9a9cf30.scope.
Dec 05 09:27:08 compute-1 podman[223620]: 2025-12-05 09:27:08.682221593 +0000 UTC m=+0.024029924 image pull 014dc726c85414b29f2dde7b5d875685d08784761c0f0ffa8630d1583a877bf9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Dec 05 09:27:08 compute-1 systemd[1]: Started libcrun container.
Dec 05 09:27:08 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c1768a45cc4efc6b51693b0a548df8d56a3d8a2f65a04214aaf142688f31bd53/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 05 09:27:08 compute-1 podman[223620]: 2025-12-05 09:27:08.803047089 +0000 UTC m=+0.144855410 container init 65a4efaab1bb4c179caf23f9028f4b219d12683a280f7f7a945d46cfc9a9cf30 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0a97aec7-0780-4b5e-9498-e796fd7b42fd, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Dec 05 09:27:08 compute-1 podman[223620]: 2025-12-05 09:27:08.810146954 +0000 UTC m=+0.151955265 container start 65a4efaab1bb4c179caf23f9028f4b219d12683a280f7f7a945d46cfc9a9cf30 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0a97aec7-0780-4b5e-9498-e796fd7b42fd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 05 09:27:08 compute-1 neutron-haproxy-ovnmeta-0a97aec7-0780-4b5e-9498-e796fd7b42fd[223636]: [NOTICE]   (223640) : New worker (223642) forked
Dec 05 09:27:08 compute-1 neutron-haproxy-ovnmeta-0a97aec7-0780-4b5e-9498-e796fd7b42fd[223636]: [NOTICE]   (223640) : Loading success.
Dec 05 09:27:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:27:08.872 105272 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:27:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:27:08.872 105272 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:27:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:27:08.873 105272 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:27:08 compute-1 nova_compute[189066]: 2025-12-05 09:27:08.965 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:27:09 compute-1 nova_compute[189066]: 2025-12-05 09:27:09.020 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:27:09 compute-1 nova_compute[189066]: 2025-12-05 09:27:09.021 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:27:09 compute-1 nova_compute[189066]: 2025-12-05 09:27:09.021 189070 DEBUG nova.compute.manager [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Dec 05 09:27:09 compute-1 nova_compute[189066]: 2025-12-05 09:27:09.449 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:27:10 compute-1 nova_compute[189066]: 2025-12-05 09:27:10.040 189070 DEBUG oslo_concurrency.lockutils [req-60d45398-3de6-42ba-9364-5fbf59d348a5 req-01307fb7-3034-448c-a1d9-b71d972ce8ee 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Releasing lock "refresh_cache-43d83f29-ba12-4205-ba09-545c3dc28920" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.748 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '43d83f29-ba12-4205-ba09-545c3dc28920', 'name': 'tempest-LiveAutoBlockMigrationV225Test-server-1534053809', 'flavor': {'id': 'fbadeab4-f24f-4100-963a-d228b2a6f7c4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '3ebffd97-b242-42d7-b245-ebdaf8e4377c'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-0000000b', 'OS-EXT-SRV-ATTR:host': 'compute-1.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'paused', 'tenant_id': 'aa5b008731384d18ba83c8d69e76bcef', 'user_id': 'a9b7ab1c9c854146af8af16f337a063d', 'hostId': '775d6a56e5645c4fc9619f653366421d7b59c37709cb5996def4234c', 'status': 'paused', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.752 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '9d2b0f76-0408-4f6b-8a37-ac9882a44b4e', 'name': 'tempest-TestNetworkAdvancedServerOps-server-1301659118', 'flavor': {'id': 'fbadeab4-f24f-4100-963a-d228b2a6f7c4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '3ebffd97-b242-42d7-b245-ebdaf8e4377c'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-0000000a', 'OS-EXT-SRV-ATTR:host': 'compute-1.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'paused', 'tenant_id': 'e26ae3fdd48d4947978a480f70e14f84', 'user_id': '65751a90715341b2984ef84ebbaa1650', 'hostId': 'e310e8c2de91dded1bf420f01eae623153594a2dfb619537eb8a35c6', 'status': 'paused', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.753 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.757 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 43d83f29-ba12-4205-ba09-545c3dc28920 / tap8977e440-f4 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.757 12 DEBUG ceilometer.compute.pollsters [-] 43d83f29-ba12-4205-ba09-545c3dc28920/network.outgoing.packets volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.761 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 9d2b0f76-0408-4f6b-8a37-ac9882a44b4e / tapfee19e88-d1 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.761 12 DEBUG ceilometer.compute.pollsters [-] 9d2b0f76-0408-4f6b-8a37-ac9882a44b4e/network.outgoing.packets volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.766 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a6d0b3c6-6814-413f-8c7a-29335e8760ba', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'a9b7ab1c9c854146af8af16f337a063d', 'user_name': None, 'project_id': 'aa5b008731384d18ba83c8d69e76bcef', 'project_name': None, 'resource_id': 'instance-0000000b-43d83f29-ba12-4205-ba09-545c3dc28920-tap8977e440-f4', 'timestamp': '2025-12-05T09:27:10.753702', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-1534053809', 'name': 'tap8977e440-f4', 'instance_id': '43d83f29-ba12-4205-ba09-545c3dc28920', 'instance_type': 'm1.nano', 'host': '775d6a56e5645c4fc9619f653366421d7b59c37709cb5996def4234c', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'fbadeab4-f24f-4100-963a-d228b2a6f7c4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '3ebffd97-b242-42d7-b245-ebdaf8e4377c'}, 'image_ref': '3ebffd97-b242-42d7-b245-ebdaf8e4377c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:81:c8:37', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap8977e440-f4'}, 'message_id': '9386353a-d1bc-11f0-a2f3-fa163e9454b0', 'monotonic_time': 4103.811399844, 'message_signature': '667d68d009aef60cf8fe99f5d0fdcee6cdb19c11b190ed457ca826ba2455ce5b'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '65751a90715341b2984ef84ebbaa1650', 'user_name': None, 'project_id': 'e26ae3fdd48d4947978a480f70e14f84', 'project_name': None, 'resource_id': 'instance-0000000a-9d2b0f76-0408-4f6b-8a37-ac9882a44b4e-tapfee19e88-d1', 'timestamp': '2025-12-05T09:27:10.753702', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1301659118', 'name': 'tapfee19e88-d1', 'instance_id': '9d2b0f76-0408-4f6b-8a37-ac9882a44b4e', 'instance_type': 'm1.nano', 'host': 'e310e8c2de91dded1bf420f01eae623153594a2dfb619537eb8a35c6', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'fbadeab4-f24f-4100-963a-d228b2a6f7c4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '3ebffd97-b242-42d7-b245-ebdaf8e4377c'}, 'image_ref': '3ebffd97-b242-42d7-b245-ebdaf8e4377c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:ad:12:e3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapfee19e88-d1'}, 'message_id': '9386a880-d1bc-11f0-a2f3-fa163e9454b0', 'monotonic_time': 4103.816751647, 'message_signature': 'aa18bb5d59919ccf33fbdc4cd1dab12cf8adb6d0d70ecb2b3fa2c0b7aff1cc60'}]}, 'timestamp': '2025-12-05 09:27:10.761970', '_unique_id': 'e02c3e2aacdb4313bb0dc474d96830b6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.766 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.766 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.766 12 ERROR oslo_messaging.notify.messaging     yield
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.766 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.766 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.766 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.766 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.766 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.766 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.766 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.766 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.766 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.766 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.766 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.766 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.766 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.766 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.766 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.766 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.766 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.766 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.766 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.766 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.766 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.766 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.766 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.766 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.766 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.766 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.766 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.766 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.766 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.766 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.766 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.766 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.766 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.766 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.766 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.766 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.766 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.766 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.766 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.766 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.766 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.766 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.766 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.766 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.766 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.766 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.766 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.766 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.766 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.766 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.766 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.768 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.794 12 DEBUG ceilometer.compute.pollsters [-] 43d83f29-ba12-4205-ba09-545c3dc28920/disk.device.read.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.794 12 DEBUG ceilometer.compute.pollsters [-] 43d83f29-ba12-4205-ba09-545c3dc28920/disk.device.read.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.820 12 DEBUG ceilometer.compute.pollsters [-] 9d2b0f76-0408-4f6b-8a37-ac9882a44b4e/disk.device.read.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.820 12 DEBUG ceilometer.compute.pollsters [-] 9d2b0f76-0408-4f6b-8a37-ac9882a44b4e/disk.device.read.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.822 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8c864b56-9f9a-4ea4-b1c6-e3799bc971fc', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': 'a9b7ab1c9c854146af8af16f337a063d', 'user_name': None, 'project_id': 'aa5b008731384d18ba83c8d69e76bcef', 'project_name': None, 'resource_id': '43d83f29-ba12-4205-ba09-545c3dc28920-vda', 'timestamp': '2025-12-05T09:27:10.768845', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-1534053809', 'name': 'instance-0000000b', 'instance_id': '43d83f29-ba12-4205-ba09-545c3dc28920', 'instance_type': 'm1.nano', 'host': '775d6a56e5645c4fc9619f653366421d7b59c37709cb5996def4234c', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'fbadeab4-f24f-4100-963a-d228b2a6f7c4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '3ebffd97-b242-42d7-b245-ebdaf8e4377c'}, 'image_ref': '3ebffd97-b242-42d7-b245-ebdaf8e4377c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '938baa74-d1bc-11f0-a2f3-fa163e9454b0', 'monotonic_time': 4103.826513798, 'message_signature': 'f63bd58ec0f002ccc09964101d9418c9146f55320d70d357d84c82cb8e1cf085'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': 'a9b7ab1c9c854146af8af16f337a063d', 'user_name': None, 'project_id': 'aa5b008731384d18ba83c8d69e76bcef', 'project_name': None, 'resource_id': '43d83f29-ba12-4205-ba09-545c3dc28920-sda', 'timestamp': '2025-12-05T09:27:10.768845', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-1534053809', 'name': 'instance-0000000b', 'instance_id': '43d83f29-ba12-4205-ba09-545c3dc28920', 'instance_type': 'm1.nano', 'host': '775d6a56e5645c4fc9619f653366421d7b59c37709cb5996def4234c', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'fbadeab4-f24f-4100-963a-d228b2a6f7c4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '3ebffd97-b242-42d7-b245-ebdaf8e4377c'}, 'image_ref': '3ebffd97-b242-42d7-b245-ebdaf8e4377c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '938bba5a-d1bc-11f0-a2f3-fa163e9454b0', 'monotonic_time': 4103.826513798, 'message_signature': 'e4082643bb8bd0548b6944838f8d56d64e4ea79fea92622076d533e1ef330511'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '65751a90715341b2984ef84ebbaa1650', 'user_name': None, 'project_id': 'e26ae3fdd48d4947978a480f70e14f84', 'project_name': None, 'resource_id': '9d2b0f76-0408-4f6b-8a37-ac9882a44b4e-vda', 'timestamp': '2025-12-05T09:27:10.768845', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1301659118', 'name': 'instance-0000000a', 'instance_id': '9d2b0f76-0408-4f6b-8a37-ac9882a44b4e', 'instance_type': 'm1.nano', 'host': 'e310e8c2de91dded1bf420f01eae623153594a2dfb619537eb8a35c6', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'fbadeab4-f24f-4100-963a-d228b2a6f7c4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '3ebffd97-b242-42d7-b245-ebdaf8e4377c'}, 'image_ref': '3ebffd97-b242-42d7-b245-ebdaf8e4377c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '938f9d64-d1bc-11f0-a2f3-fa163e9454b0', 'monotonic_time': 4103.852671843, 'message_signature': '40212f351a7d2978dc54ede035b0b60a31e10e2d88d1d1b2e4c9bd9a2d818416'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '65751a90715341b2984ef84ebbaa1650', 'user_name': None, 'project_id': 'e26ae3fdd48d4947978a480f70e14f84', 'project_name': None, 'resource_id': '9d2b0f76-0408-4f6b-8a37-ac9882a44b4e-sda', 'timestamp': '2025-12-05T09:27:10.768845', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1301659118', 'name': 'instance-0000000a', 'instance_id': '9d2b0f76-0408-4f6b-8a37-ac9882a44b4e', 'instance_type': 'm1.nano', 'host': 'e310e8c2de91dded1bf420f01eae623153594a2dfb619537eb8a35c6', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'fbadeab4-f24f-4100-963a-d228b2a6f7c4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '3ebffd97-b242-42d7-b245-ebdaf8e4377c'}, 'image_ref': '3ebffd97-b242-42d7-b245-ebdaf8e4377c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '938faaa2-d1bc-11f0-a2f3-fa163e9454b0', 'monotonic_time': 4103.852671843, 'message_signature': '65e0d34830a309b463680c0ce2d69dc5887c8ac11350e7dda0fcd832532c6565'}]}, 'timestamp': '2025-12-05 09:27:10.820953', '_unique_id': '2c2c3220a49241c694c251d9ae023440'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.822 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.822 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.822 12 ERROR oslo_messaging.notify.messaging     yield
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.822 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.822 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.822 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.822 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.822 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.822 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.822 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.822 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.822 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.822 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.822 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.822 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.822 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.822 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.822 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.822 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.822 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.822 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.822 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.822 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.822 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.822 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.822 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.822 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.822 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.822 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.822 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.822 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.822 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.822 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.822 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.822 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.822 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.822 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.822 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.822 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.822 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.822 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.822 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.822 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.822 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.822 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.822 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.822 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.822 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.822 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.822 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.822 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.822 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.822 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.822 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.828 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.843 12 DEBUG ceilometer.compute.pollsters [-] 43d83f29-ba12-4205-ba09-545c3dc28920/cpu volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.863 12 DEBUG ceilometer.compute.pollsters [-] 9d2b0f76-0408-4f6b-8a37-ac9882a44b4e/cpu volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.866 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '425f9692-05b2-4cb9-969d-f33ea372affa', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': 'a9b7ab1c9c854146af8af16f337a063d', 'user_name': None, 'project_id': 'aa5b008731384d18ba83c8d69e76bcef', 'project_name': None, 'resource_id': '43d83f29-ba12-4205-ba09-545c3dc28920', 'timestamp': '2025-12-05T09:27:10.828438', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-1534053809', 'name': 'instance-0000000b', 'instance_id': '43d83f29-ba12-4205-ba09-545c3dc28920', 'instance_type': 'm1.nano', 'host': '775d6a56e5645c4fc9619f653366421d7b59c37709cb5996def4234c', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'fbadeab4-f24f-4100-963a-d228b2a6f7c4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '3ebffd97-b242-42d7-b245-ebdaf8e4377c'}, 'image_ref': '3ebffd97-b242-42d7-b245-ebdaf8e4377c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '93939fcc-d1bc-11f0-a2f3-fa163e9454b0', 'monotonic_time': 4103.901361616, 'message_signature': 'b5c5c14308d2adc158cd1afec798ff9769845194932059d843e7bc11b13a4d2a'}, {'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '65751a90715341b2984ef84ebbaa1650', 'user_name': None, 'project_id': 'e26ae3fdd48d4947978a480f70e14f84', 'project_name': None, 'resource_id': '9d2b0f76-0408-4f6b-8a37-ac9882a44b4e', 'timestamp': '2025-12-05T09:27:10.828438', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1301659118', 'name': 'instance-0000000a', 'instance_id': '9d2b0f76-0408-4f6b-8a37-ac9882a44b4e', 'instance_type': 'm1.nano', 'host': 'e310e8c2de91dded1bf420f01eae623153594a2dfb619537eb8a35c6', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'fbadeab4-f24f-4100-963a-d228b2a6f7c4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '3ebffd97-b242-42d7-b245-ebdaf8e4377c'}, 'image_ref': '3ebffd97-b242-42d7-b245-ebdaf8e4377c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '9396582a-d1bc-11f0-a2f3-fa163e9454b0', 'monotonic_time': 4103.9213389, 'message_signature': 'fd736619c3dc0af56f266755c6ee6b5a345b429d1172d8e82436690e9b491ba9'}]}, 'timestamp': '2025-12-05 09:27:10.864805', '_unique_id': '120f33692f0649ccb0bddb32f464a2e1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.866 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.866 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.866 12 ERROR oslo_messaging.notify.messaging     yield
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.866 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.866 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.866 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.866 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.866 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.866 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.866 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.866 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.866 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.866 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.866 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.866 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.866 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.866 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.866 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.866 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.866 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.866 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.866 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.866 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.866 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.866 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.866 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.866 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.866 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.866 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.866 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.866 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.866 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.866 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.866 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.866 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.866 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.866 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.866 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.866 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.866 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.866 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.866 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.866 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.866 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.866 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.866 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.866 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.866 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.866 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.866 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.866 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.866 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.866 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.866 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.867 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.867 12 DEBUG ceilometer.compute.pollsters [-] 43d83f29-ba12-4205-ba09-545c3dc28920/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.867 12 DEBUG ceilometer.compute.pollsters [-] 43d83f29-ba12-4205-ba09-545c3dc28920/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.868 12 DEBUG ceilometer.compute.pollsters [-] 9d2b0f76-0408-4f6b-8a37-ac9882a44b4e/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.868 12 DEBUG ceilometer.compute.pollsters [-] 9d2b0f76-0408-4f6b-8a37-ac9882a44b4e/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.869 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b7daadc7-f16c-471c-8fe7-4c1b35cb3273', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': 'a9b7ab1c9c854146af8af16f337a063d', 'user_name': None, 'project_id': 'aa5b008731384d18ba83c8d69e76bcef', 'project_name': None, 'resource_id': '43d83f29-ba12-4205-ba09-545c3dc28920-vda', 'timestamp': '2025-12-05T09:27:10.867468', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-1534053809', 'name': 'instance-0000000b', 'instance_id': '43d83f29-ba12-4205-ba09-545c3dc28920', 'instance_type': 'm1.nano', 'host': '775d6a56e5645c4fc9619f653366421d7b59c37709cb5996def4234c', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'fbadeab4-f24f-4100-963a-d228b2a6f7c4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '3ebffd97-b242-42d7-b245-ebdaf8e4377c'}, 'image_ref': '3ebffd97-b242-42d7-b245-ebdaf8e4377c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '9396d304-d1bc-11f0-a2f3-fa163e9454b0', 'monotonic_time': 4103.826513798, 'message_signature': '17891af0e9ca8042c2d7feeedfd150732a4ab174a4b20e468807e95262d2d06c'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': 'a9b7ab1c9c854146af8af16f337a063d', 'user_name': None, 'project_id': 'aa5b008731384d18ba83c8d69e76bcef', 'project_name': None, 'resource_id': '43d83f29-ba12-4205-ba09-545c3dc28920-sda', 'timestamp': '2025-12-05T09:27:10.867468', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-1534053809', 'name': 'instance-0000000b', 'instance_id': '43d83f29-ba12-4205-ba09-545c3dc28920', 'instance_type': 'm1.nano', 'host': '775d6a56e5645c4fc9619f653366421d7b59c37709cb5996def4234c', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'fbadeab4-f24f-4100-963a-d228b2a6f7c4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '3ebffd97-b242-42d7-b245-ebdaf8e4377c'}, 'image_ref': '3ebffd97-b242-42d7-b245-ebdaf8e4377c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '9396e06a-d1bc-11f0-a2f3-fa163e9454b0', 'monotonic_time': 4103.826513798, 'message_signature': '7ae74a4b41af61c17ef95dbbf7761df131b0bcd43e54c600e5a5780037ba161f'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '65751a90715341b2984ef84ebbaa1650', 'user_name': None, 'project_id': 'e26ae3fdd48d4947978a480f70e14f84', 'project_name': None, 'resource_id': '9d2b0f76-0408-4f6b-8a37-ac9882a44b4e-vda', 'timestamp': '2025-12-05T09:27:10.867468', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1301659118', 'name': 'instance-0000000a', 'instance_id': '9d2b0f76-0408-4f6b-8a37-ac9882a44b4e', 'instance_type': 'm1.nano', 'host': 'e310e8c2de91dded1bf420f01eae623153594a2dfb619537eb8a35c6', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'fbadeab4-f24f-4100-963a-d228b2a6f7c4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '3ebffd97-b242-42d7-b245-ebdaf8e4377c'}, 'image_ref': '3ebffd97-b242-42d7-b245-ebdaf8e4377c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '9396ec72-d1bc-11f0-a2f3-fa163e9454b0', 'monotonic_time': 4103.852671843, 'message_signature': '5f62ecb383372533dcf2a52c01796a16a6396d01bbd5e66cd784b17e6de4b22d'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '65751a90715341b2984ef84ebbaa1650', 'user_name': None, 'project_id': 'e26ae3fdd48d4947978a480f70e14f84', 'project_name': None, 'resource_id': '9d2b0f76-0408-4f6b-8a37-ac9882a44b4e-sda', 'timestamp': '2025-12-05T09:27:10.867468', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1301659118', 'name': 'instance-0000000a', 'instance_id': '9d2b0f76-0408-4f6b-8a37-ac9882a44b4e', 'instance_type': 'm1.nano', 'host': 'e310e8c2de91dded1bf420f01eae623153594a2dfb619537eb8a35c6', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'fbadeab4-f24f-4100-963a-d228b2a6f7c4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '3ebffd97-b242-42d7-b245-ebdaf8e4377c'}, 'image_ref': '3ebffd97-b242-42d7-b245-ebdaf8e4377c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '9396fb22-d1bc-11f0-a2f3-fa163e9454b0', 'monotonic_time': 4103.852671843, 'message_signature': '9edb151cfb911606e3d9f963e23d731b600cc14750e3ba50ff978e772ba3606b'}]}, 'timestamp': '2025-12-05 09:27:10.868889', '_unique_id': 'e0533154a33a433b98ddabde4086400a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.869 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.869 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.869 12 ERROR oslo_messaging.notify.messaging     yield
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.869 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.869 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.869 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.869 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.869 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.869 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.869 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.869 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.869 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.869 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.869 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.869 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.869 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.869 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.869 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.869 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.869 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.869 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.869 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.869 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.869 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.869 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.869 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.869 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.869 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.869 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.869 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.869 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.869 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.869 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.869 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.869 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.869 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.869 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.869 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.869 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.869 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.869 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.869 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.869 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.869 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.869 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.869 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.869 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.869 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.869 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.869 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.869 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.869 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.869 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.869 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.870 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.870 12 DEBUG ceilometer.compute.pollsters [-] 43d83f29-ba12-4205-ba09-545c3dc28920/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.871 12 DEBUG ceilometer.compute.pollsters [-] 43d83f29-ba12-4205-ba09-545c3dc28920/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.871 12 DEBUG ceilometer.compute.pollsters [-] 9d2b0f76-0408-4f6b-8a37-ac9882a44b4e/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.871 12 DEBUG ceilometer.compute.pollsters [-] 9d2b0f76-0408-4f6b-8a37-ac9882a44b4e/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.873 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a10fac9b-0522-4f78-b1f8-145b968eb3b8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'a9b7ab1c9c854146af8af16f337a063d', 'user_name': None, 'project_id': 'aa5b008731384d18ba83c8d69e76bcef', 'project_name': None, 'resource_id': '43d83f29-ba12-4205-ba09-545c3dc28920-vda', 'timestamp': '2025-12-05T09:27:10.870944', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-1534053809', 'name': 'instance-0000000b', 'instance_id': '43d83f29-ba12-4205-ba09-545c3dc28920', 'instance_type': 'm1.nano', 'host': '775d6a56e5645c4fc9619f653366421d7b59c37709cb5996def4234c', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'fbadeab4-f24f-4100-963a-d228b2a6f7c4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '3ebffd97-b242-42d7-b245-ebdaf8e4377c'}, 'image_ref': '3ebffd97-b242-42d7-b245-ebdaf8e4377c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '93975964-d1bc-11f0-a2f3-fa163e9454b0', 'monotonic_time': 4103.826513798, 'message_signature': '844545d9381af1154f3ae17505daf1e095f9780a9244944046fc45fbfb161558'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'a9b7ab1c9c854146af8af16f337a063d', 'user_name': None, 'project_id': 'aa5b008731384d18ba83c8d69e76bcef', 'project_name': None, 'resource_id': '43d83f29-ba12-4205-ba09-545c3dc28920-sda', 'timestamp': '2025-12-05T09:27:10.870944', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-1534053809', 'name': 'instance-0000000b', 'instance_id': '43d83f29-ba12-4205-ba09-545c3dc28920', 'instance_type': 'm1.nano', 'host': '775d6a56e5645c4fc9619f653366421d7b59c37709cb5996def4234c', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'fbadeab4-f24f-4100-963a-d228b2a6f7c4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '3ebffd97-b242-42d7-b245-ebdaf8e4377c'}, 'image_ref': '3ebffd97-b242-42d7-b245-ebdaf8e4377c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '939766c0-d1bc-11f0-a2f3-fa163e9454b0', 'monotonic_time': 4103.826513798, 'message_signature': '4fefd99354b8b77253f236cf4d38a9f904c99dc4e7c53dbaafb15844e3533c55'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '65751a90715341b2984ef84ebbaa1650', 'user_name': None, 'project_id': 'e26ae3fdd48d4947978a480f70e14f84', 'project_name': None, 'resource_id': '9d2b0f76-0408-4f6b-8a37-ac9882a44b4e-vda', 'timestamp': '2025-12-05T09:27:10.870944', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1301659118', 'name': 'instance-0000000a', 'instance_id': '9d2b0f76-0408-4f6b-8a37-ac9882a44b4e', 'instance_type': 'm1.nano', 'host': 'e310e8c2de91dded1bf420f01eae623153594a2dfb619537eb8a35c6', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'fbadeab4-f24f-4100-963a-d228b2a6f7c4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '3ebffd97-b242-42d7-b245-ebdaf8e4377c'}, 'image_ref': '3ebffd97-b242-42d7-b245-ebdaf8e4377c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '9397749e-d1bc-11f0-a2f3-fa163e9454b0', 'monotonic_time': 4103.852671843, 'message_signature': '62b9dcf7950ce438a19839dd18bf60defacd5a650fe54131cace1dc3d6bb0197'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '65751a90715341b2984ef84ebbaa1650', 'user_name': None, 'project_id': 'e26ae3fdd48d4947978a480f70e14f84', 'project_name': None, 'resource_id': '9d2b0f76-0408-4f6b-8a37-ac9882a44b4e-sda', 'timestamp': '2025-12-05T09:27:10.870944', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1301659118', 'name': 'instance-0000000a', 'instance_id': '9d2b0f76-0408-4f6b-8a37-ac9882a44b4e', 'instance_type': 'm1.nano', 'host': 'e310e8c2de91dded1bf420f01eae623153594a2dfb619537eb8a35c6', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'fbadeab4-f24f-4100-963a-d228b2a6f7c4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '3ebffd97-b242-42d7-b245-ebdaf8e4377c'}, 'image_ref': '3ebffd97-b242-42d7-b245-ebdaf8e4377c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '93978812-d1bc-11f0-a2f3-fa163e9454b0', 'monotonic_time': 4103.852671843, 'message_signature': 'aeb29854577bf8144f3cd89257f997560f2f3840152c4fd491e776a2e22135aa'}]}, 'timestamp': '2025-12-05 09:27:10.872511', '_unique_id': 'd250ad0fee794ecb9456cd69ead467a9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.873 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.873 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.873 12 ERROR oslo_messaging.notify.messaging     yield
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.873 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.873 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.873 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.873 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.873 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.873 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.873 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.873 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.873 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.873 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.873 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.873 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.873 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.873 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.873 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.873 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.873 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.873 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.873 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.873 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.873 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.873 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.873 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.873 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.873 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.873 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.873 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.873 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.873 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.873 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.873 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.873 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.873 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.873 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.873 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.873 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.873 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.873 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.873 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.873 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.873 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.873 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.873 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.873 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.873 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.873 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.873 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.873 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.873 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.873 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.873 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.874 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.874 12 DEBUG ceilometer.compute.pollsters [-] 43d83f29-ba12-4205-ba09-545c3dc28920/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.874 12 DEBUG ceilometer.compute.pollsters [-] 9d2b0f76-0408-4f6b-8a37-ac9882a44b4e/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.876 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '31cd377b-7944-47c1-85d0-ecc3eebec070', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'a9b7ab1c9c854146af8af16f337a063d', 'user_name': None, 'project_id': 'aa5b008731384d18ba83c8d69e76bcef', 'project_name': None, 'resource_id': 'instance-0000000b-43d83f29-ba12-4205-ba09-545c3dc28920-tap8977e440-f4', 'timestamp': '2025-12-05T09:27:10.874617', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-1534053809', 'name': 'tap8977e440-f4', 'instance_id': '43d83f29-ba12-4205-ba09-545c3dc28920', 'instance_type': 'm1.nano', 'host': '775d6a56e5645c4fc9619f653366421d7b59c37709cb5996def4234c', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'fbadeab4-f24f-4100-963a-d228b2a6f7c4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '3ebffd97-b242-42d7-b245-ebdaf8e4377c'}, 'image_ref': '3ebffd97-b242-42d7-b245-ebdaf8e4377c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:81:c8:37', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap8977e440-f4'}, 'message_id': '9397e802-d1bc-11f0-a2f3-fa163e9454b0', 'monotonic_time': 4103.811399844, 'message_signature': '406d690861da691e261491ec388ce09a821e8772e8ec47df65fbbd329ce56a7a'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '65751a90715341b2984ef84ebbaa1650', 'user_name': None, 'project_id': 'e26ae3fdd48d4947978a480f70e14f84', 'project_name': None, 'resource_id': 'instance-0000000a-9d2b0f76-0408-4f6b-8a37-ac9882a44b4e-tapfee19e88-d1', 'timestamp': '2025-12-05T09:27:10.874617', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1301659118', 'name': 'tapfee19e88-d1', 'instance_id': '9d2b0f76-0408-4f6b-8a37-ac9882a44b4e', 'instance_type': 'm1.nano', 'host': 'e310e8c2de91dded1bf420f01eae623153594a2dfb619537eb8a35c6', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'fbadeab4-f24f-4100-963a-d228b2a6f7c4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '3ebffd97-b242-42d7-b245-ebdaf8e4377c'}, 'image_ref': '3ebffd97-b242-42d7-b245-ebdaf8e4377c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:ad:12:e3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapfee19e88-d1'}, 'message_id': '9397f414-d1bc-11f0-a2f3-fa163e9454b0', 'monotonic_time': 4103.816751647, 'message_signature': 'e25df3150d91c0b3e2bd79f9f0d34bc5acc66089fe39eec2a72f877f117970c1'}]}, 'timestamp': '2025-12-05 09:27:10.875281', '_unique_id': '547a262cbd71447aa2379c838e531cdd'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.876 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.876 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.876 12 ERROR oslo_messaging.notify.messaging     yield
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.876 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.876 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.876 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.876 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.876 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.876 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.876 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.876 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.876 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.876 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.876 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.876 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.876 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.876 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.876 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.876 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.876 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.876 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.876 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.876 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.876 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.876 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.876 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.876 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.876 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.876 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.876 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.876 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.876 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.876 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.876 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.876 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.876 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.876 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.876 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.876 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.876 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.876 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.876 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.876 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.876 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.876 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.876 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.876 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.876 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.876 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.876 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.876 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.876 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.876 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.876 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.877 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.877 12 DEBUG ceilometer.compute.pollsters [-] 43d83f29-ba12-4205-ba09-545c3dc28920/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.878 12 DEBUG ceilometer.compute.pollsters [-] 9d2b0f76-0408-4f6b-8a37-ac9882a44b4e/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.879 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7e47c8e7-2a80-4cd0-b787-a9f272c08e71', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'a9b7ab1c9c854146af8af16f337a063d', 'user_name': None, 'project_id': 'aa5b008731384d18ba83c8d69e76bcef', 'project_name': None, 'resource_id': 'instance-0000000b-43d83f29-ba12-4205-ba09-545c3dc28920-tap8977e440-f4', 'timestamp': '2025-12-05T09:27:10.877774', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-1534053809', 'name': 'tap8977e440-f4', 'instance_id': '43d83f29-ba12-4205-ba09-545c3dc28920', 'instance_type': 'm1.nano', 'host': '775d6a56e5645c4fc9619f653366421d7b59c37709cb5996def4234c', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'fbadeab4-f24f-4100-963a-d228b2a6f7c4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '3ebffd97-b242-42d7-b245-ebdaf8e4377c'}, 'image_ref': '3ebffd97-b242-42d7-b245-ebdaf8e4377c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:81:c8:37', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap8977e440-f4'}, 'message_id': '9398652a-d1bc-11f0-a2f3-fa163e9454b0', 'monotonic_time': 4103.811399844, 'message_signature': '565cce04f221763bdbbdc669965d286af5d6aa092a8c2a3776c822a4f8911429'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '65751a90715341b2984ef84ebbaa1650', 'user_name': None, 'project_id': 'e26ae3fdd48d4947978a480f70e14f84', 'project_name': None, 'resource_id': 'instance-0000000a-9d2b0f76-0408-4f6b-8a37-ac9882a44b4e-tapfee19e88-d1', 'timestamp': '2025-12-05T09:27:10.877774', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1301659118', 'name': 'tapfee19e88-d1', 'instance_id': '9d2b0f76-0408-4f6b-8a37-ac9882a44b4e', 'instance_type': 'm1.nano', 'host': 'e310e8c2de91dded1bf420f01eae623153594a2dfb619537eb8a35c6', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'fbadeab4-f24f-4100-963a-d228b2a6f7c4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '3ebffd97-b242-42d7-b245-ebdaf8e4377c'}, 'image_ref': '3ebffd97-b242-42d7-b245-ebdaf8e4377c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:ad:12:e3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapfee19e88-d1'}, 'message_id': '93987ede-d1bc-11f0-a2f3-fa163e9454b0', 'monotonic_time': 4103.816751647, 'message_signature': 'f3cd86f4f3b41ff1a56132fd0309dd5c8ed6005ed45b822a0a4c25ee00719809'}]}, 'timestamp': '2025-12-05 09:27:10.878884', '_unique_id': '0578848130cb48c7bb23529a1e206e47'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.879 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.879 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.879 12 ERROR oslo_messaging.notify.messaging     yield
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.879 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.879 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.879 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.879 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.879 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.879 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.879 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.879 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.879 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.879 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.879 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.879 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.879 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.879 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.879 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.879 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.879 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.879 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.879 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.879 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.879 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.879 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.879 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.879 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.879 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.879 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.879 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.879 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.879 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.879 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.879 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.879 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.879 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.879 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.879 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.879 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.879 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.879 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.879 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.879 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.879 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.879 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.879 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.879 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.879 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.879 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.879 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.879 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.879 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.879 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.879 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.881 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.881 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.881 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-LiveAutoBlockMigrationV225Test-server-1534053809>, <NovaLikeServer: tempest-TestNetworkAdvancedServerOps-server-1301659118>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-LiveAutoBlockMigrationV225Test-server-1534053809>, <NovaLikeServer: tempest-TestNetworkAdvancedServerOps-server-1301659118>]
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.882 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.882 12 DEBUG ceilometer.compute.pollsters [-] 43d83f29-ba12-4205-ba09-545c3dc28920/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.882 12 DEBUG ceilometer.compute.pollsters [-] 43d83f29-ba12-4205-ba09-545c3dc28920/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.883 12 DEBUG ceilometer.compute.pollsters [-] 9d2b0f76-0408-4f6b-8a37-ac9882a44b4e/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.883 12 DEBUG ceilometer.compute.pollsters [-] 9d2b0f76-0408-4f6b-8a37-ac9882a44b4e/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.884 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '324d0ffe-fad5-478e-b84a-ba32b4719f02', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': 'a9b7ab1c9c854146af8af16f337a063d', 'user_name': None, 'project_id': 'aa5b008731384d18ba83c8d69e76bcef', 'project_name': None, 'resource_id': '43d83f29-ba12-4205-ba09-545c3dc28920-vda', 'timestamp': '2025-12-05T09:27:10.882383', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-1534053809', 'name': 'instance-0000000b', 'instance_id': '43d83f29-ba12-4205-ba09-545c3dc28920', 'instance_type': 'm1.nano', 'host': '775d6a56e5645c4fc9619f653366421d7b59c37709cb5996def4234c', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'fbadeab4-f24f-4100-963a-d228b2a6f7c4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '3ebffd97-b242-42d7-b245-ebdaf8e4377c'}, 'image_ref': '3ebffd97-b242-42d7-b245-ebdaf8e4377c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '93991a38-d1bc-11f0-a2f3-fa163e9454b0', 'monotonic_time': 4103.826513798, 'message_signature': '347dcf2ab9404c2f53445b2a42f493f1193b255f05d4d8bfafd69556b6138093'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': 'a9b7ab1c9c854146af8af16f337a063d', 'user_name': None, 'project_id': 'aa5b008731384d18ba83c8d69e76bcef', 'project_name': None, 'resource_id': '43d83f29-ba12-4205-ba09-545c3dc28920-sda', 'timestamp': '2025-12-05T09:27:10.882383', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-1534053809', 'name': 'instance-0000000b', 'instance_id': '43d83f29-ba12-4205-ba09-545c3dc28920', 'instance_type': 'm1.nano', 'host': '775d6a56e5645c4fc9619f653366421d7b59c37709cb5996def4234c', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'fbadeab4-f24f-4100-963a-d228b2a6f7c4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '3ebffd97-b242-42d7-b245-ebdaf8e4377c'}, 'image_ref': '3ebffd97-b242-42d7-b245-ebdaf8e4377c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '93992ed8-d1bc-11f0-a2f3-fa163e9454b0', 'monotonic_time': 4103.826513798, 'message_signature': 'c0e7404d3c55c2c2363c9c5cda4b652b7dc26d19a309a425a066b06ef4d10511'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '65751a90715341b2984ef84ebbaa1650', 'user_name': None, 'project_id': 'e26ae3fdd48d4947978a480f70e14f84', 'project_name': None, 'resource_id': '9d2b0f76-0408-4f6b-8a37-ac9882a44b4e-vda', 'timestamp': '2025-12-05T09:27:10.882383', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1301659118', 'name': 'instance-0000000a', 'instance_id': '9d2b0f76-0408-4f6b-8a37-ac9882a44b4e', 'instance_type': 'm1.nano', 'host': 'e310e8c2de91dded1bf420f01eae623153594a2dfb619537eb8a35c6', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'fbadeab4-f24f-4100-963a-d228b2a6f7c4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '3ebffd97-b242-42d7-b245-ebdaf8e4377c'}, 'image_ref': '3ebffd97-b242-42d7-b245-ebdaf8e4377c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '93993ca2-d1bc-11f0-a2f3-fa163e9454b0', 'monotonic_time': 4103.852671843, 'message_signature': '7b981e3a2354e7fe50c63b4bd925fd4296e2a5b14891e42750351dd8836d1595'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '65751a90715341b2984ef84ebbaa1650', 'user_name': None, 'project_id': 'e26ae3fdd48d4947978a480f70e14f84', 'project_name': None, 'resource_id': '9d2b0f76-0408-4f6b-8a37-ac9882a44b4e-sda', 'timestamp': '2025-12-05T09:27:10.882383', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1301659118', 'name': 'instance-0000000a', 'instance_id': '9d2b0f76-0408-4f6b-8a37-ac9882a44b4e', 'instance_type': 'm1.nano', 'host': 'e310e8c2de91dded1bf420f01eae623153594a2dfb619537eb8a35c6', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'fbadeab4-f24f-4100-963a-d228b2a6f7c4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '3ebffd97-b242-42d7-b245-ebdaf8e4377c'}, 'image_ref': '3ebffd97-b242-42d7-b245-ebdaf8e4377c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '93994b48-d1bc-11f0-a2f3-fa163e9454b0', 'monotonic_time': 4103.852671843, 'message_signature': '53cd737e6e91ffca816065b2b469f4767950773580ddb90afabd1f4286f18bb7'}]}, 'timestamp': '2025-12-05 09:27:10.884068', '_unique_id': 'b49023e72d084c28a896d8f88ee249bc'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.884 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.884 12 ERROR oslo_messaging.notify.messaging     yield
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.884 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.884 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.884 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.884 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.884 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.884 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.884 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.884 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.884 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.884 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.884 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.884 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.884 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.884 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.884 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.884 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.884 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.884 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.884 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.884 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.884 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.884 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.884 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.884 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.884 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.884 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.884 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.884 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.884 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.885 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.886 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.886 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-LiveAutoBlockMigrationV225Test-server-1534053809>, <NovaLikeServer: tempest-TestNetworkAdvancedServerOps-server-1301659118>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-LiveAutoBlockMigrationV225Test-server-1534053809>, <NovaLikeServer: tempest-TestNetworkAdvancedServerOps-server-1301659118>]
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.886 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.886 12 DEBUG ceilometer.compute.pollsters [-] 43d83f29-ba12-4205-ba09-545c3dc28920/network.incoming.packets volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.886 12 DEBUG ceilometer.compute.pollsters [-] 9d2b0f76-0408-4f6b-8a37-ac9882a44b4e/network.incoming.packets volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.888 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'bcc23e44-414a-4afa-b604-445bbdca501c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 1, 'user_id': 'a9b7ab1c9c854146af8af16f337a063d', 'user_name': None, 'project_id': 'aa5b008731384d18ba83c8d69e76bcef', 'project_name': None, 'resource_id': 'instance-0000000b-43d83f29-ba12-4205-ba09-545c3dc28920-tap8977e440-f4', 'timestamp': '2025-12-05T09:27:10.886619', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-1534053809', 'name': 'tap8977e440-f4', 'instance_id': '43d83f29-ba12-4205-ba09-545c3dc28920', 'instance_type': 'm1.nano', 'host': '775d6a56e5645c4fc9619f653366421d7b59c37709cb5996def4234c', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'fbadeab4-f24f-4100-963a-d228b2a6f7c4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '3ebffd97-b242-42d7-b245-ebdaf8e4377c'}, 'image_ref': '3ebffd97-b242-42d7-b245-ebdaf8e4377c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:81:c8:37', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap8977e440-f4'}, 'message_id': '9399bcc2-d1bc-11f0-a2f3-fa163e9454b0', 'monotonic_time': 4103.811399844, 'message_signature': 'b1c6a63cb46596b4ca0d26e20f70bf14c8d1c373f2429cda0de18b16ac4e0d12'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 1, 'user_id': '65751a90715341b2984ef84ebbaa1650', 'user_name': None, 'project_id': 'e26ae3fdd48d4947978a480f70e14f84', 'project_name': None, 'resource_id': 'instance-0000000a-9d2b0f76-0408-4f6b-8a37-ac9882a44b4e-tapfee19e88-d1', 'timestamp': '2025-12-05T09:27:10.886619', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1301659118', 'name': 'tapfee19e88-d1', 'instance_id': '9d2b0f76-0408-4f6b-8a37-ac9882a44b4e', 'instance_type': 'm1.nano', 'host': 'e310e8c2de91dded1bf420f01eae623153594a2dfb619537eb8a35c6', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'fbadeab4-f24f-4100-963a-d228b2a6f7c4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '3ebffd97-b242-42d7-b245-ebdaf8e4377c'}, 'image_ref': '3ebffd97-b242-42d7-b245-ebdaf8e4377c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:ad:12:e3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapfee19e88-d1'}, 'message_id': '9399c942-d1bc-11f0-a2f3-fa163e9454b0', 'monotonic_time': 4103.816751647, 'message_signature': '4d4340de40e0da0361403e5e189ccc6cabbe86bdcc57543c6e1fbbf156b08787'}]}, 'timestamp': '2025-12-05 09:27:10.887289', '_unique_id': '504e2063c4784bf0ba0f6c65e4e214c4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.888 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.888 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.888 12 ERROR oslo_messaging.notify.messaging     yield
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.888 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.888 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.888 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.888 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.888 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.888 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.888 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.888 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.888 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.888 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.888 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.888 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.888 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.888 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.888 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.888 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.888 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.888 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.888 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.888 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.888 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.888 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.888 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.888 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.888 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.888 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.888 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.888 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.888 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.888 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.888 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.888 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.888 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.888 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.888 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.888 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.888 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.888 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.888 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.888 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.888 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.888 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.888 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.888 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.888 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.888 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.888 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.888 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.888 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.888 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.888 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.889 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.889 12 DEBUG ceilometer.compute.pollsters [-] 43d83f29-ba12-4205-ba09-545c3dc28920/memory.usage volume: Unavailable _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.889 12 WARNING ceilometer.compute.pollsters [-] memory.usage statistic in not available for instance 43d83f29-ba12-4205-ba09-545c3dc28920: ceilometer.compute.pollsters.NoVolumeException
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.889 12 DEBUG ceilometer.compute.pollsters [-] 9d2b0f76-0408-4f6b-8a37-ac9882a44b4e/memory.usage volume: Unavailable _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.889 12 WARNING ceilometer.compute.pollsters [-] memory.usage statistic in not available for instance 9d2b0f76-0408-4f6b-8a37-ac9882a44b4e: ceilometer.compute.pollsters.NoVolumeException
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.890 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.890 12 DEBUG ceilometer.compute.pollsters [-] 43d83f29-ba12-4205-ba09-545c3dc28920/disk.device.read.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.890 12 DEBUG ceilometer.compute.pollsters [-] 43d83f29-ba12-4205-ba09-545c3dc28920/disk.device.read.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.890 12 DEBUG ceilometer.compute.pollsters [-] 9d2b0f76-0408-4f6b-8a37-ac9882a44b4e/disk.device.read.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.890 12 DEBUG ceilometer.compute.pollsters [-] 9d2b0f76-0408-4f6b-8a37-ac9882a44b4e/disk.device.read.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.891 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '311defb8-e84d-4157-b06c-7b726f91a110', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'a9b7ab1c9c854146af8af16f337a063d', 'user_name': None, 'project_id': 'aa5b008731384d18ba83c8d69e76bcef', 'project_name': None, 'resource_id': '43d83f29-ba12-4205-ba09-545c3dc28920-vda', 'timestamp': '2025-12-05T09:27:10.890120', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-1534053809', 'name': 'instance-0000000b', 'instance_id': '43d83f29-ba12-4205-ba09-545c3dc28920', 'instance_type': 'm1.nano', 'host': '775d6a56e5645c4fc9619f653366421d7b59c37709cb5996def4234c', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'fbadeab4-f24f-4100-963a-d228b2a6f7c4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '3ebffd97-b242-42d7-b245-ebdaf8e4377c'}, 'image_ref': '3ebffd97-b242-42d7-b245-ebdaf8e4377c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '939a448a-d1bc-11f0-a2f3-fa163e9454b0', 'monotonic_time': 4103.826513798, 'message_signature': '4a326c6199a76ee1d82a0ab26f0f4562f1e4a38d72a28ba59c6c80eada59ed1f'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'a9b7ab1c9c854146af8af16f337a063d', 'user_name': None, 'project_id': 'aa5b008731384d18ba83c8d69e76bcef', 'project_name': None, 'resource_id': '43d83f29-ba12-4205-ba09-545c3dc28920-sda', 'timestamp': '2025-12-05T09:27:10.890120', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-1534053809', 'name': 'instance-0000000b', 'instance_id': '43d83f29-ba12-4205-ba09-545c3dc28920', 'instance_type': 'm1.nano', 'host': '775d6a56e5645c4fc9619f653366421d7b59c37709cb5996def4234c', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'fbadeab4-f24f-4100-963a-d228b2a6f7c4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '3ebffd97-b242-42d7-b245-ebdaf8e4377c'}, 'image_ref': '3ebffd97-b242-42d7-b245-ebdaf8e4377c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '939a4f70-d1bc-11f0-a2f3-fa163e9454b0', 'monotonic_time': 4103.826513798, 'message_signature': '28a4bfcaf33acf48c31481148c6053e42814c6c9a5de7fc9653705a5a5bbc1c7'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '65751a90715341b2984ef84ebbaa1650', 'user_name': None, 'project_id': 'e26ae3fdd48d4947978a480f70e14f84', 'project_name': None, 'resource_id': '9d2b0f76-0408-4f6b-8a37-ac9882a44b4e-vda', 'timestamp': '2025-12-05T09:27:10.890120', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1301659118', 'name': 'instance-0000000a', 'instance_id': '9d2b0f76-0408-4f6b-8a37-ac9882a44b4e', 'instance_type': 'm1.nano', 'host': 'e310e8c2de91dded1bf420f01eae623153594a2dfb619537eb8a35c6', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'fbadeab4-f24f-4100-963a-d228b2a6f7c4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '3ebffd97-b242-42d7-b245-ebdaf8e4377c'}, 'image_ref': '3ebffd97-b242-42d7-b245-ebdaf8e4377c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '939a58e4-d1bc-11f0-a2f3-fa163e9454b0', 'monotonic_time': 4103.852671843, 'message_signature': '1e114651dd35972c8c71ba61273cf0fc70fd45813c50a4ff7f0e5a9cdf0ea826'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '65751a90715341b2984ef84ebbaa1650', 'user_name': None, 'project_id': 'e26ae3fdd48d4947978a480f70e14f84', 'project_name': None, 'resource_id': '9d2b0f76-0408-4f6b-8a37-ac9882a44b4e-sda', 'timestamp': '2025-12-05T09:27:10.890120', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1301659118', 'name': 'instance-0000000a', 'instance_id': '9d2b0f76-0408-4f6b-8a37-ac9882a44b4e', 'instance_type': 'm1.nano', 'host': 'e310e8c2de91dded1bf420f01eae623153594a2dfb619537eb8a35c6', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'fbadeab4-f24f-4100-963a-d228b2a6f7c4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '3ebffd97-b242-42d7-b245-ebdaf8e4377c'}, 'image_ref': '3ebffd97-b242-42d7-b245-ebdaf8e4377c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '939a61ea-d1bc-11f0-a2f3-fa163e9454b0', 'monotonic_time': 4103.852671843, 'message_signature': 'e3a8333f1f1dff571ae4786eaa0b375bb8753a43a3ba17ef491c2a94aac39361'}]}, 'timestamp': '2025-12-05 09:27:10.891157', '_unique_id': 'c907a8ecdd6d46fdb7b29670b1c0f9fb'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.891 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.891 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.891 12 ERROR oslo_messaging.notify.messaging     yield
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.891 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.891 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.891 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.891 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.891 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.891 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.891 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.891 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.891 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.891 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.891 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.891 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.891 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.891 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.891 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.891 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.891 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.891 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.891 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.891 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.891 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.891 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.891 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.891 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.891 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.891 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.891 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.891 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.891 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.891 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.891 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.891 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.891 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.891 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.891 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.891 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.891 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.891 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.891 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.891 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.891 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.891 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.891 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.891 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.891 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.891 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.891 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.891 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.891 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.891 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.891 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.892 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.904 12 DEBUG ceilometer.compute.pollsters [-] 43d83f29-ba12-4205-ba09-545c3dc28920/disk.device.allocation volume: 204800 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.905 12 DEBUG ceilometer.compute.pollsters [-] 43d83f29-ba12-4205-ba09-545c3dc28920/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.914 12 DEBUG ceilometer.compute.pollsters [-] 9d2b0f76-0408-4f6b-8a37-ac9882a44b4e/disk.device.allocation volume: 204800 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.914 12 DEBUG ceilometer.compute.pollsters [-] 9d2b0f76-0408-4f6b-8a37-ac9882a44b4e/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.916 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6550918d-03fc-4018-b877-163394d2b2c8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 204800, 'user_id': 'a9b7ab1c9c854146af8af16f337a063d', 'user_name': None, 'project_id': 'aa5b008731384d18ba83c8d69e76bcef', 'project_name': None, 'resource_id': '43d83f29-ba12-4205-ba09-545c3dc28920-vda', 'timestamp': '2025-12-05T09:27:10.893073', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-1534053809', 'name': 'instance-0000000b', 'instance_id': '43d83f29-ba12-4205-ba09-545c3dc28920', 'instance_type': 'm1.nano', 'host': '775d6a56e5645c4fc9619f653366421d7b59c37709cb5996def4234c', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'fbadeab4-f24f-4100-963a-d228b2a6f7c4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '3ebffd97-b242-42d7-b245-ebdaf8e4377c'}, 'image_ref': '3ebffd97-b242-42d7-b245-ebdaf8e4377c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '939c8d4e-d1bc-11f0-a2f3-fa163e9454b0', 'monotonic_time': 4103.950641954, 'message_signature': 'f0daecdb4d36b662133ef09ec628ee92864870dc561d80b28d437f6766b0bb03'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': 'a9b7ab1c9c854146af8af16f337a063d', 'user_name': None, 'project_id': 'aa5b008731384d18ba83c8d69e76bcef', 'project_name': None, 'resource_id': '43d83f29-ba12-4205-ba09-545c3dc28920-sda', 'timestamp': '2025-12-05T09:27:10.893073', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-1534053809', 'name': 'instance-0000000b', 'instance_id': '43d83f29-ba12-4205-ba09-545c3dc28920', 'instance_type': 'm1.nano', 'host': '775d6a56e5645c4fc9619f653366421d7b59c37709cb5996def4234c', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'fbadeab4-f24f-4100-963a-d228b2a6f7c4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '3ebffd97-b242-42d7-b245-ebdaf8e4377c'}, 'image_ref': '3ebffd97-b242-42d7-b245-ebdaf8e4377c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '939c9c26-d1bc-11f0-a2f3-fa163e9454b0', 'monotonic_time': 4103.950641954, 'message_signature': '854a1f132efb821307f82ddda96e8791a625509b47965577050bedadf4d3f6da'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 204800, 'user_id': '65751a90715341b2984ef84ebbaa1650', 'user_name': None, 'project_id': 'e26ae3fdd48d4947978a480f70e14f84', 'project_name': None, 'resource_id': '9d2b0f76-0408-4f6b-8a37-ac9882a44b4e-vda', 'timestamp': '2025-12-05T09:27:10.893073', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1301659118', 'name': 'instance-0000000a', 'instance_id': '9d2b0f76-0408-4f6b-8a37-ac9882a44b4e', 'instance_type': 'm1.nano', 'host': 'e310e8c2de91dded1bf420f01eae623153594a2dfb619537eb8a35c6', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'fbadeab4-f24f-4100-963a-d228b2a6f7c4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '3ebffd97-b242-42d7-b245-ebdaf8e4377c'}, 'image_ref': '3ebffd97-b242-42d7-b245-ebdaf8e4377c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '939df5a8-d1bc-11f0-a2f3-fa163e9454b0', 'monotonic_time': 4103.963298606, 'message_signature': '4d21f601d84dfa3454c6a6a8438978e477c61cbba8c3a6f908d2b82f057a5709'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': '65751a90715341b2984ef84ebbaa1650', 'user_name': None, 'project_id': 'e26ae3fdd48d4947978a480f70e14f84', 'project_name': None, 'resource_id': '9d2b0f76-0408-4f6b-8a37-ac9882a44b4e-sda', 'timestamp': '2025-12-05T09:27:10.893073', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1301659118', 'name': 'instance-0000000a', 'instance_id': '9d2b0f76-0408-4f6b-8a37-ac9882a44b4e', 'instance_type': 'm1.nano', 'host': 'e310e8c2de91dded1bf420f01eae623153594a2dfb619537eb8a35c6', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'fbadeab4-f24f-4100-963a-d228b2a6f7c4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '3ebffd97-b242-42d7-b245-ebdaf8e4377c'}, 'image_ref': '3ebffd97-b242-42d7-b245-ebdaf8e4377c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '939e011a-d1bc-11f0-a2f3-fa163e9454b0', 'monotonic_time': 4103.963298606, 'message_signature': '45fd48853494856268946ba31de26545b16d08a65509b5510b9ac0332bef4c3a'}]}, 'timestamp': '2025-12-05 09:27:10.914918', '_unique_id': '67aa882a72154629af48c85baa537fc4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.916 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.916 12 ERROR oslo_messaging.notify.messaging     yield
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.916 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.916 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.916 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.916 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.916 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.916 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.916 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.916 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.916 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.916 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.916 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.916 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.916 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.916 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.916 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.916 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.916 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.916 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.916 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.916 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.916 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.916 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.916 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.916 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.916 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.916 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.916 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.916 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.916 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.917 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.917 12 DEBUG ceilometer.compute.pollsters [-] 43d83f29-ba12-4205-ba09-545c3dc28920/disk.device.read.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.917 12 DEBUG ceilometer.compute.pollsters [-] 43d83f29-ba12-4205-ba09-545c3dc28920/disk.device.read.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.917 12 DEBUG ceilometer.compute.pollsters [-] 9d2b0f76-0408-4f6b-8a37-ac9882a44b4e/disk.device.read.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.918 12 DEBUG ceilometer.compute.pollsters [-] 9d2b0f76-0408-4f6b-8a37-ac9882a44b4e/disk.device.read.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.918 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0474b3d6-f63f-4b99-8031-3f93db67037e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': 'a9b7ab1c9c854146af8af16f337a063d', 'user_name': None, 'project_id': 'aa5b008731384d18ba83c8d69e76bcef', 'project_name': None, 'resource_id': '43d83f29-ba12-4205-ba09-545c3dc28920-vda', 'timestamp': '2025-12-05T09:27:10.917339', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-1534053809', 'name': 'instance-0000000b', 'instance_id': '43d83f29-ba12-4205-ba09-545c3dc28920', 'instance_type': 'm1.nano', 'host': '775d6a56e5645c4fc9619f653366421d7b59c37709cb5996def4234c', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'fbadeab4-f24f-4100-963a-d228b2a6f7c4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '3ebffd97-b242-42d7-b245-ebdaf8e4377c'}, 'image_ref': '3ebffd97-b242-42d7-b245-ebdaf8e4377c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '939e6b6e-d1bc-11f0-a2f3-fa163e9454b0', 'monotonic_time': 4103.826513798, 'message_signature': '09032ec463c9076f87ebe5d17475a87bf3567b0826ec612995fa1e990ec98745'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': 'a9b7ab1c9c854146af8af16f337a063d', 'user_name': None, 'project_id': 'aa5b008731384d18ba83c8d69e76bcef', 'project_name': None, 'resource_id': '43d83f29-ba12-4205-ba09-545c3dc28920-sda', 'timestamp': '2025-12-05T09:27:10.917339', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-1534053809', 'name': 'instance-0000000b', 'instance_id': '43d83f29-ba12-4205-ba09-545c3dc28920', 'instance_type': 'm1.nano', 'host': '775d6a56e5645c4fc9619f653366421d7b59c37709cb5996def4234c', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'fbadeab4-f24f-4100-963a-d228b2a6f7c4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '3ebffd97-b242-42d7-b245-ebdaf8e4377c'}, 'image_ref': '3ebffd97-b242-42d7-b245-ebdaf8e4377c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '939e765e-d1bc-11f0-a2f3-fa163e9454b0', 'monotonic_time': 4103.826513798, 'message_signature': '955d5c56fe878a4cc88a6691577900d3faa1e70f212af18c668795414bb3ddbf'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '65751a90715341b2984ef84ebbaa1650', 'user_name': None, 'project_id': 'e26ae3fdd48d4947978a480f70e14f84', 'project_name': None, 'resource_id': '9d2b0f76-0408-4f6b-8a37-ac9882a44b4e-vda', 'timestamp': '2025-12-05T09:27:10.917339', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1301659118', 'name': 'instance-0000000a', 'instance_id': '9d2b0f76-0408-4f6b-8a37-ac9882a44b4e', 'instance_type': 'm1.nano', 'host': 'e310e8c2de91dded1bf420f01eae623153594a2dfb619537eb8a35c6', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'fbadeab4-f24f-4100-963a-d228b2a6f7c4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '3ebffd97-b242-42d7-b245-ebdaf8e4377c'}, 'image_ref': '3ebffd97-b242-42d7-b245-ebdaf8e4377c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '939e8004-d1bc-11f0-a2f3-fa163e9454b0', 'monotonic_time': 4103.852671843, 'message_signature': 'a3388d52fccfd6ec5e17092533eedbcad110da5afaa2bf59047d509db24f6c6f'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '65751a90715341b2984ef84ebbaa1650', 'user_name': None, 'project_id': 'e26ae3fdd48d4947978a480f70e14f84', 'project_name': None, 'resource_id': '9d2b0f76-0408-4f6b-8a37-ac9882a44b4e-sda', 'timestamp': '2025-12-05T09:27:10.917339', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1301659118', 'name': 'instance-0000000a', 'instance_id': '9d2b0f76-0408-4f6b-8a37-ac9882a44b4e', 'instance_type': 'm1.nano', 'host': 'e310e8c2de91dded1bf420f01eae623153594a2dfb619537eb8a35c6', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'fbadeab4-f24f-4100-963a-d228b2a6f7c4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '3ebffd97-b242-42d7-b245-ebdaf8e4377c'}, 'image_ref': '3ebffd97-b242-42d7-b245-ebdaf8e4377c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '939e8a04-d1bc-11f0-a2f3-fa163e9454b0', 'monotonic_time': 4103.852671843, 'message_signature': 'fb7614bb5ab561f2897f15b881d4b6911633b3f196ce5f5086f714669bfa966e'}]}, 'timestamp': '2025-12-05 09:27:10.918403', '_unique_id': '198ce02451e54e55b3a56dcb033486e4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.918 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.918 12 ERROR oslo_messaging.notify.messaging     yield
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.918 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.918 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.918 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.918 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.918 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.918 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.918 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.918 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.918 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.918 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.918 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.918 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.918 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.918 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.918 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.918 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.918 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.918 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.918 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.918 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.918 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.918 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.918 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.918 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.918 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.918 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.918 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.918 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.918 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.919 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.920 12 DEBUG ceilometer.compute.pollsters [-] 43d83f29-ba12-4205-ba09-545c3dc28920/network.outgoing.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.920 12 DEBUG ceilometer.compute.pollsters [-] 9d2b0f76-0408-4f6b-8a37-ac9882a44b4e/network.outgoing.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.921 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '589fbb1d-3620-4588-be0f-6f8243b9bd59', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'a9b7ab1c9c854146af8af16f337a063d', 'user_name': None, 'project_id': 'aa5b008731384d18ba83c8d69e76bcef', 'project_name': None, 'resource_id': 'instance-0000000b-43d83f29-ba12-4205-ba09-545c3dc28920-tap8977e440-f4', 'timestamp': '2025-12-05T09:27:10.920072', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-1534053809', 'name': 'tap8977e440-f4', 'instance_id': '43d83f29-ba12-4205-ba09-545c3dc28920', 'instance_type': 'm1.nano', 'host': '775d6a56e5645c4fc9619f653366421d7b59c37709cb5996def4234c', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'fbadeab4-f24f-4100-963a-d228b2a6f7c4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '3ebffd97-b242-42d7-b245-ebdaf8e4377c'}, 'image_ref': '3ebffd97-b242-42d7-b245-ebdaf8e4377c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:81:c8:37', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap8977e440-f4'}, 'message_id': '939ed77a-d1bc-11f0-a2f3-fa163e9454b0', 'monotonic_time': 4103.811399844, 'message_signature': '7e00aa267bc23dd3326b4240a4ec0c658c9d1aa72fc5b8ecd39993cf9fe92248'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '65751a90715341b2984ef84ebbaa1650', 'user_name': None, 'project_id': 'e26ae3fdd48d4947978a480f70e14f84', 'project_name': None, 'resource_id': 'instance-0000000a-9d2b0f76-0408-4f6b-8a37-ac9882a44b4e-tapfee19e88-d1', 'timestamp': '2025-12-05T09:27:10.920072', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1301659118', 'name': 'tapfee19e88-d1', 'instance_id': '9d2b0f76-0408-4f6b-8a37-ac9882a44b4e', 'instance_type': 'm1.nano', 'host': 'e310e8c2de91dded1bf420f01eae623153594a2dfb619537eb8a35c6', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'fbadeab4-f24f-4100-963a-d228b2a6f7c4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '3ebffd97-b242-42d7-b245-ebdaf8e4377c'}, 'image_ref': '3ebffd97-b242-42d7-b245-ebdaf8e4377c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:ad:12:e3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapfee19e88-d1'}, 'message_id': '939ee47c-d1bc-11f0-a2f3-fa163e9454b0', 'monotonic_time': 4103.816751647, 'message_signature': '65437fcafcc0a144868364924627c582898898841b1bc12fe027e95bde866c58'}]}, 'timestamp': '2025-12-05 09:27:10.920728', '_unique_id': '3bf8b375f4bc4c7786bc4258884edc3e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.921 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.921 12 ERROR oslo_messaging.notify.messaging     yield
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.921 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.921 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.921 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.921 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.921 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.921 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.921 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.921 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.921 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.921 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.921 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.921 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.921 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.921 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.921 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.921 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.921 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.921 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.921 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.921 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.921 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.921 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.921 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.921 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.921 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.921 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.921 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.921 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.921 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.922 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.922 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.922 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-LiveAutoBlockMigrationV225Test-server-1534053809>, <NovaLikeServer: tempest-TestNetworkAdvancedServerOps-server-1301659118>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-LiveAutoBlockMigrationV225Test-server-1534053809>, <NovaLikeServer: tempest-TestNetworkAdvancedServerOps-server-1301659118>]
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.922 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.923 12 DEBUG ceilometer.compute.pollsters [-] 43d83f29-ba12-4205-ba09-545c3dc28920/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.923 12 DEBUG ceilometer.compute.pollsters [-] 9d2b0f76-0408-4f6b-8a37-ac9882a44b4e/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.924 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9207b831-6c2f-4493-b636-86fdb26b98ee', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'a9b7ab1c9c854146af8af16f337a063d', 'user_name': None, 'project_id': 'aa5b008731384d18ba83c8d69e76bcef', 'project_name': None, 'resource_id': 'instance-0000000b-43d83f29-ba12-4205-ba09-545c3dc28920-tap8977e440-f4', 'timestamp': '2025-12-05T09:27:10.922990', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-1534053809', 'name': 'tap8977e440-f4', 'instance_id': '43d83f29-ba12-4205-ba09-545c3dc28920', 'instance_type': 'm1.nano', 'host': '775d6a56e5645c4fc9619f653366421d7b59c37709cb5996def4234c', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'fbadeab4-f24f-4100-963a-d228b2a6f7c4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '3ebffd97-b242-42d7-b245-ebdaf8e4377c'}, 'image_ref': '3ebffd97-b242-42d7-b245-ebdaf8e4377c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:81:c8:37', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap8977e440-f4'}, 'message_id': '939f47e6-d1bc-11f0-a2f3-fa163e9454b0', 'monotonic_time': 4103.811399844, 'message_signature': 'f364fd98a2d762e3c4e9b5597f7229ba7b26793d61a928f24117dda16e507cf6'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '65751a90715341b2984ef84ebbaa1650', 'user_name': None, 'project_id': 'e26ae3fdd48d4947978a480f70e14f84', 'project_name': None, 'resource_id': 'instance-0000000a-9d2b0f76-0408-4f6b-8a37-ac9882a44b4e-tapfee19e88-d1', 'timestamp': '2025-12-05T09:27:10.922990', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1301659118', 'name': 'tapfee19e88-d1', 'instance_id': '9d2b0f76-0408-4f6b-8a37-ac9882a44b4e', 'instance_type': 'm1.nano', 'host': 'e310e8c2de91dded1bf420f01eae623153594a2dfb619537eb8a35c6', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'fbadeab4-f24f-4100-963a-d228b2a6f7c4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '3ebffd97-b242-42d7-b245-ebdaf8e4377c'}, 'image_ref': '3ebffd97-b242-42d7-b245-ebdaf8e4377c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:ad:12:e3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapfee19e88-d1'}, 'message_id': '939f5240-d1bc-11f0-a2f3-fa163e9454b0', 'monotonic_time': 4103.816751647, 'message_signature': '0cccaa9b01c0b102c3ddf88e6e0f83b2b2e2dbb9656d07d820b5082321bf0f12'}]}, 'timestamp': '2025-12-05 09:27:10.923556', '_unique_id': 'e04791a8a4d34379ae57cf54dd4cc53d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.924 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.924 12 ERROR oslo_messaging.notify.messaging     yield
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.924 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.924 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.924 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.924 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.924 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.924 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.924 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.924 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.924 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.924 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.924 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.924 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.924 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.924 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.924 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.924 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.924 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.924 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.924 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.924 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.924 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.924 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.924 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.924 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.924 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.924 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.924 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.924 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.924 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.925 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.925 12 DEBUG ceilometer.compute.pollsters [-] 43d83f29-ba12-4205-ba09-545c3dc28920/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.925 12 DEBUG ceilometer.compute.pollsters [-] 43d83f29-ba12-4205-ba09-545c3dc28920/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.925 12 DEBUG ceilometer.compute.pollsters [-] 9d2b0f76-0408-4f6b-8a37-ac9882a44b4e/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.926 12 DEBUG ceilometer.compute.pollsters [-] 9d2b0f76-0408-4f6b-8a37-ac9882a44b4e/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.927 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '453d9bfe-9091-40aa-b9fc-456ea8f7d9a6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'a9b7ab1c9c854146af8af16f337a063d', 'user_name': None, 'project_id': 'aa5b008731384d18ba83c8d69e76bcef', 'project_name': None, 'resource_id': '43d83f29-ba12-4205-ba09-545c3dc28920-vda', 'timestamp': '2025-12-05T09:27:10.925299', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-1534053809', 'name': 'instance-0000000b', 'instance_id': '43d83f29-ba12-4205-ba09-545c3dc28920', 'instance_type': 'm1.nano', 'host': '775d6a56e5645c4fc9619f653366421d7b59c37709cb5996def4234c', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'fbadeab4-f24f-4100-963a-d228b2a6f7c4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '3ebffd97-b242-42d7-b245-ebdaf8e4377c'}, 'image_ref': '3ebffd97-b242-42d7-b245-ebdaf8e4377c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '939fa2ea-d1bc-11f0-a2f3-fa163e9454b0', 'monotonic_time': 4103.950641954, 'message_signature': '23d87ee763621f3419b183c527dad505c55d4c023e94aa1cdf0c8260b84f263e'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'a9b7ab1c9c854146af8af16f337a063d', 'user_name': None, 'project_id': 'aa5b008731384d18ba83c8d69e76bcef', 'project_name': None, 'resource_id': '43d83f29-ba12-4205-ba09-545c3dc28920-sda', 'timestamp': '2025-12-05T09:27:10.925299', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-1534053809', 'name': 'instance-0000000b', 'instance_id': '43d83f29-ba12-4205-ba09-545c3dc28920', 'instance_type': 'm1.nano', 'host': '775d6a56e5645c4fc9619f653366421d7b59c37709cb5996def4234c', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'fbadeab4-f24f-4100-963a-d228b2a6f7c4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '3ebffd97-b242-42d7-b245-ebdaf8e4377c'}, 'image_ref': '3ebffd97-b242-42d7-b245-ebdaf8e4377c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '939fae66-d1bc-11f0-a2f3-fa163e9454b0', 'monotonic_time': 4103.950641954, 'message_signature': '84439f4c1c3b0685e71aff150dfe1150ea3bb33538edbecca8ca5a64f163dc25'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '65751a90715341b2984ef84ebbaa1650', 'user_name': None, 'project_id': 'e26ae3fdd48d4947978a480f70e14f84', 'project_name': None, 'resource_id': '9d2b0f76-0408-4f6b-8a37-ac9882a44b4e-vda', 'timestamp': '2025-12-05T09:27:10.925299', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1301659118', 'name': 'instance-0000000a', 'instance_id': '9d2b0f76-0408-4f6b-8a37-ac9882a44b4e', 'instance_type': 'm1.nano', 'host': 'e310e8c2de91dded1bf420f01eae623153594a2dfb619537eb8a35c6', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'fbadeab4-f24f-4100-963a-d228b2a6f7c4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '3ebffd97-b242-42d7-b245-ebdaf8e4377c'}, 'image_ref': '3ebffd97-b242-42d7-b245-ebdaf8e4377c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '939fb924-d1bc-11f0-a2f3-fa163e9454b0', 'monotonic_time': 4103.963298606, 'message_signature': 'f307ab30063ab9f67191d0cad648da0be5ca7a67a65e8b92de9b52d50d862112'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '65751a90715341b2984ef84ebbaa1650', 'user_name': None, 'project_id': 'e26ae3fdd48d4947978a480f70e14f84', 'project_name': None, 'resource_id': '9d2b0f76-0408-4f6b-8a37-ac9882a44b4e-sda', 'timestamp': '2025-12-05T09:27:10.925299', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1301659118', 'name': 'instance-0000000a', 'instance_id': '9d2b0f76-0408-4f6b-8a37-ac9882a44b4e', 'instance_type': 'm1.nano', 'host': 'e310e8c2de91dded1bf420f01eae623153594a2dfb619537eb8a35c6', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'fbadeab4-f24f-4100-963a-d228b2a6f7c4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '3ebffd97-b242-42d7-b245-ebdaf8e4377c'}, 'image_ref': '3ebffd97-b242-42d7-b245-ebdaf8e4377c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '939fc2b6-d1bc-11f0-a2f3-fa163e9454b0', 'monotonic_time': 4103.963298606, 'message_signature': '4bb7cb2c6d50777f9c2debe7b6f308bb23c7d2167e88d1e69c8b83ac8ecf09d9'}]}, 'timestamp': '2025-12-05 09:27:10.926402', '_unique_id': 'a038622581c14589a772dffdc85857d5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.927 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.927 12 ERROR oslo_messaging.notify.messaging     yield
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.927 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.927 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.927 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.927 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.927 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.927 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.927 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.927 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.927 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.927 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.927 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.927 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.927 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.927 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.927 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.927 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.927 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.927 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.927 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.927 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.927 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.927 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.927 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.927 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.927 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.927 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.927 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.927 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.927 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.928 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.928 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.928 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-LiveAutoBlockMigrationV225Test-server-1534053809>, <NovaLikeServer: tempest-TestNetworkAdvancedServerOps-server-1301659118>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-LiveAutoBlockMigrationV225Test-server-1534053809>, <NovaLikeServer: tempest-TestNetworkAdvancedServerOps-server-1301659118>]
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.928 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.928 12 DEBUG ceilometer.compute.pollsters [-] 43d83f29-ba12-4205-ba09-545c3dc28920/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.928 12 DEBUG ceilometer.compute.pollsters [-] 9d2b0f76-0408-4f6b-8a37-ac9882a44b4e/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.929 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ec891acf-778f-446a-ac1d-a165508a2d7f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'a9b7ab1c9c854146af8af16f337a063d', 'user_name': None, 'project_id': 'aa5b008731384d18ba83c8d69e76bcef', 'project_name': None, 'resource_id': 'instance-0000000b-43d83f29-ba12-4205-ba09-545c3dc28920-tap8977e440-f4', 'timestamp': '2025-12-05T09:27:10.928619', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-1534053809', 'name': 'tap8977e440-f4', 'instance_id': '43d83f29-ba12-4205-ba09-545c3dc28920', 'instance_type': 'm1.nano', 'host': '775d6a56e5645c4fc9619f653366421d7b59c37709cb5996def4234c', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'fbadeab4-f24f-4100-963a-d228b2a6f7c4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '3ebffd97-b242-42d7-b245-ebdaf8e4377c'}, 'image_ref': '3ebffd97-b242-42d7-b245-ebdaf8e4377c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:81:c8:37', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap8977e440-f4'}, 'message_id': '93a023be-d1bc-11f0-a2f3-fa163e9454b0', 'monotonic_time': 4103.811399844, 'message_signature': 'c87bbb8e042fc4a91245b9d8cce52e8979a86db3772b930f91e9f396ec86494d'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '65751a90715341b2984ef84ebbaa1650', 'user_name': None, 'project_id': 'e26ae3fdd48d4947978a480f70e14f84', 'project_name': None, 'resource_id': 'instance-0000000a-9d2b0f76-0408-4f6b-8a37-ac9882a44b4e-tapfee19e88-d1', 'timestamp': '2025-12-05T09:27:10.928619', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1301659118', 'name': 'tapfee19e88-d1', 'instance_id': '9d2b0f76-0408-4f6b-8a37-ac9882a44b4e', 'instance_type': 'm1.nano', 'host': 'e310e8c2de91dded1bf420f01eae623153594a2dfb619537eb8a35c6', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'fbadeab4-f24f-4100-963a-d228b2a6f7c4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '3ebffd97-b242-42d7-b245-ebdaf8e4377c'}, 'image_ref': '3ebffd97-b242-42d7-b245-ebdaf8e4377c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:ad:12:e3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapfee19e88-d1'}, 'message_id': '93a02de6-d1bc-11f0-a2f3-fa163e9454b0', 'monotonic_time': 4103.816751647, 'message_signature': 'e658f7048edb291f7c03e39abc9b59a84dc38af4dad2722e14bddfc4ed05b691'}]}, 'timestamp': '2025-12-05 09:27:10.929155', '_unique_id': 'ac674d6eaff94a5dab5664a02fa76674'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.929 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.929 12 ERROR oslo_messaging.notify.messaging     yield
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.929 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.929 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.929 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.929 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.929 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.929 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.929 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.929 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.929 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.929 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.929 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.929 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.929 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.929 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.929 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.929 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.929 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.929 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.929 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.929 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.929 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.929 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.929 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.929 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.929 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.929 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.929 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.929 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.929 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.930 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.930 12 DEBUG ceilometer.compute.pollsters [-] 43d83f29-ba12-4205-ba09-545c3dc28920/disk.device.usage volume: 196624 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.930 12 DEBUG ceilometer.compute.pollsters [-] 43d83f29-ba12-4205-ba09-545c3dc28920/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.931 12 DEBUG ceilometer.compute.pollsters [-] 9d2b0f76-0408-4f6b-8a37-ac9882a44b4e/disk.device.usage volume: 196624 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.931 12 DEBUG ceilometer.compute.pollsters [-] 9d2b0f76-0408-4f6b-8a37-ac9882a44b4e/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.932 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6bb72ab1-b6d5-42e5-878a-5900295562b5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 196624, 'user_id': 'a9b7ab1c9c854146af8af16f337a063d', 'user_name': None, 'project_id': 'aa5b008731384d18ba83c8d69e76bcef', 'project_name': None, 'resource_id': '43d83f29-ba12-4205-ba09-545c3dc28920-vda', 'timestamp': '2025-12-05T09:27:10.930699', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-1534053809', 'name': 'instance-0000000b', 'instance_id': '43d83f29-ba12-4205-ba09-545c3dc28920', 'instance_type': 'm1.nano', 'host': '775d6a56e5645c4fc9619f653366421d7b59c37709cb5996def4234c', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'fbadeab4-f24f-4100-963a-d228b2a6f7c4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '3ebffd97-b242-42d7-b245-ebdaf8e4377c'}, 'image_ref': '3ebffd97-b242-42d7-b245-ebdaf8e4377c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '93a074ea-d1bc-11f0-a2f3-fa163e9454b0', 'monotonic_time': 4103.950641954, 'message_signature': '8e03a1655c021321d30b043deaf976a4f8831f25b387ea43ae440f2d16f6aba2'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'a9b7ab1c9c854146af8af16f337a063d', 'user_name': None, 'project_id': 'aa5b008731384d18ba83c8d69e76bcef', 'project_name': None, 'resource_id': '43d83f29-ba12-4205-ba09-545c3dc28920-sda', 'timestamp': '2025-12-05T09:27:10.930699', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-1534053809', 'name': 'instance-0000000b', 'instance_id': '43d83f29-ba12-4205-ba09-545c3dc28920', 'instance_type': 'm1.nano', 'host': '775d6a56e5645c4fc9619f653366421d7b59c37709cb5996def4234c', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'fbadeab4-f24f-4100-963a-d228b2a6f7c4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '3ebffd97-b242-42d7-b245-ebdaf8e4377c'}, 'image_ref': '3ebffd97-b242-42d7-b245-ebdaf8e4377c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '93a07ec2-d1bc-11f0-a2f3-fa163e9454b0', 'monotonic_time': 4103.950641954, 'message_signature': '9d8b5ad662b8ff0a676c1720d5de75ca767499c7bc725e33130d0cba99871cb4'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 196624, 'user_id': '65751a90715341b2984ef84ebbaa1650', 'user_name': None, 'project_id': 'e26ae3fdd48d4947978a480f70e14f84', 'project_name': None, 'resource_id': '9d2b0f76-0408-4f6b-8a37-ac9882a44b4e-vda', 'timestamp': '2025-12-05T09:27:10.930699', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1301659118', 'name': 'instance-0000000a', 'instance_id': '9d2b0f76-0408-4f6b-8a37-ac9882a44b4e', 'instance_type': 'm1.nano', 'host': 'e310e8c2de91dded1bf420f01eae623153594a2dfb619537eb8a35c6', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'fbadeab4-f24f-4100-963a-d228b2a6f7c4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '3ebffd97-b242-42d7-b245-ebdaf8e4377c'}, 'image_ref': '3ebffd97-b242-42d7-b245-ebdaf8e4377c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '93a08854-d1bc-11f0-a2f3-fa163e9454b0', 'monotonic_time': 4103.963298606, 'message_signature': '49f59f134d84965d32eb07fae89e6f90fc6f3f9a36e59d382f3fdf90bd2a46cf'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '65751a90715341b2984ef84ebbaa1650', 'user_name': None, 'project_id': 'e26ae3fdd48d4947978a480f70e14f84', 'project_name': None, 'resource_id': '9d2b0f76-0408-4f6b-8a37-ac9882a44b4e-sda', 'timestamp': '2025-12-05T09:27:10.930699', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1301659118', 'name': 'instance-0000000a', 'instance_id': '9d2b0f76-0408-4f6b-8a37-ac9882a44b4e', 'instance_type': 'm1.nano', 'host': 'e310e8c2de91dded1bf420f01eae623153594a2dfb619537eb8a35c6', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'fbadeab4-f24f-4100-963a-d228b2a6f7c4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '3ebffd97-b242-42d7-b245-ebdaf8e4377c'}, 'image_ref': '3ebffd97-b242-42d7-b245-ebdaf8e4377c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '93a09272-d1bc-11f0-a2f3-fa163e9454b0', 'monotonic_time': 4103.963298606, 'message_signature': '32e1483311eef43bf87a037c1d1a573fdc237622ceb64e5368325d8e46602aaa'}]}, 'timestamp': '2025-12-05 09:27:10.931719', '_unique_id': 'fcbd1d5b913c49d8876439d172981f38'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.932 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.932 12 ERROR oslo_messaging.notify.messaging     yield
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.932 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.932 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.932 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.932 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.932 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.932 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.932 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.932 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.932 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.932 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.932 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.932 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.932 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.932 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.932 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.932 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.932 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.932 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.932 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.932 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.932 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.932 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.932 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.932 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.932 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.932 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.932 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.932 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.932 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.933 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.933 12 DEBUG ceilometer.compute.pollsters [-] 43d83f29-ba12-4205-ba09-545c3dc28920/network.incoming.bytes volume: 90 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.933 12 DEBUG ceilometer.compute.pollsters [-] 9d2b0f76-0408-4f6b-8a37-ac9882a44b4e/network.incoming.bytes volume: 90 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.934 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '670b160b-e53c-417e-91d1-47f9d8c89fb6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 90, 'user_id': 'a9b7ab1c9c854146af8af16f337a063d', 'user_name': None, 'project_id': 'aa5b008731384d18ba83c8d69e76bcef', 'project_name': None, 'resource_id': 'instance-0000000b-43d83f29-ba12-4205-ba09-545c3dc28920-tap8977e440-f4', 'timestamp': '2025-12-05T09:27:10.933308', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-1534053809', 'name': 'tap8977e440-f4', 'instance_id': '43d83f29-ba12-4205-ba09-545c3dc28920', 'instance_type': 'm1.nano', 'host': '775d6a56e5645c4fc9619f653366421d7b59c37709cb5996def4234c', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'fbadeab4-f24f-4100-963a-d228b2a6f7c4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '3ebffd97-b242-42d7-b245-ebdaf8e4377c'}, 'image_ref': '3ebffd97-b242-42d7-b245-ebdaf8e4377c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:81:c8:37', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap8977e440-f4'}, 'message_id': '93a0dae8-d1bc-11f0-a2f3-fa163e9454b0', 'monotonic_time': 4103.811399844, 'message_signature': '8aa8c3819934f450670fcd6e780e279f583ac115e3606e58d68b3d7e92817040'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 90, 'user_id': '65751a90715341b2984ef84ebbaa1650', 'user_name': None, 'project_id': 'e26ae3fdd48d4947978a480f70e14f84', 'project_name': None, 'resource_id': 'instance-0000000a-9d2b0f76-0408-4f6b-8a37-ac9882a44b4e-tapfee19e88-d1', 'timestamp': '2025-12-05T09:27:10.933308', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1301659118', 'name': 'tapfee19e88-d1', 'instance_id': '9d2b0f76-0408-4f6b-8a37-ac9882a44b4e', 'instance_type': 'm1.nano', 'host': 'e310e8c2de91dded1bf420f01eae623153594a2dfb619537eb8a35c6', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'fbadeab4-f24f-4100-963a-d228b2a6f7c4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '3ebffd97-b242-42d7-b245-ebdaf8e4377c'}, 'image_ref': '3ebffd97-b242-42d7-b245-ebdaf8e4377c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:ad:12:e3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapfee19e88-d1'}, 'message_id': '93a0e5f6-d1bc-11f0-a2f3-fa163e9454b0', 'monotonic_time': 4103.816751647, 'message_signature': '2bc3beddd497c629761d8c498e2377bb08ebcb6bc609ed35362cbf52c6a686f2'}]}, 'timestamp': '2025-12-05 09:27:10.933867', '_unique_id': '7bfbb11a90194b43ad48f5009d427e00'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.934 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.934 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.934 12 ERROR oslo_messaging.notify.messaging     yield
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.934 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.934 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.934 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.934 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.934 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.934 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.934 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.934 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.934 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.934 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.934 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.934 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.934 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.934 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.934 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.934 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.934 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.934 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.934 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.934 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.934 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.934 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.934 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.934 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.934 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.934 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.934 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.934 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.934 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.934 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.934 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.934 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.934 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.934 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.934 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.934 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.934 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.934 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.934 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.934 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.934 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.934 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.934 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.934 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.934 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.934 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.934 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.934 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.934 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.934 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.934 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.935 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.935 12 DEBUG ceilometer.compute.pollsters [-] 43d83f29-ba12-4205-ba09-545c3dc28920/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.935 12 DEBUG ceilometer.compute.pollsters [-] 9d2b0f76-0408-4f6b-8a37-ac9882a44b4e/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.936 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5ca5466d-ab57-49fa-9391-f02c69349f7a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'a9b7ab1c9c854146af8af16f337a063d', 'user_name': None, 'project_id': 'aa5b008731384d18ba83c8d69e76bcef', 'project_name': None, 'resource_id': 'instance-0000000b-43d83f29-ba12-4205-ba09-545c3dc28920-tap8977e440-f4', 'timestamp': '2025-12-05T09:27:10.935374', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-1534053809', 'name': 'tap8977e440-f4', 'instance_id': '43d83f29-ba12-4205-ba09-545c3dc28920', 'instance_type': 'm1.nano', 'host': '775d6a56e5645c4fc9619f653366421d7b59c37709cb5996def4234c', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'fbadeab4-f24f-4100-963a-d228b2a6f7c4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '3ebffd97-b242-42d7-b245-ebdaf8e4377c'}, 'image_ref': '3ebffd97-b242-42d7-b245-ebdaf8e4377c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:81:c8:37', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap8977e440-f4'}, 'message_id': '93a12c50-d1bc-11f0-a2f3-fa163e9454b0', 'monotonic_time': 4103.811399844, 'message_signature': 'a6413f115aa44fbeb6d0a21b2b58a0535bbf6f78b16e254e6cef2d840f7721c3'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '65751a90715341b2984ef84ebbaa1650', 'user_name': None, 'project_id': 'e26ae3fdd48d4947978a480f70e14f84', 'project_name': None, 'resource_id': 'instance-0000000a-9d2b0f76-0408-4f6b-8a37-ac9882a44b4e-tapfee19e88-d1', 'timestamp': '2025-12-05T09:27:10.935374', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1301659118', 'name': 'tapfee19e88-d1', 'instance_id': '9d2b0f76-0408-4f6b-8a37-ac9882a44b4e', 'instance_type': 'm1.nano', 'host': 'e310e8c2de91dded1bf420f01eae623153594a2dfb619537eb8a35c6', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'fbadeab4-f24f-4100-963a-d228b2a6f7c4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '3ebffd97-b242-42d7-b245-ebdaf8e4377c'}, 'image_ref': '3ebffd97-b242-42d7-b245-ebdaf8e4377c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:ad:12:e3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapfee19e88-d1'}, 'message_id': '93a136c8-d1bc-11f0-a2f3-fa163e9454b0', 'monotonic_time': 4103.816751647, 'message_signature': 'dd14de7f77ee8715448e13217c6fd91ccfeb4e72c7d57cbd5ed9363ce8783c05'}]}, 'timestamp': '2025-12-05 09:27:10.935936', '_unique_id': '466c74dc58ba416793c7311a0a5bc760'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.936 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.936 12 ERROR oslo_messaging.notify.messaging     yield
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.936 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.936 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.936 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.936 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.936 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.936 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.936 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.936 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.936 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.936 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.936 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.936 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.936 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.936 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.936 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.936 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.936 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.936 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.936 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.936 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.936 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.936 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.936 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.936 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.936 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.936 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.936 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.936 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.936 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.937 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.937 12 DEBUG ceilometer.compute.pollsters [-] 43d83f29-ba12-4205-ba09-545c3dc28920/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.937 12 DEBUG ceilometer.compute.pollsters [-] 9d2b0f76-0408-4f6b-8a37-ac9882a44b4e/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.938 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5360da7a-837b-4039-9534-0a63702929b5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'a9b7ab1c9c854146af8af16f337a063d', 'user_name': None, 'project_id': 'aa5b008731384d18ba83c8d69e76bcef', 'project_name': None, 'resource_id': 'instance-0000000b-43d83f29-ba12-4205-ba09-545c3dc28920-tap8977e440-f4', 'timestamp': '2025-12-05T09:27:10.937487', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-1534053809', 'name': 'tap8977e440-f4', 'instance_id': '43d83f29-ba12-4205-ba09-545c3dc28920', 'instance_type': 'm1.nano', 'host': '775d6a56e5645c4fc9619f653366421d7b59c37709cb5996def4234c', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'fbadeab4-f24f-4100-963a-d228b2a6f7c4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '3ebffd97-b242-42d7-b245-ebdaf8e4377c'}, 'image_ref': '3ebffd97-b242-42d7-b245-ebdaf8e4377c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:81:c8:37', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap8977e440-f4'}, 'message_id': '93a17ee4-d1bc-11f0-a2f3-fa163e9454b0', 'monotonic_time': 4103.811399844, 'message_signature': '47a797b19079b2ff1f7a2a7af5e93a8c560b7c5b79cfd6a95ca8b4efd3879ae7'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '65751a90715341b2984ef84ebbaa1650', 'user_name': None, 'project_id': 'e26ae3fdd48d4947978a480f70e14f84', 'project_name': None, 'resource_id': 'instance-0000000a-9d2b0f76-0408-4f6b-8a37-ac9882a44b4e-tapfee19e88-d1', 'timestamp': '2025-12-05T09:27:10.937487', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1301659118', 'name': 'tapfee19e88-d1', 'instance_id': '9d2b0f76-0408-4f6b-8a37-ac9882a44b4e', 'instance_type': 'm1.nano', 'host': 'e310e8c2de91dded1bf420f01eae623153594a2dfb619537eb8a35c6', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'fbadeab4-f24f-4100-963a-d228b2a6f7c4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '3ebffd97-b242-42d7-b245-ebdaf8e4377c'}, 'image_ref': '3ebffd97-b242-42d7-b245-ebdaf8e4377c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:ad:12:e3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapfee19e88-d1'}, 'message_id': '93a1890c-d1bc-11f0-a2f3-fa163e9454b0', 'monotonic_time': 4103.816751647, 'message_signature': '9bdeb9b68bbd3ab741012bcb650acd78dc3069d7f73cbd89d7fd746d49613c02'}]}, 'timestamp': '2025-12-05 09:27:10.938043', '_unique_id': '137424b6e09a4aa49017283325561239'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.938 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.938 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.938 12 ERROR oslo_messaging.notify.messaging     yield
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.938 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.938 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.938 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.938 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.938 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.938 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.938 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.938 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.938 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.938 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.938 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.938 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.938 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.938 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.938 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.938 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.938 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.938 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.938 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.938 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.938 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.938 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.938 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.938 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.938 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.938 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.938 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.938 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.938 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.938 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.938 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.938 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.938 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.938 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.938 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.938 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.938 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.938 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.938 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.938 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.938 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.938 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.938 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.938 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.938 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.938 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.938 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.938 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.938 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.938 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 09:27:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:27:10.938 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:27:11 compute-1 nova_compute[189066]: 2025-12-05 09:27:11.044 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:27:11 compute-1 nova_compute[189066]: 2025-12-05 09:27:11.044 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:27:11 compute-1 nova_compute[189066]: 2025-12-05 09:27:11.228 189070 DEBUG oslo_concurrency.lockutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:27:11 compute-1 nova_compute[189066]: 2025-12-05 09:27:11.228 189070 DEBUG oslo_concurrency.lockutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:27:11 compute-1 nova_compute[189066]: 2025-12-05 09:27:11.229 189070 DEBUG oslo_concurrency.lockutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:27:11 compute-1 nova_compute[189066]: 2025-12-05 09:27:11.229 189070 DEBUG nova.compute.resource_tracker [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 05 09:27:11 compute-1 podman[223651]: 2025-12-05 09:27:11.344622527 +0000 UTC m=+0.063723795 container health_status 3f3288243aefe700d53cffce184353529fbaf6e44f1c19ec4792ed576af41176 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, container_name=multipathd, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Dec 05 09:27:11 compute-1 nova_compute[189066]: 2025-12-05 09:27:11.367 189070 DEBUG oslo_concurrency.processutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/43d83f29-ba12-4205-ba09-545c3dc28920/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 09:27:11 compute-1 nova_compute[189066]: 2025-12-05 09:27:11.430 189070 DEBUG oslo_concurrency.processutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/43d83f29-ba12-4205-ba09-545c3dc28920/disk --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 09:27:11 compute-1 nova_compute[189066]: 2025-12-05 09:27:11.431 189070 DEBUG oslo_concurrency.processutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/43d83f29-ba12-4205-ba09-545c3dc28920/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 09:27:11 compute-1 nova_compute[189066]: 2025-12-05 09:27:11.491 189070 DEBUG oslo_concurrency.processutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/43d83f29-ba12-4205-ba09-545c3dc28920/disk --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 09:27:11 compute-1 nova_compute[189066]: 2025-12-05 09:27:11.498 189070 DEBUG oslo_concurrency.processutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9d2b0f76-0408-4f6b-8a37-ac9882a44b4e/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 09:27:11 compute-1 nova_compute[189066]: 2025-12-05 09:27:11.563 189070 DEBUG oslo_concurrency.processutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9d2b0f76-0408-4f6b-8a37-ac9882a44b4e/disk --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 09:27:11 compute-1 nova_compute[189066]: 2025-12-05 09:27:11.564 189070 DEBUG oslo_concurrency.processutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9d2b0f76-0408-4f6b-8a37-ac9882a44b4e/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 09:27:11 compute-1 nova_compute[189066]: 2025-12-05 09:27:11.628 189070 DEBUG oslo_concurrency.processutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9d2b0f76-0408-4f6b-8a37-ac9882a44b4e/disk --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 09:27:11 compute-1 nova_compute[189066]: 2025-12-05 09:27:11.793 189070 WARNING nova.virt.libvirt.driver [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 05 09:27:11 compute-1 nova_compute[189066]: 2025-12-05 09:27:11.794 189070 DEBUG nova.compute.resource_tracker [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5575MB free_disk=73.3319206237793GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 05 09:27:11 compute-1 nova_compute[189066]: 2025-12-05 09:27:11.795 189070 DEBUG oslo_concurrency.lockutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:27:11 compute-1 nova_compute[189066]: 2025-12-05 09:27:11.795 189070 DEBUG oslo_concurrency.lockutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:27:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:27:11.805 105272 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=10, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f2:d3:89', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '8e:43:2c:7d:20:dd'}, ipsec=False) old=SB_Global(nb_cfg=9) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 09:27:11 compute-1 nova_compute[189066]: 2025-12-05 09:27:11.805 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:27:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:27:11.807 105272 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Dec 05 09:27:11 compute-1 nova_compute[189066]: 2025-12-05 09:27:11.949 189070 DEBUG nova.compute.resource_tracker [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Instance 9d2b0f76-0408-4f6b-8a37-ac9882a44b4e actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Dec 05 09:27:11 compute-1 nova_compute[189066]: 2025-12-05 09:27:11.950 189070 DEBUG nova.compute.resource_tracker [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Instance 43d83f29-ba12-4205-ba09-545c3dc28920 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Dec 05 09:27:11 compute-1 nova_compute[189066]: 2025-12-05 09:27:11.950 189070 DEBUG nova.compute.resource_tracker [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 05 09:27:11 compute-1 nova_compute[189066]: 2025-12-05 09:27:11.951 189070 DEBUG nova.compute.resource_tracker [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 05 09:27:11 compute-1 nova_compute[189066]: 2025-12-05 09:27:11.996 189070 DEBUG nova.compute.manager [req-dba5e495-2375-4c13-af30-c31f2c7eba54 req-59e8729f-a65f-4370-b2fb-192592617383 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 9d2b0f76-0408-4f6b-8a37-ac9882a44b4e] Received event network-vif-plugged-fee19e88-d18e-4020-97b6-26caf4ef6fa9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 09:27:11 compute-1 nova_compute[189066]: 2025-12-05 09:27:11.996 189070 DEBUG oslo_concurrency.lockutils [req-dba5e495-2375-4c13-af30-c31f2c7eba54 req-59e8729f-a65f-4370-b2fb-192592617383 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Acquiring lock "9d2b0f76-0408-4f6b-8a37-ac9882a44b4e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:27:11 compute-1 nova_compute[189066]: 2025-12-05 09:27:11.997 189070 DEBUG oslo_concurrency.lockutils [req-dba5e495-2375-4c13-af30-c31f2c7eba54 req-59e8729f-a65f-4370-b2fb-192592617383 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Lock "9d2b0f76-0408-4f6b-8a37-ac9882a44b4e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:27:11 compute-1 nova_compute[189066]: 2025-12-05 09:27:11.997 189070 DEBUG oslo_concurrency.lockutils [req-dba5e495-2375-4c13-af30-c31f2c7eba54 req-59e8729f-a65f-4370-b2fb-192592617383 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Lock "9d2b0f76-0408-4f6b-8a37-ac9882a44b4e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:27:11 compute-1 nova_compute[189066]: 2025-12-05 09:27:11.997 189070 DEBUG nova.compute.manager [req-dba5e495-2375-4c13-af30-c31f2c7eba54 req-59e8729f-a65f-4370-b2fb-192592617383 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 9d2b0f76-0408-4f6b-8a37-ac9882a44b4e] Processing event network-vif-plugged-fee19e88-d18e-4020-97b6-26caf4ef6fa9 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Dec 05 09:27:11 compute-1 nova_compute[189066]: 2025-12-05 09:27:11.998 189070 DEBUG nova.compute.manager [None req-6ce70ff3-e851-4175-bf63-4a9beff89553 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] [instance: 9d2b0f76-0408-4f6b-8a37-ac9882a44b4e] Instance event wait completed in 5 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Dec 05 09:27:12 compute-1 nova_compute[189066]: 2025-12-05 09:27:12.003 189070 DEBUG nova.virt.driver [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] Emitting event <LifecycleEvent: 1764926832.0029984, 9d2b0f76-0408-4f6b-8a37-ac9882a44b4e => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 09:27:12 compute-1 nova_compute[189066]: 2025-12-05 09:27:12.003 189070 INFO nova.compute.manager [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] [instance: 9d2b0f76-0408-4f6b-8a37-ac9882a44b4e] VM Resumed (Lifecycle Event)
Dec 05 09:27:12 compute-1 nova_compute[189066]: 2025-12-05 09:27:12.006 189070 DEBUG nova.virt.libvirt.driver [None req-6ce70ff3-e851-4175-bf63-4a9beff89553 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] [instance: 9d2b0f76-0408-4f6b-8a37-ac9882a44b4e] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Dec 05 09:27:12 compute-1 nova_compute[189066]: 2025-12-05 09:27:12.009 189070 INFO nova.virt.libvirt.driver [-] [instance: 9d2b0f76-0408-4f6b-8a37-ac9882a44b4e] Instance spawned successfully.
Dec 05 09:27:12 compute-1 nova_compute[189066]: 2025-12-05 09:27:12.010 189070 DEBUG nova.virt.libvirt.driver [None req-6ce70ff3-e851-4175-bf63-4a9beff89553 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] [instance: 9d2b0f76-0408-4f6b-8a37-ac9882a44b4e] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Dec 05 09:27:12 compute-1 nova_compute[189066]: 2025-12-05 09:27:12.094 189070 DEBUG nova.scheduler.client.report [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Refreshing inventories for resource provider be68f9f1-7820-4bfa-8dbd-210e13729f64 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Dec 05 09:27:12 compute-1 nova_compute[189066]: 2025-12-05 09:27:12.152 189070 DEBUG nova.compute.manager [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] [instance: 9d2b0f76-0408-4f6b-8a37-ac9882a44b4e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 09:27:12 compute-1 nova_compute[189066]: 2025-12-05 09:27:12.157 189070 DEBUG nova.virt.libvirt.driver [None req-6ce70ff3-e851-4175-bf63-4a9beff89553 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] [instance: 9d2b0f76-0408-4f6b-8a37-ac9882a44b4e] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 09:27:12 compute-1 nova_compute[189066]: 2025-12-05 09:27:12.157 189070 DEBUG nova.virt.libvirt.driver [None req-6ce70ff3-e851-4175-bf63-4a9beff89553 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] [instance: 9d2b0f76-0408-4f6b-8a37-ac9882a44b4e] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 09:27:12 compute-1 nova_compute[189066]: 2025-12-05 09:27:12.158 189070 DEBUG nova.virt.libvirt.driver [None req-6ce70ff3-e851-4175-bf63-4a9beff89553 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] [instance: 9d2b0f76-0408-4f6b-8a37-ac9882a44b4e] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 09:27:12 compute-1 nova_compute[189066]: 2025-12-05 09:27:12.158 189070 DEBUG nova.virt.libvirt.driver [None req-6ce70ff3-e851-4175-bf63-4a9beff89553 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] [instance: 9d2b0f76-0408-4f6b-8a37-ac9882a44b4e] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 09:27:12 compute-1 nova_compute[189066]: 2025-12-05 09:27:12.159 189070 DEBUG nova.virt.libvirt.driver [None req-6ce70ff3-e851-4175-bf63-4a9beff89553 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] [instance: 9d2b0f76-0408-4f6b-8a37-ac9882a44b4e] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 09:27:12 compute-1 nova_compute[189066]: 2025-12-05 09:27:12.159 189070 DEBUG nova.virt.libvirt.driver [None req-6ce70ff3-e851-4175-bf63-4a9beff89553 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] [instance: 9d2b0f76-0408-4f6b-8a37-ac9882a44b4e] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 09:27:12 compute-1 nova_compute[189066]: 2025-12-05 09:27:12.163 189070 DEBUG nova.compute.manager [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] [instance: 9d2b0f76-0408-4f6b-8a37-ac9882a44b4e] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 09:27:12 compute-1 nova_compute[189066]: 2025-12-05 09:27:12.207 189070 DEBUG nova.scheduler.client.report [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Updating ProviderTree inventory for provider be68f9f1-7820-4bfa-8dbd-210e13729f64 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Dec 05 09:27:12 compute-1 nova_compute[189066]: 2025-12-05 09:27:12.208 189070 DEBUG nova.compute.provider_tree [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Updating inventory in ProviderTree for provider be68f9f1-7820-4bfa-8dbd-210e13729f64 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Dec 05 09:27:12 compute-1 nova_compute[189066]: 2025-12-05 09:27:12.222 189070 INFO nova.compute.manager [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] [instance: 9d2b0f76-0408-4f6b-8a37-ac9882a44b4e] During sync_power_state the instance has a pending task (spawning). Skip.
Dec 05 09:27:12 compute-1 nova_compute[189066]: 2025-12-05 09:27:12.240 189070 DEBUG nova.scheduler.client.report [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Refreshing aggregate associations for resource provider be68f9f1-7820-4bfa-8dbd-210e13729f64, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Dec 05 09:27:12 compute-1 nova_compute[189066]: 2025-12-05 09:27:12.272 189070 DEBUG nova.scheduler.client.report [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Refreshing trait associations for resource provider be68f9f1-7820-4bfa-8dbd-210e13729f64, traits: COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_DEVICE_TAGGING,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_STORAGE_BUS_USB,COMPUTE_NODE,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_VOLUME_EXTEND,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_GRAPHICS_MODEL_NONE,HW_CPU_X86_MMX,COMPUTE_TRUSTED_CERTS,COMPUTE_SECURITY_UEFI_SECURE_BOOT,HW_CPU_X86_SSE,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_RESCUE_BFV,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_NET_VIF_MODEL_E1000E,HW_CPU_X86_SSE41,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_STORAGE_BUS_FDC,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_STORAGE_BUS_IDE,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_SECURITY_TPM_1_2,HW_CPU_X86_SSE42,COMPUTE_SECURITY_TPM_2_0,COMPUTE_NET_VIF_MODEL_NE2K_PCI,HW_CPU_X86_SSSE3,COMPUTE_STORAGE_BUS_SATA,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_ACCELERATORS,HW_CPU_X86_SSE2,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_NET_VIF_MODEL_RTL8139 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Dec 05 09:27:12 compute-1 nova_compute[189066]: 2025-12-05 09:27:12.309 189070 INFO nova.compute.manager [None req-6ce70ff3-e851-4175-bf63-4a9beff89553 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] [instance: 9d2b0f76-0408-4f6b-8a37-ac9882a44b4e] Took 14.57 seconds to spawn the instance on the hypervisor.
Dec 05 09:27:12 compute-1 nova_compute[189066]: 2025-12-05 09:27:12.310 189070 DEBUG nova.compute.manager [None req-6ce70ff3-e851-4175-bf63-4a9beff89553 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] [instance: 9d2b0f76-0408-4f6b-8a37-ac9882a44b4e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 09:27:12 compute-1 nova_compute[189066]: 2025-12-05 09:27:12.390 189070 DEBUG nova.compute.provider_tree [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Inventory has not changed in ProviderTree for provider: be68f9f1-7820-4bfa-8dbd-210e13729f64 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 09:27:12 compute-1 nova_compute[189066]: 2025-12-05 09:27:12.412 189070 INFO nova.compute.manager [None req-6ce70ff3-e851-4175-bf63-4a9beff89553 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] [instance: 9d2b0f76-0408-4f6b-8a37-ac9882a44b4e] Took 16.24 seconds to build instance.
Dec 05 09:27:12 compute-1 nova_compute[189066]: 2025-12-05 09:27:12.416 189070 DEBUG nova.scheduler.client.report [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Inventory has not changed for provider be68f9f1-7820-4bfa-8dbd-210e13729f64 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 09:27:12 compute-1 nova_compute[189066]: 2025-12-05 09:27:12.451 189070 DEBUG nova.compute.resource_tracker [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 05 09:27:12 compute-1 nova_compute[189066]: 2025-12-05 09:27:12.452 189070 DEBUG oslo_concurrency.lockutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.657s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:27:12 compute-1 nova_compute[189066]: 2025-12-05 09:27:12.452 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:27:12 compute-1 nova_compute[189066]: 2025-12-05 09:27:12.453 189070 DEBUG nova.compute.manager [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Dec 05 09:27:12 compute-1 nova_compute[189066]: 2025-12-05 09:27:12.455 189070 DEBUG oslo_concurrency.lockutils [None req-6ce70ff3-e851-4175-bf63-4a9beff89553 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] Lock "9d2b0f76-0408-4f6b-8a37-ac9882a44b4e" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 16.469s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:27:12 compute-1 nova_compute[189066]: 2025-12-05 09:27:12.481 189070 DEBUG nova.compute.manager [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Dec 05 09:27:12 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:27:12.809 105272 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=2deaed7a-68f6-453c-b7f8-10ef033f3762, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '10'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 09:27:13 compute-1 nova_compute[189066]: 2025-12-05 09:27:13.461 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:27:13 compute-1 nova_compute[189066]: 2025-12-05 09:27:13.462 189070 DEBUG nova.compute.manager [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 05 09:27:13 compute-1 nova_compute[189066]: 2025-12-05 09:27:13.462 189070 DEBUG nova.compute.manager [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 05 09:27:13 compute-1 nova_compute[189066]: 2025-12-05 09:27:13.503 189070 DEBUG nova.compute.manager [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] [instance: 43d83f29-ba12-4205-ba09-545c3dc28920] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871
Dec 05 09:27:13 compute-1 nova_compute[189066]: 2025-12-05 09:27:13.968 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:27:14 compute-1 nova_compute[189066]: 2025-12-05 09:27:14.149 189070 DEBUG nova.compute.manager [req-9dae674b-8bab-4175-afbe-b9a7f6c054c0 req-f4dcc72b-d42a-4a75-91a1-66e87f4a6472 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 9d2b0f76-0408-4f6b-8a37-ac9882a44b4e] Received event network-vif-plugged-fee19e88-d18e-4020-97b6-26caf4ef6fa9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 09:27:14 compute-1 nova_compute[189066]: 2025-12-05 09:27:14.150 189070 DEBUG oslo_concurrency.lockutils [req-9dae674b-8bab-4175-afbe-b9a7f6c054c0 req-f4dcc72b-d42a-4a75-91a1-66e87f4a6472 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Acquiring lock "9d2b0f76-0408-4f6b-8a37-ac9882a44b4e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:27:14 compute-1 nova_compute[189066]: 2025-12-05 09:27:14.150 189070 DEBUG oslo_concurrency.lockutils [req-9dae674b-8bab-4175-afbe-b9a7f6c054c0 req-f4dcc72b-d42a-4a75-91a1-66e87f4a6472 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Lock "9d2b0f76-0408-4f6b-8a37-ac9882a44b4e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:27:14 compute-1 nova_compute[189066]: 2025-12-05 09:27:14.150 189070 DEBUG oslo_concurrency.lockutils [req-9dae674b-8bab-4175-afbe-b9a7f6c054c0 req-f4dcc72b-d42a-4a75-91a1-66e87f4a6472 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Lock "9d2b0f76-0408-4f6b-8a37-ac9882a44b4e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:27:14 compute-1 nova_compute[189066]: 2025-12-05 09:27:14.151 189070 DEBUG nova.compute.manager [req-9dae674b-8bab-4175-afbe-b9a7f6c054c0 req-f4dcc72b-d42a-4a75-91a1-66e87f4a6472 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 9d2b0f76-0408-4f6b-8a37-ac9882a44b4e] No waiting events found dispatching network-vif-plugged-fee19e88-d18e-4020-97b6-26caf4ef6fa9 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 09:27:14 compute-1 nova_compute[189066]: 2025-12-05 09:27:14.151 189070 WARNING nova.compute.manager [req-9dae674b-8bab-4175-afbe-b9a7f6c054c0 req-f4dcc72b-d42a-4a75-91a1-66e87f4a6472 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 9d2b0f76-0408-4f6b-8a37-ac9882a44b4e] Received unexpected event network-vif-plugged-fee19e88-d18e-4020-97b6-26caf4ef6fa9 for instance with vm_state active and task_state None.
Dec 05 09:27:14 compute-1 nova_compute[189066]: 2025-12-05 09:27:14.205 189070 DEBUG oslo_concurrency.lockutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Acquiring lock "refresh_cache-9d2b0f76-0408-4f6b-8a37-ac9882a44b4e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 09:27:14 compute-1 nova_compute[189066]: 2025-12-05 09:27:14.206 189070 DEBUG oslo_concurrency.lockutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Acquired lock "refresh_cache-9d2b0f76-0408-4f6b-8a37-ac9882a44b4e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 09:27:14 compute-1 nova_compute[189066]: 2025-12-05 09:27:14.207 189070 DEBUG nova.network.neutron [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] [instance: 9d2b0f76-0408-4f6b-8a37-ac9882a44b4e] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Dec 05 09:27:14 compute-1 nova_compute[189066]: 2025-12-05 09:27:14.207 189070 DEBUG nova.objects.instance [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 9d2b0f76-0408-4f6b-8a37-ac9882a44b4e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 09:27:14 compute-1 nova_compute[189066]: 2025-12-05 09:27:14.452 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:27:15 compute-1 podman[223684]: 2025-12-05 09:27:15.635024891 +0000 UTC m=+0.072028580 container health_status b07f36300303a5c1729056a61e344d6c6fd9b2e404082d8e8ce7185d45652c2b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, container_name=openstack_network_exporter, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, vendor=Red Hat, Inc., io.buildah.version=1.33.7, maintainer=Red Hat, Inc., name=ubi9-minimal, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, version=9.6, config_id=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, io.openshift.tags=minimal rhel9)
Dec 05 09:27:16 compute-1 nova_compute[189066]: 2025-12-05 09:27:16.808 189070 DEBUG nova.compute.manager [req-195e8287-a5b1-4d68-87b2-6e557c939d87 req-2ad83dce-841c-4faf-bb9c-d83438417312 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 43d83f29-ba12-4205-ba09-545c3dc28920] Received event network-vif-plugged-8977e440-f4bd-42c7-bf7b-c57e7184cc33 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 09:27:16 compute-1 nova_compute[189066]: 2025-12-05 09:27:16.809 189070 DEBUG oslo_concurrency.lockutils [req-195e8287-a5b1-4d68-87b2-6e557c939d87 req-2ad83dce-841c-4faf-bb9c-d83438417312 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Acquiring lock "43d83f29-ba12-4205-ba09-545c3dc28920-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:27:16 compute-1 nova_compute[189066]: 2025-12-05 09:27:16.809 189070 DEBUG oslo_concurrency.lockutils [req-195e8287-a5b1-4d68-87b2-6e557c939d87 req-2ad83dce-841c-4faf-bb9c-d83438417312 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Lock "43d83f29-ba12-4205-ba09-545c3dc28920-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:27:16 compute-1 nova_compute[189066]: 2025-12-05 09:27:16.809 189070 DEBUG oslo_concurrency.lockutils [req-195e8287-a5b1-4d68-87b2-6e557c939d87 req-2ad83dce-841c-4faf-bb9c-d83438417312 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Lock "43d83f29-ba12-4205-ba09-545c3dc28920-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:27:16 compute-1 nova_compute[189066]: 2025-12-05 09:27:16.810 189070 DEBUG nova.compute.manager [req-195e8287-a5b1-4d68-87b2-6e557c939d87 req-2ad83dce-841c-4faf-bb9c-d83438417312 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 43d83f29-ba12-4205-ba09-545c3dc28920] Processing event network-vif-plugged-8977e440-f4bd-42c7-bf7b-c57e7184cc33 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Dec 05 09:27:16 compute-1 nova_compute[189066]: 2025-12-05 09:27:16.810 189070 DEBUG nova.compute.manager [None req-084a5c26-267c-4925-bdee-4818a6d5ea30 a9b7ab1c9c854146af8af16f337a063d aa5b008731384d18ba83c8d69e76bcef - - default default] [instance: 43d83f29-ba12-4205-ba09-545c3dc28920] Instance event wait completed in 10 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Dec 05 09:27:16 compute-1 nova_compute[189066]: 2025-12-05 09:27:16.815 189070 DEBUG nova.virt.driver [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] Emitting event <LifecycleEvent: 1764926836.8144646, 43d83f29-ba12-4205-ba09-545c3dc28920 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 09:27:16 compute-1 nova_compute[189066]: 2025-12-05 09:27:16.816 189070 INFO nova.compute.manager [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] [instance: 43d83f29-ba12-4205-ba09-545c3dc28920] VM Resumed (Lifecycle Event)
Dec 05 09:27:16 compute-1 nova_compute[189066]: 2025-12-05 09:27:16.820 189070 DEBUG nova.virt.libvirt.driver [None req-084a5c26-267c-4925-bdee-4818a6d5ea30 a9b7ab1c9c854146af8af16f337a063d aa5b008731384d18ba83c8d69e76bcef - - default default] [instance: 43d83f29-ba12-4205-ba09-545c3dc28920] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Dec 05 09:27:16 compute-1 nova_compute[189066]: 2025-12-05 09:27:16.826 189070 INFO nova.virt.libvirt.driver [-] [instance: 43d83f29-ba12-4205-ba09-545c3dc28920] Instance spawned successfully.
Dec 05 09:27:16 compute-1 nova_compute[189066]: 2025-12-05 09:27:16.826 189070 DEBUG nova.virt.libvirt.driver [None req-084a5c26-267c-4925-bdee-4818a6d5ea30 a9b7ab1c9c854146af8af16f337a063d aa5b008731384d18ba83c8d69e76bcef - - default default] [instance: 43d83f29-ba12-4205-ba09-545c3dc28920] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Dec 05 09:27:16 compute-1 nova_compute[189066]: 2025-12-05 09:27:16.848 189070 DEBUG nova.compute.manager [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] [instance: 43d83f29-ba12-4205-ba09-545c3dc28920] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 09:27:16 compute-1 nova_compute[189066]: 2025-12-05 09:27:16.854 189070 DEBUG nova.compute.manager [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] [instance: 43d83f29-ba12-4205-ba09-545c3dc28920] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 09:27:16 compute-1 nova_compute[189066]: 2025-12-05 09:27:16.859 189070 DEBUG nova.virt.libvirt.driver [None req-084a5c26-267c-4925-bdee-4818a6d5ea30 a9b7ab1c9c854146af8af16f337a063d aa5b008731384d18ba83c8d69e76bcef - - default default] [instance: 43d83f29-ba12-4205-ba09-545c3dc28920] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 09:27:16 compute-1 nova_compute[189066]: 2025-12-05 09:27:16.859 189070 DEBUG nova.virt.libvirt.driver [None req-084a5c26-267c-4925-bdee-4818a6d5ea30 a9b7ab1c9c854146af8af16f337a063d aa5b008731384d18ba83c8d69e76bcef - - default default] [instance: 43d83f29-ba12-4205-ba09-545c3dc28920] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 09:27:16 compute-1 nova_compute[189066]: 2025-12-05 09:27:16.860 189070 DEBUG nova.virt.libvirt.driver [None req-084a5c26-267c-4925-bdee-4818a6d5ea30 a9b7ab1c9c854146af8af16f337a063d aa5b008731384d18ba83c8d69e76bcef - - default default] [instance: 43d83f29-ba12-4205-ba09-545c3dc28920] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 09:27:16 compute-1 nova_compute[189066]: 2025-12-05 09:27:16.860 189070 DEBUG nova.virt.libvirt.driver [None req-084a5c26-267c-4925-bdee-4818a6d5ea30 a9b7ab1c9c854146af8af16f337a063d aa5b008731384d18ba83c8d69e76bcef - - default default] [instance: 43d83f29-ba12-4205-ba09-545c3dc28920] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 09:27:16 compute-1 nova_compute[189066]: 2025-12-05 09:27:16.861 189070 DEBUG nova.virt.libvirt.driver [None req-084a5c26-267c-4925-bdee-4818a6d5ea30 a9b7ab1c9c854146af8af16f337a063d aa5b008731384d18ba83c8d69e76bcef - - default default] [instance: 43d83f29-ba12-4205-ba09-545c3dc28920] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 09:27:16 compute-1 nova_compute[189066]: 2025-12-05 09:27:16.861 189070 DEBUG nova.virt.libvirt.driver [None req-084a5c26-267c-4925-bdee-4818a6d5ea30 a9b7ab1c9c854146af8af16f337a063d aa5b008731384d18ba83c8d69e76bcef - - default default] [instance: 43d83f29-ba12-4205-ba09-545c3dc28920] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 09:27:16 compute-1 nova_compute[189066]: 2025-12-05 09:27:16.901 189070 INFO nova.compute.manager [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] [instance: 43d83f29-ba12-4205-ba09-545c3dc28920] During sync_power_state the instance has a pending task (spawning). Skip.
Dec 05 09:27:16 compute-1 nova_compute[189066]: 2025-12-05 09:27:16.972 189070 INFO nova.compute.manager [None req-084a5c26-267c-4925-bdee-4818a6d5ea30 a9b7ab1c9c854146af8af16f337a063d aa5b008731384d18ba83c8d69e76bcef - - default default] [instance: 43d83f29-ba12-4205-ba09-545c3dc28920] Took 18.59 seconds to spawn the instance on the hypervisor.
Dec 05 09:27:16 compute-1 nova_compute[189066]: 2025-12-05 09:27:16.973 189070 DEBUG nova.compute.manager [None req-084a5c26-267c-4925-bdee-4818a6d5ea30 a9b7ab1c9c854146af8af16f337a063d aa5b008731384d18ba83c8d69e76bcef - - default default] [instance: 43d83f29-ba12-4205-ba09-545c3dc28920] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 09:27:17 compute-1 nova_compute[189066]: 2025-12-05 09:27:17.005 189070 DEBUG nova.network.neutron [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] [instance: 9d2b0f76-0408-4f6b-8a37-ac9882a44b4e] Updating instance_info_cache with network_info: [{"id": "fee19e88-d18e-4020-97b6-26caf4ef6fa9", "address": "fa:16:3e:ad:12:e3", "network": {"id": "2d5ff262-0b2d-49fb-b643-980510ce97c7", "bridge": "br-int", "label": "tempest-network-smoke--2096539509", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e26ae3fdd48d4947978a480f70e14f84", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfee19e88-d1", "ovs_interfaceid": "fee19e88-d18e-4020-97b6-26caf4ef6fa9", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 09:27:17 compute-1 nova_compute[189066]: 2025-12-05 09:27:17.074 189070 DEBUG oslo_concurrency.lockutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Releasing lock "refresh_cache-9d2b0f76-0408-4f6b-8a37-ac9882a44b4e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 09:27:17 compute-1 nova_compute[189066]: 2025-12-05 09:27:17.074 189070 DEBUG nova.compute.manager [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] [instance: 9d2b0f76-0408-4f6b-8a37-ac9882a44b4e] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Dec 05 09:27:17 compute-1 nova_compute[189066]: 2025-12-05 09:27:17.075 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:27:17 compute-1 nova_compute[189066]: 2025-12-05 09:27:17.076 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:27:17 compute-1 nova_compute[189066]: 2025-12-05 09:27:17.076 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:27:17 compute-1 nova_compute[189066]: 2025-12-05 09:27:17.076 189070 DEBUG nova.compute.manager [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 05 09:27:17 compute-1 nova_compute[189066]: 2025-12-05 09:27:17.080 189070 INFO nova.compute.manager [None req-084a5c26-267c-4925-bdee-4818a6d5ea30 a9b7ab1c9c854146af8af16f337a063d aa5b008731384d18ba83c8d69e76bcef - - default default] [instance: 43d83f29-ba12-4205-ba09-545c3dc28920] Took 19.97 seconds to build instance.
Dec 05 09:27:17 compute-1 nova_compute[189066]: 2025-12-05 09:27:17.105 189070 DEBUG oslo_concurrency.lockutils [None req-084a5c26-267c-4925-bdee-4818a6d5ea30 a9b7ab1c9c854146af8af16f337a063d aa5b008731384d18ba83c8d69e76bcef - - default default] Lock "43d83f29-ba12-4205-ba09-545c3dc28920" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 20.105s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:27:18 compute-1 podman[223707]: 2025-12-05 09:27:18.616167539 +0000 UTC m=+0.055103312 container health_status b9f45cdcb567bb250381fa80d86c78c226d5638c7de1b6d79ee017275814ff68 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Dec 05 09:27:18 compute-1 nova_compute[189066]: 2025-12-05 09:27:18.970 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:27:19 compute-1 nova_compute[189066]: 2025-12-05 09:27:19.057 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:27:19 compute-1 NetworkManager[55704]: <info>  [1764926839.0582] manager: (patch-br-int-to-provnet-540ef657-1e81-4b72-856b-0e1c84504735): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/51)
Dec 05 09:27:19 compute-1 NetworkManager[55704]: <info>  [1764926839.0593] manager: (patch-provnet-540ef657-1e81-4b72-856b-0e1c84504735-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/52)
Dec 05 09:27:19 compute-1 nova_compute[189066]: 2025-12-05 09:27:19.133 189070 DEBUG nova.compute.manager [req-637477a1-889b-4246-a8ce-6a9201e20067 req-00636993-4f5a-4c62-b902-626d61902439 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 43d83f29-ba12-4205-ba09-545c3dc28920] Received event network-vif-plugged-8977e440-f4bd-42c7-bf7b-c57e7184cc33 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 09:27:19 compute-1 nova_compute[189066]: 2025-12-05 09:27:19.134 189070 DEBUG oslo_concurrency.lockutils [req-637477a1-889b-4246-a8ce-6a9201e20067 req-00636993-4f5a-4c62-b902-626d61902439 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Acquiring lock "43d83f29-ba12-4205-ba09-545c3dc28920-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:27:19 compute-1 nova_compute[189066]: 2025-12-05 09:27:19.135 189070 DEBUG oslo_concurrency.lockutils [req-637477a1-889b-4246-a8ce-6a9201e20067 req-00636993-4f5a-4c62-b902-626d61902439 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Lock "43d83f29-ba12-4205-ba09-545c3dc28920-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:27:19 compute-1 nova_compute[189066]: 2025-12-05 09:27:19.135 189070 DEBUG oslo_concurrency.lockutils [req-637477a1-889b-4246-a8ce-6a9201e20067 req-00636993-4f5a-4c62-b902-626d61902439 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Lock "43d83f29-ba12-4205-ba09-545c3dc28920-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:27:19 compute-1 nova_compute[189066]: 2025-12-05 09:27:19.135 189070 DEBUG nova.compute.manager [req-637477a1-889b-4246-a8ce-6a9201e20067 req-00636993-4f5a-4c62-b902-626d61902439 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 43d83f29-ba12-4205-ba09-545c3dc28920] No waiting events found dispatching network-vif-plugged-8977e440-f4bd-42c7-bf7b-c57e7184cc33 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 09:27:19 compute-1 nova_compute[189066]: 2025-12-05 09:27:19.136 189070 WARNING nova.compute.manager [req-637477a1-889b-4246-a8ce-6a9201e20067 req-00636993-4f5a-4c62-b902-626d61902439 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 43d83f29-ba12-4205-ba09-545c3dc28920] Received unexpected event network-vif-plugged-8977e440-f4bd-42c7-bf7b-c57e7184cc33 for instance with vm_state active and task_state None.
Dec 05 09:27:19 compute-1 nova_compute[189066]: 2025-12-05 09:27:19.237 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:27:19 compute-1 ovn_controller[95809]: 2025-12-05T09:27:19Z|00084|binding|INFO|Releasing lport da7e4261-23bc-43e6-a0d1-1c34dd95f6c8 from this chassis (sb_readonly=0)
Dec 05 09:27:19 compute-1 ovn_controller[95809]: 2025-12-05T09:27:19Z|00085|binding|INFO|Releasing lport 0cfbe0fa-0fb7-480e-bf9d-3c9768dbeffa from this chassis (sb_readonly=0)
Dec 05 09:27:19 compute-1 ovn_controller[95809]: 2025-12-05T09:27:19Z|00086|binding|INFO|Releasing lport 6691f267-8fd6-45bc-bce0-2a0ee4f4c825 from this chassis (sb_readonly=0)
Dec 05 09:27:19 compute-1 nova_compute[189066]: 2025-12-05 09:27:19.265 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:27:19 compute-1 nova_compute[189066]: 2025-12-05 09:27:19.455 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:27:20 compute-1 nova_compute[189066]: 2025-12-05 09:27:20.464 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:27:22 compute-1 nova_compute[189066]: 2025-12-05 09:27:22.970 189070 DEBUG nova.compute.manager [req-af66b374-975d-469d-b567-cd3cb3ab08a8 req-bcbc8a24-c336-4f58-a348-9a485491ecb4 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 9d2b0f76-0408-4f6b-8a37-ac9882a44b4e] Received event network-changed-fee19e88-d18e-4020-97b6-26caf4ef6fa9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 09:27:22 compute-1 nova_compute[189066]: 2025-12-05 09:27:22.971 189070 DEBUG nova.compute.manager [req-af66b374-975d-469d-b567-cd3cb3ab08a8 req-bcbc8a24-c336-4f58-a348-9a485491ecb4 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 9d2b0f76-0408-4f6b-8a37-ac9882a44b4e] Refreshing instance network info cache due to event network-changed-fee19e88-d18e-4020-97b6-26caf4ef6fa9. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Dec 05 09:27:22 compute-1 nova_compute[189066]: 2025-12-05 09:27:22.972 189070 DEBUG oslo_concurrency.lockutils [req-af66b374-975d-469d-b567-cd3cb3ab08a8 req-bcbc8a24-c336-4f58-a348-9a485491ecb4 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Acquiring lock "refresh_cache-9d2b0f76-0408-4f6b-8a37-ac9882a44b4e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 09:27:22 compute-1 nova_compute[189066]: 2025-12-05 09:27:22.972 189070 DEBUG oslo_concurrency.lockutils [req-af66b374-975d-469d-b567-cd3cb3ab08a8 req-bcbc8a24-c336-4f58-a348-9a485491ecb4 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Acquired lock "refresh_cache-9d2b0f76-0408-4f6b-8a37-ac9882a44b4e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 09:27:22 compute-1 nova_compute[189066]: 2025-12-05 09:27:22.973 189070 DEBUG nova.network.neutron [req-af66b374-975d-469d-b567-cd3cb3ab08a8 req-bcbc8a24-c336-4f58-a348-9a485491ecb4 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 9d2b0f76-0408-4f6b-8a37-ac9882a44b4e] Refreshing network info cache for port fee19e88-d18e-4020-97b6-26caf4ef6fa9 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Dec 05 09:27:23 compute-1 nova_compute[189066]: 2025-12-05 09:27:23.973 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:27:24 compute-1 nova_compute[189066]: 2025-12-05 09:27:24.457 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:27:24 compute-1 nova_compute[189066]: 2025-12-05 09:27:24.478 189070 DEBUG nova.virt.libvirt.driver [None req-5c023d50-6bf4-4c07-8ad5-79325ee30727 9d57c2b5ca2f41c09c8f4b723716ef7b 9891534839274cbc89aec062c36488b0 - - default default] [instance: 43d83f29-ba12-4205-ba09-545c3dc28920] Check if temp file /var/lib/nova/instances/tmp_5yqv494 exists to indicate shared storage is being used for migration. Exists? False _check_shared_storage_test_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10065
Dec 05 09:27:24 compute-1 nova_compute[189066]: 2025-12-05 09:27:24.479 189070 DEBUG nova.compute.manager [None req-5c023d50-6bf4-4c07-8ad5-79325ee30727 9d57c2b5ca2f41c09c8f4b723716ef7b 9891534839274cbc89aec062c36488b0 - - default default] source check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp_5yqv494',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='43d83f29-ba12-4205-ba09-545c3dc28920',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_source /usr/lib/python3.9/site-packages/nova/compute/manager.py:8587
Dec 05 09:27:24 compute-1 podman[223740]: 2025-12-05 09:27:24.648286563 +0000 UTC m=+0.077370152 container health_status 550b604b3b40c506829669f3af0f3e8dc4e7ac01464222ce7f26998708fe1a96 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Dec 05 09:27:24 compute-1 nova_compute[189066]: 2025-12-05 09:27:24.747 189070 DEBUG nova.network.neutron [req-af66b374-975d-469d-b567-cd3cb3ab08a8 req-bcbc8a24-c336-4f58-a348-9a485491ecb4 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 9d2b0f76-0408-4f6b-8a37-ac9882a44b4e] Updated VIF entry in instance network info cache for port fee19e88-d18e-4020-97b6-26caf4ef6fa9. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Dec 05 09:27:24 compute-1 nova_compute[189066]: 2025-12-05 09:27:24.749 189070 DEBUG nova.network.neutron [req-af66b374-975d-469d-b567-cd3cb3ab08a8 req-bcbc8a24-c336-4f58-a348-9a485491ecb4 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 9d2b0f76-0408-4f6b-8a37-ac9882a44b4e] Updating instance_info_cache with network_info: [{"id": "fee19e88-d18e-4020-97b6-26caf4ef6fa9", "address": "fa:16:3e:ad:12:e3", "network": {"id": "2d5ff262-0b2d-49fb-b643-980510ce97c7", "bridge": "br-int", "label": "tempest-network-smoke--2096539509", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.214", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e26ae3fdd48d4947978a480f70e14f84", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfee19e88-d1", "ovs_interfaceid": "fee19e88-d18e-4020-97b6-26caf4ef6fa9", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 09:27:24 compute-1 nova_compute[189066]: 2025-12-05 09:27:24.805 189070 DEBUG oslo_concurrency.lockutils [req-af66b374-975d-469d-b567-cd3cb3ab08a8 req-bcbc8a24-c336-4f58-a348-9a485491ecb4 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Releasing lock "refresh_cache-9d2b0f76-0408-4f6b-8a37-ac9882a44b4e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 09:27:24 compute-1 sshd-session[223732]: Received disconnect from 185.118.15.236 port 36286:11: Bye Bye [preauth]
Dec 05 09:27:24 compute-1 sshd-session[223732]: Disconnected from authenticating user root 185.118.15.236 port 36286 [preauth]
Dec 05 09:27:25 compute-1 ovn_controller[95809]: 2025-12-05T09:27:25Z|00012|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:ad:12:e3 10.100.0.14
Dec 05 09:27:25 compute-1 ovn_controller[95809]: 2025-12-05T09:27:25Z|00013|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:ad:12:e3 10.100.0.14
Dec 05 09:27:26 compute-1 nova_compute[189066]: 2025-12-05 09:27:26.003 189070 DEBUG oslo_concurrency.processutils [None req-5c023d50-6bf4-4c07-8ad5-79325ee30727 9d57c2b5ca2f41c09c8f4b723716ef7b 9891534839274cbc89aec062c36488b0 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/43d83f29-ba12-4205-ba09-545c3dc28920/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 09:27:26 compute-1 nova_compute[189066]: 2025-12-05 09:27:26.081 189070 DEBUG oslo_concurrency.processutils [None req-5c023d50-6bf4-4c07-8ad5-79325ee30727 9d57c2b5ca2f41c09c8f4b723716ef7b 9891534839274cbc89aec062c36488b0 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/43d83f29-ba12-4205-ba09-545c3dc28920/disk --force-share --output=json" returned: 0 in 0.078s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 09:27:26 compute-1 nova_compute[189066]: 2025-12-05 09:27:26.086 189070 DEBUG oslo_concurrency.processutils [None req-5c023d50-6bf4-4c07-8ad5-79325ee30727 9d57c2b5ca2f41c09c8f4b723716ef7b 9891534839274cbc89aec062c36488b0 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/43d83f29-ba12-4205-ba09-545c3dc28920/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 09:27:26 compute-1 nova_compute[189066]: 2025-12-05 09:27:26.148 189070 DEBUG oslo_concurrency.processutils [None req-5c023d50-6bf4-4c07-8ad5-79325ee30727 9d57c2b5ca2f41c09c8f4b723716ef7b 9891534839274cbc89aec062c36488b0 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/43d83f29-ba12-4205-ba09-545c3dc28920/disk --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 09:27:28 compute-1 sshd-session[223788]: Accepted publickey for nova from 192.168.122.102 port 49958 ssh2: ECDSA SHA256:SmkhuBePRe5VD3eW9pHWZd8sXFprcvpDE1m9LAG/9Ps
Dec 05 09:27:28 compute-1 systemd[1]: Created slice User Slice of UID 42436.
Dec 05 09:27:28 compute-1 systemd[1]: Starting User Runtime Directory /run/user/42436...
Dec 05 09:27:28 compute-1 systemd-logind[807]: New session 30 of user nova.
Dec 05 09:27:28 compute-1 systemd[1]: Finished User Runtime Directory /run/user/42436.
Dec 05 09:27:28 compute-1 systemd[1]: Starting User Manager for UID 42436...
Dec 05 09:27:28 compute-1 systemd[223792]: pam_unix(systemd-user:session): session opened for user nova(uid=42436) by nova(uid=0)
Dec 05 09:27:29 compute-1 nova_compute[189066]: 2025-12-05 09:27:29.010 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:27:29 compute-1 systemd[223792]: Queued start job for default target Main User Target.
Dec 05 09:27:29 compute-1 systemd[223792]: Created slice User Application Slice.
Dec 05 09:27:29 compute-1 systemd[223792]: Started Mark boot as successful after the user session has run 2 minutes.
Dec 05 09:27:29 compute-1 systemd[223792]: Started Daily Cleanup of User's Temporary Directories.
Dec 05 09:27:29 compute-1 systemd[223792]: Reached target Paths.
Dec 05 09:27:29 compute-1 systemd[223792]: Reached target Timers.
Dec 05 09:27:29 compute-1 systemd[223792]: Starting D-Bus User Message Bus Socket...
Dec 05 09:27:29 compute-1 systemd[223792]: Starting Create User's Volatile Files and Directories...
Dec 05 09:27:29 compute-1 systemd[223792]: Listening on D-Bus User Message Bus Socket.
Dec 05 09:27:29 compute-1 systemd[223792]: Reached target Sockets.
Dec 05 09:27:29 compute-1 systemd[223792]: Finished Create User's Volatile Files and Directories.
Dec 05 09:27:29 compute-1 systemd[223792]: Reached target Basic System.
Dec 05 09:27:29 compute-1 systemd[223792]: Reached target Main User Target.
Dec 05 09:27:29 compute-1 systemd[223792]: Startup finished in 164ms.
Dec 05 09:27:29 compute-1 systemd[1]: Started User Manager for UID 42436.
Dec 05 09:27:29 compute-1 systemd[1]: Started Session 30 of User nova.
Dec 05 09:27:29 compute-1 sshd-session[223788]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Dec 05 09:27:29 compute-1 sshd-session[223808]: Received disconnect from 192.168.122.102 port 49958:11: disconnected by user
Dec 05 09:27:29 compute-1 sshd-session[223808]: Disconnected from user nova 192.168.122.102 port 49958
Dec 05 09:27:29 compute-1 sshd-session[223788]: pam_unix(sshd:session): session closed for user nova
Dec 05 09:27:29 compute-1 systemd[1]: session-30.scope: Deactivated successfully.
Dec 05 09:27:29 compute-1 systemd-logind[807]: Session 30 logged out. Waiting for processes to exit.
Dec 05 09:27:29 compute-1 systemd-logind[807]: Removed session 30.
Dec 05 09:27:29 compute-1 nova_compute[189066]: 2025-12-05 09:27:29.460 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:27:30 compute-1 ovn_controller[95809]: 2025-12-05T09:27:30Z|00014|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:81:c8:37 10.100.0.12
Dec 05 09:27:30 compute-1 ovn_controller[95809]: 2025-12-05T09:27:30Z|00015|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:81:c8:37 10.100.0.12
Dec 05 09:27:32 compute-1 nova_compute[189066]: 2025-12-05 09:27:32.663 189070 INFO nova.compute.manager [None req-6caa070d-e57a-4ee7-a421-450627dd976d 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] [instance: 9d2b0f76-0408-4f6b-8a37-ac9882a44b4e] Get console output
Dec 05 09:27:32 compute-1 nova_compute[189066]: 2025-12-05 09:27:32.672 220618 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Dec 05 09:27:32 compute-1 podman[223832]: 2025-12-05 09:27:32.686452129 +0000 UTC m=+0.110241805 container health_status d60d3dfd47255bc7c068dbedc03dbf86678863f0414a1b8f8536e4d53dd67a13 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a42d21893a42142b0b00e96296a13ac9d2c0fedf55d6438ad104fd0309c63376-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute)
Dec 05 09:27:33 compute-1 nova_compute[189066]: 2025-12-05 09:27:33.548 189070 DEBUG nova.compute.manager [req-4d795af7-1acc-4238-a983-16a7bca3b482 req-39e3e49b-7a7a-43f5-99ca-7062cd8cbbe9 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 43d83f29-ba12-4205-ba09-545c3dc28920] Received event network-vif-unplugged-8977e440-f4bd-42c7-bf7b-c57e7184cc33 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 09:27:33 compute-1 nova_compute[189066]: 2025-12-05 09:27:33.548 189070 DEBUG oslo_concurrency.lockutils [req-4d795af7-1acc-4238-a983-16a7bca3b482 req-39e3e49b-7a7a-43f5-99ca-7062cd8cbbe9 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Acquiring lock "43d83f29-ba12-4205-ba09-545c3dc28920-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:27:33 compute-1 nova_compute[189066]: 2025-12-05 09:27:33.549 189070 DEBUG oslo_concurrency.lockutils [req-4d795af7-1acc-4238-a983-16a7bca3b482 req-39e3e49b-7a7a-43f5-99ca-7062cd8cbbe9 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Lock "43d83f29-ba12-4205-ba09-545c3dc28920-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:27:33 compute-1 nova_compute[189066]: 2025-12-05 09:27:33.549 189070 DEBUG oslo_concurrency.lockutils [req-4d795af7-1acc-4238-a983-16a7bca3b482 req-39e3e49b-7a7a-43f5-99ca-7062cd8cbbe9 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Lock "43d83f29-ba12-4205-ba09-545c3dc28920-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:27:33 compute-1 nova_compute[189066]: 2025-12-05 09:27:33.549 189070 DEBUG nova.compute.manager [req-4d795af7-1acc-4238-a983-16a7bca3b482 req-39e3e49b-7a7a-43f5-99ca-7062cd8cbbe9 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 43d83f29-ba12-4205-ba09-545c3dc28920] No waiting events found dispatching network-vif-unplugged-8977e440-f4bd-42c7-bf7b-c57e7184cc33 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 09:27:33 compute-1 nova_compute[189066]: 2025-12-05 09:27:33.549 189070 DEBUG nova.compute.manager [req-4d795af7-1acc-4238-a983-16a7bca3b482 req-39e3e49b-7a7a-43f5-99ca-7062cd8cbbe9 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 43d83f29-ba12-4205-ba09-545c3dc28920] Received event network-vif-unplugged-8977e440-f4bd-42c7-bf7b-c57e7184cc33 for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Dec 05 09:27:34 compute-1 nova_compute[189066]: 2025-12-05 09:27:34.014 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:27:34 compute-1 nova_compute[189066]: 2025-12-05 09:27:34.464 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:27:36 compute-1 podman[223852]: 2025-12-05 09:27:36.675367045 +0000 UTC m=+0.107963240 container health_status 0cdc951f3b455a1459471b20e1f1626ca53f702831c125f835d1b670aeb17ccf (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a42d21893a42142b0b00e96296a13ac9d2c0fedf55d6438ad104fd0309c63376-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 05 09:27:37 compute-1 nova_compute[189066]: 2025-12-05 09:27:37.670 189070 DEBUG nova.compute.manager [req-4c51eb04-9e46-4113-9856-5470b795ad0c req-1ed8479a-73f9-4245-a989-4ed3c6089ff5 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 43d83f29-ba12-4205-ba09-545c3dc28920] Received event network-vif-plugged-8977e440-f4bd-42c7-bf7b-c57e7184cc33 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 09:27:37 compute-1 nova_compute[189066]: 2025-12-05 09:27:37.671 189070 DEBUG oslo_concurrency.lockutils [req-4c51eb04-9e46-4113-9856-5470b795ad0c req-1ed8479a-73f9-4245-a989-4ed3c6089ff5 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Acquiring lock "43d83f29-ba12-4205-ba09-545c3dc28920-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:27:37 compute-1 nova_compute[189066]: 2025-12-05 09:27:37.671 189070 DEBUG oslo_concurrency.lockutils [req-4c51eb04-9e46-4113-9856-5470b795ad0c req-1ed8479a-73f9-4245-a989-4ed3c6089ff5 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Lock "43d83f29-ba12-4205-ba09-545c3dc28920-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:27:37 compute-1 nova_compute[189066]: 2025-12-05 09:27:37.671 189070 DEBUG oslo_concurrency.lockutils [req-4c51eb04-9e46-4113-9856-5470b795ad0c req-1ed8479a-73f9-4245-a989-4ed3c6089ff5 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Lock "43d83f29-ba12-4205-ba09-545c3dc28920-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:27:37 compute-1 nova_compute[189066]: 2025-12-05 09:27:37.671 189070 DEBUG nova.compute.manager [req-4c51eb04-9e46-4113-9856-5470b795ad0c req-1ed8479a-73f9-4245-a989-4ed3c6089ff5 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 43d83f29-ba12-4205-ba09-545c3dc28920] No waiting events found dispatching network-vif-plugged-8977e440-f4bd-42c7-bf7b-c57e7184cc33 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 09:27:37 compute-1 nova_compute[189066]: 2025-12-05 09:27:37.671 189070 WARNING nova.compute.manager [req-4c51eb04-9e46-4113-9856-5470b795ad0c req-1ed8479a-73f9-4245-a989-4ed3c6089ff5 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 43d83f29-ba12-4205-ba09-545c3dc28920] Received unexpected event network-vif-plugged-8977e440-f4bd-42c7-bf7b-c57e7184cc33 for instance with vm_state active and task_state migrating.
Dec 05 09:27:37 compute-1 nova_compute[189066]: 2025-12-05 09:27:37.809 189070 INFO nova.compute.manager [None req-aa84c6a2-89df-4c18-bf2c-5373a53a9049 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] [instance: 9d2b0f76-0408-4f6b-8a37-ac9882a44b4e] Get console output
Dec 05 09:27:37 compute-1 nova_compute[189066]: 2025-12-05 09:27:37.814 220618 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Dec 05 09:27:38 compute-1 nova_compute[189066]: 2025-12-05 09:27:38.447 189070 INFO nova.compute.manager [None req-5c023d50-6bf4-4c07-8ad5-79325ee30727 9d57c2b5ca2f41c09c8f4b723716ef7b 9891534839274cbc89aec062c36488b0 - - default default] [instance: 43d83f29-ba12-4205-ba09-545c3dc28920] Took 12.30 seconds for pre_live_migration on destination host compute-2.ctlplane.example.com.
Dec 05 09:27:38 compute-1 nova_compute[189066]: 2025-12-05 09:27:38.448 189070 DEBUG nova.compute.manager [None req-5c023d50-6bf4-4c07-8ad5-79325ee30727 9d57c2b5ca2f41c09c8f4b723716ef7b 9891534839274cbc89aec062c36488b0 - - default default] [instance: 43d83f29-ba12-4205-ba09-545c3dc28920] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Dec 05 09:27:38 compute-1 nova_compute[189066]: 2025-12-05 09:27:38.470 189070 DEBUG nova.compute.manager [None req-5c023d50-6bf4-4c07-8ad5-79325ee30727 9d57c2b5ca2f41c09c8f4b723716ef7b 9891534839274cbc89aec062c36488b0 - - default default] live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp_5yqv494',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='43d83f29-ba12-4205-ba09-545c3dc28920',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=Migration(f230ce5b-ec62-4081-8078-d62bbbc1ecbd),old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=True) _do_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8939
Dec 05 09:27:38 compute-1 nova_compute[189066]: 2025-12-05 09:27:38.504 189070 DEBUG nova.objects.instance [None req-5c023d50-6bf4-4c07-8ad5-79325ee30727 9d57c2b5ca2f41c09c8f4b723716ef7b 9891534839274cbc89aec062c36488b0 - - default default] Lazy-loading 'migration_context' on Instance uuid 43d83f29-ba12-4205-ba09-545c3dc28920 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 09:27:38 compute-1 nova_compute[189066]: 2025-12-05 09:27:38.506 189070 DEBUG nova.virt.libvirt.driver [None req-5c023d50-6bf4-4c07-8ad5-79325ee30727 9d57c2b5ca2f41c09c8f4b723716ef7b 9891534839274cbc89aec062c36488b0 - - default default] [instance: 43d83f29-ba12-4205-ba09-545c3dc28920] Starting monitoring of live migration _live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10639
Dec 05 09:27:38 compute-1 nova_compute[189066]: 2025-12-05 09:27:38.507 189070 DEBUG nova.virt.libvirt.driver [None req-5c023d50-6bf4-4c07-8ad5-79325ee30727 9d57c2b5ca2f41c09c8f4b723716ef7b 9891534839274cbc89aec062c36488b0 - - default default] [instance: 43d83f29-ba12-4205-ba09-545c3dc28920] Operation thread is still running _live_migration_monitor /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10440
Dec 05 09:27:38 compute-1 nova_compute[189066]: 2025-12-05 09:27:38.508 189070 DEBUG nova.virt.libvirt.driver [None req-5c023d50-6bf4-4c07-8ad5-79325ee30727 9d57c2b5ca2f41c09c8f4b723716ef7b 9891534839274cbc89aec062c36488b0 - - default default] [instance: 43d83f29-ba12-4205-ba09-545c3dc28920] Migration not running yet _live_migration_monitor /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10449
Dec 05 09:27:38 compute-1 nova_compute[189066]: 2025-12-05 09:27:38.528 189070 DEBUG nova.virt.libvirt.vif [None req-5c023d50-6bf4-4c07-8ad5-79325ee30727 9d57c2b5ca2f41c09c8f4b723716ef7b 9891534839274cbc89aec062c36488b0 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-05T09:26:51Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-LiveAutoBlockMigrationV225Test-server-1534053809',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-liveautoblockmigrationv225test-server-1534053809',id=11,image_ref='3ebffd97-b242-42d7-b245-ebdaf8e4377c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-05T09:27:16Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='aa5b008731384d18ba83c8d69e76bcef',ramdisk_id='',reservation_id='r-mjfc6v5g',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='3ebffd97-b242-42d7-b245-ebdaf8e4377c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-LiveAutoBlockMigrationV225Test-63392288',owner_user_name='tempest-LiveAutoBlockMigrationV225Test-63392288-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-05T09:27:17Z,user_data=None,user_id='a9b7ab1c9c854146af8af16f337a063d',uuid=43d83f29-ba12-4205-ba09-545c3dc28920,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "8977e440-f4bd-42c7-bf7b-c57e7184cc33", "address": "fa:16:3e:81:c8:37", "network": {"id": "0a97aec7-0780-4b5e-9498-e796fd7b42fd", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-971219995-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "aa5b008731384d18ba83c8d69e76bcef", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap8977e440-f4", "ovs_interfaceid": "8977e440-f4bd-42c7-bf7b-c57e7184cc33", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Dec 05 09:27:38 compute-1 nova_compute[189066]: 2025-12-05 09:27:38.529 189070 DEBUG nova.network.os_vif_util [None req-5c023d50-6bf4-4c07-8ad5-79325ee30727 9d57c2b5ca2f41c09c8f4b723716ef7b 9891534839274cbc89aec062c36488b0 - - default default] Converting VIF {"id": "8977e440-f4bd-42c7-bf7b-c57e7184cc33", "address": "fa:16:3e:81:c8:37", "network": {"id": "0a97aec7-0780-4b5e-9498-e796fd7b42fd", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-971219995-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "aa5b008731384d18ba83c8d69e76bcef", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap8977e440-f4", "ovs_interfaceid": "8977e440-f4bd-42c7-bf7b-c57e7184cc33", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 09:27:38 compute-1 nova_compute[189066]: 2025-12-05 09:27:38.530 189070 DEBUG nova.network.os_vif_util [None req-5c023d50-6bf4-4c07-8ad5-79325ee30727 9d57c2b5ca2f41c09c8f4b723716ef7b 9891534839274cbc89aec062c36488b0 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:81:c8:37,bridge_name='br-int',has_traffic_filtering=True,id=8977e440-f4bd-42c7-bf7b-c57e7184cc33,network=Network(0a97aec7-0780-4b5e-9498-e796fd7b42fd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap8977e440-f4') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 09:27:38 compute-1 nova_compute[189066]: 2025-12-05 09:27:38.530 189070 DEBUG nova.virt.libvirt.migration [None req-5c023d50-6bf4-4c07-8ad5-79325ee30727 9d57c2b5ca2f41c09c8f4b723716ef7b 9891534839274cbc89aec062c36488b0 - - default default] [instance: 43d83f29-ba12-4205-ba09-545c3dc28920] Updating guest XML with vif config: <interface type="ethernet">
Dec 05 09:27:38 compute-1 nova_compute[189066]:   <mac address="fa:16:3e:81:c8:37"/>
Dec 05 09:27:38 compute-1 nova_compute[189066]:   <model type="virtio"/>
Dec 05 09:27:38 compute-1 nova_compute[189066]:   <driver name="vhost" rx_queue_size="512"/>
Dec 05 09:27:38 compute-1 nova_compute[189066]:   <mtu size="1442"/>
Dec 05 09:27:38 compute-1 nova_compute[189066]:   <target dev="tap8977e440-f4"/>
Dec 05 09:27:38 compute-1 nova_compute[189066]: </interface>
Dec 05 09:27:38 compute-1 nova_compute[189066]:  _update_vif_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:388
Dec 05 09:27:38 compute-1 nova_compute[189066]: 2025-12-05 09:27:38.531 189070 DEBUG nova.virt.libvirt.driver [None req-5c023d50-6bf4-4c07-8ad5-79325ee30727 9d57c2b5ca2f41c09c8f4b723716ef7b 9891534839274cbc89aec062c36488b0 - - default default] [instance: 43d83f29-ba12-4205-ba09-545c3dc28920] About to invoke the migrate API _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10272
Dec 05 09:27:38 compute-1 podman[223879]: 2025-12-05 09:27:38.642800258 +0000 UTC m=+0.066733183 container health_status e8cea6352187f28e672aa25c227f2705a90d58074b8cf42235a56df5d4026fd5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a42d21893a42142b0b00e96296a13ac9d2c0fedf55d6438ad104fd0309c63376-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Dec 05 09:27:39 compute-1 nova_compute[189066]: 2025-12-05 09:27:39.011 189070 DEBUG nova.virt.libvirt.migration [None req-5c023d50-6bf4-4c07-8ad5-79325ee30727 9d57c2b5ca2f41c09c8f4b723716ef7b 9891534839274cbc89aec062c36488b0 - - default default] [instance: 43d83f29-ba12-4205-ba09-545c3dc28920] Current None elapsed 0 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Dec 05 09:27:39 compute-1 nova_compute[189066]: 2025-12-05 09:27:39.012 189070 INFO nova.virt.libvirt.migration [None req-5c023d50-6bf4-4c07-8ad5-79325ee30727 9d57c2b5ca2f41c09c8f4b723716ef7b 9891534839274cbc89aec062c36488b0 - - default default] [instance: 43d83f29-ba12-4205-ba09-545c3dc28920] Increasing downtime to 50 ms after 0 sec elapsed time
Dec 05 09:27:39 compute-1 nova_compute[189066]: 2025-12-05 09:27:39.022 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:27:39 compute-1 nova_compute[189066]: 2025-12-05 09:27:39.125 189070 INFO nova.virt.libvirt.driver [None req-5c023d50-6bf4-4c07-8ad5-79325ee30727 9d57c2b5ca2f41c09c8f4b723716ef7b 9891534839274cbc89aec062c36488b0 - - default default] [instance: 43d83f29-ba12-4205-ba09-545c3dc28920] Migration running for 0 secs, memory 100% remaining (bytes processed=0, remaining=0, total=0); disk 100% remaining (bytes processed=0, remaining=0, total=0).
Dec 05 09:27:39 compute-1 systemd[1]: Stopping User Manager for UID 42436...
Dec 05 09:27:39 compute-1 systemd[223792]: Activating special unit Exit the Session...
Dec 05 09:27:39 compute-1 systemd[223792]: Stopped target Main User Target.
Dec 05 09:27:39 compute-1 systemd[223792]: Stopped target Basic System.
Dec 05 09:27:39 compute-1 systemd[223792]: Stopped target Paths.
Dec 05 09:27:39 compute-1 systemd[223792]: Stopped target Sockets.
Dec 05 09:27:39 compute-1 systemd[223792]: Stopped target Timers.
Dec 05 09:27:39 compute-1 systemd[223792]: Stopped Mark boot as successful after the user session has run 2 minutes.
Dec 05 09:27:39 compute-1 systemd[223792]: Stopped Daily Cleanup of User's Temporary Directories.
Dec 05 09:27:39 compute-1 systemd[223792]: Closed D-Bus User Message Bus Socket.
Dec 05 09:27:39 compute-1 systemd[223792]: Stopped Create User's Volatile Files and Directories.
Dec 05 09:27:39 compute-1 systemd[223792]: Removed slice User Application Slice.
Dec 05 09:27:39 compute-1 systemd[223792]: Reached target Shutdown.
Dec 05 09:27:39 compute-1 systemd[223792]: Finished Exit the Session.
Dec 05 09:27:39 compute-1 systemd[223792]: Reached target Exit the Session.
Dec 05 09:27:39 compute-1 systemd[1]: user@42436.service: Deactivated successfully.
Dec 05 09:27:39 compute-1 systemd[1]: Stopped User Manager for UID 42436.
Dec 05 09:27:39 compute-1 systemd[1]: Stopping User Runtime Directory /run/user/42436...
Dec 05 09:27:39 compute-1 systemd[1]: run-user-42436.mount: Deactivated successfully.
Dec 05 09:27:39 compute-1 systemd[1]: user-runtime-dir@42436.service: Deactivated successfully.
Dec 05 09:27:39 compute-1 systemd[1]: Stopped User Runtime Directory /run/user/42436.
Dec 05 09:27:39 compute-1 systemd[1]: Removed slice User Slice of UID 42436.
Dec 05 09:27:39 compute-1 nova_compute[189066]: 2025-12-05 09:27:39.466 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:27:39 compute-1 nova_compute[189066]: 2025-12-05 09:27:39.629 189070 DEBUG nova.virt.libvirt.migration [None req-5c023d50-6bf4-4c07-8ad5-79325ee30727 9d57c2b5ca2f41c09c8f4b723716ef7b 9891534839274cbc89aec062c36488b0 - - default default] [instance: 43d83f29-ba12-4205-ba09-545c3dc28920] Current 50 elapsed 1 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Dec 05 09:27:39 compute-1 nova_compute[189066]: 2025-12-05 09:27:39.630 189070 DEBUG nova.virt.libvirt.migration [None req-5c023d50-6bf4-4c07-8ad5-79325ee30727 9d57c2b5ca2f41c09c8f4b723716ef7b 9891534839274cbc89aec062c36488b0 - - default default] [instance: 43d83f29-ba12-4205-ba09-545c3dc28920] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525
Dec 05 09:27:40 compute-1 nova_compute[189066]: 2025-12-05 09:27:40.133 189070 DEBUG nova.virt.libvirt.migration [None req-5c023d50-6bf4-4c07-8ad5-79325ee30727 9d57c2b5ca2f41c09c8f4b723716ef7b 9891534839274cbc89aec062c36488b0 - - default default] [instance: 43d83f29-ba12-4205-ba09-545c3dc28920] Current 50 elapsed 1 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Dec 05 09:27:40 compute-1 nova_compute[189066]: 2025-12-05 09:27:40.133 189070 DEBUG nova.virt.libvirt.migration [None req-5c023d50-6bf4-4c07-8ad5-79325ee30727 9d57c2b5ca2f41c09c8f4b723716ef7b 9891534839274cbc89aec062c36488b0 - - default default] [instance: 43d83f29-ba12-4205-ba09-545c3dc28920] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525
Dec 05 09:27:40 compute-1 nova_compute[189066]: 2025-12-05 09:27:40.637 189070 DEBUG nova.virt.libvirt.migration [None req-5c023d50-6bf4-4c07-8ad5-79325ee30727 9d57c2b5ca2f41c09c8f4b723716ef7b 9891534839274cbc89aec062c36488b0 - - default default] [instance: 43d83f29-ba12-4205-ba09-545c3dc28920] Current 50 elapsed 2 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Dec 05 09:27:40 compute-1 nova_compute[189066]: 2025-12-05 09:27:40.638 189070 DEBUG nova.virt.libvirt.migration [None req-5c023d50-6bf4-4c07-8ad5-79325ee30727 9d57c2b5ca2f41c09c8f4b723716ef7b 9891534839274cbc89aec062c36488b0 - - default default] [instance: 43d83f29-ba12-4205-ba09-545c3dc28920] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525
Dec 05 09:27:40 compute-1 nova_compute[189066]: 2025-12-05 09:27:40.782 189070 DEBUG nova.virt.driver [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] Emitting event <LifecycleEvent: 1764926860.7822661, 43d83f29-ba12-4205-ba09-545c3dc28920 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 09:27:40 compute-1 nova_compute[189066]: 2025-12-05 09:27:40.783 189070 INFO nova.compute.manager [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] [instance: 43d83f29-ba12-4205-ba09-545c3dc28920] VM Paused (Lifecycle Event)
Dec 05 09:27:40 compute-1 kernel: tap8977e440-f4 (unregistering): left promiscuous mode
Dec 05 09:27:41 compute-1 NetworkManager[55704]: <info>  [1764926861.0000] device (tap8977e440-f4): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec 05 09:27:41 compute-1 nova_compute[189066]: 2025-12-05 09:27:41.019 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:27:41 compute-1 ovn_controller[95809]: 2025-12-05T09:27:41Z|00087|binding|INFO|Releasing lport 8977e440-f4bd-42c7-bf7b-c57e7184cc33 from this chassis (sb_readonly=0)
Dec 05 09:27:41 compute-1 ovn_controller[95809]: 2025-12-05T09:27:41Z|00088|binding|INFO|Setting lport 8977e440-f4bd-42c7-bf7b-c57e7184cc33 down in Southbound
Dec 05 09:27:41 compute-1 ovn_controller[95809]: 2025-12-05T09:27:41Z|00089|binding|INFO|Releasing lport d771a01e-fae5-4012-b36a-241c6c0bf739 from this chassis (sb_readonly=0)
Dec 05 09:27:41 compute-1 ovn_controller[95809]: 2025-12-05T09:27:41Z|00090|binding|INFO|Setting lport d771a01e-fae5-4012-b36a-241c6c0bf739 down in Southbound
Dec 05 09:27:41 compute-1 ovn_controller[95809]: 2025-12-05T09:27:41Z|00091|binding|INFO|Removing iface tap8977e440-f4 ovn-installed in OVS
Dec 05 09:27:41 compute-1 nova_compute[189066]: 2025-12-05 09:27:41.024 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:27:41 compute-1 nova_compute[189066]: 2025-12-05 09:27:41.071 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:27:41 compute-1 systemd[1]: machine-qemu\x2d6\x2dinstance\x2d0000000b.scope: Deactivated successfully.
Dec 05 09:27:41 compute-1 systemd[1]: machine-qemu\x2d6\x2dinstance\x2d0000000b.scope: Consumed 15.224s CPU time.
Dec 05 09:27:41 compute-1 systemd-machined[154815]: Machine qemu-6-instance-0000000b terminated.
Dec 05 09:27:41 compute-1 nova_compute[189066]: 2025-12-05 09:27:41.244 189070 DEBUG nova.virt.libvirt.guest [None req-5c023d50-6bf4-4c07-8ad5-79325ee30727 9d57c2b5ca2f41c09c8f4b723716ef7b 9891534839274cbc89aec062c36488b0 - - default default] Domain has shutdown/gone away: Requested operation is not valid: domain is not running get_job_info /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:688
Dec 05 09:27:41 compute-1 nova_compute[189066]: 2025-12-05 09:27:41.244 189070 INFO nova.virt.libvirt.driver [None req-5c023d50-6bf4-4c07-8ad5-79325ee30727 9d57c2b5ca2f41c09c8f4b723716ef7b 9891534839274cbc89aec062c36488b0 - - default default] [instance: 43d83f29-ba12-4205-ba09-545c3dc28920] Migration operation has completed
Dec 05 09:27:41 compute-1 nova_compute[189066]: 2025-12-05 09:27:41.245 189070 INFO nova.compute.manager [None req-5c023d50-6bf4-4c07-8ad5-79325ee30727 9d57c2b5ca2f41c09c8f4b723716ef7b 9891534839274cbc89aec062c36488b0 - - default default] [instance: 43d83f29-ba12-4205-ba09-545c3dc28920] _post_live_migration() is started..
Dec 05 09:27:41 compute-1 nova_compute[189066]: 2025-12-05 09:27:41.258 189070 DEBUG nova.virt.libvirt.driver [None req-5c023d50-6bf4-4c07-8ad5-79325ee30727 9d57c2b5ca2f41c09c8f4b723716ef7b 9891534839274cbc89aec062c36488b0 - - default default] [instance: 43d83f29-ba12-4205-ba09-545c3dc28920] Migrate API has completed _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10279
Dec 05 09:27:41 compute-1 nova_compute[189066]: 2025-12-05 09:27:41.258 189070 DEBUG nova.virt.libvirt.driver [None req-5c023d50-6bf4-4c07-8ad5-79325ee30727 9d57c2b5ca2f41c09c8f4b723716ef7b 9891534839274cbc89aec062c36488b0 - - default default] [instance: 43d83f29-ba12-4205-ba09-545c3dc28920] Migration operation thread has finished _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10327
Dec 05 09:27:41 compute-1 nova_compute[189066]: 2025-12-05 09:27:41.259 189070 DEBUG nova.virt.libvirt.driver [None req-5c023d50-6bf4-4c07-8ad5-79325ee30727 9d57c2b5ca2f41c09c8f4b723716ef7b 9891534839274cbc89aec062c36488b0 - - default default] [instance: 43d83f29-ba12-4205-ba09-545c3dc28920] Migration operation thread notification thread_finished /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10630
Dec 05 09:27:41 compute-1 podman[223926]: 2025-12-05 09:27:41.638104254 +0000 UTC m=+0.075987368 container health_status 3f3288243aefe700d53cffce184353529fbaf6e44f1c19ec4792ed576af41176 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_id=multipathd, org.label-schema.vendor=CentOS, container_name=multipathd, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Dec 05 09:27:41 compute-1 nova_compute[189066]: 2025-12-05 09:27:41.854 189070 DEBUG nova.compute.manager [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] [instance: 43d83f29-ba12-4205-ba09-545c3dc28920] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 09:27:41 compute-1 nova_compute[189066]: 2025-12-05 09:27:41.860 189070 DEBUG nova.compute.manager [req-680b9531-11f7-487e-bcc3-41caf2a7271e req-03023c61-cdee-4158-b02f-92514c1a1965 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 43d83f29-ba12-4205-ba09-545c3dc28920] Received event network-changed-8977e440-f4bd-42c7-bf7b-c57e7184cc33 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 09:27:41 compute-1 nova_compute[189066]: 2025-12-05 09:27:41.860 189070 DEBUG nova.compute.manager [req-680b9531-11f7-487e-bcc3-41caf2a7271e req-03023c61-cdee-4158-b02f-92514c1a1965 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 43d83f29-ba12-4205-ba09-545c3dc28920] Refreshing instance network info cache due to event network-changed-8977e440-f4bd-42c7-bf7b-c57e7184cc33. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Dec 05 09:27:41 compute-1 nova_compute[189066]: 2025-12-05 09:27:41.860 189070 DEBUG oslo_concurrency.lockutils [req-680b9531-11f7-487e-bcc3-41caf2a7271e req-03023c61-cdee-4158-b02f-92514c1a1965 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Acquiring lock "refresh_cache-43d83f29-ba12-4205-ba09-545c3dc28920" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 09:27:41 compute-1 nova_compute[189066]: 2025-12-05 09:27:41.861 189070 DEBUG oslo_concurrency.lockutils [req-680b9531-11f7-487e-bcc3-41caf2a7271e req-03023c61-cdee-4158-b02f-92514c1a1965 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Acquired lock "refresh_cache-43d83f29-ba12-4205-ba09-545c3dc28920" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 09:27:41 compute-1 nova_compute[189066]: 2025-12-05 09:27:41.861 189070 DEBUG nova.network.neutron [req-680b9531-11f7-487e-bcc3-41caf2a7271e req-03023c61-cdee-4158-b02f-92514c1a1965 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 43d83f29-ba12-4205-ba09-545c3dc28920] Refreshing network info cache for port 8977e440-f4bd-42c7-bf7b-c57e7184cc33 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Dec 05 09:27:41 compute-1 ovn_controller[95809]: 2025-12-05T09:27:41Z|00092|binding|INFO|Releasing lport da7e4261-23bc-43e6-a0d1-1c34dd95f6c8 from this chassis (sb_readonly=0)
Dec 05 09:27:41 compute-1 ovn_controller[95809]: 2025-12-05T09:27:41Z|00093|binding|INFO|Releasing lport 0cfbe0fa-0fb7-480e-bf9d-3c9768dbeffa from this chassis (sb_readonly=0)
Dec 05 09:27:41 compute-1 ovn_controller[95809]: 2025-12-05T09:27:41Z|00094|binding|INFO|Releasing lport 6691f267-8fd6-45bc-bce0-2a0ee4f4c825 from this chassis (sb_readonly=0)
Dec 05 09:27:41 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:27:41.876 105272 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:3f:ce:ca 19.80.0.182'], port_security=['fa:16:3e:3f:ce:ca 19.80.0.182'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=['8977e440-f4bd-42c7-bf7b-c57e7184cc33'], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-subport-1788543283', 'neutron:cidrs': '19.80.0.182/24', 'neutron:device_id': '', 'neutron:device_owner': 'trunk:subport', 'neutron:mtu': '', 'neutron:network_name': 'neutron-e5fa9b67-cb83-4b68-927e-0eb9577e2a4d', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-subport-1788543283', 'neutron:project_id': 'aa5b008731384d18ba83c8d69e76bcef', 'neutron:revision_number': '3', 'neutron:security_group_ids': 'ab115429-13ca-4258-9a08-fc76d0ca17b7', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[42], additional_encap=[], encap=[], mirror_rules=[], datapath=868394f9-662d-4629-849a-2949c92470e2, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=d771a01e-fae5-4012-b36a-241c6c0bf739) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fc06f54c970>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 09:27:41 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:27:41.878 105272 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:81:c8:37 10.100.0.12'], port_security=['fa:16:3e:81:c8:37 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com,compute-2.ctlplane.example.com', 'activation-strategy': 'rarp', 'additional-chassis-activated': '27a42d69-fccd-4cb4-8b07-f904963c8b4f'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-parent-840969275', 'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '43d83f29-ba12-4205-ba09-545c3dc28920', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-0a97aec7-0780-4b5e-9498-e796fd7b42fd', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-parent-840969275', 'neutron:project_id': 'aa5b008731384d18ba83c8d69e76bcef', 'neutron:revision_number': '8', 'neutron:security_group_ids': 'ab115429-13ca-4258-9a08-fc76d0ca17b7', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=36325ae0-997b-4e15-a889-e33151da06b1, chassis=[], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fc06f54c970>], logical_port=8977e440-f4bd-42c7-bf7b-c57e7184cc33) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fc06f54c970>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 09:27:41 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:27:41.879 105272 INFO neutron.agent.ovn.metadata.agent [-] Port d771a01e-fae5-4012-b36a-241c6c0bf739 in datapath e5fa9b67-cb83-4b68-927e-0eb9577e2a4d unbound from our chassis
Dec 05 09:27:41 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:27:41.881 105272 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network e5fa9b67-cb83-4b68-927e-0eb9577e2a4d, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 05 09:27:41 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:27:41.884 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[c3d14b37-87ea-478b-898d-ee6ff4f0ae2c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:27:41 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:27:41.884 105272 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-e5fa9b67-cb83-4b68-927e-0eb9577e2a4d namespace which is not needed anymore
Dec 05 09:27:41 compute-1 nova_compute[189066]: 2025-12-05 09:27:41.910 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:27:42 compute-1 neutron-haproxy-ovnmeta-e5fa9b67-cb83-4b68-927e-0eb9577e2a4d[223542]: [NOTICE]   (223556) : haproxy version is 2.8.14-c23fe91
Dec 05 09:27:42 compute-1 neutron-haproxy-ovnmeta-e5fa9b67-cb83-4b68-927e-0eb9577e2a4d[223542]: [NOTICE]   (223556) : path to executable is /usr/sbin/haproxy
Dec 05 09:27:42 compute-1 neutron-haproxy-ovnmeta-e5fa9b67-cb83-4b68-927e-0eb9577e2a4d[223542]: [WARNING]  (223556) : Exiting Master process...
Dec 05 09:27:42 compute-1 neutron-haproxy-ovnmeta-e5fa9b67-cb83-4b68-927e-0eb9577e2a4d[223542]: [ALERT]    (223556) : Current worker (223562) exited with code 143 (Terminated)
Dec 05 09:27:42 compute-1 neutron-haproxy-ovnmeta-e5fa9b67-cb83-4b68-927e-0eb9577e2a4d[223542]: [WARNING]  (223556) : All workers exited. Exiting... (0)
Dec 05 09:27:42 compute-1 systemd[1]: libpod-2396afc30b4895da089705db463af93d561894176dc22dfffc8a2cfa8755b03c.scope: Deactivated successfully.
Dec 05 09:27:42 compute-1 podman[223964]: 2025-12-05 09:27:42.041147867 +0000 UTC m=+0.049340878 container died 2396afc30b4895da089705db463af93d561894176dc22dfffc8a2cfa8755b03c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e5fa9b67-cb83-4b68-927e-0eb9577e2a4d, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 05 09:27:42 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-2396afc30b4895da089705db463af93d561894176dc22dfffc8a2cfa8755b03c-userdata-shm.mount: Deactivated successfully.
Dec 05 09:27:42 compute-1 systemd[1]: var-lib-containers-storage-overlay-3152458a210273a36fba88993508d1329324094df5b774ac0a70e62a7e26f15c-merged.mount: Deactivated successfully.
Dec 05 09:27:42 compute-1 podman[223964]: 2025-12-05 09:27:42.07445346 +0000 UTC m=+0.082646481 container cleanup 2396afc30b4895da089705db463af93d561894176dc22dfffc8a2cfa8755b03c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e5fa9b67-cb83-4b68-927e-0eb9577e2a4d, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 05 09:27:42 compute-1 systemd[1]: libpod-conmon-2396afc30b4895da089705db463af93d561894176dc22dfffc8a2cfa8755b03c.scope: Deactivated successfully.
Dec 05 09:27:42 compute-1 podman[223991]: 2025-12-05 09:27:42.14887484 +0000 UTC m=+0.047934084 container remove 2396afc30b4895da089705db463af93d561894176dc22dfffc8a2cfa8755b03c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e5fa9b67-cb83-4b68-927e-0eb9577e2a4d, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 05 09:27:42 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:27:42.157 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[5d61915b-e088-4144-9ee4-abdacc7a8d1f]: (4, ('Fri Dec  5 09:27:41 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-e5fa9b67-cb83-4b68-927e-0eb9577e2a4d (2396afc30b4895da089705db463af93d561894176dc22dfffc8a2cfa8755b03c)\n2396afc30b4895da089705db463af93d561894176dc22dfffc8a2cfa8755b03c\nFri Dec  5 09:27:42 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-e5fa9b67-cb83-4b68-927e-0eb9577e2a4d (2396afc30b4895da089705db463af93d561894176dc22dfffc8a2cfa8755b03c)\n2396afc30b4895da089705db463af93d561894176dc22dfffc8a2cfa8755b03c\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:27:42 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:27:42.160 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[46ad3b8c-5698-44a6-8d10-4e1453d25a5c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:27:42 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:27:42.161 105272 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tape5fa9b67-c0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 09:27:42 compute-1 nova_compute[189066]: 2025-12-05 09:27:42.165 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:27:42 compute-1 kernel: tape5fa9b67-c0: left promiscuous mode
Dec 05 09:27:42 compute-1 nova_compute[189066]: 2025-12-05 09:27:42.181 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:27:42 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:27:42.187 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[cb4ec2ee-c6e2-40b5-b581-3347210e2725]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:27:42 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:27:42.200 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[7d111bbc-27d8-481b-bc34-6b59a4d3a72f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:27:42 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:27:42.201 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[fe873b90-3ed2-4413-b06f-3e9a20e14c1b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:27:42 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:27:42.223 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[fda39345-d28e-4f2f-af8c-1656aeb2a39b]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 410008, 'reachable_time': 37054, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 224014, 'error': None, 'target': 'ovnmeta-e5fa9b67-cb83-4b68-927e-0eb9577e2a4d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:27:42 compute-1 systemd[1]: run-netns-ovnmeta\x2de5fa9b67\x2dcb83\x2d4b68\x2d927e\x2d0eb9577e2a4d.mount: Deactivated successfully.
Dec 05 09:27:42 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:27:42.228 105809 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-e5fa9b67-cb83-4b68-927e-0eb9577e2a4d deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Dec 05 09:27:42 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:27:42.228 105809 DEBUG oslo.privsep.daemon [-] privsep: reply[49699ffc-4d7a-445b-a262-03f2980c788e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:27:42 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:27:42.230 105272 INFO neutron.agent.ovn.metadata.agent [-] Port 8977e440-f4bd-42c7-bf7b-c57e7184cc33 in datapath 0a97aec7-0780-4b5e-9498-e796fd7b42fd unbound from our chassis
Dec 05 09:27:42 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:27:42.232 105272 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 0a97aec7-0780-4b5e-9498-e796fd7b42fd, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 05 09:27:42 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:27:42.233 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[11e770d2-6dd3-4213-bf1c-5e24e6dca7ab]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:27:42 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:27:42.234 105272 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-0a97aec7-0780-4b5e-9498-e796fd7b42fd namespace which is not needed anymore
Dec 05 09:27:42 compute-1 neutron-haproxy-ovnmeta-0a97aec7-0780-4b5e-9498-e796fd7b42fd[223636]: [NOTICE]   (223640) : haproxy version is 2.8.14-c23fe91
Dec 05 09:27:42 compute-1 neutron-haproxy-ovnmeta-0a97aec7-0780-4b5e-9498-e796fd7b42fd[223636]: [NOTICE]   (223640) : path to executable is /usr/sbin/haproxy
Dec 05 09:27:42 compute-1 neutron-haproxy-ovnmeta-0a97aec7-0780-4b5e-9498-e796fd7b42fd[223636]: [WARNING]  (223640) : Exiting Master process...
Dec 05 09:27:42 compute-1 neutron-haproxy-ovnmeta-0a97aec7-0780-4b5e-9498-e796fd7b42fd[223636]: [WARNING]  (223640) : Exiting Master process...
Dec 05 09:27:42 compute-1 neutron-haproxy-ovnmeta-0a97aec7-0780-4b5e-9498-e796fd7b42fd[223636]: [ALERT]    (223640) : Current worker (223642) exited with code 143 (Terminated)
Dec 05 09:27:42 compute-1 neutron-haproxy-ovnmeta-0a97aec7-0780-4b5e-9498-e796fd7b42fd[223636]: [WARNING]  (223640) : All workers exited. Exiting... (0)
Dec 05 09:27:42 compute-1 systemd[1]: libpod-65a4efaab1bb4c179caf23f9028f4b219d12683a280f7f7a945d46cfc9a9cf30.scope: Deactivated successfully.
Dec 05 09:27:42 compute-1 podman[224032]: 2025-12-05 09:27:42.401174017 +0000 UTC m=+0.054463392 container died 65a4efaab1bb4c179caf23f9028f4b219d12683a280f7f7a945d46cfc9a9cf30 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0a97aec7-0780-4b5e-9498-e796fd7b42fd, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Dec 05 09:27:42 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-65a4efaab1bb4c179caf23f9028f4b219d12683a280f7f7a945d46cfc9a9cf30-userdata-shm.mount: Deactivated successfully.
Dec 05 09:27:42 compute-1 systemd[1]: var-lib-containers-storage-overlay-c1768a45cc4efc6b51693b0a548df8d56a3d8a2f65a04214aaf142688f31bd53-merged.mount: Deactivated successfully.
Dec 05 09:27:42 compute-1 podman[224032]: 2025-12-05 09:27:42.431410385 +0000 UTC m=+0.084699760 container cleanup 65a4efaab1bb4c179caf23f9028f4b219d12683a280f7f7a945d46cfc9a9cf30 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0a97aec7-0780-4b5e-9498-e796fd7b42fd, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125)
Dec 05 09:27:42 compute-1 systemd[1]: libpod-conmon-65a4efaab1bb4c179caf23f9028f4b219d12683a280f7f7a945d46cfc9a9cf30.scope: Deactivated successfully.
Dec 05 09:27:42 compute-1 podman[224059]: 2025-12-05 09:27:42.498869415 +0000 UTC m=+0.045057283 container remove 65a4efaab1bb4c179caf23f9028f4b219d12683a280f7f7a945d46cfc9a9cf30 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0a97aec7-0780-4b5e-9498-e796fd7b42fd, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team)
Dec 05 09:27:42 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:27:42.505 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[f39dca8b-6d4f-4029-850e-925397450c9f]: (4, ('Fri Dec  5 09:27:42 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-0a97aec7-0780-4b5e-9498-e796fd7b42fd (65a4efaab1bb4c179caf23f9028f4b219d12683a280f7f7a945d46cfc9a9cf30)\n65a4efaab1bb4c179caf23f9028f4b219d12683a280f7f7a945d46cfc9a9cf30\nFri Dec  5 09:27:42 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-0a97aec7-0780-4b5e-9498-e796fd7b42fd (65a4efaab1bb4c179caf23f9028f4b219d12683a280f7f7a945d46cfc9a9cf30)\n65a4efaab1bb4c179caf23f9028f4b219d12683a280f7f7a945d46cfc9a9cf30\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:27:42 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:27:42.507 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[aa819c50-f601-47e8-8962-485a6d147d31]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:27:42 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:27:42.509 105272 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap0a97aec7-00, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 09:27:42 compute-1 nova_compute[189066]: 2025-12-05 09:27:42.512 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:27:42 compute-1 kernel: tap0a97aec7-00: left promiscuous mode
Dec 05 09:27:42 compute-1 nova_compute[189066]: 2025-12-05 09:27:42.529 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:27:42 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:27:42.533 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[71de1ae2-a4ea-4727-a812-b7969a6a542a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:27:42 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:27:42.554 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[2fe2fe19-d83b-4ba4-a695-9d65ea8064f7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:27:42 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:27:42.556 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[726fa406-0e9b-47e8-8196-ea37848770af]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:27:42 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:27:42.573 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[04c141f2-de5b-41b7-a7d4-4f381fbadb66]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 410103, 'reachable_time': 33673, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 224079, 'error': None, 'target': 'ovnmeta-0a97aec7-0780-4b5e-9498-e796fd7b42fd', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:27:42 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:27:42.576 105809 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-0a97aec7-0780-4b5e-9498-e796fd7b42fd deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Dec 05 09:27:42 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:27:42.577 105809 DEBUG oslo.privsep.daemon [-] privsep: reply[c16c2402-e326-4b95-9431-7e6929fd8377]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:27:43 compute-1 systemd[1]: run-netns-ovnmeta\x2d0a97aec7\x2d0780\x2d4b5e\x2d9498\x2de796fd7b42fd.mount: Deactivated successfully.
Dec 05 09:27:43 compute-1 nova_compute[189066]: 2025-12-05 09:27:43.329 189070 DEBUG oslo_concurrency.lockutils [None req-c762e13d-2bb2-475a-8d1b-53e9a4ff209c 401ddf2632c64d0094aff64bba42db19 ea153810a6ed4283be879b89a75f1e86 - - default default] Acquiring lock "refresh_cache-9d2b0f76-0408-4f6b-8a37-ac9882a44b4e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 09:27:43 compute-1 nova_compute[189066]: 2025-12-05 09:27:43.329 189070 DEBUG oslo_concurrency.lockutils [None req-c762e13d-2bb2-475a-8d1b-53e9a4ff209c 401ddf2632c64d0094aff64bba42db19 ea153810a6ed4283be879b89a75f1e86 - - default default] Acquired lock "refresh_cache-9d2b0f76-0408-4f6b-8a37-ac9882a44b4e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 09:27:43 compute-1 nova_compute[189066]: 2025-12-05 09:27:43.330 189070 DEBUG nova.network.neutron [None req-c762e13d-2bb2-475a-8d1b-53e9a4ff209c 401ddf2632c64d0094aff64bba42db19 ea153810a6ed4283be879b89a75f1e86 - - default default] [instance: 9d2b0f76-0408-4f6b-8a37-ac9882a44b4e] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Dec 05 09:27:43 compute-1 nova_compute[189066]: 2025-12-05 09:27:43.523 189070 DEBUG nova.network.neutron [None req-5c023d50-6bf4-4c07-8ad5-79325ee30727 9d57c2b5ca2f41c09c8f4b723716ef7b 9891534839274cbc89aec062c36488b0 - - default default] Activated binding for port 8977e440-f4bd-42c7-bf7b-c57e7184cc33 and host compute-2.ctlplane.example.com migrate_instance_start /usr/lib/python3.9/site-packages/nova/network/neutron.py:3181
Dec 05 09:27:43 compute-1 nova_compute[189066]: 2025-12-05 09:27:43.523 189070 DEBUG nova.compute.manager [None req-5c023d50-6bf4-4c07-8ad5-79325ee30727 9d57c2b5ca2f41c09c8f4b723716ef7b 9891534839274cbc89aec062c36488b0 - - default default] [instance: 43d83f29-ba12-4205-ba09-545c3dc28920] Calling driver.post_live_migration_at_source with original source VIFs from migrate_data: [{"id": "8977e440-f4bd-42c7-bf7b-c57e7184cc33", "address": "fa:16:3e:81:c8:37", "network": {"id": "0a97aec7-0780-4b5e-9498-e796fd7b42fd", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-971219995-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "aa5b008731384d18ba83c8d69e76bcef", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8977e440-f4", "ovs_interfaceid": "8977e440-f4bd-42c7-bf7b-c57e7184cc33", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] _post_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:9326
Dec 05 09:27:43 compute-1 nova_compute[189066]: 2025-12-05 09:27:43.524 189070 DEBUG nova.virt.libvirt.vif [None req-5c023d50-6bf4-4c07-8ad5-79325ee30727 9d57c2b5ca2f41c09c8f4b723716ef7b 9891534839274cbc89aec062c36488b0 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-05T09:26:51Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-LiveAutoBlockMigrationV225Test-server-1534053809',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-liveautoblockmigrationv225test-server-1534053809',id=11,image_ref='3ebffd97-b242-42d7-b245-ebdaf8e4377c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-05T09:27:16Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='aa5b008731384d18ba83c8d69e76bcef',ramdisk_id='',reservation_id='r-mjfc6v5g',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='3ebffd97-b242-42d7-b245-ebdaf8e4377c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-LiveAutoBlockMigrationV225Test-63392288',owner_user_name='tempest-LiveAutoBlockMigrationV225Test-63392288-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-05T09:27:23Z,user_data=None,user_id='a9b7ab1c9c854146af8af16f337a063d',uuid=43d83f29-ba12-4205-ba09-545c3dc28920,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "8977e440-f4bd-42c7-bf7b-c57e7184cc33", "address": "fa:16:3e:81:c8:37", "network": {"id": "0a97aec7-0780-4b5e-9498-e796fd7b42fd", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-971219995-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "aa5b008731384d18ba83c8d69e76bcef", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8977e440-f4", "ovs_interfaceid": "8977e440-f4bd-42c7-bf7b-c57e7184cc33", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Dec 05 09:27:43 compute-1 nova_compute[189066]: 2025-12-05 09:27:43.525 189070 DEBUG nova.network.os_vif_util [None req-5c023d50-6bf4-4c07-8ad5-79325ee30727 9d57c2b5ca2f41c09c8f4b723716ef7b 9891534839274cbc89aec062c36488b0 - - default default] Converting VIF {"id": "8977e440-f4bd-42c7-bf7b-c57e7184cc33", "address": "fa:16:3e:81:c8:37", "network": {"id": "0a97aec7-0780-4b5e-9498-e796fd7b42fd", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-971219995-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "aa5b008731384d18ba83c8d69e76bcef", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8977e440-f4", "ovs_interfaceid": "8977e440-f4bd-42c7-bf7b-c57e7184cc33", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 09:27:43 compute-1 nova_compute[189066]: 2025-12-05 09:27:43.526 189070 DEBUG nova.network.os_vif_util [None req-5c023d50-6bf4-4c07-8ad5-79325ee30727 9d57c2b5ca2f41c09c8f4b723716ef7b 9891534839274cbc89aec062c36488b0 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:81:c8:37,bridge_name='br-int',has_traffic_filtering=True,id=8977e440-f4bd-42c7-bf7b-c57e7184cc33,network=Network(0a97aec7-0780-4b5e-9498-e796fd7b42fd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap8977e440-f4') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 09:27:43 compute-1 nova_compute[189066]: 2025-12-05 09:27:43.526 189070 DEBUG os_vif [None req-5c023d50-6bf4-4c07-8ad5-79325ee30727 9d57c2b5ca2f41c09c8f4b723716ef7b 9891534839274cbc89aec062c36488b0 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:81:c8:37,bridge_name='br-int',has_traffic_filtering=True,id=8977e440-f4bd-42c7-bf7b-c57e7184cc33,network=Network(0a97aec7-0780-4b5e-9498-e796fd7b42fd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap8977e440-f4') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Dec 05 09:27:43 compute-1 nova_compute[189066]: 2025-12-05 09:27:43.529 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:27:43 compute-1 nova_compute[189066]: 2025-12-05 09:27:43.530 189070 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap8977e440-f4, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 09:27:43 compute-1 nova_compute[189066]: 2025-12-05 09:27:43.532 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:27:43 compute-1 nova_compute[189066]: 2025-12-05 09:27:43.534 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 09:27:43 compute-1 nova_compute[189066]: 2025-12-05 09:27:43.538 189070 INFO os_vif [None req-5c023d50-6bf4-4c07-8ad5-79325ee30727 9d57c2b5ca2f41c09c8f4b723716ef7b 9891534839274cbc89aec062c36488b0 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:81:c8:37,bridge_name='br-int',has_traffic_filtering=True,id=8977e440-f4bd-42c7-bf7b-c57e7184cc33,network=Network(0a97aec7-0780-4b5e-9498-e796fd7b42fd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap8977e440-f4')
Dec 05 09:27:43 compute-1 nova_compute[189066]: 2025-12-05 09:27:43.539 189070 DEBUG oslo_concurrency.lockutils [None req-5c023d50-6bf4-4c07-8ad5-79325ee30727 9d57c2b5ca2f41c09c8f4b723716ef7b 9891534839274cbc89aec062c36488b0 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:27:43 compute-1 nova_compute[189066]: 2025-12-05 09:27:43.539 189070 DEBUG oslo_concurrency.lockutils [None req-5c023d50-6bf4-4c07-8ad5-79325ee30727 9d57c2b5ca2f41c09c8f4b723716ef7b 9891534839274cbc89aec062c36488b0 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:27:43 compute-1 nova_compute[189066]: 2025-12-05 09:27:43.540 189070 DEBUG oslo_concurrency.lockutils [None req-5c023d50-6bf4-4c07-8ad5-79325ee30727 9d57c2b5ca2f41c09c8f4b723716ef7b 9891534839274cbc89aec062c36488b0 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:27:43 compute-1 nova_compute[189066]: 2025-12-05 09:27:43.540 189070 DEBUG nova.compute.manager [None req-5c023d50-6bf4-4c07-8ad5-79325ee30727 9d57c2b5ca2f41c09c8f4b723716ef7b 9891534839274cbc89aec062c36488b0 - - default default] [instance: 43d83f29-ba12-4205-ba09-545c3dc28920] Calling driver.cleanup from _post_live_migration _post_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:9349
Dec 05 09:27:43 compute-1 nova_compute[189066]: 2025-12-05 09:27:43.541 189070 INFO nova.virt.libvirt.driver [None req-5c023d50-6bf4-4c07-8ad5-79325ee30727 9d57c2b5ca2f41c09c8f4b723716ef7b 9891534839274cbc89aec062c36488b0 - - default default] [instance: 43d83f29-ba12-4205-ba09-545c3dc28920] Deleting instance files /var/lib/nova/instances/43d83f29-ba12-4205-ba09-545c3dc28920_del
Dec 05 09:27:43 compute-1 nova_compute[189066]: 2025-12-05 09:27:43.542 189070 INFO nova.virt.libvirt.driver [None req-5c023d50-6bf4-4c07-8ad5-79325ee30727 9d57c2b5ca2f41c09c8f4b723716ef7b 9891534839274cbc89aec062c36488b0 - - default default] [instance: 43d83f29-ba12-4205-ba09-545c3dc28920] Deletion of /var/lib/nova/instances/43d83f29-ba12-4205-ba09-545c3dc28920_del complete
Dec 05 09:27:44 compute-1 nova_compute[189066]: 2025-12-05 09:27:44.061 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:27:44 compute-1 nova_compute[189066]: 2025-12-05 09:27:44.103 189070 DEBUG nova.compute.manager [req-b718dde8-e842-4e54-aa2d-f14464d121b1 req-d371533f-7006-42c1-afae-1cd4e940c497 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 43d83f29-ba12-4205-ba09-545c3dc28920] Received event network-vif-plugged-8977e440-f4bd-42c7-bf7b-c57e7184cc33 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 09:27:44 compute-1 nova_compute[189066]: 2025-12-05 09:27:44.104 189070 DEBUG oslo_concurrency.lockutils [req-b718dde8-e842-4e54-aa2d-f14464d121b1 req-d371533f-7006-42c1-afae-1cd4e940c497 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Acquiring lock "43d83f29-ba12-4205-ba09-545c3dc28920-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:27:44 compute-1 nova_compute[189066]: 2025-12-05 09:27:44.104 189070 DEBUG oslo_concurrency.lockutils [req-b718dde8-e842-4e54-aa2d-f14464d121b1 req-d371533f-7006-42c1-afae-1cd4e940c497 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Lock "43d83f29-ba12-4205-ba09-545c3dc28920-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:27:44 compute-1 nova_compute[189066]: 2025-12-05 09:27:44.104 189070 DEBUG oslo_concurrency.lockutils [req-b718dde8-e842-4e54-aa2d-f14464d121b1 req-d371533f-7006-42c1-afae-1cd4e940c497 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Lock "43d83f29-ba12-4205-ba09-545c3dc28920-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:27:44 compute-1 nova_compute[189066]: 2025-12-05 09:27:44.104 189070 DEBUG nova.compute.manager [req-b718dde8-e842-4e54-aa2d-f14464d121b1 req-d371533f-7006-42c1-afae-1cd4e940c497 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 43d83f29-ba12-4205-ba09-545c3dc28920] No waiting events found dispatching network-vif-plugged-8977e440-f4bd-42c7-bf7b-c57e7184cc33 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 09:27:44 compute-1 nova_compute[189066]: 2025-12-05 09:27:44.105 189070 WARNING nova.compute.manager [req-b718dde8-e842-4e54-aa2d-f14464d121b1 req-d371533f-7006-42c1-afae-1cd4e940c497 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 43d83f29-ba12-4205-ba09-545c3dc28920] Received unexpected event network-vif-plugged-8977e440-f4bd-42c7-bf7b-c57e7184cc33 for instance with vm_state active and task_state migrating.
Dec 05 09:27:44 compute-1 nova_compute[189066]: 2025-12-05 09:27:44.133 189070 DEBUG nova.compute.manager [req-ec66e979-5381-4b83-b4e7-251f1dba19fc req-3218bfc8-87cd-40bc-ab13-29ca7b46f2e4 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 43d83f29-ba12-4205-ba09-545c3dc28920] Received event network-vif-unplugged-8977e440-f4bd-42c7-bf7b-c57e7184cc33 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 09:27:44 compute-1 nova_compute[189066]: 2025-12-05 09:27:44.133 189070 DEBUG oslo_concurrency.lockutils [req-ec66e979-5381-4b83-b4e7-251f1dba19fc req-3218bfc8-87cd-40bc-ab13-29ca7b46f2e4 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Acquiring lock "43d83f29-ba12-4205-ba09-545c3dc28920-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:27:44 compute-1 nova_compute[189066]: 2025-12-05 09:27:44.134 189070 DEBUG oslo_concurrency.lockutils [req-ec66e979-5381-4b83-b4e7-251f1dba19fc req-3218bfc8-87cd-40bc-ab13-29ca7b46f2e4 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Lock "43d83f29-ba12-4205-ba09-545c3dc28920-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:27:44 compute-1 nova_compute[189066]: 2025-12-05 09:27:44.134 189070 DEBUG oslo_concurrency.lockutils [req-ec66e979-5381-4b83-b4e7-251f1dba19fc req-3218bfc8-87cd-40bc-ab13-29ca7b46f2e4 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Lock "43d83f29-ba12-4205-ba09-545c3dc28920-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:27:44 compute-1 nova_compute[189066]: 2025-12-05 09:27:44.134 189070 DEBUG nova.compute.manager [req-ec66e979-5381-4b83-b4e7-251f1dba19fc req-3218bfc8-87cd-40bc-ab13-29ca7b46f2e4 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 43d83f29-ba12-4205-ba09-545c3dc28920] No waiting events found dispatching network-vif-unplugged-8977e440-f4bd-42c7-bf7b-c57e7184cc33 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 09:27:44 compute-1 nova_compute[189066]: 2025-12-05 09:27:44.134 189070 DEBUG nova.compute.manager [req-ec66e979-5381-4b83-b4e7-251f1dba19fc req-3218bfc8-87cd-40bc-ab13-29ca7b46f2e4 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 43d83f29-ba12-4205-ba09-545c3dc28920] Received event network-vif-unplugged-8977e440-f4bd-42c7-bf7b-c57e7184cc33 for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Dec 05 09:27:44 compute-1 nova_compute[189066]: 2025-12-05 09:27:44.484 189070 DEBUG nova.network.neutron [req-680b9531-11f7-487e-bcc3-41caf2a7271e req-03023c61-cdee-4158-b02f-92514c1a1965 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 43d83f29-ba12-4205-ba09-545c3dc28920] Updated VIF entry in instance network info cache for port 8977e440-f4bd-42c7-bf7b-c57e7184cc33. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Dec 05 09:27:44 compute-1 nova_compute[189066]: 2025-12-05 09:27:44.485 189070 DEBUG nova.network.neutron [req-680b9531-11f7-487e-bcc3-41caf2a7271e req-03023c61-cdee-4158-b02f-92514c1a1965 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 43d83f29-ba12-4205-ba09-545c3dc28920] Updating instance_info_cache with network_info: [{"id": "8977e440-f4bd-42c7-bf7b-c57e7184cc33", "address": "fa:16:3e:81:c8:37", "network": {"id": "0a97aec7-0780-4b5e-9498-e796fd7b42fd", "bridge": null, "label": "tempest-LiveAutoBlockMigrationV225Test-971219995-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "aa5b008731384d18ba83c8d69e76bcef", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "unbound", "details": {"bound_drivers": {"0": "ovn"}}, "devname": "tap8977e440-f4", "ovs_interfaceid": null, "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"migrating_to": "compute-2.ctlplane.example.com"}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 09:27:44 compute-1 nova_compute[189066]: 2025-12-05 09:27:44.518 189070 DEBUG oslo_concurrency.lockutils [req-680b9531-11f7-487e-bcc3-41caf2a7271e req-03023c61-cdee-4158-b02f-92514c1a1965 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Releasing lock "refresh_cache-43d83f29-ba12-4205-ba09-545c3dc28920" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 09:27:45 compute-1 nova_compute[189066]: 2025-12-05 09:27:45.887 189070 DEBUG nova.network.neutron [None req-c762e13d-2bb2-475a-8d1b-53e9a4ff209c 401ddf2632c64d0094aff64bba42db19 ea153810a6ed4283be879b89a75f1e86 - - default default] [instance: 9d2b0f76-0408-4f6b-8a37-ac9882a44b4e] Updating instance_info_cache with network_info: [{"id": "fee19e88-d18e-4020-97b6-26caf4ef6fa9", "address": "fa:16:3e:ad:12:e3", "network": {"id": "2d5ff262-0b2d-49fb-b643-980510ce97c7", "bridge": "br-int", "label": "tempest-network-smoke--2096539509", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.214", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e26ae3fdd48d4947978a480f70e14f84", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfee19e88-d1", "ovs_interfaceid": "fee19e88-d18e-4020-97b6-26caf4ef6fa9", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 09:27:45 compute-1 nova_compute[189066]: 2025-12-05 09:27:45.907 189070 DEBUG oslo_concurrency.lockutils [None req-c762e13d-2bb2-475a-8d1b-53e9a4ff209c 401ddf2632c64d0094aff64bba42db19 ea153810a6ed4283be879b89a75f1e86 - - default default] Releasing lock "refresh_cache-9d2b0f76-0408-4f6b-8a37-ac9882a44b4e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 09:27:46 compute-1 nova_compute[189066]: 2025-12-05 09:27:46.064 189070 DEBUG nova.virt.libvirt.driver [None req-c762e13d-2bb2-475a-8d1b-53e9a4ff209c 401ddf2632c64d0094aff64bba42db19 ea153810a6ed4283be879b89a75f1e86 - - default default] [instance: 9d2b0f76-0408-4f6b-8a37-ac9882a44b4e] Starting migrate_disk_and_power_off migrate_disk_and_power_off /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11511
Dec 05 09:27:46 compute-1 nova_compute[189066]: 2025-12-05 09:27:46.064 189070 DEBUG nova.virt.libvirt.volume.remotefs [None req-c762e13d-2bb2-475a-8d1b-53e9a4ff209c 401ddf2632c64d0094aff64bba42db19 ea153810a6ed4283be879b89a75f1e86 - - default default] Creating file /var/lib/nova/instances/9d2b0f76-0408-4f6b-8a37-ac9882a44b4e/cf324f04891642cd8b39895f527256e8.tmp on remote host 192.168.122.102 create_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:79
Dec 05 09:27:46 compute-1 nova_compute[189066]: 2025-12-05 09:27:46.065 189070 DEBUG oslo_concurrency.processutils [None req-c762e13d-2bb2-475a-8d1b-53e9a4ff209c 401ddf2632c64d0094aff64bba42db19 ea153810a6ed4283be879b89a75f1e86 - - default default] Running cmd (subprocess): ssh -o BatchMode=yes 192.168.122.102 touch /var/lib/nova/instances/9d2b0f76-0408-4f6b-8a37-ac9882a44b4e/cf324f04891642cd8b39895f527256e8.tmp execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 09:27:46 compute-1 nova_compute[189066]: 2025-12-05 09:27:46.230 189070 DEBUG nova.compute.manager [req-7daac798-7c79-4b1e-a190-9bfe00d88905 req-82163267-90b8-496a-9910-02679ad52375 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 43d83f29-ba12-4205-ba09-545c3dc28920] Received event network-vif-plugged-8977e440-f4bd-42c7-bf7b-c57e7184cc33 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 09:27:46 compute-1 nova_compute[189066]: 2025-12-05 09:27:46.231 189070 DEBUG oslo_concurrency.lockutils [req-7daac798-7c79-4b1e-a190-9bfe00d88905 req-82163267-90b8-496a-9910-02679ad52375 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Acquiring lock "43d83f29-ba12-4205-ba09-545c3dc28920-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:27:46 compute-1 nova_compute[189066]: 2025-12-05 09:27:46.231 189070 DEBUG oslo_concurrency.lockutils [req-7daac798-7c79-4b1e-a190-9bfe00d88905 req-82163267-90b8-496a-9910-02679ad52375 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Lock "43d83f29-ba12-4205-ba09-545c3dc28920-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:27:46 compute-1 nova_compute[189066]: 2025-12-05 09:27:46.231 189070 DEBUG oslo_concurrency.lockutils [req-7daac798-7c79-4b1e-a190-9bfe00d88905 req-82163267-90b8-496a-9910-02679ad52375 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Lock "43d83f29-ba12-4205-ba09-545c3dc28920-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:27:46 compute-1 nova_compute[189066]: 2025-12-05 09:27:46.232 189070 DEBUG nova.compute.manager [req-7daac798-7c79-4b1e-a190-9bfe00d88905 req-82163267-90b8-496a-9910-02679ad52375 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 43d83f29-ba12-4205-ba09-545c3dc28920] No waiting events found dispatching network-vif-plugged-8977e440-f4bd-42c7-bf7b-c57e7184cc33 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 09:27:46 compute-1 nova_compute[189066]: 2025-12-05 09:27:46.232 189070 WARNING nova.compute.manager [req-7daac798-7c79-4b1e-a190-9bfe00d88905 req-82163267-90b8-496a-9910-02679ad52375 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 43d83f29-ba12-4205-ba09-545c3dc28920] Received unexpected event network-vif-plugged-8977e440-f4bd-42c7-bf7b-c57e7184cc33 for instance with vm_state active and task_state migrating.
Dec 05 09:27:46 compute-1 nova_compute[189066]: 2025-12-05 09:27:46.232 189070 DEBUG nova.compute.manager [req-7daac798-7c79-4b1e-a190-9bfe00d88905 req-82163267-90b8-496a-9910-02679ad52375 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 43d83f29-ba12-4205-ba09-545c3dc28920] Received event network-vif-plugged-8977e440-f4bd-42c7-bf7b-c57e7184cc33 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 09:27:46 compute-1 nova_compute[189066]: 2025-12-05 09:27:46.232 189070 DEBUG oslo_concurrency.lockutils [req-7daac798-7c79-4b1e-a190-9bfe00d88905 req-82163267-90b8-496a-9910-02679ad52375 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Acquiring lock "43d83f29-ba12-4205-ba09-545c3dc28920-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:27:46 compute-1 nova_compute[189066]: 2025-12-05 09:27:46.233 189070 DEBUG oslo_concurrency.lockutils [req-7daac798-7c79-4b1e-a190-9bfe00d88905 req-82163267-90b8-496a-9910-02679ad52375 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Lock "43d83f29-ba12-4205-ba09-545c3dc28920-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:27:46 compute-1 nova_compute[189066]: 2025-12-05 09:27:46.233 189070 DEBUG oslo_concurrency.lockutils [req-7daac798-7c79-4b1e-a190-9bfe00d88905 req-82163267-90b8-496a-9910-02679ad52375 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Lock "43d83f29-ba12-4205-ba09-545c3dc28920-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:27:46 compute-1 nova_compute[189066]: 2025-12-05 09:27:46.233 189070 DEBUG nova.compute.manager [req-7daac798-7c79-4b1e-a190-9bfe00d88905 req-82163267-90b8-496a-9910-02679ad52375 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 43d83f29-ba12-4205-ba09-545c3dc28920] No waiting events found dispatching network-vif-plugged-8977e440-f4bd-42c7-bf7b-c57e7184cc33 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 09:27:46 compute-1 nova_compute[189066]: 2025-12-05 09:27:46.233 189070 WARNING nova.compute.manager [req-7daac798-7c79-4b1e-a190-9bfe00d88905 req-82163267-90b8-496a-9910-02679ad52375 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 43d83f29-ba12-4205-ba09-545c3dc28920] Received unexpected event network-vif-plugged-8977e440-f4bd-42c7-bf7b-c57e7184cc33 for instance with vm_state active and task_state migrating.
Dec 05 09:27:46 compute-1 nova_compute[189066]: 2025-12-05 09:27:46.511 189070 DEBUG oslo_concurrency.processutils [None req-c762e13d-2bb2-475a-8d1b-53e9a4ff209c 401ddf2632c64d0094aff64bba42db19 ea153810a6ed4283be879b89a75f1e86 - - default default] CMD "ssh -o BatchMode=yes 192.168.122.102 touch /var/lib/nova/instances/9d2b0f76-0408-4f6b-8a37-ac9882a44b4e/cf324f04891642cd8b39895f527256e8.tmp" returned: 1 in 0.447s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 09:27:46 compute-1 nova_compute[189066]: 2025-12-05 09:27:46.512 189070 DEBUG oslo_concurrency.processutils [None req-c762e13d-2bb2-475a-8d1b-53e9a4ff209c 401ddf2632c64d0094aff64bba42db19 ea153810a6ed4283be879b89a75f1e86 - - default default] 'ssh -o BatchMode=yes 192.168.122.102 touch /var/lib/nova/instances/9d2b0f76-0408-4f6b-8a37-ac9882a44b4e/cf324f04891642cd8b39895f527256e8.tmp' failed. Not Retrying. execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:473
Dec 05 09:27:46 compute-1 nova_compute[189066]: 2025-12-05 09:27:46.513 189070 DEBUG nova.virt.libvirt.volume.remotefs [None req-c762e13d-2bb2-475a-8d1b-53e9a4ff209c 401ddf2632c64d0094aff64bba42db19 ea153810a6ed4283be879b89a75f1e86 - - default default] Creating directory /var/lib/nova/instances/9d2b0f76-0408-4f6b-8a37-ac9882a44b4e on remote host 192.168.122.102 create_dir /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:91
Dec 05 09:27:46 compute-1 nova_compute[189066]: 2025-12-05 09:27:46.513 189070 DEBUG oslo_concurrency.processutils [None req-c762e13d-2bb2-475a-8d1b-53e9a4ff209c 401ddf2632c64d0094aff64bba42db19 ea153810a6ed4283be879b89a75f1e86 - - default default] Running cmd (subprocess): ssh -o BatchMode=yes 192.168.122.102 mkdir -p /var/lib/nova/instances/9d2b0f76-0408-4f6b-8a37-ac9882a44b4e execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 09:27:46 compute-1 sshd-session[224080]: Received disconnect from 122.114.113.177 port 33690:11: Bye Bye [preauth]
Dec 05 09:27:46 compute-1 sshd-session[224080]: Disconnected from authenticating user root 122.114.113.177 port 33690 [preauth]
Dec 05 09:27:46 compute-1 podman[224086]: 2025-12-05 09:27:46.641753995 +0000 UTC m=+0.063823922 container health_status b07f36300303a5c1729056a61e344d6c6fd9b2e404082d8e8ce7185d45652c2b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=openstack_network_exporter, managed_by=edpm_ansible, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, maintainer=Red Hat, Inc., name=ubi9-minimal, architecture=x86_64, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, com.redhat.component=ubi9-minimal-container)
Dec 05 09:27:46 compute-1 nova_compute[189066]: 2025-12-05 09:27:46.718 189070 DEBUG oslo_concurrency.processutils [None req-c762e13d-2bb2-475a-8d1b-53e9a4ff209c 401ddf2632c64d0094aff64bba42db19 ea153810a6ed4283be879b89a75f1e86 - - default default] CMD "ssh -o BatchMode=yes 192.168.122.102 mkdir -p /var/lib/nova/instances/9d2b0f76-0408-4f6b-8a37-ac9882a44b4e" returned: 0 in 0.206s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 09:27:46 compute-1 nova_compute[189066]: 2025-12-05 09:27:46.724 189070 DEBUG nova.virt.libvirt.driver [None req-c762e13d-2bb2-475a-8d1b-53e9a4ff209c 401ddf2632c64d0094aff64bba42db19 ea153810a6ed4283be879b89a75f1e86 - - default default] [instance: 9d2b0f76-0408-4f6b-8a37-ac9882a44b4e] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071
Dec 05 09:27:48 compute-1 sshd-session[224082]: Connection reset by authenticating user root 91.202.233.33 port 54634 [preauth]
Dec 05 09:27:48 compute-1 nova_compute[189066]: 2025-12-05 09:27:48.533 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:27:48 compute-1 kernel: tapfee19e88-d1 (unregistering): left promiscuous mode
Dec 05 09:27:48 compute-1 NetworkManager[55704]: <info>  [1764926868.9382] device (tapfee19e88-d1): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec 05 09:27:48 compute-1 ovn_controller[95809]: 2025-12-05T09:27:48Z|00095|binding|INFO|Releasing lport fee19e88-d18e-4020-97b6-26caf4ef6fa9 from this chassis (sb_readonly=0)
Dec 05 09:27:48 compute-1 ovn_controller[95809]: 2025-12-05T09:27:48Z|00096|binding|INFO|Setting lport fee19e88-d18e-4020-97b6-26caf4ef6fa9 down in Southbound
Dec 05 09:27:48 compute-1 ovn_controller[95809]: 2025-12-05T09:27:48Z|00097|binding|INFO|Removing iface tapfee19e88-d1 ovn-installed in OVS
Dec 05 09:27:48 compute-1 nova_compute[189066]: 2025-12-05 09:27:48.947 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:27:48 compute-1 nova_compute[189066]: 2025-12-05 09:27:48.988 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:27:49 compute-1 systemd[1]: machine-qemu\x2d5\x2dinstance\x2d0000000a.scope: Deactivated successfully.
Dec 05 09:27:49 compute-1 systemd[1]: machine-qemu\x2d5\x2dinstance\x2d0000000a.scope: Consumed 15.327s CPU time.
Dec 05 09:27:49 compute-1 systemd-machined[154815]: Machine qemu-5-instance-0000000a terminated.
Dec 05 09:27:49 compute-1 nova_compute[189066]: 2025-12-05 09:27:49.062 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:27:49 compute-1 podman[224109]: 2025-12-05 09:27:49.06385275 +0000 UTC m=+0.082702383 container health_status b9f45cdcb567bb250381fa80d86c78c226d5638c7de1b6d79ee017275814ff68 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Dec 05 09:27:49 compute-1 nova_compute[189066]: 2025-12-05 09:27:49.221 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:27:49 compute-1 nova_compute[189066]: 2025-12-05 09:27:49.226 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:27:49 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:27:49.301 105272 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ad:12:e3 10.100.0.14'], port_security=['fa:16:3e:ad:12:e3 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '9d2b0f76-0408-4f6b-8a37-ac9882a44b4e', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-2d5ff262-0b2d-49fb-b643-980510ce97c7', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e26ae3fdd48d4947978a480f70e14f84', 'neutron:revision_number': '4', 'neutron:security_group_ids': '8cc6de91-de2a-4730-92b2-0be4bf62385a', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com', 'neutron:port_fip': '192.168.122.214'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=512618ca-52a2-4c70-b76c-f25c0d00e2da, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fc06f54c970>], logical_port=fee19e88-d18e-4020-97b6-26caf4ef6fa9) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fc06f54c970>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 09:27:49 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:27:49.302 105272 INFO neutron.agent.ovn.metadata.agent [-] Port fee19e88-d18e-4020-97b6-26caf4ef6fa9 in datapath 2d5ff262-0b2d-49fb-b643-980510ce97c7 unbound from our chassis
Dec 05 09:27:49 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:27:49.305 105272 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 2d5ff262-0b2d-49fb-b643-980510ce97c7, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 05 09:27:49 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:27:49.307 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[ffe6fe51-c70d-4153-8557-5299a1c3c9d0]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:27:49 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:27:49.308 105272 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-2d5ff262-0b2d-49fb-b643-980510ce97c7 namespace which is not needed anymore
Dec 05 09:27:49 compute-1 neutron-haproxy-ovnmeta-2d5ff262-0b2d-49fb-b643-980510ce97c7[223468]: [NOTICE]   (223472) : haproxy version is 2.8.14-c23fe91
Dec 05 09:27:49 compute-1 neutron-haproxy-ovnmeta-2d5ff262-0b2d-49fb-b643-980510ce97c7[223468]: [NOTICE]   (223472) : path to executable is /usr/sbin/haproxy
Dec 05 09:27:49 compute-1 neutron-haproxy-ovnmeta-2d5ff262-0b2d-49fb-b643-980510ce97c7[223468]: [WARNING]  (223472) : Exiting Master process...
Dec 05 09:27:49 compute-1 neutron-haproxy-ovnmeta-2d5ff262-0b2d-49fb-b643-980510ce97c7[223468]: [ALERT]    (223472) : Current worker (223474) exited with code 143 (Terminated)
Dec 05 09:27:49 compute-1 neutron-haproxy-ovnmeta-2d5ff262-0b2d-49fb-b643-980510ce97c7[223468]: [WARNING]  (223472) : All workers exited. Exiting... (0)
Dec 05 09:27:49 compute-1 systemd[1]: libpod-23ba8fc634738c5b2fd79cd10d3697ceba69f51d33ed176194dee31a9a898da5.scope: Deactivated successfully.
Dec 05 09:27:49 compute-1 podman[224174]: 2025-12-05 09:27:49.473315629 +0000 UTC m=+0.056940123 container died 23ba8fc634738c5b2fd79cd10d3697ceba69f51d33ed176194dee31a9a898da5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-2d5ff262-0b2d-49fb-b643-980510ce97c7, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Dec 05 09:27:49 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-23ba8fc634738c5b2fd79cd10d3697ceba69f51d33ed176194dee31a9a898da5-userdata-shm.mount: Deactivated successfully.
Dec 05 09:27:49 compute-1 systemd[1]: var-lib-containers-storage-overlay-1feb21e2e4d4c74cc9d649a43f86e416f91d98315e1476ee1fdc2a3f7dfd70d5-merged.mount: Deactivated successfully.
Dec 05 09:27:49 compute-1 podman[224174]: 2025-12-05 09:27:49.508246043 +0000 UTC m=+0.091870517 container cleanup 23ba8fc634738c5b2fd79cd10d3697ceba69f51d33ed176194dee31a9a898da5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-2d5ff262-0b2d-49fb-b643-980510ce97c7, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125)
Dec 05 09:27:49 compute-1 systemd[1]: libpod-conmon-23ba8fc634738c5b2fd79cd10d3697ceba69f51d33ed176194dee31a9a898da5.scope: Deactivated successfully.
Dec 05 09:27:49 compute-1 podman[224203]: 2025-12-05 09:27:49.57319337 +0000 UTC m=+0.042142071 container remove 23ba8fc634738c5b2fd79cd10d3697ceba69f51d33ed176194dee31a9a898da5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-2d5ff262-0b2d-49fb-b643-980510ce97c7, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Dec 05 09:27:49 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:27:49.579 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[e8c2ed52-20ef-4461-a92d-308684fca38a]: (4, ('Fri Dec  5 09:27:49 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-2d5ff262-0b2d-49fb-b643-980510ce97c7 (23ba8fc634738c5b2fd79cd10d3697ceba69f51d33ed176194dee31a9a898da5)\n23ba8fc634738c5b2fd79cd10d3697ceba69f51d33ed176194dee31a9a898da5\nFri Dec  5 09:27:49 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-2d5ff262-0b2d-49fb-b643-980510ce97c7 (23ba8fc634738c5b2fd79cd10d3697ceba69f51d33ed176194dee31a9a898da5)\n23ba8fc634738c5b2fd79cd10d3697ceba69f51d33ed176194dee31a9a898da5\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:27:49 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:27:49.581 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[9cf73aef-3f67-4175-b720-a8dd71d549cb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:27:49 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:27:49.582 105272 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap2d5ff262-00, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 09:27:49 compute-1 nova_compute[189066]: 2025-12-05 09:27:49.585 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:27:49 compute-1 kernel: tap2d5ff262-00: left promiscuous mode
Dec 05 09:27:49 compute-1 nova_compute[189066]: 2025-12-05 09:27:49.607 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:27:49 compute-1 nova_compute[189066]: 2025-12-05 09:27:49.608 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:27:49 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:27:49.610 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[e79d15b3-dbdb-466b-8eaf-c801927fbf4d]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:27:49 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:27:49.627 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[7fcf8cba-cb2b-4807-aca6-505cbac4461c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:27:49 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:27:49.628 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[1d77f0bd-530d-4613-b224-3b7a0a80b4d8]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:27:49 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:27:49.647 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[c5e0441c-4e53-4abc-922f-53830f9952ce]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 409914, 'reachable_time': 22928, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 224221, 'error': None, 'target': 'ovnmeta-2d5ff262-0b2d-49fb-b643-980510ce97c7', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:27:49 compute-1 systemd[1]: run-netns-ovnmeta\x2d2d5ff262\x2d0b2d\x2d49fb\x2db643\x2d980510ce97c7.mount: Deactivated successfully.
Dec 05 09:27:49 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:27:49.651 105809 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-2d5ff262-0b2d-49fb-b643-980510ce97c7 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Dec 05 09:27:49 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:27:49.651 105809 DEBUG oslo.privsep.daemon [-] privsep: reply[96e8f3a9-77a2-43a3-bf38-0fc3e12ecd7d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:27:49 compute-1 nova_compute[189066]: 2025-12-05 09:27:49.745 189070 INFO nova.virt.libvirt.driver [None req-c762e13d-2bb2-475a-8d1b-53e9a4ff209c 401ddf2632c64d0094aff64bba42db19 ea153810a6ed4283be879b89a75f1e86 - - default default] [instance: 9d2b0f76-0408-4f6b-8a37-ac9882a44b4e] Instance shutdown successfully after 3 seconds.
Dec 05 09:27:49 compute-1 nova_compute[189066]: 2025-12-05 09:27:49.752 189070 INFO nova.virt.libvirt.driver [-] [instance: 9d2b0f76-0408-4f6b-8a37-ac9882a44b4e] Instance destroyed successfully.
Dec 05 09:27:49 compute-1 nova_compute[189066]: 2025-12-05 09:27:49.753 189070 DEBUG nova.virt.libvirt.vif [None req-c762e13d-2bb2-475a-8d1b-53e9a4ff209c 401ddf2632c64d0094aff64bba42db19 ea153810a6ed4283be879b89a75f1e86 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-05T09:26:52Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-1301659118',display_name='tempest-TestNetworkAdvancedServerOps-server-1301659118',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-1301659118',id=10,image_ref='3ebffd97-b242-42d7-b245-ebdaf8e4377c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBE/IH5546c3h1NqRaMnG8oeFTcOTykF/TM0HkVvHYHh6GuyCm0ZuW+dN3e3eFJ7fYYhI257JWo40Mrp34xaUZfOzZGF5l/jYsmwg8LZO1T3z+5cPaeu5EMQJR4Fd4T/olg==',key_name='tempest-TestNetworkAdvancedServerOps-1059800776',keypairs=<?>,launch_index=0,launched_at=2025-12-05T09:27:12Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=MigrationContext,new_flavor=Flavor(1),node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='e26ae3fdd48d4947978a480f70e14f84',ramdisk_id='',reservation_id='r-i1fw0oxn',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=ServiceList,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='3ebffd97-b242-42d7-b245-ebdaf8e4377c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-TestNetworkAdvancedServerOps-1829130727',owner_user_name='tempest-TestNetworkAdvancedServerOps-1829130727-project-member'},tags=<?>,task_state='resize_migrating',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-05T09:27:42Z,user_data=None,user_id='65751a90715341b2984ef84ebbaa1650',uuid=9d2b0f76-0408-4f6b-8a37-ac9882a44b4e,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "fee19e88-d18e-4020-97b6-26caf4ef6fa9", "address": "fa:16:3e:ad:12:e3", "network": {"id": "2d5ff262-0b2d-49fb-b643-980510ce97c7", "bridge": "br-int", "label": "tempest-network-smoke--2096539509", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.214", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-network-smoke--2096539509", "vif_mac": "fa:16:3e:ad:12:e3"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e26ae3fdd48d4947978a480f70e14f84", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfee19e88-d1", "ovs_interfaceid": "fee19e88-d18e-4020-97b6-26caf4ef6fa9", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Dec 05 09:27:49 compute-1 nova_compute[189066]: 2025-12-05 09:27:49.754 189070 DEBUG nova.network.os_vif_util [None req-c762e13d-2bb2-475a-8d1b-53e9a4ff209c 401ddf2632c64d0094aff64bba42db19 ea153810a6ed4283be879b89a75f1e86 - - default default] Converting VIF {"id": "fee19e88-d18e-4020-97b6-26caf4ef6fa9", "address": "fa:16:3e:ad:12:e3", "network": {"id": "2d5ff262-0b2d-49fb-b643-980510ce97c7", "bridge": "br-int", "label": "tempest-network-smoke--2096539509", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.214", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-network-smoke--2096539509", "vif_mac": "fa:16:3e:ad:12:e3"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e26ae3fdd48d4947978a480f70e14f84", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfee19e88-d1", "ovs_interfaceid": "fee19e88-d18e-4020-97b6-26caf4ef6fa9", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 09:27:49 compute-1 nova_compute[189066]: 2025-12-05 09:27:49.755 189070 DEBUG nova.network.os_vif_util [None req-c762e13d-2bb2-475a-8d1b-53e9a4ff209c 401ddf2632c64d0094aff64bba42db19 ea153810a6ed4283be879b89a75f1e86 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:ad:12:e3,bridge_name='br-int',has_traffic_filtering=True,id=fee19e88-d18e-4020-97b6-26caf4ef6fa9,network=Network(2d5ff262-0b2d-49fb-b643-980510ce97c7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfee19e88-d1') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 09:27:49 compute-1 nova_compute[189066]: 2025-12-05 09:27:49.755 189070 DEBUG os_vif [None req-c762e13d-2bb2-475a-8d1b-53e9a4ff209c 401ddf2632c64d0094aff64bba42db19 ea153810a6ed4283be879b89a75f1e86 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:ad:12:e3,bridge_name='br-int',has_traffic_filtering=True,id=fee19e88-d18e-4020-97b6-26caf4ef6fa9,network=Network(2d5ff262-0b2d-49fb-b643-980510ce97c7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfee19e88-d1') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Dec 05 09:27:49 compute-1 nova_compute[189066]: 2025-12-05 09:27:49.758 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:27:49 compute-1 nova_compute[189066]: 2025-12-05 09:27:49.759 189070 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapfee19e88-d1, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 09:27:49 compute-1 nova_compute[189066]: 2025-12-05 09:27:49.761 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:27:49 compute-1 nova_compute[189066]: 2025-12-05 09:27:49.762 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:27:49 compute-1 nova_compute[189066]: 2025-12-05 09:27:49.765 189070 INFO os_vif [None req-c762e13d-2bb2-475a-8d1b-53e9a4ff209c 401ddf2632c64d0094aff64bba42db19 ea153810a6ed4283be879b89a75f1e86 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:ad:12:e3,bridge_name='br-int',has_traffic_filtering=True,id=fee19e88-d18e-4020-97b6-26caf4ef6fa9,network=Network(2d5ff262-0b2d-49fb-b643-980510ce97c7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfee19e88-d1')
Dec 05 09:27:49 compute-1 nova_compute[189066]: 2025-12-05 09:27:49.769 189070 DEBUG oslo_concurrency.processutils [None req-c762e13d-2bb2-475a-8d1b-53e9a4ff209c 401ddf2632c64d0094aff64bba42db19 ea153810a6ed4283be879b89a75f1e86 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9d2b0f76-0408-4f6b-8a37-ac9882a44b4e/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 09:27:49 compute-1 nova_compute[189066]: 2025-12-05 09:27:49.843 189070 DEBUG oslo_concurrency.processutils [None req-c762e13d-2bb2-475a-8d1b-53e9a4ff209c 401ddf2632c64d0094aff64bba42db19 ea153810a6ed4283be879b89a75f1e86 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9d2b0f76-0408-4f6b-8a37-ac9882a44b4e/disk --force-share --output=json" returned: 0 in 0.074s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 09:27:49 compute-1 nova_compute[189066]: 2025-12-05 09:27:49.845 189070 DEBUG oslo_concurrency.processutils [None req-c762e13d-2bb2-475a-8d1b-53e9a4ff209c 401ddf2632c64d0094aff64bba42db19 ea153810a6ed4283be879b89a75f1e86 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9d2b0f76-0408-4f6b-8a37-ac9882a44b4e/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 09:27:49 compute-1 nova_compute[189066]: 2025-12-05 09:27:49.903 189070 DEBUG oslo_concurrency.processutils [None req-c762e13d-2bb2-475a-8d1b-53e9a4ff209c 401ddf2632c64d0094aff64bba42db19 ea153810a6ed4283be879b89a75f1e86 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9d2b0f76-0408-4f6b-8a37-ac9882a44b4e/disk --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 09:27:49 compute-1 nova_compute[189066]: 2025-12-05 09:27:49.905 189070 DEBUG nova.virt.libvirt.volume.remotefs [None req-c762e13d-2bb2-475a-8d1b-53e9a4ff209c 401ddf2632c64d0094aff64bba42db19 ea153810a6ed4283be879b89a75f1e86 - - default default] Copying file /var/lib/nova/instances/9d2b0f76-0408-4f6b-8a37-ac9882a44b4e_resize/disk to 192.168.122.102:/var/lib/nova/instances/9d2b0f76-0408-4f6b-8a37-ac9882a44b4e/disk copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103
Dec 05 09:27:49 compute-1 nova_compute[189066]: 2025-12-05 09:27:49.905 189070 DEBUG oslo_concurrency.processutils [None req-c762e13d-2bb2-475a-8d1b-53e9a4ff209c 401ddf2632c64d0094aff64bba42db19 ea153810a6ed4283be879b89a75f1e86 - - default default] Running cmd (subprocess): scp -r /var/lib/nova/instances/9d2b0f76-0408-4f6b-8a37-ac9882a44b4e_resize/disk 192.168.122.102:/var/lib/nova/instances/9d2b0f76-0408-4f6b-8a37-ac9882a44b4e/disk execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 09:27:50 compute-1 nova_compute[189066]: 2025-12-05 09:27:50.353 189070 DEBUG oslo_concurrency.processutils [None req-c762e13d-2bb2-475a-8d1b-53e9a4ff209c 401ddf2632c64d0094aff64bba42db19 ea153810a6ed4283be879b89a75f1e86 - - default default] CMD "scp -r /var/lib/nova/instances/9d2b0f76-0408-4f6b-8a37-ac9882a44b4e_resize/disk 192.168.122.102:/var/lib/nova/instances/9d2b0f76-0408-4f6b-8a37-ac9882a44b4e/disk" returned: 0 in 0.448s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 09:27:50 compute-1 nova_compute[189066]: 2025-12-05 09:27:50.354 189070 DEBUG nova.virt.libvirt.volume.remotefs [None req-c762e13d-2bb2-475a-8d1b-53e9a4ff209c 401ddf2632c64d0094aff64bba42db19 ea153810a6ed4283be879b89a75f1e86 - - default default] Copying file /var/lib/nova/instances/9d2b0f76-0408-4f6b-8a37-ac9882a44b4e_resize/disk.config to 192.168.122.102:/var/lib/nova/instances/9d2b0f76-0408-4f6b-8a37-ac9882a44b4e/disk.config copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103
Dec 05 09:27:50 compute-1 nova_compute[189066]: 2025-12-05 09:27:50.355 189070 DEBUG oslo_concurrency.processutils [None req-c762e13d-2bb2-475a-8d1b-53e9a4ff209c 401ddf2632c64d0094aff64bba42db19 ea153810a6ed4283be879b89a75f1e86 - - default default] Running cmd (subprocess): scp -C -r /var/lib/nova/instances/9d2b0f76-0408-4f6b-8a37-ac9882a44b4e_resize/disk.config 192.168.122.102:/var/lib/nova/instances/9d2b0f76-0408-4f6b-8a37-ac9882a44b4e/disk.config execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 09:27:50 compute-1 sshd-session[224108]: Invalid user user from 91.202.233.33 port 54650
Dec 05 09:27:50 compute-1 nova_compute[189066]: 2025-12-05 09:27:50.585 189070 DEBUG oslo_concurrency.processutils [None req-c762e13d-2bb2-475a-8d1b-53e9a4ff209c 401ddf2632c64d0094aff64bba42db19 ea153810a6ed4283be879b89a75f1e86 - - default default] CMD "scp -C -r /var/lib/nova/instances/9d2b0f76-0408-4f6b-8a37-ac9882a44b4e_resize/disk.config 192.168.122.102:/var/lib/nova/instances/9d2b0f76-0408-4f6b-8a37-ac9882a44b4e/disk.config" returned: 0 in 0.230s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 09:27:50 compute-1 nova_compute[189066]: 2025-12-05 09:27:50.586 189070 DEBUG nova.virt.libvirt.volume.remotefs [None req-c762e13d-2bb2-475a-8d1b-53e9a4ff209c 401ddf2632c64d0094aff64bba42db19 ea153810a6ed4283be879b89a75f1e86 - - default default] Copying file /var/lib/nova/instances/9d2b0f76-0408-4f6b-8a37-ac9882a44b4e_resize/disk.info to 192.168.122.102:/var/lib/nova/instances/9d2b0f76-0408-4f6b-8a37-ac9882a44b4e/disk.info copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103
Dec 05 09:27:50 compute-1 nova_compute[189066]: 2025-12-05 09:27:50.587 189070 DEBUG oslo_concurrency.processutils [None req-c762e13d-2bb2-475a-8d1b-53e9a4ff209c 401ddf2632c64d0094aff64bba42db19 ea153810a6ed4283be879b89a75f1e86 - - default default] Running cmd (subprocess): scp -C -r /var/lib/nova/instances/9d2b0f76-0408-4f6b-8a37-ac9882a44b4e_resize/disk.info 192.168.122.102:/var/lib/nova/instances/9d2b0f76-0408-4f6b-8a37-ac9882a44b4e/disk.info execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 09:27:50 compute-1 nova_compute[189066]: 2025-12-05 09:27:50.792 189070 DEBUG oslo_concurrency.processutils [None req-c762e13d-2bb2-475a-8d1b-53e9a4ff209c 401ddf2632c64d0094aff64bba42db19 ea153810a6ed4283be879b89a75f1e86 - - default default] CMD "scp -C -r /var/lib/nova/instances/9d2b0f76-0408-4f6b-8a37-ac9882a44b4e_resize/disk.info 192.168.122.102:/var/lib/nova/instances/9d2b0f76-0408-4f6b-8a37-ac9882a44b4e/disk.info" returned: 0 in 0.205s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 09:27:50 compute-1 sshd-session[224108]: Connection reset by invalid user user 91.202.233.33 port 54650 [preauth]
Dec 05 09:27:51 compute-1 nova_compute[189066]: 2025-12-05 09:27:51.862 189070 DEBUG nova.compute.manager [req-0befec5e-6f83-43f3-b3f9-17b9a1b847d5 req-d7dca8b7-ebb1-4ea1-b2e2-3178c6106322 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 9d2b0f76-0408-4f6b-8a37-ac9882a44b4e] Received event network-vif-unplugged-fee19e88-d18e-4020-97b6-26caf4ef6fa9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 09:27:51 compute-1 nova_compute[189066]: 2025-12-05 09:27:51.862 189070 DEBUG oslo_concurrency.lockutils [req-0befec5e-6f83-43f3-b3f9-17b9a1b847d5 req-d7dca8b7-ebb1-4ea1-b2e2-3178c6106322 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Acquiring lock "9d2b0f76-0408-4f6b-8a37-ac9882a44b4e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:27:51 compute-1 nova_compute[189066]: 2025-12-05 09:27:51.862 189070 DEBUG oslo_concurrency.lockutils [req-0befec5e-6f83-43f3-b3f9-17b9a1b847d5 req-d7dca8b7-ebb1-4ea1-b2e2-3178c6106322 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Lock "9d2b0f76-0408-4f6b-8a37-ac9882a44b4e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:27:51 compute-1 nova_compute[189066]: 2025-12-05 09:27:51.863 189070 DEBUG oslo_concurrency.lockutils [req-0befec5e-6f83-43f3-b3f9-17b9a1b847d5 req-d7dca8b7-ebb1-4ea1-b2e2-3178c6106322 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Lock "9d2b0f76-0408-4f6b-8a37-ac9882a44b4e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:27:51 compute-1 nova_compute[189066]: 2025-12-05 09:27:51.863 189070 DEBUG nova.compute.manager [req-0befec5e-6f83-43f3-b3f9-17b9a1b847d5 req-d7dca8b7-ebb1-4ea1-b2e2-3178c6106322 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 9d2b0f76-0408-4f6b-8a37-ac9882a44b4e] No waiting events found dispatching network-vif-unplugged-fee19e88-d18e-4020-97b6-26caf4ef6fa9 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 09:27:51 compute-1 nova_compute[189066]: 2025-12-05 09:27:51.863 189070 WARNING nova.compute.manager [req-0befec5e-6f83-43f3-b3f9-17b9a1b847d5 req-d7dca8b7-ebb1-4ea1-b2e2-3178c6106322 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 9d2b0f76-0408-4f6b-8a37-ac9882a44b4e] Received unexpected event network-vif-unplugged-fee19e88-d18e-4020-97b6-26caf4ef6fa9 for instance with vm_state active and task_state resize_migrating.
Dec 05 09:27:51 compute-1 nova_compute[189066]: 2025-12-05 09:27:51.863 189070 DEBUG nova.compute.manager [req-0befec5e-6f83-43f3-b3f9-17b9a1b847d5 req-d7dca8b7-ebb1-4ea1-b2e2-3178c6106322 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 9d2b0f76-0408-4f6b-8a37-ac9882a44b4e] Received event network-vif-plugged-fee19e88-d18e-4020-97b6-26caf4ef6fa9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 09:27:51 compute-1 nova_compute[189066]: 2025-12-05 09:27:51.863 189070 DEBUG oslo_concurrency.lockutils [req-0befec5e-6f83-43f3-b3f9-17b9a1b847d5 req-d7dca8b7-ebb1-4ea1-b2e2-3178c6106322 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Acquiring lock "9d2b0f76-0408-4f6b-8a37-ac9882a44b4e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:27:51 compute-1 nova_compute[189066]: 2025-12-05 09:27:51.864 189070 DEBUG oslo_concurrency.lockutils [req-0befec5e-6f83-43f3-b3f9-17b9a1b847d5 req-d7dca8b7-ebb1-4ea1-b2e2-3178c6106322 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Lock "9d2b0f76-0408-4f6b-8a37-ac9882a44b4e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:27:51 compute-1 nova_compute[189066]: 2025-12-05 09:27:51.864 189070 DEBUG oslo_concurrency.lockutils [req-0befec5e-6f83-43f3-b3f9-17b9a1b847d5 req-d7dca8b7-ebb1-4ea1-b2e2-3178c6106322 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Lock "9d2b0f76-0408-4f6b-8a37-ac9882a44b4e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:27:51 compute-1 nova_compute[189066]: 2025-12-05 09:27:51.864 189070 DEBUG nova.compute.manager [req-0befec5e-6f83-43f3-b3f9-17b9a1b847d5 req-d7dca8b7-ebb1-4ea1-b2e2-3178c6106322 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 9d2b0f76-0408-4f6b-8a37-ac9882a44b4e] No waiting events found dispatching network-vif-plugged-fee19e88-d18e-4020-97b6-26caf4ef6fa9 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 09:27:51 compute-1 nova_compute[189066]: 2025-12-05 09:27:51.864 189070 WARNING nova.compute.manager [req-0befec5e-6f83-43f3-b3f9-17b9a1b847d5 req-d7dca8b7-ebb1-4ea1-b2e2-3178c6106322 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 9d2b0f76-0408-4f6b-8a37-ac9882a44b4e] Received unexpected event network-vif-plugged-fee19e88-d18e-4020-97b6-26caf4ef6fa9 for instance with vm_state active and task_state resize_migrating.
Dec 05 09:27:51 compute-1 nova_compute[189066]: 2025-12-05 09:27:51.972 189070 DEBUG neutronclient.v2_0.client [None req-c762e13d-2bb2-475a-8d1b-53e9a4ff209c 401ddf2632c64d0094aff64bba42db19 ea153810a6ed4283be879b89a75f1e86 - - default default] Error message: {"NeutronError": {"type": "PortBindingNotFound", "message": "Binding for port fee19e88-d18e-4020-97b6-26caf4ef6fa9 for host compute-2.ctlplane.example.com could not be found.", "detail": ""}} _handle_fault_response /usr/lib/python3.9/site-packages/neutronclient/v2_0/client.py:262
Dec 05 09:27:52 compute-1 nova_compute[189066]: 2025-12-05 09:27:52.093 189070 DEBUG oslo_concurrency.lockutils [None req-c762e13d-2bb2-475a-8d1b-53e9a4ff209c 401ddf2632c64d0094aff64bba42db19 ea153810a6ed4283be879b89a75f1e86 - - default default] Acquiring lock "9d2b0f76-0408-4f6b-8a37-ac9882a44b4e-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:27:52 compute-1 nova_compute[189066]: 2025-12-05 09:27:52.094 189070 DEBUG oslo_concurrency.lockutils [None req-c762e13d-2bb2-475a-8d1b-53e9a4ff209c 401ddf2632c64d0094aff64bba42db19 ea153810a6ed4283be879b89a75f1e86 - - default default] Lock "9d2b0f76-0408-4f6b-8a37-ac9882a44b4e-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:27:52 compute-1 nova_compute[189066]: 2025-12-05 09:27:52.094 189070 DEBUG oslo_concurrency.lockutils [None req-c762e13d-2bb2-475a-8d1b-53e9a4ff209c 401ddf2632c64d0094aff64bba42db19 ea153810a6ed4283be879b89a75f1e86 - - default default] Lock "9d2b0f76-0408-4f6b-8a37-ac9882a44b4e-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:27:52 compute-1 nova_compute[189066]: 2025-12-05 09:27:52.166 189070 DEBUG oslo_concurrency.lockutils [None req-5c023d50-6bf4-4c07-8ad5-79325ee30727 9d57c2b5ca2f41c09c8f4b723716ef7b 9891534839274cbc89aec062c36488b0 - - default default] Acquiring lock "43d83f29-ba12-4205-ba09-545c3dc28920-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:27:52 compute-1 nova_compute[189066]: 2025-12-05 09:27:52.167 189070 DEBUG oslo_concurrency.lockutils [None req-5c023d50-6bf4-4c07-8ad5-79325ee30727 9d57c2b5ca2f41c09c8f4b723716ef7b 9891534839274cbc89aec062c36488b0 - - default default] Lock "43d83f29-ba12-4205-ba09-545c3dc28920-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:27:52 compute-1 nova_compute[189066]: 2025-12-05 09:27:52.167 189070 DEBUG oslo_concurrency.lockutils [None req-5c023d50-6bf4-4c07-8ad5-79325ee30727 9d57c2b5ca2f41c09c8f4b723716ef7b 9891534839274cbc89aec062c36488b0 - - default default] Lock "43d83f29-ba12-4205-ba09-545c3dc28920-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:27:52 compute-1 nova_compute[189066]: 2025-12-05 09:27:52.203 189070 DEBUG oslo_concurrency.lockutils [None req-5c023d50-6bf4-4c07-8ad5-79325ee30727 9d57c2b5ca2f41c09c8f4b723716ef7b 9891534839274cbc89aec062c36488b0 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:27:52 compute-1 nova_compute[189066]: 2025-12-05 09:27:52.204 189070 DEBUG oslo_concurrency.lockutils [None req-5c023d50-6bf4-4c07-8ad5-79325ee30727 9d57c2b5ca2f41c09c8f4b723716ef7b 9891534839274cbc89aec062c36488b0 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:27:52 compute-1 nova_compute[189066]: 2025-12-05 09:27:52.204 189070 DEBUG oslo_concurrency.lockutils [None req-5c023d50-6bf4-4c07-8ad5-79325ee30727 9d57c2b5ca2f41c09c8f4b723716ef7b 9891534839274cbc89aec062c36488b0 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:27:52 compute-1 nova_compute[189066]: 2025-12-05 09:27:52.205 189070 DEBUG nova.compute.resource_tracker [None req-5c023d50-6bf4-4c07-8ad5-79325ee30727 9d57c2b5ca2f41c09c8f4b723716ef7b 9891534839274cbc89aec062c36488b0 - - default default] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 05 09:27:52 compute-1 nova_compute[189066]: 2025-12-05 09:27:52.293 189070 WARNING nova.virt.libvirt.driver [None req-5c023d50-6bf4-4c07-8ad5-79325ee30727 9d57c2b5ca2f41c09c8f4b723716ef7b 9891534839274cbc89aec062c36488b0 - - default default] Periodic task is updating the host stats, it is trying to get disk info for instance-0000000a, but the backing disk storage was removed by a concurrent operation such as resize. Error: No disk at /var/lib/nova/instances/9d2b0f76-0408-4f6b-8a37-ac9882a44b4e/disk: nova.exception.DiskNotFound: No disk at /var/lib/nova/instances/9d2b0f76-0408-4f6b-8a37-ac9882a44b4e/disk
Dec 05 09:27:52 compute-1 nova_compute[189066]: 2025-12-05 09:27:52.464 189070 WARNING nova.virt.libvirt.driver [None req-5c023d50-6bf4-4c07-8ad5-79325ee30727 9d57c2b5ca2f41c09c8f4b723716ef7b 9891534839274cbc89aec062c36488b0 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 05 09:27:52 compute-1 nova_compute[189066]: 2025-12-05 09:27:52.465 189070 DEBUG nova.compute.resource_tracker [None req-5c023d50-6bf4-4c07-8ad5-79325ee30727 9d57c2b5ca2f41c09c8f4b723716ef7b 9891534839274cbc89aec062c36488b0 - - default default] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5740MB free_disk=73.3051528930664GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 05 09:27:52 compute-1 nova_compute[189066]: 2025-12-05 09:27:52.465 189070 DEBUG oslo_concurrency.lockutils [None req-5c023d50-6bf4-4c07-8ad5-79325ee30727 9d57c2b5ca2f41c09c8f4b723716ef7b 9891534839274cbc89aec062c36488b0 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:27:52 compute-1 nova_compute[189066]: 2025-12-05 09:27:52.466 189070 DEBUG oslo_concurrency.lockutils [None req-5c023d50-6bf4-4c07-8ad5-79325ee30727 9d57c2b5ca2f41c09c8f4b723716ef7b 9891534839274cbc89aec062c36488b0 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:27:52 compute-1 nova_compute[189066]: 2025-12-05 09:27:52.520 189070 DEBUG nova.compute.resource_tracker [None req-5c023d50-6bf4-4c07-8ad5-79325ee30727 9d57c2b5ca2f41c09c8f4b723716ef7b 9891534839274cbc89aec062c36488b0 - - default default] Migration for instance 43d83f29-ba12-4205-ba09-545c3dc28920 refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:903
Dec 05 09:27:52 compute-1 nova_compute[189066]: 2025-12-05 09:27:52.521 189070 DEBUG nova.compute.resource_tracker [None req-5c023d50-6bf4-4c07-8ad5-79325ee30727 9d57c2b5ca2f41c09c8f4b723716ef7b 9891534839274cbc89aec062c36488b0 - - default default] Migration for instance 9d2b0f76-0408-4f6b-8a37-ac9882a44b4e refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:903
Dec 05 09:27:52 compute-1 nova_compute[189066]: 2025-12-05 09:27:52.548 189070 DEBUG nova.compute.resource_tracker [None req-5c023d50-6bf4-4c07-8ad5-79325ee30727 9d57c2b5ca2f41c09c8f4b723716ef7b 9891534839274cbc89aec062c36488b0 - - default default] [instance: 43d83f29-ba12-4205-ba09-545c3dc28920] Skipping migration as instance is neither resizing nor live-migrating. _update_usage_from_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1491
Dec 05 09:27:52 compute-1 nova_compute[189066]: 2025-12-05 09:27:52.588 189070 INFO nova.compute.resource_tracker [None req-5c023d50-6bf4-4c07-8ad5-79325ee30727 9d57c2b5ca2f41c09c8f4b723716ef7b 9891534839274cbc89aec062c36488b0 - - default default] [instance: 9d2b0f76-0408-4f6b-8a37-ac9882a44b4e] Updating resource usage from migration ef60dbc9-7d05-4b2b-b77b-7867a7770c46
Dec 05 09:27:52 compute-1 nova_compute[189066]: 2025-12-05 09:27:52.589 189070 DEBUG nova.compute.resource_tracker [None req-5c023d50-6bf4-4c07-8ad5-79325ee30727 9d57c2b5ca2f41c09c8f4b723716ef7b 9891534839274cbc89aec062c36488b0 - - default default] [instance: 9d2b0f76-0408-4f6b-8a37-ac9882a44b4e] Starting to track outgoing migration ef60dbc9-7d05-4b2b-b77b-7867a7770c46 with flavor fbadeab4-f24f-4100-963a-d228b2a6f7c4 _update_usage_from_migration /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1444
Dec 05 09:27:52 compute-1 nova_compute[189066]: 2025-12-05 09:27:52.632 189070 DEBUG nova.compute.resource_tracker [None req-5c023d50-6bf4-4c07-8ad5-79325ee30727 9d57c2b5ca2f41c09c8f4b723716ef7b 9891534839274cbc89aec062c36488b0 - - default default] Migration f230ce5b-ec62-4081-8078-d62bbbc1ecbd is active on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1640
Dec 05 09:27:52 compute-1 nova_compute[189066]: 2025-12-05 09:27:52.632 189070 DEBUG nova.compute.resource_tracker [None req-5c023d50-6bf4-4c07-8ad5-79325ee30727 9d57c2b5ca2f41c09c8f4b723716ef7b 9891534839274cbc89aec062c36488b0 - - default default] Migration ef60dbc9-7d05-4b2b-b77b-7867a7770c46 is active on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1640
Dec 05 09:27:52 compute-1 nova_compute[189066]: 2025-12-05 09:27:52.633 189070 DEBUG nova.compute.resource_tracker [None req-5c023d50-6bf4-4c07-8ad5-79325ee30727 9d57c2b5ca2f41c09c8f4b723716ef7b 9891534839274cbc89aec062c36488b0 - - default default] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 05 09:27:52 compute-1 nova_compute[189066]: 2025-12-05 09:27:52.633 189070 DEBUG nova.compute.resource_tracker [None req-5c023d50-6bf4-4c07-8ad5-79325ee30727 9d57c2b5ca2f41c09c8f4b723716ef7b 9891534839274cbc89aec062c36488b0 - - default default] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 05 09:27:52 compute-1 nova_compute[189066]: 2025-12-05 09:27:52.722 189070 DEBUG nova.compute.provider_tree [None req-5c023d50-6bf4-4c07-8ad5-79325ee30727 9d57c2b5ca2f41c09c8f4b723716ef7b 9891534839274cbc89aec062c36488b0 - - default default] Inventory has not changed in ProviderTree for provider: be68f9f1-7820-4bfa-8dbd-210e13729f64 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 09:27:52 compute-1 nova_compute[189066]: 2025-12-05 09:27:52.742 189070 DEBUG nova.scheduler.client.report [None req-5c023d50-6bf4-4c07-8ad5-79325ee30727 9d57c2b5ca2f41c09c8f4b723716ef7b 9891534839274cbc89aec062c36488b0 - - default default] Inventory has not changed for provider be68f9f1-7820-4bfa-8dbd-210e13729f64 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 09:27:52 compute-1 nova_compute[189066]: 2025-12-05 09:27:52.786 189070 DEBUG nova.compute.resource_tracker [None req-5c023d50-6bf4-4c07-8ad5-79325ee30727 9d57c2b5ca2f41c09c8f4b723716ef7b 9891534839274cbc89aec062c36488b0 - - default default] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 05 09:27:52 compute-1 nova_compute[189066]: 2025-12-05 09:27:52.787 189070 DEBUG oslo_concurrency.lockutils [None req-5c023d50-6bf4-4c07-8ad5-79325ee30727 9d57c2b5ca2f41c09c8f4b723716ef7b 9891534839274cbc89aec062c36488b0 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.321s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:27:52 compute-1 nova_compute[189066]: 2025-12-05 09:27:52.793 189070 INFO nova.compute.manager [None req-5c023d50-6bf4-4c07-8ad5-79325ee30727 9d57c2b5ca2f41c09c8f4b723716ef7b 9891534839274cbc89aec062c36488b0 - - default default] [instance: 43d83f29-ba12-4205-ba09-545c3dc28920] Migrating instance to compute-2.ctlplane.example.com finished successfully.
Dec 05 09:27:52 compute-1 nova_compute[189066]: 2025-12-05 09:27:52.942 189070 INFO nova.scheduler.client.report [None req-5c023d50-6bf4-4c07-8ad5-79325ee30727 9d57c2b5ca2f41c09c8f4b723716ef7b 9891534839274cbc89aec062c36488b0 - - default default] Deleted allocation for migration f230ce5b-ec62-4081-8078-d62bbbc1ecbd
Dec 05 09:27:52 compute-1 nova_compute[189066]: 2025-12-05 09:27:52.943 189070 DEBUG nova.virt.libvirt.driver [None req-5c023d50-6bf4-4c07-8ad5-79325ee30727 9d57c2b5ca2f41c09c8f4b723716ef7b 9891534839274cbc89aec062c36488b0 - - default default] [instance: 43d83f29-ba12-4205-ba09-545c3dc28920] Live migration monitoring is all done _live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10662
Dec 05 09:27:54 compute-1 sshd-session[224234]: Connection reset by authenticating user root 91.202.233.33 port 26832 [preauth]
Dec 05 09:27:54 compute-1 nova_compute[189066]: 2025-12-05 09:27:54.099 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:27:54 compute-1 nova_compute[189066]: 2025-12-05 09:27:54.607 189070 DEBUG nova.compute.manager [req-4f785a70-029c-4937-a096-933805101d0a req-52c32e6d-c9f9-469b-92d3-376fd3af3c33 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 9d2b0f76-0408-4f6b-8a37-ac9882a44b4e] Received event network-changed-fee19e88-d18e-4020-97b6-26caf4ef6fa9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 09:27:54 compute-1 nova_compute[189066]: 2025-12-05 09:27:54.607 189070 DEBUG nova.compute.manager [req-4f785a70-029c-4937-a096-933805101d0a req-52c32e6d-c9f9-469b-92d3-376fd3af3c33 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 9d2b0f76-0408-4f6b-8a37-ac9882a44b4e] Refreshing instance network info cache due to event network-changed-fee19e88-d18e-4020-97b6-26caf4ef6fa9. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Dec 05 09:27:54 compute-1 nova_compute[189066]: 2025-12-05 09:27:54.608 189070 DEBUG oslo_concurrency.lockutils [req-4f785a70-029c-4937-a096-933805101d0a req-52c32e6d-c9f9-469b-92d3-376fd3af3c33 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Acquiring lock "refresh_cache-9d2b0f76-0408-4f6b-8a37-ac9882a44b4e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 09:27:54 compute-1 nova_compute[189066]: 2025-12-05 09:27:54.608 189070 DEBUG oslo_concurrency.lockutils [req-4f785a70-029c-4937-a096-933805101d0a req-52c32e6d-c9f9-469b-92d3-376fd3af3c33 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Acquired lock "refresh_cache-9d2b0f76-0408-4f6b-8a37-ac9882a44b4e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 09:27:54 compute-1 nova_compute[189066]: 2025-12-05 09:27:54.608 189070 DEBUG nova.network.neutron [req-4f785a70-029c-4937-a096-933805101d0a req-52c32e6d-c9f9-469b-92d3-376fd3af3c33 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 9d2b0f76-0408-4f6b-8a37-ac9882a44b4e] Refreshing network info cache for port fee19e88-d18e-4020-97b6-26caf4ef6fa9 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Dec 05 09:27:54 compute-1 nova_compute[189066]: 2025-12-05 09:27:54.762 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:27:55 compute-1 podman[224239]: 2025-12-05 09:27:55.652769938 +0000 UTC m=+0.080915457 container health_status 550b604b3b40c506829669f3af0f3e8dc4e7ac01464222ce7f26998708fe1a96 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Dec 05 09:27:56 compute-1 nova_compute[189066]: 2025-12-05 09:27:56.858 189070 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764926861.2430432, 43d83f29-ba12-4205-ba09-545c3dc28920 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 09:27:56 compute-1 nova_compute[189066]: 2025-12-05 09:27:56.859 189070 INFO nova.compute.manager [-] [instance: 43d83f29-ba12-4205-ba09-545c3dc28920] VM Stopped (Lifecycle Event)
Dec 05 09:27:56 compute-1 nova_compute[189066]: 2025-12-05 09:27:56.884 189070 DEBUG nova.compute.manager [None req-318d80da-f972-4e86-803c-c9900c7d9ce4 - - - - - -] [instance: 43d83f29-ba12-4205-ba09-545c3dc28920] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 09:27:56 compute-1 sshd-session[224237]: Connection reset by authenticating user root 91.202.233.33 port 26868 [preauth]
Dec 05 09:27:58 compute-1 nova_compute[189066]: 2025-12-05 09:27:58.590 189070 DEBUG nova.network.neutron [req-4f785a70-029c-4937-a096-933805101d0a req-52c32e6d-c9f9-469b-92d3-376fd3af3c33 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 9d2b0f76-0408-4f6b-8a37-ac9882a44b4e] Updated VIF entry in instance network info cache for port fee19e88-d18e-4020-97b6-26caf4ef6fa9. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Dec 05 09:27:58 compute-1 nova_compute[189066]: 2025-12-05 09:27:58.591 189070 DEBUG nova.network.neutron [req-4f785a70-029c-4937-a096-933805101d0a req-52c32e6d-c9f9-469b-92d3-376fd3af3c33 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 9d2b0f76-0408-4f6b-8a37-ac9882a44b4e] Updating instance_info_cache with network_info: [{"id": "fee19e88-d18e-4020-97b6-26caf4ef6fa9", "address": "fa:16:3e:ad:12:e3", "network": {"id": "2d5ff262-0b2d-49fb-b643-980510ce97c7", "bridge": "br-int", "label": "tempest-network-smoke--2096539509", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.214", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e26ae3fdd48d4947978a480f70e14f84", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfee19e88-d1", "ovs_interfaceid": "fee19e88-d18e-4020-97b6-26caf4ef6fa9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 09:27:58 compute-1 nova_compute[189066]: 2025-12-05 09:27:58.621 189070 DEBUG oslo_concurrency.lockutils [req-4f785a70-029c-4937-a096-933805101d0a req-52c32e6d-c9f9-469b-92d3-376fd3af3c33 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Releasing lock "refresh_cache-9d2b0f76-0408-4f6b-8a37-ac9882a44b4e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 09:27:58 compute-1 sshd-session[224265]: Connection reset by authenticating user root 91.202.233.33 port 26876 [preauth]
Dec 05 09:27:59 compute-1 nova_compute[189066]: 2025-12-05 09:27:59.100 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:27:59 compute-1 nova_compute[189066]: 2025-12-05 09:27:59.763 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:28:01 compute-1 nova_compute[189066]: 2025-12-05 09:28:01.941 189070 DEBUG nova.compute.manager [req-fdeecf0d-bc07-4a6e-8357-08f20e3f3326 req-9f6eccc3-113c-4cba-9ffa-570d0a680bca 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 9d2b0f76-0408-4f6b-8a37-ac9882a44b4e] Received event network-vif-plugged-fee19e88-d18e-4020-97b6-26caf4ef6fa9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 09:28:01 compute-1 nova_compute[189066]: 2025-12-05 09:28:01.941 189070 DEBUG oslo_concurrency.lockutils [req-fdeecf0d-bc07-4a6e-8357-08f20e3f3326 req-9f6eccc3-113c-4cba-9ffa-570d0a680bca 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Acquiring lock "9d2b0f76-0408-4f6b-8a37-ac9882a44b4e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:28:01 compute-1 nova_compute[189066]: 2025-12-05 09:28:01.942 189070 DEBUG oslo_concurrency.lockutils [req-fdeecf0d-bc07-4a6e-8357-08f20e3f3326 req-9f6eccc3-113c-4cba-9ffa-570d0a680bca 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Lock "9d2b0f76-0408-4f6b-8a37-ac9882a44b4e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:28:01 compute-1 nova_compute[189066]: 2025-12-05 09:28:01.942 189070 DEBUG oslo_concurrency.lockutils [req-fdeecf0d-bc07-4a6e-8357-08f20e3f3326 req-9f6eccc3-113c-4cba-9ffa-570d0a680bca 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Lock "9d2b0f76-0408-4f6b-8a37-ac9882a44b4e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:28:01 compute-1 nova_compute[189066]: 2025-12-05 09:28:01.942 189070 DEBUG nova.compute.manager [req-fdeecf0d-bc07-4a6e-8357-08f20e3f3326 req-9f6eccc3-113c-4cba-9ffa-570d0a680bca 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 9d2b0f76-0408-4f6b-8a37-ac9882a44b4e] No waiting events found dispatching network-vif-plugged-fee19e88-d18e-4020-97b6-26caf4ef6fa9 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 09:28:01 compute-1 nova_compute[189066]: 2025-12-05 09:28:01.942 189070 WARNING nova.compute.manager [req-fdeecf0d-bc07-4a6e-8357-08f20e3f3326 req-9f6eccc3-113c-4cba-9ffa-570d0a680bca 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 9d2b0f76-0408-4f6b-8a37-ac9882a44b4e] Received unexpected event network-vif-plugged-fee19e88-d18e-4020-97b6-26caf4ef6fa9 for instance with vm_state resized and task_state None.
Dec 05 09:28:03 compute-1 podman[224269]: 2025-12-05 09:28:03.633940721 +0000 UTC m=+0.076874140 container health_status d60d3dfd47255bc7c068dbedc03dbf86678863f0414a1b8f8536e4d53dd67a13 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ceilometer_agent_compute, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a42d21893a42142b0b00e96296a13ac9d2c0fedf55d6438ad104fd0309c63376-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3)
Dec 05 09:28:03 compute-1 sshd-session[224267]: Received disconnect from 101.47.162.91 port 50358:11: Bye Bye [preauth]
Dec 05 09:28:03 compute-1 sshd-session[224267]: Disconnected from authenticating user root 101.47.162.91 port 50358 [preauth]
Dec 05 09:28:04 compute-1 nova_compute[189066]: 2025-12-05 09:28:04.103 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:28:04 compute-1 nova_compute[189066]: 2025-12-05 09:28:04.267 189070 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764926869.2661347, 9d2b0f76-0408-4f6b-8a37-ac9882a44b4e => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 09:28:04 compute-1 nova_compute[189066]: 2025-12-05 09:28:04.268 189070 INFO nova.compute.manager [-] [instance: 9d2b0f76-0408-4f6b-8a37-ac9882a44b4e] VM Stopped (Lifecycle Event)
Dec 05 09:28:04 compute-1 nova_compute[189066]: 2025-12-05 09:28:04.294 189070 DEBUG nova.compute.manager [None req-f846014d-7657-40ff-9f85-b0f41a5bc9e5 - - - - - -] [instance: 9d2b0f76-0408-4f6b-8a37-ac9882a44b4e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 09:28:04 compute-1 nova_compute[189066]: 2025-12-05 09:28:04.299 189070 DEBUG nova.compute.manager [None req-f846014d-7657-40ff-9f85-b0f41a5bc9e5 - - - - - -] [instance: 9d2b0f76-0408-4f6b-8a37-ac9882a44b4e] Synchronizing instance power state after lifecycle event "Stopped"; current vm_state: resized, current task_state: resize_reverting, current DB power_state: 1, VM power_state: 4 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 09:28:04 compute-1 nova_compute[189066]: 2025-12-05 09:28:04.327 189070 INFO nova.compute.manager [None req-f846014d-7657-40ff-9f85-b0f41a5bc9e5 - - - - - -] [instance: 9d2b0f76-0408-4f6b-8a37-ac9882a44b4e] During the sync_power process the instance has moved from host compute-2.ctlplane.example.com to host compute-1.ctlplane.example.com
Dec 05 09:28:04 compute-1 nova_compute[189066]: 2025-12-05 09:28:04.626 189070 DEBUG nova.compute.manager [req-c2ee262e-9fdf-49da-9519-21b6d733f626 req-c938ea33-a438-4121-9e89-c823326a8b79 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 9d2b0f76-0408-4f6b-8a37-ac9882a44b4e] Received event network-vif-plugged-fee19e88-d18e-4020-97b6-26caf4ef6fa9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 09:28:04 compute-1 nova_compute[189066]: 2025-12-05 09:28:04.626 189070 DEBUG oslo_concurrency.lockutils [req-c2ee262e-9fdf-49da-9519-21b6d733f626 req-c938ea33-a438-4121-9e89-c823326a8b79 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Acquiring lock "9d2b0f76-0408-4f6b-8a37-ac9882a44b4e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:28:04 compute-1 nova_compute[189066]: 2025-12-05 09:28:04.627 189070 DEBUG oslo_concurrency.lockutils [req-c2ee262e-9fdf-49da-9519-21b6d733f626 req-c938ea33-a438-4121-9e89-c823326a8b79 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Lock "9d2b0f76-0408-4f6b-8a37-ac9882a44b4e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:28:04 compute-1 nova_compute[189066]: 2025-12-05 09:28:04.627 189070 DEBUG oslo_concurrency.lockutils [req-c2ee262e-9fdf-49da-9519-21b6d733f626 req-c938ea33-a438-4121-9e89-c823326a8b79 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Lock "9d2b0f76-0408-4f6b-8a37-ac9882a44b4e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:28:04 compute-1 nova_compute[189066]: 2025-12-05 09:28:04.627 189070 DEBUG nova.compute.manager [req-c2ee262e-9fdf-49da-9519-21b6d733f626 req-c938ea33-a438-4121-9e89-c823326a8b79 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 9d2b0f76-0408-4f6b-8a37-ac9882a44b4e] No waiting events found dispatching network-vif-plugged-fee19e88-d18e-4020-97b6-26caf4ef6fa9 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 09:28:04 compute-1 nova_compute[189066]: 2025-12-05 09:28:04.627 189070 WARNING nova.compute.manager [req-c2ee262e-9fdf-49da-9519-21b6d733f626 req-c938ea33-a438-4121-9e89-c823326a8b79 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 9d2b0f76-0408-4f6b-8a37-ac9882a44b4e] Received unexpected event network-vif-plugged-fee19e88-d18e-4020-97b6-26caf4ef6fa9 for instance with vm_state resized and task_state resize_reverting.
Dec 05 09:28:04 compute-1 nova_compute[189066]: 2025-12-05 09:28:04.765 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:28:06 compute-1 nova_compute[189066]: 2025-12-05 09:28:06.555 189070 INFO nova.compute.manager [None req-f66eff75-7cd1-44f3-a5f4-b38314c0c9dc 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] [instance: 9d2b0f76-0408-4f6b-8a37-ac9882a44b4e] Swapping old allocation on dict_keys(['be68f9f1-7820-4bfa-8dbd-210e13729f64']) held by migration ef60dbc9-7d05-4b2b-b77b-7867a7770c46 for instance
Dec 05 09:28:06 compute-1 nova_compute[189066]: 2025-12-05 09:28:06.604 189070 DEBUG nova.scheduler.client.report [None req-f66eff75-7cd1-44f3-a5f4-b38314c0c9dc 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] Overwriting current allocation {'allocations': {'c9dc6886-5698-47d7-9fba-c02789cc3433': {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}, 'generation': 12}}, 'project_id': 'e26ae3fdd48d4947978a480f70e14f84', 'user_id': '65751a90715341b2984ef84ebbaa1650', 'consumer_generation': 1} on consumer 9d2b0f76-0408-4f6b-8a37-ac9882a44b4e move_allocations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:2018
Dec 05 09:28:06 compute-1 nova_compute[189066]: 2025-12-05 09:28:06.905 189070 DEBUG nova.compute.manager [req-b9fc332b-9196-469e-84ec-93f6eb49dd96 req-2885db56-e974-4def-a15f-267f3b07837c 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 9d2b0f76-0408-4f6b-8a37-ac9882a44b4e] Received event network-vif-unplugged-fee19e88-d18e-4020-97b6-26caf4ef6fa9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 09:28:06 compute-1 nova_compute[189066]: 2025-12-05 09:28:06.906 189070 DEBUG oslo_concurrency.lockutils [req-b9fc332b-9196-469e-84ec-93f6eb49dd96 req-2885db56-e974-4def-a15f-267f3b07837c 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Acquiring lock "9d2b0f76-0408-4f6b-8a37-ac9882a44b4e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:28:06 compute-1 nova_compute[189066]: 2025-12-05 09:28:06.906 189070 DEBUG oslo_concurrency.lockutils [req-b9fc332b-9196-469e-84ec-93f6eb49dd96 req-2885db56-e974-4def-a15f-267f3b07837c 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Lock "9d2b0f76-0408-4f6b-8a37-ac9882a44b4e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:28:06 compute-1 nova_compute[189066]: 2025-12-05 09:28:06.906 189070 DEBUG oslo_concurrency.lockutils [req-b9fc332b-9196-469e-84ec-93f6eb49dd96 req-2885db56-e974-4def-a15f-267f3b07837c 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Lock "9d2b0f76-0408-4f6b-8a37-ac9882a44b4e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:28:06 compute-1 nova_compute[189066]: 2025-12-05 09:28:06.906 189070 DEBUG nova.compute.manager [req-b9fc332b-9196-469e-84ec-93f6eb49dd96 req-2885db56-e974-4def-a15f-267f3b07837c 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 9d2b0f76-0408-4f6b-8a37-ac9882a44b4e] No waiting events found dispatching network-vif-unplugged-fee19e88-d18e-4020-97b6-26caf4ef6fa9 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 09:28:06 compute-1 nova_compute[189066]: 2025-12-05 09:28:06.907 189070 WARNING nova.compute.manager [req-b9fc332b-9196-469e-84ec-93f6eb49dd96 req-2885db56-e974-4def-a15f-267f3b07837c 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 9d2b0f76-0408-4f6b-8a37-ac9882a44b4e] Received unexpected event network-vif-unplugged-fee19e88-d18e-4020-97b6-26caf4ef6fa9 for instance with vm_state resized and task_state resize_reverting.
Dec 05 09:28:06 compute-1 nova_compute[189066]: 2025-12-05 09:28:06.907 189070 DEBUG nova.compute.manager [req-b9fc332b-9196-469e-84ec-93f6eb49dd96 req-2885db56-e974-4def-a15f-267f3b07837c 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 9d2b0f76-0408-4f6b-8a37-ac9882a44b4e] Received event network-vif-plugged-fee19e88-d18e-4020-97b6-26caf4ef6fa9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 09:28:06 compute-1 nova_compute[189066]: 2025-12-05 09:28:06.907 189070 DEBUG oslo_concurrency.lockutils [req-b9fc332b-9196-469e-84ec-93f6eb49dd96 req-2885db56-e974-4def-a15f-267f3b07837c 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Acquiring lock "9d2b0f76-0408-4f6b-8a37-ac9882a44b4e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:28:06 compute-1 nova_compute[189066]: 2025-12-05 09:28:06.907 189070 DEBUG oslo_concurrency.lockutils [req-b9fc332b-9196-469e-84ec-93f6eb49dd96 req-2885db56-e974-4def-a15f-267f3b07837c 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Lock "9d2b0f76-0408-4f6b-8a37-ac9882a44b4e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:28:06 compute-1 nova_compute[189066]: 2025-12-05 09:28:06.907 189070 DEBUG oslo_concurrency.lockutils [req-b9fc332b-9196-469e-84ec-93f6eb49dd96 req-2885db56-e974-4def-a15f-267f3b07837c 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Lock "9d2b0f76-0408-4f6b-8a37-ac9882a44b4e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:28:06 compute-1 nova_compute[189066]: 2025-12-05 09:28:06.908 189070 DEBUG nova.compute.manager [req-b9fc332b-9196-469e-84ec-93f6eb49dd96 req-2885db56-e974-4def-a15f-267f3b07837c 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 9d2b0f76-0408-4f6b-8a37-ac9882a44b4e] No waiting events found dispatching network-vif-plugged-fee19e88-d18e-4020-97b6-26caf4ef6fa9 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 09:28:06 compute-1 nova_compute[189066]: 2025-12-05 09:28:06.908 189070 WARNING nova.compute.manager [req-b9fc332b-9196-469e-84ec-93f6eb49dd96 req-2885db56-e974-4def-a15f-267f3b07837c 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 9d2b0f76-0408-4f6b-8a37-ac9882a44b4e] Received unexpected event network-vif-plugged-fee19e88-d18e-4020-97b6-26caf4ef6fa9 for instance with vm_state resized and task_state resize_reverting.
Dec 05 09:28:06 compute-1 nova_compute[189066]: 2025-12-05 09:28:06.908 189070 DEBUG nova.compute.manager [req-b9fc332b-9196-469e-84ec-93f6eb49dd96 req-2885db56-e974-4def-a15f-267f3b07837c 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 9d2b0f76-0408-4f6b-8a37-ac9882a44b4e] Received event network-vif-plugged-fee19e88-d18e-4020-97b6-26caf4ef6fa9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 09:28:06 compute-1 nova_compute[189066]: 2025-12-05 09:28:06.908 189070 DEBUG oslo_concurrency.lockutils [req-b9fc332b-9196-469e-84ec-93f6eb49dd96 req-2885db56-e974-4def-a15f-267f3b07837c 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Acquiring lock "9d2b0f76-0408-4f6b-8a37-ac9882a44b4e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:28:06 compute-1 nova_compute[189066]: 2025-12-05 09:28:06.908 189070 DEBUG oslo_concurrency.lockutils [req-b9fc332b-9196-469e-84ec-93f6eb49dd96 req-2885db56-e974-4def-a15f-267f3b07837c 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Lock "9d2b0f76-0408-4f6b-8a37-ac9882a44b4e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:28:06 compute-1 nova_compute[189066]: 2025-12-05 09:28:06.909 189070 DEBUG oslo_concurrency.lockutils [req-b9fc332b-9196-469e-84ec-93f6eb49dd96 req-2885db56-e974-4def-a15f-267f3b07837c 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Lock "9d2b0f76-0408-4f6b-8a37-ac9882a44b4e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:28:06 compute-1 nova_compute[189066]: 2025-12-05 09:28:06.909 189070 DEBUG nova.compute.manager [req-b9fc332b-9196-469e-84ec-93f6eb49dd96 req-2885db56-e974-4def-a15f-267f3b07837c 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 9d2b0f76-0408-4f6b-8a37-ac9882a44b4e] No waiting events found dispatching network-vif-plugged-fee19e88-d18e-4020-97b6-26caf4ef6fa9 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 09:28:06 compute-1 nova_compute[189066]: 2025-12-05 09:28:06.909 189070 WARNING nova.compute.manager [req-b9fc332b-9196-469e-84ec-93f6eb49dd96 req-2885db56-e974-4def-a15f-267f3b07837c 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 9d2b0f76-0408-4f6b-8a37-ac9882a44b4e] Received unexpected event network-vif-plugged-fee19e88-d18e-4020-97b6-26caf4ef6fa9 for instance with vm_state resized and task_state resize_reverting.
Dec 05 09:28:06 compute-1 nova_compute[189066]: 2025-12-05 09:28:06.909 189070 DEBUG nova.compute.manager [req-b9fc332b-9196-469e-84ec-93f6eb49dd96 req-2885db56-e974-4def-a15f-267f3b07837c 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 9d2b0f76-0408-4f6b-8a37-ac9882a44b4e] Received event network-vif-plugged-fee19e88-d18e-4020-97b6-26caf4ef6fa9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 09:28:06 compute-1 nova_compute[189066]: 2025-12-05 09:28:06.910 189070 DEBUG oslo_concurrency.lockutils [req-b9fc332b-9196-469e-84ec-93f6eb49dd96 req-2885db56-e974-4def-a15f-267f3b07837c 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Acquiring lock "9d2b0f76-0408-4f6b-8a37-ac9882a44b4e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:28:06 compute-1 nova_compute[189066]: 2025-12-05 09:28:06.910 189070 DEBUG oslo_concurrency.lockutils [req-b9fc332b-9196-469e-84ec-93f6eb49dd96 req-2885db56-e974-4def-a15f-267f3b07837c 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Lock "9d2b0f76-0408-4f6b-8a37-ac9882a44b4e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:28:06 compute-1 nova_compute[189066]: 2025-12-05 09:28:06.910 189070 DEBUG oslo_concurrency.lockutils [req-b9fc332b-9196-469e-84ec-93f6eb49dd96 req-2885db56-e974-4def-a15f-267f3b07837c 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Lock "9d2b0f76-0408-4f6b-8a37-ac9882a44b4e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:28:06 compute-1 nova_compute[189066]: 2025-12-05 09:28:06.910 189070 DEBUG nova.compute.manager [req-b9fc332b-9196-469e-84ec-93f6eb49dd96 req-2885db56-e974-4def-a15f-267f3b07837c 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 9d2b0f76-0408-4f6b-8a37-ac9882a44b4e] No waiting events found dispatching network-vif-plugged-fee19e88-d18e-4020-97b6-26caf4ef6fa9 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 09:28:06 compute-1 nova_compute[189066]: 2025-12-05 09:28:06.911 189070 WARNING nova.compute.manager [req-b9fc332b-9196-469e-84ec-93f6eb49dd96 req-2885db56-e974-4def-a15f-267f3b07837c 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 9d2b0f76-0408-4f6b-8a37-ac9882a44b4e] Received unexpected event network-vif-plugged-fee19e88-d18e-4020-97b6-26caf4ef6fa9 for instance with vm_state resized and task_state resize_reverting.
Dec 05 09:28:07 compute-1 nova_compute[189066]: 2025-12-05 09:28:07.063 189070 INFO nova.network.neutron [None req-f66eff75-7cd1-44f3-a5f4-b38314c0c9dc 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] [instance: 9d2b0f76-0408-4f6b-8a37-ac9882a44b4e] Updating port fee19e88-d18e-4020-97b6-26caf4ef6fa9 with attributes {'binding:host_id': 'compute-1.ctlplane.example.com', 'device_owner': 'compute:nova'}
Dec 05 09:28:07 compute-1 podman[224289]: 2025-12-05 09:28:07.876754873 +0000 UTC m=+0.313809711 container health_status 0cdc951f3b455a1459471b20e1f1626ca53f702831c125f835d1b670aeb17ccf (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a42d21893a42142b0b00e96296a13ac9d2c0fedf55d6438ad104fd0309c63376-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 05 09:28:08 compute-1 nova_compute[189066]: 2025-12-05 09:28:08.306 189070 DEBUG oslo_concurrency.lockutils [None req-947d3976-ed56-4ac5-ba59-39c49cb0c8cd bcc37d16c39547bba794fb1f43e889c1 6c5bb818cba543bbb1bcff8df31dd9cd - - default default] Acquiring lock "2bdfdb89-21af-43b6-93eb-48a637bfbd4c" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:28:08 compute-1 nova_compute[189066]: 2025-12-05 09:28:08.306 189070 DEBUG oslo_concurrency.lockutils [None req-947d3976-ed56-4ac5-ba59-39c49cb0c8cd bcc37d16c39547bba794fb1f43e889c1 6c5bb818cba543bbb1bcff8df31dd9cd - - default default] Lock "2bdfdb89-21af-43b6-93eb-48a637bfbd4c" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:28:08 compute-1 nova_compute[189066]: 2025-12-05 09:28:08.426 189070 DEBUG nova.compute.manager [None req-947d3976-ed56-4ac5-ba59-39c49cb0c8cd bcc37d16c39547bba794fb1f43e889c1 6c5bb818cba543bbb1bcff8df31dd9cd - - default default] [instance: 2bdfdb89-21af-43b6-93eb-48a637bfbd4c] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Dec 05 09:28:08 compute-1 nova_compute[189066]: 2025-12-05 09:28:08.629 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:28:08 compute-1 nova_compute[189066]: 2025-12-05 09:28:08.630 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:28:08 compute-1 nova_compute[189066]: 2025-12-05 09:28:08.703 189070 DEBUG oslo_concurrency.lockutils [None req-947d3976-ed56-4ac5-ba59-39c49cb0c8cd bcc37d16c39547bba794fb1f43e889c1 6c5bb818cba543bbb1bcff8df31dd9cd - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:28:08 compute-1 nova_compute[189066]: 2025-12-05 09:28:08.704 189070 DEBUG oslo_concurrency.lockutils [None req-947d3976-ed56-4ac5-ba59-39c49cb0c8cd bcc37d16c39547bba794fb1f43e889c1 6c5bb818cba543bbb1bcff8df31dd9cd - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:28:08 compute-1 nova_compute[189066]: 2025-12-05 09:28:08.711 189070 DEBUG nova.virt.hardware [None req-947d3976-ed56-4ac5-ba59-39c49cb0c8cd bcc37d16c39547bba794fb1f43e889c1 6c5bb818cba543bbb1bcff8df31dd9cd - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Dec 05 09:28:08 compute-1 nova_compute[189066]: 2025-12-05 09:28:08.711 189070 INFO nova.compute.claims [None req-947d3976-ed56-4ac5-ba59-39c49cb0c8cd bcc37d16c39547bba794fb1f43e889c1 6c5bb818cba543bbb1bcff8df31dd9cd - - default default] [instance: 2bdfdb89-21af-43b6-93eb-48a637bfbd4c] Claim successful on node compute-1.ctlplane.example.com
Dec 05 09:28:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:28:08.872 105272 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:28:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:28:08.873 105272 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:28:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:28:08.874 105272 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:28:08 compute-1 nova_compute[189066]: 2025-12-05 09:28:08.940 189070 DEBUG nova.compute.provider_tree [None req-947d3976-ed56-4ac5-ba59-39c49cb0c8cd bcc37d16c39547bba794fb1f43e889c1 6c5bb818cba543bbb1bcff8df31dd9cd - - default default] Inventory has not changed in ProviderTree for provider: be68f9f1-7820-4bfa-8dbd-210e13729f64 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 09:28:09 compute-1 nova_compute[189066]: 2025-12-05 09:28:09.020 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:28:09 compute-1 nova_compute[189066]: 2025-12-05 09:28:09.021 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:28:09 compute-1 nova_compute[189066]: 2025-12-05 09:28:09.048 189070 DEBUG nova.scheduler.client.report [None req-947d3976-ed56-4ac5-ba59-39c49cb0c8cd bcc37d16c39547bba794fb1f43e889c1 6c5bb818cba543bbb1bcff8df31dd9cd - - default default] Inventory has not changed for provider be68f9f1-7820-4bfa-8dbd-210e13729f64 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 09:28:09 compute-1 nova_compute[189066]: 2025-12-05 09:28:09.085 189070 DEBUG oslo_concurrency.lockutils [None req-947d3976-ed56-4ac5-ba59-39c49cb0c8cd bcc37d16c39547bba794fb1f43e889c1 6c5bb818cba543bbb1bcff8df31dd9cd - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.381s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:28:09 compute-1 nova_compute[189066]: 2025-12-05 09:28:09.087 189070 DEBUG nova.compute.manager [None req-947d3976-ed56-4ac5-ba59-39c49cb0c8cd bcc37d16c39547bba794fb1f43e889c1 6c5bb818cba543bbb1bcff8df31dd9cd - - default default] [instance: 2bdfdb89-21af-43b6-93eb-48a637bfbd4c] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Dec 05 09:28:09 compute-1 nova_compute[189066]: 2025-12-05 09:28:09.142 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:28:09 compute-1 nova_compute[189066]: 2025-12-05 09:28:09.159 189070 DEBUG nova.compute.manager [None req-947d3976-ed56-4ac5-ba59-39c49cb0c8cd bcc37d16c39547bba794fb1f43e889c1 6c5bb818cba543bbb1bcff8df31dd9cd - - default default] [instance: 2bdfdb89-21af-43b6-93eb-48a637bfbd4c] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Dec 05 09:28:09 compute-1 nova_compute[189066]: 2025-12-05 09:28:09.159 189070 DEBUG nova.network.neutron [None req-947d3976-ed56-4ac5-ba59-39c49cb0c8cd bcc37d16c39547bba794fb1f43e889c1 6c5bb818cba543bbb1bcff8df31dd9cd - - default default] [instance: 2bdfdb89-21af-43b6-93eb-48a637bfbd4c] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Dec 05 09:28:09 compute-1 nova_compute[189066]: 2025-12-05 09:28:09.196 189070 INFO nova.virt.libvirt.driver [None req-947d3976-ed56-4ac5-ba59-39c49cb0c8cd bcc37d16c39547bba794fb1f43e889c1 6c5bb818cba543bbb1bcff8df31dd9cd - - default default] [instance: 2bdfdb89-21af-43b6-93eb-48a637bfbd4c] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Dec 05 09:28:09 compute-1 nova_compute[189066]: 2025-12-05 09:28:09.222 189070 DEBUG nova.compute.manager [None req-947d3976-ed56-4ac5-ba59-39c49cb0c8cd bcc37d16c39547bba794fb1f43e889c1 6c5bb818cba543bbb1bcff8df31dd9cd - - default default] [instance: 2bdfdb89-21af-43b6-93eb-48a637bfbd4c] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Dec 05 09:28:09 compute-1 nova_compute[189066]: 2025-12-05 09:28:09.245 189070 DEBUG oslo_concurrency.lockutils [None req-f66eff75-7cd1-44f3-a5f4-b38314c0c9dc 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] Acquiring lock "refresh_cache-9d2b0f76-0408-4f6b-8a37-ac9882a44b4e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 09:28:09 compute-1 nova_compute[189066]: 2025-12-05 09:28:09.246 189070 DEBUG oslo_concurrency.lockutils [None req-f66eff75-7cd1-44f3-a5f4-b38314c0c9dc 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] Acquired lock "refresh_cache-9d2b0f76-0408-4f6b-8a37-ac9882a44b4e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 09:28:09 compute-1 nova_compute[189066]: 2025-12-05 09:28:09.246 189070 DEBUG nova.network.neutron [None req-f66eff75-7cd1-44f3-a5f4-b38314c0c9dc 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] [instance: 9d2b0f76-0408-4f6b-8a37-ac9882a44b4e] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Dec 05 09:28:09 compute-1 nova_compute[189066]: 2025-12-05 09:28:09.523 189070 DEBUG nova.compute.manager [None req-947d3976-ed56-4ac5-ba59-39c49cb0c8cd bcc37d16c39547bba794fb1f43e889c1 6c5bb818cba543bbb1bcff8df31dd9cd - - default default] [instance: 2bdfdb89-21af-43b6-93eb-48a637bfbd4c] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Dec 05 09:28:09 compute-1 nova_compute[189066]: 2025-12-05 09:28:09.525 189070 DEBUG nova.virt.libvirt.driver [None req-947d3976-ed56-4ac5-ba59-39c49cb0c8cd bcc37d16c39547bba794fb1f43e889c1 6c5bb818cba543bbb1bcff8df31dd9cd - - default default] [instance: 2bdfdb89-21af-43b6-93eb-48a637bfbd4c] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Dec 05 09:28:09 compute-1 nova_compute[189066]: 2025-12-05 09:28:09.526 189070 INFO nova.virt.libvirt.driver [None req-947d3976-ed56-4ac5-ba59-39c49cb0c8cd bcc37d16c39547bba794fb1f43e889c1 6c5bb818cba543bbb1bcff8df31dd9cd - - default default] [instance: 2bdfdb89-21af-43b6-93eb-48a637bfbd4c] Creating image(s)
Dec 05 09:28:09 compute-1 nova_compute[189066]: 2025-12-05 09:28:09.527 189070 DEBUG oslo_concurrency.lockutils [None req-947d3976-ed56-4ac5-ba59-39c49cb0c8cd bcc37d16c39547bba794fb1f43e889c1 6c5bb818cba543bbb1bcff8df31dd9cd - - default default] Acquiring lock "/var/lib/nova/instances/2bdfdb89-21af-43b6-93eb-48a637bfbd4c/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:28:09 compute-1 nova_compute[189066]: 2025-12-05 09:28:09.527 189070 DEBUG oslo_concurrency.lockutils [None req-947d3976-ed56-4ac5-ba59-39c49cb0c8cd bcc37d16c39547bba794fb1f43e889c1 6c5bb818cba543bbb1bcff8df31dd9cd - - default default] Lock "/var/lib/nova/instances/2bdfdb89-21af-43b6-93eb-48a637bfbd4c/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:28:09 compute-1 nova_compute[189066]: 2025-12-05 09:28:09.528 189070 DEBUG oslo_concurrency.lockutils [None req-947d3976-ed56-4ac5-ba59-39c49cb0c8cd bcc37d16c39547bba794fb1f43e889c1 6c5bb818cba543bbb1bcff8df31dd9cd - - default default] Lock "/var/lib/nova/instances/2bdfdb89-21af-43b6-93eb-48a637bfbd4c/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:28:09 compute-1 nova_compute[189066]: 2025-12-05 09:28:09.551 189070 DEBUG oslo_concurrency.processutils [None req-947d3976-ed56-4ac5-ba59-39c49cb0c8cd bcc37d16c39547bba794fb1f43e889c1 6c5bb818cba543bbb1bcff8df31dd9cd - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5b1bdb5cc50b9bf43ebe3094e9279a12a18df0ce --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 09:28:09 compute-1 podman[224315]: 2025-12-05 09:28:09.614746757 +0000 UTC m=+0.053199402 container health_status e8cea6352187f28e672aa25c227f2705a90d58074b8cf42235a56df5d4026fd5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a42d21893a42142b0b00e96296a13ac9d2c0fedf55d6438ad104fd0309c63376-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_metadata_agent)
Dec 05 09:28:09 compute-1 nova_compute[189066]: 2025-12-05 09:28:09.622 189070 DEBUG oslo_concurrency.processutils [None req-947d3976-ed56-4ac5-ba59-39c49cb0c8cd bcc37d16c39547bba794fb1f43e889c1 6c5bb818cba543bbb1bcff8df31dd9cd - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5b1bdb5cc50b9bf43ebe3094e9279a12a18df0ce --force-share --output=json" returned: 0 in 0.071s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 09:28:09 compute-1 nova_compute[189066]: 2025-12-05 09:28:09.623 189070 DEBUG oslo_concurrency.lockutils [None req-947d3976-ed56-4ac5-ba59-39c49cb0c8cd bcc37d16c39547bba794fb1f43e889c1 6c5bb818cba543bbb1bcff8df31dd9cd - - default default] Acquiring lock "5b1bdb5cc50b9bf43ebe3094e9279a12a18df0ce" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:28:09 compute-1 nova_compute[189066]: 2025-12-05 09:28:09.624 189070 DEBUG oslo_concurrency.lockutils [None req-947d3976-ed56-4ac5-ba59-39c49cb0c8cd bcc37d16c39547bba794fb1f43e889c1 6c5bb818cba543bbb1bcff8df31dd9cd - - default default] Lock "5b1bdb5cc50b9bf43ebe3094e9279a12a18df0ce" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:28:09 compute-1 nova_compute[189066]: 2025-12-05 09:28:09.636 189070 DEBUG oslo_concurrency.processutils [None req-947d3976-ed56-4ac5-ba59-39c49cb0c8cd bcc37d16c39547bba794fb1f43e889c1 6c5bb818cba543bbb1bcff8df31dd9cd - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5b1bdb5cc50b9bf43ebe3094e9279a12a18df0ce --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 09:28:09 compute-1 nova_compute[189066]: 2025-12-05 09:28:09.678 189070 DEBUG nova.compute.manager [req-a1fae37d-4814-4f94-81d8-a6bff03dd8cd req-1348b13a-eb4e-4c15-8c9c-f1a74625f12c 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 9d2b0f76-0408-4f6b-8a37-ac9882a44b4e] Received event network-vif-unplugged-fee19e88-d18e-4020-97b6-26caf4ef6fa9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 09:28:09 compute-1 nova_compute[189066]: 2025-12-05 09:28:09.679 189070 DEBUG oslo_concurrency.lockutils [req-a1fae37d-4814-4f94-81d8-a6bff03dd8cd req-1348b13a-eb4e-4c15-8c9c-f1a74625f12c 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Acquiring lock "9d2b0f76-0408-4f6b-8a37-ac9882a44b4e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:28:09 compute-1 nova_compute[189066]: 2025-12-05 09:28:09.679 189070 DEBUG oslo_concurrency.lockutils [req-a1fae37d-4814-4f94-81d8-a6bff03dd8cd req-1348b13a-eb4e-4c15-8c9c-f1a74625f12c 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Lock "9d2b0f76-0408-4f6b-8a37-ac9882a44b4e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:28:09 compute-1 nova_compute[189066]: 2025-12-05 09:28:09.680 189070 DEBUG oslo_concurrency.lockutils [req-a1fae37d-4814-4f94-81d8-a6bff03dd8cd req-1348b13a-eb4e-4c15-8c9c-f1a74625f12c 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Lock "9d2b0f76-0408-4f6b-8a37-ac9882a44b4e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:28:09 compute-1 nova_compute[189066]: 2025-12-05 09:28:09.680 189070 DEBUG nova.compute.manager [req-a1fae37d-4814-4f94-81d8-a6bff03dd8cd req-1348b13a-eb4e-4c15-8c9c-f1a74625f12c 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 9d2b0f76-0408-4f6b-8a37-ac9882a44b4e] No waiting events found dispatching network-vif-unplugged-fee19e88-d18e-4020-97b6-26caf4ef6fa9 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 09:28:09 compute-1 nova_compute[189066]: 2025-12-05 09:28:09.680 189070 WARNING nova.compute.manager [req-a1fae37d-4814-4f94-81d8-a6bff03dd8cd req-1348b13a-eb4e-4c15-8c9c-f1a74625f12c 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 9d2b0f76-0408-4f6b-8a37-ac9882a44b4e] Received unexpected event network-vif-unplugged-fee19e88-d18e-4020-97b6-26caf4ef6fa9 for instance with vm_state resized and task_state resize_reverting.
Dec 05 09:28:09 compute-1 nova_compute[189066]: 2025-12-05 09:28:09.680 189070 DEBUG nova.compute.manager [req-a1fae37d-4814-4f94-81d8-a6bff03dd8cd req-1348b13a-eb4e-4c15-8c9c-f1a74625f12c 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 9d2b0f76-0408-4f6b-8a37-ac9882a44b4e] Received event network-vif-plugged-fee19e88-d18e-4020-97b6-26caf4ef6fa9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 09:28:09 compute-1 nova_compute[189066]: 2025-12-05 09:28:09.681 189070 DEBUG oslo_concurrency.lockutils [req-a1fae37d-4814-4f94-81d8-a6bff03dd8cd req-1348b13a-eb4e-4c15-8c9c-f1a74625f12c 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Acquiring lock "9d2b0f76-0408-4f6b-8a37-ac9882a44b4e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:28:09 compute-1 nova_compute[189066]: 2025-12-05 09:28:09.681 189070 DEBUG oslo_concurrency.lockutils [req-a1fae37d-4814-4f94-81d8-a6bff03dd8cd req-1348b13a-eb4e-4c15-8c9c-f1a74625f12c 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Lock "9d2b0f76-0408-4f6b-8a37-ac9882a44b4e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:28:09 compute-1 nova_compute[189066]: 2025-12-05 09:28:09.681 189070 DEBUG oslo_concurrency.lockutils [req-a1fae37d-4814-4f94-81d8-a6bff03dd8cd req-1348b13a-eb4e-4c15-8c9c-f1a74625f12c 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Lock "9d2b0f76-0408-4f6b-8a37-ac9882a44b4e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:28:09 compute-1 nova_compute[189066]: 2025-12-05 09:28:09.681 189070 DEBUG nova.compute.manager [req-a1fae37d-4814-4f94-81d8-a6bff03dd8cd req-1348b13a-eb4e-4c15-8c9c-f1a74625f12c 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 9d2b0f76-0408-4f6b-8a37-ac9882a44b4e] No waiting events found dispatching network-vif-plugged-fee19e88-d18e-4020-97b6-26caf4ef6fa9 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 09:28:09 compute-1 nova_compute[189066]: 2025-12-05 09:28:09.682 189070 WARNING nova.compute.manager [req-a1fae37d-4814-4f94-81d8-a6bff03dd8cd req-1348b13a-eb4e-4c15-8c9c-f1a74625f12c 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 9d2b0f76-0408-4f6b-8a37-ac9882a44b4e] Received unexpected event network-vif-plugged-fee19e88-d18e-4020-97b6-26caf4ef6fa9 for instance with vm_state resized and task_state resize_reverting.
Dec 05 09:28:09 compute-1 nova_compute[189066]: 2025-12-05 09:28:09.700 189070 DEBUG oslo_concurrency.processutils [None req-947d3976-ed56-4ac5-ba59-39c49cb0c8cd bcc37d16c39547bba794fb1f43e889c1 6c5bb818cba543bbb1bcff8df31dd9cd - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5b1bdb5cc50b9bf43ebe3094e9279a12a18df0ce --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 09:28:09 compute-1 nova_compute[189066]: 2025-12-05 09:28:09.725 189070 DEBUG oslo_concurrency.processutils [None req-947d3976-ed56-4ac5-ba59-39c49cb0c8cd bcc37d16c39547bba794fb1f43e889c1 6c5bb818cba543bbb1bcff8df31dd9cd - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5b1bdb5cc50b9bf43ebe3094e9279a12a18df0ce,backing_fmt=raw /var/lib/nova/instances/2bdfdb89-21af-43b6-93eb-48a637bfbd4c/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 09:28:09 compute-1 nova_compute[189066]: 2025-12-05 09:28:09.765 189070 DEBUG oslo_concurrency.processutils [None req-947d3976-ed56-4ac5-ba59-39c49cb0c8cd bcc37d16c39547bba794fb1f43e889c1 6c5bb818cba543bbb1bcff8df31dd9cd - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5b1bdb5cc50b9bf43ebe3094e9279a12a18df0ce,backing_fmt=raw /var/lib/nova/instances/2bdfdb89-21af-43b6-93eb-48a637bfbd4c/disk 1073741824" returned: 0 in 0.040s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 09:28:09 compute-1 nova_compute[189066]: 2025-12-05 09:28:09.766 189070 DEBUG oslo_concurrency.lockutils [None req-947d3976-ed56-4ac5-ba59-39c49cb0c8cd bcc37d16c39547bba794fb1f43e889c1 6c5bb818cba543bbb1bcff8df31dd9cd - - default default] Lock "5b1bdb5cc50b9bf43ebe3094e9279a12a18df0ce" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.142s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:28:09 compute-1 nova_compute[189066]: 2025-12-05 09:28:09.766 189070 DEBUG oslo_concurrency.processutils [None req-947d3976-ed56-4ac5-ba59-39c49cb0c8cd bcc37d16c39547bba794fb1f43e889c1 6c5bb818cba543bbb1bcff8df31dd9cd - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5b1bdb5cc50b9bf43ebe3094e9279a12a18df0ce --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 09:28:09 compute-1 nova_compute[189066]: 2025-12-05 09:28:09.787 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:28:09 compute-1 nova_compute[189066]: 2025-12-05 09:28:09.829 189070 DEBUG oslo_concurrency.processutils [None req-947d3976-ed56-4ac5-ba59-39c49cb0c8cd bcc37d16c39547bba794fb1f43e889c1 6c5bb818cba543bbb1bcff8df31dd9cd - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5b1bdb5cc50b9bf43ebe3094e9279a12a18df0ce --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 09:28:09 compute-1 nova_compute[189066]: 2025-12-05 09:28:09.830 189070 DEBUG nova.virt.disk.api [None req-947d3976-ed56-4ac5-ba59-39c49cb0c8cd bcc37d16c39547bba794fb1f43e889c1 6c5bb818cba543bbb1bcff8df31dd9cd - - default default] Checking if we can resize image /var/lib/nova/instances/2bdfdb89-21af-43b6-93eb-48a637bfbd4c/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Dec 05 09:28:09 compute-1 nova_compute[189066]: 2025-12-05 09:28:09.831 189070 DEBUG oslo_concurrency.processutils [None req-947d3976-ed56-4ac5-ba59-39c49cb0c8cd bcc37d16c39547bba794fb1f43e889c1 6c5bb818cba543bbb1bcff8df31dd9cd - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2bdfdb89-21af-43b6-93eb-48a637bfbd4c/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 09:28:09 compute-1 nova_compute[189066]: 2025-12-05 09:28:09.860 189070 DEBUG nova.policy [None req-947d3976-ed56-4ac5-ba59-39c49cb0c8cd bcc37d16c39547bba794fb1f43e889c1 6c5bb818cba543bbb1bcff8df31dd9cd - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'bcc37d16c39547bba794fb1f43e889c1', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '6c5bb818cba543bbb1bcff8df31dd9cd', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Dec 05 09:28:09 compute-1 nova_compute[189066]: 2025-12-05 09:28:09.892 189070 DEBUG oslo_concurrency.processutils [None req-947d3976-ed56-4ac5-ba59-39c49cb0c8cd bcc37d16c39547bba794fb1f43e889c1 6c5bb818cba543bbb1bcff8df31dd9cd - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2bdfdb89-21af-43b6-93eb-48a637bfbd4c/disk --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 09:28:09 compute-1 nova_compute[189066]: 2025-12-05 09:28:09.893 189070 DEBUG nova.virt.disk.api [None req-947d3976-ed56-4ac5-ba59-39c49cb0c8cd bcc37d16c39547bba794fb1f43e889c1 6c5bb818cba543bbb1bcff8df31dd9cd - - default default] Cannot resize image /var/lib/nova/instances/2bdfdb89-21af-43b6-93eb-48a637bfbd4c/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Dec 05 09:28:09 compute-1 nova_compute[189066]: 2025-12-05 09:28:09.894 189070 DEBUG nova.objects.instance [None req-947d3976-ed56-4ac5-ba59-39c49cb0c8cd bcc37d16c39547bba794fb1f43e889c1 6c5bb818cba543bbb1bcff8df31dd9cd - - default default] Lazy-loading 'migration_context' on Instance uuid 2bdfdb89-21af-43b6-93eb-48a637bfbd4c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 09:28:10 compute-1 nova_compute[189066]: 2025-12-05 09:28:10.020 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:28:10 compute-1 nova_compute[189066]: 2025-12-05 09:28:10.288 189070 DEBUG nova.compute.manager [req-a03f6cf6-0751-4400-aee9-d14b275cca40 req-3fbfedf7-7bcb-458f-94f6-c804b96bbd38 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 9d2b0f76-0408-4f6b-8a37-ac9882a44b4e] Received event network-changed-fee19e88-d18e-4020-97b6-26caf4ef6fa9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 09:28:10 compute-1 nova_compute[189066]: 2025-12-05 09:28:10.288 189070 DEBUG nova.compute.manager [req-a03f6cf6-0751-4400-aee9-d14b275cca40 req-3fbfedf7-7bcb-458f-94f6-c804b96bbd38 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 9d2b0f76-0408-4f6b-8a37-ac9882a44b4e] Refreshing instance network info cache due to event network-changed-fee19e88-d18e-4020-97b6-26caf4ef6fa9. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Dec 05 09:28:10 compute-1 nova_compute[189066]: 2025-12-05 09:28:10.288 189070 DEBUG oslo_concurrency.lockutils [req-a03f6cf6-0751-4400-aee9-d14b275cca40 req-3fbfedf7-7bcb-458f-94f6-c804b96bbd38 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Acquiring lock "refresh_cache-9d2b0f76-0408-4f6b-8a37-ac9882a44b4e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 09:28:10 compute-1 nova_compute[189066]: 2025-12-05 09:28:10.294 189070 DEBUG nova.virt.libvirt.driver [None req-947d3976-ed56-4ac5-ba59-39c49cb0c8cd bcc37d16c39547bba794fb1f43e889c1 6c5bb818cba543bbb1bcff8df31dd9cd - - default default] [instance: 2bdfdb89-21af-43b6-93eb-48a637bfbd4c] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Dec 05 09:28:10 compute-1 nova_compute[189066]: 2025-12-05 09:28:10.294 189070 DEBUG nova.virt.libvirt.driver [None req-947d3976-ed56-4ac5-ba59-39c49cb0c8cd bcc37d16c39547bba794fb1f43e889c1 6c5bb818cba543bbb1bcff8df31dd9cd - - default default] [instance: 2bdfdb89-21af-43b6-93eb-48a637bfbd4c] Ensure instance console log exists: /var/lib/nova/instances/2bdfdb89-21af-43b6-93eb-48a637bfbd4c/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Dec 05 09:28:10 compute-1 nova_compute[189066]: 2025-12-05 09:28:10.294 189070 DEBUG oslo_concurrency.lockutils [None req-947d3976-ed56-4ac5-ba59-39c49cb0c8cd bcc37d16c39547bba794fb1f43e889c1 6c5bb818cba543bbb1bcff8df31dd9cd - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:28:10 compute-1 nova_compute[189066]: 2025-12-05 09:28:10.295 189070 DEBUG oslo_concurrency.lockutils [None req-947d3976-ed56-4ac5-ba59-39c49cb0c8cd bcc37d16c39547bba794fb1f43e889c1 6c5bb818cba543bbb1bcff8df31dd9cd - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:28:10 compute-1 nova_compute[189066]: 2025-12-05 09:28:10.295 189070 DEBUG oslo_concurrency.lockutils [None req-947d3976-ed56-4ac5-ba59-39c49cb0c8cd bcc37d16c39547bba794fb1f43e889c1 6c5bb818cba543bbb1bcff8df31dd9cd - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:28:12 compute-1 podman[224347]: 2025-12-05 09:28:12.649768206 +0000 UTC m=+0.086276800 container health_status 3f3288243aefe700d53cffce184353529fbaf6e44f1c19ec4792ed576af41176 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Dec 05 09:28:13 compute-1 nova_compute[189066]: 2025-12-05 09:28:13.020 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:28:13 compute-1 nova_compute[189066]: 2025-12-05 09:28:13.021 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:28:13 compute-1 nova_compute[189066]: 2025-12-05 09:28:13.633 189070 DEBUG oslo_concurrency.lockutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:28:13 compute-1 nova_compute[189066]: 2025-12-05 09:28:13.634 189070 DEBUG oslo_concurrency.lockutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:28:13 compute-1 nova_compute[189066]: 2025-12-05 09:28:13.634 189070 DEBUG oslo_concurrency.lockutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:28:13 compute-1 nova_compute[189066]: 2025-12-05 09:28:13.634 189070 DEBUG nova.compute.resource_tracker [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 05 09:28:14 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:28:14.063 105272 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=11, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f2:d3:89', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '8e:43:2c:7d:20:dd'}, ipsec=False) old=SB_Global(nb_cfg=10) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 09:28:14 compute-1 nova_compute[189066]: 2025-12-05 09:28:14.063 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:28:14 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:28:14.064 105272 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Dec 05 09:28:14 compute-1 nova_compute[189066]: 2025-12-05 09:28:14.106 189070 WARNING nova.virt.libvirt.driver [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Periodic task is updating the host stats, it is trying to get disk info for instance-0000000a, but the backing disk storage was removed by a concurrent operation such as resize. Error: No disk at /var/lib/nova/instances/9d2b0f76-0408-4f6b-8a37-ac9882a44b4e/disk: nova.exception.DiskNotFound: No disk at /var/lib/nova/instances/9d2b0f76-0408-4f6b-8a37-ac9882a44b4e/disk
Dec 05 09:28:14 compute-1 nova_compute[189066]: 2025-12-05 09:28:14.188 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:28:14 compute-1 nova_compute[189066]: 2025-12-05 09:28:14.281 189070 WARNING nova.virt.libvirt.driver [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 05 09:28:14 compute-1 nova_compute[189066]: 2025-12-05 09:28:14.282 189070 DEBUG nova.compute.resource_tracker [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5765MB free_disk=73.30497360229492GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 05 09:28:14 compute-1 nova_compute[189066]: 2025-12-05 09:28:14.282 189070 DEBUG oslo_concurrency.lockutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:28:14 compute-1 nova_compute[189066]: 2025-12-05 09:28:14.282 189070 DEBUG oslo_concurrency.lockutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:28:14 compute-1 nova_compute[189066]: 2025-12-05 09:28:14.307 189070 DEBUG nova.network.neutron [None req-f66eff75-7cd1-44f3-a5f4-b38314c0c9dc 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] [instance: 9d2b0f76-0408-4f6b-8a37-ac9882a44b4e] Updating instance_info_cache with network_info: [{"id": "fee19e88-d18e-4020-97b6-26caf4ef6fa9", "address": "fa:16:3e:ad:12:e3", "network": {"id": "2d5ff262-0b2d-49fb-b643-980510ce97c7", "bridge": "br-int", "label": "tempest-network-smoke--2096539509", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.214", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e26ae3fdd48d4947978a480f70e14f84", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfee19e88-d1", "ovs_interfaceid": "fee19e88-d18e-4020-97b6-26caf4ef6fa9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 09:28:14 compute-1 nova_compute[189066]: 2025-12-05 09:28:14.380 189070 DEBUG oslo_concurrency.lockutils [None req-f66eff75-7cd1-44f3-a5f4-b38314c0c9dc 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] Releasing lock "refresh_cache-9d2b0f76-0408-4f6b-8a37-ac9882a44b4e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 09:28:14 compute-1 nova_compute[189066]: 2025-12-05 09:28:14.381 189070 DEBUG nova.virt.libvirt.driver [None req-f66eff75-7cd1-44f3-a5f4-b38314c0c9dc 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] [instance: 9d2b0f76-0408-4f6b-8a37-ac9882a44b4e] Starting finish_revert_migration finish_revert_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11843
Dec 05 09:28:14 compute-1 nova_compute[189066]: 2025-12-05 09:28:14.383 189070 DEBUG oslo_concurrency.lockutils [req-a03f6cf6-0751-4400-aee9-d14b275cca40 req-3fbfedf7-7bcb-458f-94f6-c804b96bbd38 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Acquired lock "refresh_cache-9d2b0f76-0408-4f6b-8a37-ac9882a44b4e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 09:28:14 compute-1 nova_compute[189066]: 2025-12-05 09:28:14.384 189070 DEBUG nova.network.neutron [req-a03f6cf6-0751-4400-aee9-d14b275cca40 req-3fbfedf7-7bcb-458f-94f6-c804b96bbd38 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 9d2b0f76-0408-4f6b-8a37-ac9882a44b4e] Refreshing network info cache for port fee19e88-d18e-4020-97b6-26caf4ef6fa9 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Dec 05 09:28:14 compute-1 nova_compute[189066]: 2025-12-05 09:28:14.389 189070 DEBUG nova.virt.libvirt.driver [None req-f66eff75-7cd1-44f3-a5f4-b38314c0c9dc 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] [instance: 9d2b0f76-0408-4f6b-8a37-ac9882a44b4e] Start _get_guest_xml network_info=[{"id": "fee19e88-d18e-4020-97b6-26caf4ef6fa9", "address": "fa:16:3e:ad:12:e3", "network": {"id": "2d5ff262-0b2d-49fb-b643-980510ce97c7", "bridge": "br-int", "label": "tempest-network-smoke--2096539509", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.214", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e26ae3fdd48d4947978a480f70e14f84", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfee19e88-d1", "ovs_interfaceid": "fee19e88-d18e-4020-97b6-26caf4ef6fa9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum=<?>,container_format='bare',created_at=<?>,direct_url=<?>,disk_format='qcow2',id=3ebffd97-b242-42d7-b245-ebdaf8e4377c,min_disk=1,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=<?>,status=<?>,tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encrypted': False, 'encryption_secret_uuid': None, 'size': 0, 'boot_index': 0, 'device_name': '/dev/vda', 'disk_bus': 'virtio', 'guest_format': None, 'encryption_options': None, 'encryption_format': None, 'device_type': 'disk', 'image_id': '3ebffd97-b242-42d7-b245-ebdaf8e4377c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Dec 05 09:28:14 compute-1 nova_compute[189066]: 2025-12-05 09:28:14.392 189070 WARNING nova.virt.libvirt.driver [None req-f66eff75-7cd1-44f3-a5f4-b38314c0c9dc 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 05 09:28:14 compute-1 nova_compute[189066]: 2025-12-05 09:28:14.402 189070 DEBUG nova.virt.libvirt.host [None req-f66eff75-7cd1-44f3-a5f4-b38314c0c9dc 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Dec 05 09:28:14 compute-1 nova_compute[189066]: 2025-12-05 09:28:14.403 189070 DEBUG nova.virt.libvirt.host [None req-f66eff75-7cd1-44f3-a5f4-b38314c0c9dc 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Dec 05 09:28:14 compute-1 nova_compute[189066]: 2025-12-05 09:28:14.407 189070 DEBUG nova.virt.libvirt.host [None req-f66eff75-7cd1-44f3-a5f4-b38314c0c9dc 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Dec 05 09:28:14 compute-1 nova_compute[189066]: 2025-12-05 09:28:14.407 189070 DEBUG nova.virt.libvirt.host [None req-f66eff75-7cd1-44f3-a5f4-b38314c0c9dc 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Dec 05 09:28:14 compute-1 nova_compute[189066]: 2025-12-05 09:28:14.408 189070 DEBUG nova.virt.libvirt.driver [None req-f66eff75-7cd1-44f3-a5f4-b38314c0c9dc 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Dec 05 09:28:14 compute-1 nova_compute[189066]: 2025-12-05 09:28:14.409 189070 DEBUG nova.virt.hardware [None req-f66eff75-7cd1-44f3-a5f4-b38314c0c9dc 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-05T09:19:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='fbadeab4-f24f-4100-963a-d228b2a6f7c4',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum=<?>,container_format='bare',created_at=<?>,direct_url=<?>,disk_format='qcow2',id=3ebffd97-b242-42d7-b245-ebdaf8e4377c,min_disk=1,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=<?>,status=<?>,tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Dec 05 09:28:14 compute-1 nova_compute[189066]: 2025-12-05 09:28:14.409 189070 DEBUG nova.virt.hardware [None req-f66eff75-7cd1-44f3-a5f4-b38314c0c9dc 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Dec 05 09:28:14 compute-1 nova_compute[189066]: 2025-12-05 09:28:14.409 189070 DEBUG nova.virt.hardware [None req-f66eff75-7cd1-44f3-a5f4-b38314c0c9dc 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Dec 05 09:28:14 compute-1 nova_compute[189066]: 2025-12-05 09:28:14.410 189070 DEBUG nova.virt.hardware [None req-f66eff75-7cd1-44f3-a5f4-b38314c0c9dc 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Dec 05 09:28:14 compute-1 nova_compute[189066]: 2025-12-05 09:28:14.410 189070 DEBUG nova.virt.hardware [None req-f66eff75-7cd1-44f3-a5f4-b38314c0c9dc 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Dec 05 09:28:14 compute-1 nova_compute[189066]: 2025-12-05 09:28:14.410 189070 DEBUG nova.virt.hardware [None req-f66eff75-7cd1-44f3-a5f4-b38314c0c9dc 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Dec 05 09:28:14 compute-1 nova_compute[189066]: 2025-12-05 09:28:14.410 189070 DEBUG nova.virt.hardware [None req-f66eff75-7cd1-44f3-a5f4-b38314c0c9dc 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Dec 05 09:28:14 compute-1 nova_compute[189066]: 2025-12-05 09:28:14.410 189070 DEBUG nova.virt.hardware [None req-f66eff75-7cd1-44f3-a5f4-b38314c0c9dc 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Dec 05 09:28:14 compute-1 nova_compute[189066]: 2025-12-05 09:28:14.411 189070 DEBUG nova.virt.hardware [None req-f66eff75-7cd1-44f3-a5f4-b38314c0c9dc 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Dec 05 09:28:14 compute-1 nova_compute[189066]: 2025-12-05 09:28:14.412 189070 DEBUG nova.virt.hardware [None req-f66eff75-7cd1-44f3-a5f4-b38314c0c9dc 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Dec 05 09:28:14 compute-1 nova_compute[189066]: 2025-12-05 09:28:14.412 189070 DEBUG nova.virt.hardware [None req-f66eff75-7cd1-44f3-a5f4-b38314c0c9dc 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Dec 05 09:28:14 compute-1 nova_compute[189066]: 2025-12-05 09:28:14.412 189070 DEBUG nova.objects.instance [None req-f66eff75-7cd1-44f3-a5f4-b38314c0c9dc 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] Lazy-loading 'vcpu_model' on Instance uuid 9d2b0f76-0408-4f6b-8a37-ac9882a44b4e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 09:28:14 compute-1 nova_compute[189066]: 2025-12-05 09:28:14.437 189070 DEBUG nova.compute.resource_tracker [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Instance 9d2b0f76-0408-4f6b-8a37-ac9882a44b4e actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Dec 05 09:28:14 compute-1 nova_compute[189066]: 2025-12-05 09:28:14.437 189070 DEBUG nova.compute.resource_tracker [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Instance 2bdfdb89-21af-43b6-93eb-48a637bfbd4c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Dec 05 09:28:14 compute-1 nova_compute[189066]: 2025-12-05 09:28:14.438 189070 DEBUG nova.compute.resource_tracker [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 05 09:28:14 compute-1 nova_compute[189066]: 2025-12-05 09:28:14.439 189070 DEBUG nova.compute.resource_tracker [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 05 09:28:14 compute-1 nova_compute[189066]: 2025-12-05 09:28:14.486 189070 DEBUG oslo_concurrency.processutils [None req-f66eff75-7cd1-44f3-a5f4-b38314c0c9dc 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9d2b0f76-0408-4f6b-8a37-ac9882a44b4e/disk.config --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 09:28:14 compute-1 nova_compute[189066]: 2025-12-05 09:28:14.546 189070 DEBUG oslo_concurrency.processutils [None req-f66eff75-7cd1-44f3-a5f4-b38314c0c9dc 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9d2b0f76-0408-4f6b-8a37-ac9882a44b4e/disk.config --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 09:28:14 compute-1 nova_compute[189066]: 2025-12-05 09:28:14.547 189070 DEBUG oslo_concurrency.lockutils [None req-f66eff75-7cd1-44f3-a5f4-b38314c0c9dc 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] Acquiring lock "/var/lib/nova/instances/9d2b0f76-0408-4f6b-8a37-ac9882a44b4e/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:28:14 compute-1 nova_compute[189066]: 2025-12-05 09:28:14.548 189070 DEBUG oslo_concurrency.lockutils [None req-f66eff75-7cd1-44f3-a5f4-b38314c0c9dc 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] Lock "/var/lib/nova/instances/9d2b0f76-0408-4f6b-8a37-ac9882a44b4e/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:28:14 compute-1 nova_compute[189066]: 2025-12-05 09:28:14.549 189070 DEBUG oslo_concurrency.lockutils [None req-f66eff75-7cd1-44f3-a5f4-b38314c0c9dc 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] Lock "/var/lib/nova/instances/9d2b0f76-0408-4f6b-8a37-ac9882a44b4e/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:28:14 compute-1 nova_compute[189066]: 2025-12-05 09:28:14.550 189070 DEBUG nova.virt.libvirt.vif [None req-f66eff75-7cd1-44f3-a5f4-b38314c0c9dc 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2025-12-05T09:26:52Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-1301659118',display_name='tempest-TestNetworkAdvancedServerOps-server-1301659118',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-1301659118',id=10,image_ref='3ebffd97-b242-42d7-b245-ebdaf8e4377c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBE/IH5546c3h1NqRaMnG8oeFTcOTykF/TM0HkVvHYHh6GuyCm0ZuW+dN3e3eFJ7fYYhI257JWo40Mrp34xaUZfOzZGF5l/jYsmwg8LZO1T3z+5cPaeu5EMQJR4Fd4T/olg==',key_name='tempest-TestNetworkAdvancedServerOps-1059800776',keypairs=<?>,launch_index=0,launched_at=2025-12-05T09:27:59Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=MigrationContext,new_flavor=Flavor(1),node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=Flavor(1),os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='e26ae3fdd48d4947978a480f70e14f84',ramdisk_id='',reservation_id='r-i1fw0oxn',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',clean_attempts='1',image_base_image_ref='3ebffd97-b242-42d7-b245-ebdaf8e4377c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-TestNetworkAdvancedServerOps-1829130727',owner_user_name='tempest-TestNetworkAdvancedServerOps-1829130727-project-member'},tags=<?>,task_state='resize_reverting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-05T09:28:06Z,user_data=None,user_id='65751a90715341b2984ef84ebbaa1650',uuid=9d2b0f76-0408-4f6b-8a37-ac9882a44b4e,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='resized') vif={"id": "fee19e88-d18e-4020-97b6-26caf4ef6fa9", "address": "fa:16:3e:ad:12:e3", "network": {"id": "2d5ff262-0b2d-49fb-b643-980510ce97c7", "bridge": "br-int", "label": "tempest-network-smoke--2096539509", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.214", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e26ae3fdd48d4947978a480f70e14f84", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfee19e88-d1", "ovs_interfaceid": "fee19e88-d18e-4020-97b6-26caf4ef6fa9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Dec 05 09:28:14 compute-1 nova_compute[189066]: 2025-12-05 09:28:14.551 189070 DEBUG nova.network.os_vif_util [None req-f66eff75-7cd1-44f3-a5f4-b38314c0c9dc 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] Converting VIF {"id": "fee19e88-d18e-4020-97b6-26caf4ef6fa9", "address": "fa:16:3e:ad:12:e3", "network": {"id": "2d5ff262-0b2d-49fb-b643-980510ce97c7", "bridge": "br-int", "label": "tempest-network-smoke--2096539509", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.214", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e26ae3fdd48d4947978a480f70e14f84", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfee19e88-d1", "ovs_interfaceid": "fee19e88-d18e-4020-97b6-26caf4ef6fa9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 09:28:14 compute-1 nova_compute[189066]: 2025-12-05 09:28:14.552 189070 DEBUG nova.network.os_vif_util [None req-f66eff75-7cd1-44f3-a5f4-b38314c0c9dc 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ad:12:e3,bridge_name='br-int',has_traffic_filtering=True,id=fee19e88-d18e-4020-97b6-26caf4ef6fa9,network=Network(2d5ff262-0b2d-49fb-b643-980510ce97c7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfee19e88-d1') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 09:28:14 compute-1 nova_compute[189066]: 2025-12-05 09:28:14.555 189070 DEBUG nova.virt.libvirt.driver [None req-f66eff75-7cd1-44f3-a5f4-b38314c0c9dc 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] [instance: 9d2b0f76-0408-4f6b-8a37-ac9882a44b4e] End _get_guest_xml xml=<domain type="kvm">
Dec 05 09:28:14 compute-1 nova_compute[189066]:   <uuid>9d2b0f76-0408-4f6b-8a37-ac9882a44b4e</uuid>
Dec 05 09:28:14 compute-1 nova_compute[189066]:   <name>instance-0000000a</name>
Dec 05 09:28:14 compute-1 nova_compute[189066]:   <memory>131072</memory>
Dec 05 09:28:14 compute-1 nova_compute[189066]:   <vcpu>1</vcpu>
Dec 05 09:28:14 compute-1 nova_compute[189066]:   <metadata>
Dec 05 09:28:14 compute-1 nova_compute[189066]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 05 09:28:14 compute-1 nova_compute[189066]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec 05 09:28:14 compute-1 nova_compute[189066]:       <nova:name>tempest-TestNetworkAdvancedServerOps-server-1301659118</nova:name>
Dec 05 09:28:14 compute-1 nova_compute[189066]:       <nova:creationTime>2025-12-05 09:28:14</nova:creationTime>
Dec 05 09:28:14 compute-1 nova_compute[189066]:       <nova:flavor name="m1.nano">
Dec 05 09:28:14 compute-1 nova_compute[189066]:         <nova:memory>128</nova:memory>
Dec 05 09:28:14 compute-1 nova_compute[189066]:         <nova:disk>1</nova:disk>
Dec 05 09:28:14 compute-1 nova_compute[189066]:         <nova:swap>0</nova:swap>
Dec 05 09:28:14 compute-1 nova_compute[189066]:         <nova:ephemeral>0</nova:ephemeral>
Dec 05 09:28:14 compute-1 nova_compute[189066]:         <nova:vcpus>1</nova:vcpus>
Dec 05 09:28:14 compute-1 nova_compute[189066]:       </nova:flavor>
Dec 05 09:28:14 compute-1 nova_compute[189066]:       <nova:owner>
Dec 05 09:28:14 compute-1 nova_compute[189066]:         <nova:user uuid="65751a90715341b2984ef84ebbaa1650">tempest-TestNetworkAdvancedServerOps-1829130727-project-member</nova:user>
Dec 05 09:28:14 compute-1 nova_compute[189066]:         <nova:project uuid="e26ae3fdd48d4947978a480f70e14f84">tempest-TestNetworkAdvancedServerOps-1829130727</nova:project>
Dec 05 09:28:14 compute-1 nova_compute[189066]:       </nova:owner>
Dec 05 09:28:14 compute-1 nova_compute[189066]:       <nova:root type="image" uuid="3ebffd97-b242-42d7-b245-ebdaf8e4377c"/>
Dec 05 09:28:14 compute-1 nova_compute[189066]:       <nova:ports>
Dec 05 09:28:14 compute-1 nova_compute[189066]:         <nova:port uuid="fee19e88-d18e-4020-97b6-26caf4ef6fa9">
Dec 05 09:28:14 compute-1 nova_compute[189066]:           <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Dec 05 09:28:14 compute-1 nova_compute[189066]:         </nova:port>
Dec 05 09:28:14 compute-1 nova_compute[189066]:       </nova:ports>
Dec 05 09:28:14 compute-1 nova_compute[189066]:     </nova:instance>
Dec 05 09:28:14 compute-1 nova_compute[189066]:   </metadata>
Dec 05 09:28:14 compute-1 nova_compute[189066]:   <sysinfo type="smbios">
Dec 05 09:28:14 compute-1 nova_compute[189066]:     <system>
Dec 05 09:28:14 compute-1 nova_compute[189066]:       <entry name="manufacturer">RDO</entry>
Dec 05 09:28:14 compute-1 nova_compute[189066]:       <entry name="product">OpenStack Compute</entry>
Dec 05 09:28:14 compute-1 nova_compute[189066]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec 05 09:28:14 compute-1 nova_compute[189066]:       <entry name="serial">9d2b0f76-0408-4f6b-8a37-ac9882a44b4e</entry>
Dec 05 09:28:14 compute-1 nova_compute[189066]:       <entry name="uuid">9d2b0f76-0408-4f6b-8a37-ac9882a44b4e</entry>
Dec 05 09:28:14 compute-1 nova_compute[189066]:       <entry name="family">Virtual Machine</entry>
Dec 05 09:28:14 compute-1 nova_compute[189066]:     </system>
Dec 05 09:28:14 compute-1 nova_compute[189066]:   </sysinfo>
Dec 05 09:28:14 compute-1 nova_compute[189066]:   <os>
Dec 05 09:28:14 compute-1 nova_compute[189066]:     <type arch="x86_64" machine="q35">hvm</type>
Dec 05 09:28:14 compute-1 nova_compute[189066]:     <boot dev="hd"/>
Dec 05 09:28:14 compute-1 nova_compute[189066]:     <smbios mode="sysinfo"/>
Dec 05 09:28:14 compute-1 nova_compute[189066]:   </os>
Dec 05 09:28:14 compute-1 nova_compute[189066]:   <features>
Dec 05 09:28:14 compute-1 nova_compute[189066]:     <acpi/>
Dec 05 09:28:14 compute-1 nova_compute[189066]:     <apic/>
Dec 05 09:28:14 compute-1 nova_compute[189066]:     <vmcoreinfo/>
Dec 05 09:28:14 compute-1 nova_compute[189066]:   </features>
Dec 05 09:28:14 compute-1 nova_compute[189066]:   <clock offset="utc">
Dec 05 09:28:14 compute-1 nova_compute[189066]:     <timer name="pit" tickpolicy="delay"/>
Dec 05 09:28:14 compute-1 nova_compute[189066]:     <timer name="rtc" tickpolicy="catchup"/>
Dec 05 09:28:14 compute-1 nova_compute[189066]:     <timer name="hpet" present="no"/>
Dec 05 09:28:14 compute-1 nova_compute[189066]:   </clock>
Dec 05 09:28:14 compute-1 nova_compute[189066]:   <cpu mode="custom" match="exact">
Dec 05 09:28:14 compute-1 nova_compute[189066]:     <model>Nehalem</model>
Dec 05 09:28:14 compute-1 nova_compute[189066]:     <topology sockets="1" cores="1" threads="1"/>
Dec 05 09:28:14 compute-1 nova_compute[189066]:   </cpu>
Dec 05 09:28:14 compute-1 nova_compute[189066]:   <devices>
Dec 05 09:28:14 compute-1 nova_compute[189066]:     <disk type="file" device="disk">
Dec 05 09:28:14 compute-1 nova_compute[189066]:       <driver name="qemu" type="qcow2" cache="none"/>
Dec 05 09:28:14 compute-1 nova_compute[189066]:       <source file="/var/lib/nova/instances/9d2b0f76-0408-4f6b-8a37-ac9882a44b4e/disk"/>
Dec 05 09:28:14 compute-1 nova_compute[189066]:       <target dev="vda" bus="virtio"/>
Dec 05 09:28:14 compute-1 nova_compute[189066]:     </disk>
Dec 05 09:28:14 compute-1 nova_compute[189066]:     <disk type="file" device="cdrom">
Dec 05 09:28:14 compute-1 nova_compute[189066]:       <driver name="qemu" type="raw" cache="none"/>
Dec 05 09:28:14 compute-1 nova_compute[189066]:       <source file="/var/lib/nova/instances/9d2b0f76-0408-4f6b-8a37-ac9882a44b4e/disk.config"/>
Dec 05 09:28:14 compute-1 nova_compute[189066]:       <target dev="sda" bus="sata"/>
Dec 05 09:28:14 compute-1 nova_compute[189066]:     </disk>
Dec 05 09:28:14 compute-1 nova_compute[189066]:     <interface type="ethernet">
Dec 05 09:28:14 compute-1 nova_compute[189066]:       <mac address="fa:16:3e:ad:12:e3"/>
Dec 05 09:28:14 compute-1 nova_compute[189066]:       <model type="virtio"/>
Dec 05 09:28:14 compute-1 nova_compute[189066]:       <driver name="vhost" rx_queue_size="512"/>
Dec 05 09:28:14 compute-1 nova_compute[189066]:       <mtu size="1442"/>
Dec 05 09:28:14 compute-1 nova_compute[189066]:       <target dev="tapfee19e88-d1"/>
Dec 05 09:28:14 compute-1 nova_compute[189066]:     </interface>
Dec 05 09:28:14 compute-1 nova_compute[189066]:     <serial type="pty">
Dec 05 09:28:14 compute-1 nova_compute[189066]:       <log file="/var/lib/nova/instances/9d2b0f76-0408-4f6b-8a37-ac9882a44b4e/console.log" append="off"/>
Dec 05 09:28:14 compute-1 nova_compute[189066]:     </serial>
Dec 05 09:28:14 compute-1 nova_compute[189066]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Dec 05 09:28:14 compute-1 nova_compute[189066]:     <video>
Dec 05 09:28:14 compute-1 nova_compute[189066]:       <model type="virtio"/>
Dec 05 09:28:14 compute-1 nova_compute[189066]:     </video>
Dec 05 09:28:14 compute-1 nova_compute[189066]:     <input type="tablet" bus="usb"/>
Dec 05 09:28:14 compute-1 nova_compute[189066]:     <input type="keyboard" bus="usb"/>
Dec 05 09:28:14 compute-1 nova_compute[189066]:     <rng model="virtio">
Dec 05 09:28:14 compute-1 nova_compute[189066]:       <backend model="random">/dev/urandom</backend>
Dec 05 09:28:14 compute-1 nova_compute[189066]:     </rng>
Dec 05 09:28:14 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root"/>
Dec 05 09:28:14 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:28:14 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:28:14 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:28:14 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:28:14 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:28:14 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:28:14 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:28:14 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:28:14 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:28:14 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:28:14 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:28:14 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:28:14 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:28:14 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:28:14 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:28:14 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:28:14 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:28:14 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:28:14 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:28:14 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:28:14 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:28:14 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:28:14 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:28:14 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:28:14 compute-1 nova_compute[189066]:     <controller type="usb" index="0"/>
Dec 05 09:28:14 compute-1 nova_compute[189066]:     <memballoon model="virtio">
Dec 05 09:28:14 compute-1 nova_compute[189066]:       <stats period="10"/>
Dec 05 09:28:14 compute-1 nova_compute[189066]:     </memballoon>
Dec 05 09:28:14 compute-1 nova_compute[189066]:   </devices>
Dec 05 09:28:14 compute-1 nova_compute[189066]: </domain>
Dec 05 09:28:14 compute-1 nova_compute[189066]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Dec 05 09:28:14 compute-1 nova_compute[189066]: 2025-12-05 09:28:14.556 189070 DEBUG nova.compute.manager [None req-f66eff75-7cd1-44f3-a5f4-b38314c0c9dc 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] [instance: 9d2b0f76-0408-4f6b-8a37-ac9882a44b4e] Preparing to wait for external event network-vif-plugged-fee19e88-d18e-4020-97b6-26caf4ef6fa9 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Dec 05 09:28:14 compute-1 nova_compute[189066]: 2025-12-05 09:28:14.557 189070 DEBUG oslo_concurrency.lockutils [None req-f66eff75-7cd1-44f3-a5f4-b38314c0c9dc 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] Acquiring lock "9d2b0f76-0408-4f6b-8a37-ac9882a44b4e-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:28:14 compute-1 nova_compute[189066]: 2025-12-05 09:28:14.557 189070 DEBUG oslo_concurrency.lockutils [None req-f66eff75-7cd1-44f3-a5f4-b38314c0c9dc 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] Lock "9d2b0f76-0408-4f6b-8a37-ac9882a44b4e-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:28:14 compute-1 nova_compute[189066]: 2025-12-05 09:28:14.557 189070 DEBUG oslo_concurrency.lockutils [None req-f66eff75-7cd1-44f3-a5f4-b38314c0c9dc 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] Lock "9d2b0f76-0408-4f6b-8a37-ac9882a44b4e-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:28:14 compute-1 nova_compute[189066]: 2025-12-05 09:28:14.558 189070 DEBUG nova.virt.libvirt.vif [None req-f66eff75-7cd1-44f3-a5f4-b38314c0c9dc 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2025-12-05T09:26:52Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-1301659118',display_name='tempest-TestNetworkAdvancedServerOps-server-1301659118',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-1301659118',id=10,image_ref='3ebffd97-b242-42d7-b245-ebdaf8e4377c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBE/IH5546c3h1NqRaMnG8oeFTcOTykF/TM0HkVvHYHh6GuyCm0ZuW+dN3e3eFJ7fYYhI257JWo40Mrp34xaUZfOzZGF5l/jYsmwg8LZO1T3z+5cPaeu5EMQJR4Fd4T/olg==',key_name='tempest-TestNetworkAdvancedServerOps-1059800776',keypairs=<?>,launch_index=0,launched_at=2025-12-05T09:27:59Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=MigrationContext,new_flavor=Flavor(1),node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=Flavor(1),os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='e26ae3fdd48d4947978a480f70e14f84',ramdisk_id='',reservation_id='r-i1fw0oxn',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',clean_attempts='1',image_base_image_ref='3ebffd97-b242-42d7-b245-ebdaf8e4377c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-TestNetworkAdvancedServerOps-1829130727',owner_user_name='tempest-TestNetworkAdvancedServerOps-1829130727-project-member'},tags=<?>,task_state='resize_reverting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-05T09:28:06Z,user_data=None,user_id='65751a90715341b2984ef84ebbaa1650',uuid=9d2b0f76-0408-4f6b-8a37-ac9882a44b4e,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='resized') vif={"id": "fee19e88-d18e-4020-97b6-26caf4ef6fa9", "address": "fa:16:3e:ad:12:e3", "network": {"id": "2d5ff262-0b2d-49fb-b643-980510ce97c7", "bridge": "br-int", "label": "tempest-network-smoke--2096539509", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.214", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e26ae3fdd48d4947978a480f70e14f84", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfee19e88-d1", "ovs_interfaceid": "fee19e88-d18e-4020-97b6-26caf4ef6fa9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Dec 05 09:28:14 compute-1 nova_compute[189066]: 2025-12-05 09:28:14.558 189070 DEBUG nova.network.os_vif_util [None req-f66eff75-7cd1-44f3-a5f4-b38314c0c9dc 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] Converting VIF {"id": "fee19e88-d18e-4020-97b6-26caf4ef6fa9", "address": "fa:16:3e:ad:12:e3", "network": {"id": "2d5ff262-0b2d-49fb-b643-980510ce97c7", "bridge": "br-int", "label": "tempest-network-smoke--2096539509", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.214", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e26ae3fdd48d4947978a480f70e14f84", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfee19e88-d1", "ovs_interfaceid": "fee19e88-d18e-4020-97b6-26caf4ef6fa9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 09:28:14 compute-1 nova_compute[189066]: 2025-12-05 09:28:14.559 189070 DEBUG nova.network.os_vif_util [None req-f66eff75-7cd1-44f3-a5f4-b38314c0c9dc 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ad:12:e3,bridge_name='br-int',has_traffic_filtering=True,id=fee19e88-d18e-4020-97b6-26caf4ef6fa9,network=Network(2d5ff262-0b2d-49fb-b643-980510ce97c7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfee19e88-d1') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 09:28:14 compute-1 nova_compute[189066]: 2025-12-05 09:28:14.559 189070 DEBUG os_vif [None req-f66eff75-7cd1-44f3-a5f4-b38314c0c9dc 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:ad:12:e3,bridge_name='br-int',has_traffic_filtering=True,id=fee19e88-d18e-4020-97b6-26caf4ef6fa9,network=Network(2d5ff262-0b2d-49fb-b643-980510ce97c7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfee19e88-d1') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Dec 05 09:28:14 compute-1 nova_compute[189066]: 2025-12-05 09:28:14.563 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:28:14 compute-1 nova_compute[189066]: 2025-12-05 09:28:14.563 189070 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 09:28:14 compute-1 nova_compute[189066]: 2025-12-05 09:28:14.564 189070 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 09:28:14 compute-1 nova_compute[189066]: 2025-12-05 09:28:14.570 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:28:14 compute-1 nova_compute[189066]: 2025-12-05 09:28:14.571 189070 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapfee19e88-d1, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 09:28:14 compute-1 nova_compute[189066]: 2025-12-05 09:28:14.572 189070 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapfee19e88-d1, col_values=(('external_ids', {'iface-id': 'fee19e88-d18e-4020-97b6-26caf4ef6fa9', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:ad:12:e3', 'vm-uuid': '9d2b0f76-0408-4f6b-8a37-ac9882a44b4e'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 09:28:14 compute-1 nova_compute[189066]: 2025-12-05 09:28:14.575 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:28:14 compute-1 NetworkManager[55704]: <info>  [1764926894.5771] manager: (tapfee19e88-d1): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/53)
Dec 05 09:28:14 compute-1 nova_compute[189066]: 2025-12-05 09:28:14.579 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 09:28:14 compute-1 nova_compute[189066]: 2025-12-05 09:28:14.582 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:28:14 compute-1 nova_compute[189066]: 2025-12-05 09:28:14.582 189070 INFO os_vif [None req-f66eff75-7cd1-44f3-a5f4-b38314c0c9dc 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:ad:12:e3,bridge_name='br-int',has_traffic_filtering=True,id=fee19e88-d18e-4020-97b6-26caf4ef6fa9,network=Network(2d5ff262-0b2d-49fb-b643-980510ce97c7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfee19e88-d1')
Dec 05 09:28:14 compute-1 nova_compute[189066]: 2025-12-05 09:28:14.592 189070 DEBUG nova.compute.provider_tree [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Inventory has not changed in ProviderTree for provider: be68f9f1-7820-4bfa-8dbd-210e13729f64 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 09:28:14 compute-1 nova_compute[189066]: 2025-12-05 09:28:14.616 189070 DEBUG nova.scheduler.client.report [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Inventory has not changed for provider be68f9f1-7820-4bfa-8dbd-210e13729f64 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 09:28:14 compute-1 kernel: tapfee19e88-d1: entered promiscuous mode
Dec 05 09:28:14 compute-1 nova_compute[189066]: 2025-12-05 09:28:14.653 189070 DEBUG nova.compute.resource_tracker [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 05 09:28:14 compute-1 NetworkManager[55704]: <info>  [1764926894.6544] manager: (tapfee19e88-d1): new Tun device (/org/freedesktop/NetworkManager/Devices/54)
Dec 05 09:28:14 compute-1 nova_compute[189066]: 2025-12-05 09:28:14.654 189070 DEBUG oslo_concurrency.lockutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.372s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:28:14 compute-1 nova_compute[189066]: 2025-12-05 09:28:14.655 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:28:14 compute-1 ovn_controller[95809]: 2025-12-05T09:28:14Z|00098|binding|INFO|Claiming lport fee19e88-d18e-4020-97b6-26caf4ef6fa9 for this chassis.
Dec 05 09:28:14 compute-1 ovn_controller[95809]: 2025-12-05T09:28:14Z|00099|binding|INFO|fee19e88-d18e-4020-97b6-26caf4ef6fa9: Claiming fa:16:3e:ad:12:e3 10.100.0.14
Dec 05 09:28:14 compute-1 nova_compute[189066]: 2025-12-05 09:28:14.657 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:28:14 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:28:14.667 105272 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ad:12:e3 10.100.0.14'], port_security=['fa:16:3e:ad:12:e3 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '9d2b0f76-0408-4f6b-8a37-ac9882a44b4e', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-2d5ff262-0b2d-49fb-b643-980510ce97c7', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e26ae3fdd48d4947978a480f70e14f84', 'neutron:revision_number': '12', 'neutron:security_group_ids': '8cc6de91-de2a-4730-92b2-0be4bf62385a', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.214'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=512618ca-52a2-4c70-b76c-f25c0d00e2da, chassis=[<ovs.db.idl.Row object at 0x7fc06f54c970>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fc06f54c970>], logical_port=fee19e88-d18e-4020-97b6-26caf4ef6fa9) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 09:28:14 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:28:14.668 105272 INFO neutron.agent.ovn.metadata.agent [-] Port fee19e88-d18e-4020-97b6-26caf4ef6fa9 in datapath 2d5ff262-0b2d-49fb-b643-980510ce97c7 bound to our chassis
Dec 05 09:28:14 compute-1 ovn_controller[95809]: 2025-12-05T09:28:14Z|00100|binding|INFO|Setting lport fee19e88-d18e-4020-97b6-26caf4ef6fa9 ovn-installed in OVS
Dec 05 09:28:14 compute-1 ovn_controller[95809]: 2025-12-05T09:28:14Z|00101|binding|INFO|Setting lport fee19e88-d18e-4020-97b6-26caf4ef6fa9 up in Southbound
Dec 05 09:28:14 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:28:14.670 105272 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 2d5ff262-0b2d-49fb-b643-980510ce97c7
Dec 05 09:28:14 compute-1 nova_compute[189066]: 2025-12-05 09:28:14.670 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:28:14 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:28:14.685 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[68558f0e-d043-45d4-a69d-4d4d71918924]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:28:14 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:28:14.686 105272 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap2d5ff262-01 in ovnmeta-2d5ff262-0b2d-49fb-b643-980510ce97c7 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Dec 05 09:28:14 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:28:14.688 220556 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap2d5ff262-00 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Dec 05 09:28:14 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:28:14.688 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[f0baa329-9f6a-420c-af17-1db8021303eb]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:28:14 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:28:14.689 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[9353a2a6-9d2f-4190-8e9f-8261f366d1ff]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:28:14 compute-1 systemd-udevd[224389]: Network interface NamePolicy= disabled on kernel command line.
Dec 05 09:28:14 compute-1 systemd-machined[154815]: New machine qemu-7-instance-0000000a.
Dec 05 09:28:14 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:28:14.701 105809 DEBUG oslo.privsep.daemon [-] privsep: reply[786a1aea-02cf-42ea-b1dc-d8e5081b1e49]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:28:14 compute-1 NetworkManager[55704]: <info>  [1764926894.7071] device (tapfee19e88-d1): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec 05 09:28:14 compute-1 NetworkManager[55704]: <info>  [1764926894.7082] device (tapfee19e88-d1): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec 05 09:28:14 compute-1 systemd[1]: Started Virtual Machine qemu-7-instance-0000000a.
Dec 05 09:28:14 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:28:14.714 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[bcadd152-938d-4359-a856-3fc0e604d0f9]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:28:14 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:28:14.758 220896 DEBUG oslo.privsep.daemon [-] privsep: reply[06b40d5c-7400-4bf9-9fe0-9929e3114e4d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:28:14 compute-1 NetworkManager[55704]: <info>  [1764926894.7675] manager: (tap2d5ff262-00): new Veth device (/org/freedesktop/NetworkManager/Devices/55)
Dec 05 09:28:14 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:28:14.766 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[e898f47c-bc44-4fef-8b5e-9d45f9f57d26]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:28:14 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:28:14.811 220896 DEBUG oslo.privsep.daemon [-] privsep: reply[0ee31191-3043-4ae1-80f3-ead1d9bd2a98]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:28:14 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:28:14.815 220896 DEBUG oslo.privsep.daemon [-] privsep: reply[84e2da4d-d916-4c82-982c-36e00655e7a7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:28:14 compute-1 NetworkManager[55704]: <info>  [1764926894.8408] device (tap2d5ff262-00): carrier: link connected
Dec 05 09:28:14 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:28:14.850 220896 DEBUG oslo.privsep.daemon [-] privsep: reply[c77a3a01-a5a3-47fd-83c1-87e5fcb23002]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:28:14 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:28:14.882 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[a0018d14-0cee-4411-b09f-3d0e4edca456]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap2d5ff262-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:3d:1f:ea'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 31], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 416784, 'reachable_time': 16928, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 224421, 'error': None, 'target': 'ovnmeta-2d5ff262-0b2d-49fb-b643-980510ce97c7', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:28:14 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:28:14.903 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[4f88635e-f331-4aed-b6b9-a0e37cffcd98]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe3d:1fea'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 416784, 'tstamp': 416784}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 224422, 'error': None, 'target': 'ovnmeta-2d5ff262-0b2d-49fb-b643-980510ce97c7', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:28:14 compute-1 nova_compute[189066]: 2025-12-05 09:28:14.925 189070 DEBUG nova.network.neutron [None req-947d3976-ed56-4ac5-ba59-39c49cb0c8cd bcc37d16c39547bba794fb1f43e889c1 6c5bb818cba543bbb1bcff8df31dd9cd - - default default] [instance: 2bdfdb89-21af-43b6-93eb-48a637bfbd4c] Successfully created port: 6cdc578a-4ede-49b9-83a0-819716269b48 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Dec 05 09:28:14 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:28:14.925 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[bf6e6740-e57a-43a9-8d05-c70cd89d0576]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap2d5ff262-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:3d:1f:ea'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 31], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 416784, 'reachable_time': 16928, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 224423, 'error': None, 'target': 'ovnmeta-2d5ff262-0b2d-49fb-b643-980510ce97c7', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:28:14 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:28:14.965 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[878023f2-62fc-4f99-8abf-36609296da75]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:28:15 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:28:15.041 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[332c3758-606d-44a7-888b-39db96374b73]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:28:15 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:28:15.043 105272 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap2d5ff262-00, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 09:28:15 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:28:15.043 105272 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 09:28:15 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:28:15.044 105272 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap2d5ff262-00, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 09:28:15 compute-1 NetworkManager[55704]: <info>  [1764926895.0470] manager: (tap2d5ff262-00): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/56)
Dec 05 09:28:15 compute-1 kernel: tap2d5ff262-00: entered promiscuous mode
Dec 05 09:28:15 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:28:15.051 105272 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap2d5ff262-00, col_values=(('external_ids', {'iface-id': 'da7e4261-23bc-43e6-a0d1-1c34dd95f6c8'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 09:28:15 compute-1 nova_compute[189066]: 2025-12-05 09:28:15.051 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:28:15 compute-1 ovn_controller[95809]: 2025-12-05T09:28:15Z|00102|binding|INFO|Releasing lport da7e4261-23bc-43e6-a0d1-1c34dd95f6c8 from this chassis (sb_readonly=0)
Dec 05 09:28:15 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:28:15.054 105272 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/2d5ff262-0b2d-49fb-b643-980510ce97c7.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/2d5ff262-0b2d-49fb-b643-980510ce97c7.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Dec 05 09:28:15 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:28:15.055 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[9b2be63b-078c-41ca-986f-dbb01e51322d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:28:15 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:28:15.056 105272 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Dec 05 09:28:15 compute-1 ovn_metadata_agent[105267]: global
Dec 05 09:28:15 compute-1 ovn_metadata_agent[105267]:     log         /dev/log local0 debug
Dec 05 09:28:15 compute-1 ovn_metadata_agent[105267]:     log-tag     haproxy-metadata-proxy-2d5ff262-0b2d-49fb-b643-980510ce97c7
Dec 05 09:28:15 compute-1 ovn_metadata_agent[105267]:     user        root
Dec 05 09:28:15 compute-1 ovn_metadata_agent[105267]:     group       root
Dec 05 09:28:15 compute-1 ovn_metadata_agent[105267]:     maxconn     1024
Dec 05 09:28:15 compute-1 ovn_metadata_agent[105267]:     pidfile     /var/lib/neutron/external/pids/2d5ff262-0b2d-49fb-b643-980510ce97c7.pid.haproxy
Dec 05 09:28:15 compute-1 ovn_metadata_agent[105267]:     daemon
Dec 05 09:28:15 compute-1 ovn_metadata_agent[105267]: 
Dec 05 09:28:15 compute-1 ovn_metadata_agent[105267]: defaults
Dec 05 09:28:15 compute-1 ovn_metadata_agent[105267]:     log global
Dec 05 09:28:15 compute-1 ovn_metadata_agent[105267]:     mode http
Dec 05 09:28:15 compute-1 ovn_metadata_agent[105267]:     option httplog
Dec 05 09:28:15 compute-1 ovn_metadata_agent[105267]:     option dontlognull
Dec 05 09:28:15 compute-1 ovn_metadata_agent[105267]:     option http-server-close
Dec 05 09:28:15 compute-1 ovn_metadata_agent[105267]:     option forwardfor
Dec 05 09:28:15 compute-1 ovn_metadata_agent[105267]:     retries                 3
Dec 05 09:28:15 compute-1 ovn_metadata_agent[105267]:     timeout http-request    30s
Dec 05 09:28:15 compute-1 ovn_metadata_agent[105267]:     timeout connect         30s
Dec 05 09:28:15 compute-1 ovn_metadata_agent[105267]:     timeout client          32s
Dec 05 09:28:15 compute-1 ovn_metadata_agent[105267]:     timeout server          32s
Dec 05 09:28:15 compute-1 ovn_metadata_agent[105267]:     timeout http-keep-alive 30s
Dec 05 09:28:15 compute-1 ovn_metadata_agent[105267]: 
Dec 05 09:28:15 compute-1 ovn_metadata_agent[105267]: 
Dec 05 09:28:15 compute-1 ovn_metadata_agent[105267]: listen listener
Dec 05 09:28:15 compute-1 ovn_metadata_agent[105267]:     bind 169.254.169.254:80
Dec 05 09:28:15 compute-1 ovn_metadata_agent[105267]:     server metadata /var/lib/neutron/metadata_proxy
Dec 05 09:28:15 compute-1 ovn_metadata_agent[105267]:     http-request add-header X-OVN-Network-ID 2d5ff262-0b2d-49fb-b643-980510ce97c7
Dec 05 09:28:15 compute-1 ovn_metadata_agent[105267]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Dec 05 09:28:15 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:28:15.058 105272 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-2d5ff262-0b2d-49fb-b643-980510ce97c7', 'env', 'PROCESS_TAG=haproxy-2d5ff262-0b2d-49fb-b643-980510ce97c7', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/2d5ff262-0b2d-49fb-b643-980510ce97c7.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Dec 05 09:28:15 compute-1 podman[224455]: 2025-12-05 09:28:15.486713002 +0000 UTC m=+0.062140560 container create fc928b59a3e5d862d68ae647d7cf7b3926d36f3eda22fadc3aef3ef5c66e850e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-2d5ff262-0b2d-49fb-b643-980510ce97c7, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Dec 05 09:28:15 compute-1 systemd[1]: Started libpod-conmon-fc928b59a3e5d862d68ae647d7cf7b3926d36f3eda22fadc3aef3ef5c66e850e.scope.
Dec 05 09:28:15 compute-1 systemd[1]: Started libcrun container.
Dec 05 09:28:15 compute-1 podman[224455]: 2025-12-05 09:28:15.453697524 +0000 UTC m=+0.029125112 image pull 014dc726c85414b29f2dde7b5d875685d08784761c0f0ffa8630d1583a877bf9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Dec 05 09:28:15 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a98e71583e7427a1c8bbbe932a85ece63cbd2d0e4f4e53f87b0b0ece8df447ec/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 05 09:28:15 compute-1 podman[224455]: 2025-12-05 09:28:15.562479774 +0000 UTC m=+0.137907362 container init fc928b59a3e5d862d68ae647d7cf7b3926d36f3eda22fadc3aef3ef5c66e850e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-2d5ff262-0b2d-49fb-b643-980510ce97c7, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 05 09:28:15 compute-1 podman[224455]: 2025-12-05 09:28:15.570705345 +0000 UTC m=+0.146132903 container start fc928b59a3e5d862d68ae647d7cf7b3926d36f3eda22fadc3aef3ef5c66e850e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-2d5ff262-0b2d-49fb-b643-980510ce97c7, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 05 09:28:15 compute-1 neutron-haproxy-ovnmeta-2d5ff262-0b2d-49fb-b643-980510ce97c7[224476]: [NOTICE]   (224481) : New worker (224484) forked
Dec 05 09:28:15 compute-1 neutron-haproxy-ovnmeta-2d5ff262-0b2d-49fb-b643-980510ce97c7[224476]: [NOTICE]   (224481) : Loading success.
Dec 05 09:28:15 compute-1 nova_compute[189066]: 2025-12-05 09:28:15.638 189070 DEBUG nova.virt.driver [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] Emitting event <LifecycleEvent: 1764926895.637178, 9d2b0f76-0408-4f6b-8a37-ac9882a44b4e => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 09:28:15 compute-1 nova_compute[189066]: 2025-12-05 09:28:15.639 189070 INFO nova.compute.manager [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] [instance: 9d2b0f76-0408-4f6b-8a37-ac9882a44b4e] VM Started (Lifecycle Event)
Dec 05 09:28:15 compute-1 nova_compute[189066]: 2025-12-05 09:28:15.654 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:28:15 compute-1 nova_compute[189066]: 2025-12-05 09:28:15.655 189070 DEBUG nova.compute.manager [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 05 09:28:15 compute-1 nova_compute[189066]: 2025-12-05 09:28:15.655 189070 DEBUG nova.compute.manager [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 05 09:28:15 compute-1 nova_compute[189066]: 2025-12-05 09:28:15.668 189070 DEBUG nova.compute.manager [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] [instance: 9d2b0f76-0408-4f6b-8a37-ac9882a44b4e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 09:28:15 compute-1 nova_compute[189066]: 2025-12-05 09:28:15.673 189070 DEBUG nova.virt.driver [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] Emitting event <LifecycleEvent: 1764926895.6382093, 9d2b0f76-0408-4f6b-8a37-ac9882a44b4e => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 09:28:15 compute-1 nova_compute[189066]: 2025-12-05 09:28:15.673 189070 INFO nova.compute.manager [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] [instance: 9d2b0f76-0408-4f6b-8a37-ac9882a44b4e] VM Paused (Lifecycle Event)
Dec 05 09:28:15 compute-1 nova_compute[189066]: 2025-12-05 09:28:15.683 189070 DEBUG nova.compute.manager [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] [instance: 2bdfdb89-21af-43b6-93eb-48a637bfbd4c] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871
Dec 05 09:28:15 compute-1 nova_compute[189066]: 2025-12-05 09:28:15.683 189070 DEBUG oslo_concurrency.lockutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Acquiring lock "refresh_cache-9d2b0f76-0408-4f6b-8a37-ac9882a44b4e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 09:28:15 compute-1 nova_compute[189066]: 2025-12-05 09:28:15.705 189070 DEBUG nova.compute.manager [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] [instance: 9d2b0f76-0408-4f6b-8a37-ac9882a44b4e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 09:28:15 compute-1 nova_compute[189066]: 2025-12-05 09:28:15.709 189070 DEBUG nova.compute.manager [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] [instance: 9d2b0f76-0408-4f6b-8a37-ac9882a44b4e] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: resized, current task_state: resize_reverting, current DB power_state: 1, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 09:28:15 compute-1 nova_compute[189066]: 2025-12-05 09:28:15.742 189070 INFO nova.compute.manager [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] [instance: 9d2b0f76-0408-4f6b-8a37-ac9882a44b4e] During sync_power_state the instance has a pending task (resize_reverting). Skip.
Dec 05 09:28:16 compute-1 nova_compute[189066]: 2025-12-05 09:28:16.802 189070 DEBUG nova.compute.manager [req-120de377-9853-41aa-9945-9f4947872783 req-492e4329-7ff0-4228-8bb1-60ed95f7ee9e 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 9d2b0f76-0408-4f6b-8a37-ac9882a44b4e] Received event network-vif-plugged-fee19e88-d18e-4020-97b6-26caf4ef6fa9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 09:28:16 compute-1 nova_compute[189066]: 2025-12-05 09:28:16.802 189070 DEBUG oslo_concurrency.lockutils [req-120de377-9853-41aa-9945-9f4947872783 req-492e4329-7ff0-4228-8bb1-60ed95f7ee9e 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Acquiring lock "9d2b0f76-0408-4f6b-8a37-ac9882a44b4e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:28:16 compute-1 nova_compute[189066]: 2025-12-05 09:28:16.802 189070 DEBUG oslo_concurrency.lockutils [req-120de377-9853-41aa-9945-9f4947872783 req-492e4329-7ff0-4228-8bb1-60ed95f7ee9e 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Lock "9d2b0f76-0408-4f6b-8a37-ac9882a44b4e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:28:16 compute-1 nova_compute[189066]: 2025-12-05 09:28:16.802 189070 DEBUG oslo_concurrency.lockutils [req-120de377-9853-41aa-9945-9f4947872783 req-492e4329-7ff0-4228-8bb1-60ed95f7ee9e 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Lock "9d2b0f76-0408-4f6b-8a37-ac9882a44b4e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:28:16 compute-1 nova_compute[189066]: 2025-12-05 09:28:16.803 189070 DEBUG nova.compute.manager [req-120de377-9853-41aa-9945-9f4947872783 req-492e4329-7ff0-4228-8bb1-60ed95f7ee9e 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 9d2b0f76-0408-4f6b-8a37-ac9882a44b4e] Processing event network-vif-plugged-fee19e88-d18e-4020-97b6-26caf4ef6fa9 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Dec 05 09:28:16 compute-1 nova_compute[189066]: 2025-12-05 09:28:16.803 189070 DEBUG nova.compute.manager [None req-f66eff75-7cd1-44f3-a5f4-b38314c0c9dc 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] [instance: 9d2b0f76-0408-4f6b-8a37-ac9882a44b4e] Instance event wait completed in 1 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Dec 05 09:28:16 compute-1 nova_compute[189066]: 2025-12-05 09:28:16.807 189070 DEBUG nova.virt.driver [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] Emitting event <LifecycleEvent: 1764926896.8073554, 9d2b0f76-0408-4f6b-8a37-ac9882a44b4e => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 09:28:16 compute-1 nova_compute[189066]: 2025-12-05 09:28:16.808 189070 INFO nova.compute.manager [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] [instance: 9d2b0f76-0408-4f6b-8a37-ac9882a44b4e] VM Resumed (Lifecycle Event)
Dec 05 09:28:16 compute-1 nova_compute[189066]: 2025-12-05 09:28:16.812 189070 INFO nova.virt.libvirt.driver [-] [instance: 9d2b0f76-0408-4f6b-8a37-ac9882a44b4e] Instance running successfully.
Dec 05 09:28:16 compute-1 nova_compute[189066]: 2025-12-05 09:28:16.813 189070 DEBUG nova.virt.libvirt.driver [None req-f66eff75-7cd1-44f3-a5f4-b38314c0c9dc 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] [instance: 9d2b0f76-0408-4f6b-8a37-ac9882a44b4e] finish_revert_migration finished successfully. finish_revert_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11887
Dec 05 09:28:16 compute-1 nova_compute[189066]: 2025-12-05 09:28:16.849 189070 DEBUG nova.compute.manager [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] [instance: 9d2b0f76-0408-4f6b-8a37-ac9882a44b4e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 09:28:16 compute-1 nova_compute[189066]: 2025-12-05 09:28:16.852 189070 DEBUG nova.compute.manager [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] [instance: 9d2b0f76-0408-4f6b-8a37-ac9882a44b4e] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: resized, current task_state: resize_reverting, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 09:28:16 compute-1 nova_compute[189066]: 2025-12-05 09:28:16.888 189070 INFO nova.compute.manager [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] [instance: 9d2b0f76-0408-4f6b-8a37-ac9882a44b4e] During sync_power_state the instance has a pending task (resize_reverting). Skip.
Dec 05 09:28:16 compute-1 nova_compute[189066]: 2025-12-05 09:28:16.954 189070 INFO nova.compute.manager [None req-f66eff75-7cd1-44f3-a5f4-b38314c0c9dc 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] [instance: 9d2b0f76-0408-4f6b-8a37-ac9882a44b4e] Updating instance to original state: 'active'
Dec 05 09:28:17 compute-1 podman[224493]: 2025-12-05 09:28:17.641969865 +0000 UTC m=+0.077024384 container health_status b07f36300303a5c1729056a61e344d6c6fd9b2e404082d8e8ce7185d45652c2b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, io.buildah.version=1.33.7, version=9.6, maintainer=Red Hat, Inc., container_name=openstack_network_exporter, release=1755695350, vcs-type=git, io.openshift.tags=minimal rhel9, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=openstack_network_exporter, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Dec 05 09:28:17 compute-1 nova_compute[189066]: 2025-12-05 09:28:17.904 189070 DEBUG nova.network.neutron [req-a03f6cf6-0751-4400-aee9-d14b275cca40 req-3fbfedf7-7bcb-458f-94f6-c804b96bbd38 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 9d2b0f76-0408-4f6b-8a37-ac9882a44b4e] Updated VIF entry in instance network info cache for port fee19e88-d18e-4020-97b6-26caf4ef6fa9. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Dec 05 09:28:17 compute-1 nova_compute[189066]: 2025-12-05 09:28:17.904 189070 DEBUG nova.network.neutron [req-a03f6cf6-0751-4400-aee9-d14b275cca40 req-3fbfedf7-7bcb-458f-94f6-c804b96bbd38 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 9d2b0f76-0408-4f6b-8a37-ac9882a44b4e] Updating instance_info_cache with network_info: [{"id": "fee19e88-d18e-4020-97b6-26caf4ef6fa9", "address": "fa:16:3e:ad:12:e3", "network": {"id": "2d5ff262-0b2d-49fb-b643-980510ce97c7", "bridge": "br-int", "label": "tempest-network-smoke--2096539509", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.214", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e26ae3fdd48d4947978a480f70e14f84", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfee19e88-d1", "ovs_interfaceid": "fee19e88-d18e-4020-97b6-26caf4ef6fa9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 09:28:17 compute-1 nova_compute[189066]: 2025-12-05 09:28:17.938 189070 DEBUG oslo_concurrency.lockutils [req-a03f6cf6-0751-4400-aee9-d14b275cca40 req-3fbfedf7-7bcb-458f-94f6-c804b96bbd38 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Releasing lock "refresh_cache-9d2b0f76-0408-4f6b-8a37-ac9882a44b4e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 09:28:17 compute-1 nova_compute[189066]: 2025-12-05 09:28:17.939 189070 DEBUG oslo_concurrency.lockutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Acquired lock "refresh_cache-9d2b0f76-0408-4f6b-8a37-ac9882a44b4e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 09:28:17 compute-1 nova_compute[189066]: 2025-12-05 09:28:17.939 189070 DEBUG nova.network.neutron [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] [instance: 9d2b0f76-0408-4f6b-8a37-ac9882a44b4e] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Dec 05 09:28:17 compute-1 nova_compute[189066]: 2025-12-05 09:28:17.939 189070 DEBUG nova.objects.instance [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 9d2b0f76-0408-4f6b-8a37-ac9882a44b4e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 09:28:18 compute-1 nova_compute[189066]: 2025-12-05 09:28:18.781 189070 DEBUG nova.network.neutron [None req-947d3976-ed56-4ac5-ba59-39c49cb0c8cd bcc37d16c39547bba794fb1f43e889c1 6c5bb818cba543bbb1bcff8df31dd9cd - - default default] [instance: 2bdfdb89-21af-43b6-93eb-48a637bfbd4c] Successfully updated port: 6cdc578a-4ede-49b9-83a0-819716269b48 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Dec 05 09:28:19 compute-1 nova_compute[189066]: 2025-12-05 09:28:19.051 189070 DEBUG oslo_concurrency.lockutils [None req-947d3976-ed56-4ac5-ba59-39c49cb0c8cd bcc37d16c39547bba794fb1f43e889c1 6c5bb818cba543bbb1bcff8df31dd9cd - - default default] Acquiring lock "refresh_cache-2bdfdb89-21af-43b6-93eb-48a637bfbd4c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 09:28:19 compute-1 nova_compute[189066]: 2025-12-05 09:28:19.051 189070 DEBUG oslo_concurrency.lockutils [None req-947d3976-ed56-4ac5-ba59-39c49cb0c8cd bcc37d16c39547bba794fb1f43e889c1 6c5bb818cba543bbb1bcff8df31dd9cd - - default default] Acquired lock "refresh_cache-2bdfdb89-21af-43b6-93eb-48a637bfbd4c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 09:28:19 compute-1 nova_compute[189066]: 2025-12-05 09:28:19.051 189070 DEBUG nova.network.neutron [None req-947d3976-ed56-4ac5-ba59-39c49cb0c8cd bcc37d16c39547bba794fb1f43e889c1 6c5bb818cba543bbb1bcff8df31dd9cd - - default default] [instance: 2bdfdb89-21af-43b6-93eb-48a637bfbd4c] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Dec 05 09:28:19 compute-1 nova_compute[189066]: 2025-12-05 09:28:19.191 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:28:19 compute-1 nova_compute[189066]: 2025-12-05 09:28:19.430 189070 DEBUG nova.network.neutron [None req-947d3976-ed56-4ac5-ba59-39c49cb0c8cd bcc37d16c39547bba794fb1f43e889c1 6c5bb818cba543bbb1bcff8df31dd9cd - - default default] [instance: 2bdfdb89-21af-43b6-93eb-48a637bfbd4c] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Dec 05 09:28:19 compute-1 nova_compute[189066]: 2025-12-05 09:28:19.575 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:28:19 compute-1 podman[224514]: 2025-12-05 09:28:19.622805285 +0000 UTC m=+0.060779507 container health_status b9f45cdcb567bb250381fa80d86c78c226d5638c7de1b6d79ee017275814ff68 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Dec 05 09:28:20 compute-1 nova_compute[189066]: 2025-12-05 09:28:20.054 189070 DEBUG nova.compute.manager [req-534f5e79-8d47-46ba-8f4c-e0068e7fce40 req-790ddf81-521e-45a0-87d6-1e527e786ae9 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 9d2b0f76-0408-4f6b-8a37-ac9882a44b4e] Received event network-vif-plugged-fee19e88-d18e-4020-97b6-26caf4ef6fa9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 09:28:20 compute-1 nova_compute[189066]: 2025-12-05 09:28:20.055 189070 DEBUG oslo_concurrency.lockutils [req-534f5e79-8d47-46ba-8f4c-e0068e7fce40 req-790ddf81-521e-45a0-87d6-1e527e786ae9 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Acquiring lock "9d2b0f76-0408-4f6b-8a37-ac9882a44b4e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:28:20 compute-1 nova_compute[189066]: 2025-12-05 09:28:20.055 189070 DEBUG oslo_concurrency.lockutils [req-534f5e79-8d47-46ba-8f4c-e0068e7fce40 req-790ddf81-521e-45a0-87d6-1e527e786ae9 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Lock "9d2b0f76-0408-4f6b-8a37-ac9882a44b4e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:28:20 compute-1 nova_compute[189066]: 2025-12-05 09:28:20.056 189070 DEBUG oslo_concurrency.lockutils [req-534f5e79-8d47-46ba-8f4c-e0068e7fce40 req-790ddf81-521e-45a0-87d6-1e527e786ae9 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Lock "9d2b0f76-0408-4f6b-8a37-ac9882a44b4e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:28:20 compute-1 nova_compute[189066]: 2025-12-05 09:28:20.056 189070 DEBUG nova.compute.manager [req-534f5e79-8d47-46ba-8f4c-e0068e7fce40 req-790ddf81-521e-45a0-87d6-1e527e786ae9 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 9d2b0f76-0408-4f6b-8a37-ac9882a44b4e] No waiting events found dispatching network-vif-plugged-fee19e88-d18e-4020-97b6-26caf4ef6fa9 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 09:28:20 compute-1 nova_compute[189066]: 2025-12-05 09:28:20.057 189070 WARNING nova.compute.manager [req-534f5e79-8d47-46ba-8f4c-e0068e7fce40 req-790ddf81-521e-45a0-87d6-1e527e786ae9 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 9d2b0f76-0408-4f6b-8a37-ac9882a44b4e] Received unexpected event network-vif-plugged-fee19e88-d18e-4020-97b6-26caf4ef6fa9 for instance with vm_state active and task_state None.
Dec 05 09:28:21 compute-1 nova_compute[189066]: 2025-12-05 09:28:21.365 189070 DEBUG nova.network.neutron [None req-947d3976-ed56-4ac5-ba59-39c49cb0c8cd bcc37d16c39547bba794fb1f43e889c1 6c5bb818cba543bbb1bcff8df31dd9cd - - default default] [instance: 2bdfdb89-21af-43b6-93eb-48a637bfbd4c] Updating instance_info_cache with network_info: [{"id": "6cdc578a-4ede-49b9-83a0-819716269b48", "address": "fa:16:3e:8c:88:28", "network": {"id": "846936f3-e466-44ea-ae63-40fc6aa49531", "bridge": "br-int", "label": "tempest-network-smoke--1014331195", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6c5bb818cba543bbb1bcff8df31dd9cd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6cdc578a-4e", "ovs_interfaceid": "6cdc578a-4ede-49b9-83a0-819716269b48", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 09:28:21 compute-1 nova_compute[189066]: 2025-12-05 09:28:21.371 189070 DEBUG nova.network.neutron [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] [instance: 9d2b0f76-0408-4f6b-8a37-ac9882a44b4e] Updating instance_info_cache with network_info: [{"id": "fee19e88-d18e-4020-97b6-26caf4ef6fa9", "address": "fa:16:3e:ad:12:e3", "network": {"id": "2d5ff262-0b2d-49fb-b643-980510ce97c7", "bridge": "br-int", "label": "tempest-network-smoke--2096539509", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.214", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e26ae3fdd48d4947978a480f70e14f84", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfee19e88-d1", "ovs_interfaceid": "fee19e88-d18e-4020-97b6-26caf4ef6fa9", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 09:28:21 compute-1 nova_compute[189066]: 2025-12-05 09:28:21.396 189070 DEBUG oslo_concurrency.lockutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Releasing lock "refresh_cache-9d2b0f76-0408-4f6b-8a37-ac9882a44b4e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 09:28:21 compute-1 nova_compute[189066]: 2025-12-05 09:28:21.396 189070 DEBUG nova.compute.manager [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] [instance: 9d2b0f76-0408-4f6b-8a37-ac9882a44b4e] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Dec 05 09:28:21 compute-1 nova_compute[189066]: 2025-12-05 09:28:21.397 189070 DEBUG oslo_concurrency.lockutils [None req-947d3976-ed56-4ac5-ba59-39c49cb0c8cd bcc37d16c39547bba794fb1f43e889c1 6c5bb818cba543bbb1bcff8df31dd9cd - - default default] Releasing lock "refresh_cache-2bdfdb89-21af-43b6-93eb-48a637bfbd4c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 09:28:21 compute-1 nova_compute[189066]: 2025-12-05 09:28:21.397 189070 DEBUG nova.compute.manager [None req-947d3976-ed56-4ac5-ba59-39c49cb0c8cd bcc37d16c39547bba794fb1f43e889c1 6c5bb818cba543bbb1bcff8df31dd9cd - - default default] [instance: 2bdfdb89-21af-43b6-93eb-48a637bfbd4c] Instance network_info: |[{"id": "6cdc578a-4ede-49b9-83a0-819716269b48", "address": "fa:16:3e:8c:88:28", "network": {"id": "846936f3-e466-44ea-ae63-40fc6aa49531", "bridge": "br-int", "label": "tempest-network-smoke--1014331195", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6c5bb818cba543bbb1bcff8df31dd9cd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6cdc578a-4e", "ovs_interfaceid": "6cdc578a-4ede-49b9-83a0-819716269b48", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Dec 05 09:28:21 compute-1 nova_compute[189066]: 2025-12-05 09:28:21.397 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:28:21 compute-1 nova_compute[189066]: 2025-12-05 09:28:21.399 189070 DEBUG nova.virt.libvirt.driver [None req-947d3976-ed56-4ac5-ba59-39c49cb0c8cd bcc37d16c39547bba794fb1f43e889c1 6c5bb818cba543bbb1bcff8df31dd9cd - - default default] [instance: 2bdfdb89-21af-43b6-93eb-48a637bfbd4c] Start _get_guest_xml network_info=[{"id": "6cdc578a-4ede-49b9-83a0-819716269b48", "address": "fa:16:3e:8c:88:28", "network": {"id": "846936f3-e466-44ea-ae63-40fc6aa49531", "bridge": "br-int", "label": "tempest-network-smoke--1014331195", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6c5bb818cba543bbb1bcff8df31dd9cd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6cdc578a-4e", "ovs_interfaceid": "6cdc578a-4ede-49b9-83a0-819716269b48", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T09:19:04Z,direct_url=<?>,disk_format='qcow2',id=3ebffd97-b242-42d7-b245-ebdaf8e4377c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='1cb29f3743454b7fb71af92993455437',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T09:19:06Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encrypted': False, 'encryption_secret_uuid': None, 'size': 0, 'boot_index': 0, 'device_name': '/dev/vda', 'disk_bus': 'virtio', 'guest_format': None, 'encryption_options': None, 'encryption_format': None, 'device_type': 'disk', 'image_id': '3ebffd97-b242-42d7-b245-ebdaf8e4377c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Dec 05 09:28:21 compute-1 nova_compute[189066]: 2025-12-05 09:28:21.400 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:28:21 compute-1 nova_compute[189066]: 2025-12-05 09:28:21.400 189070 DEBUG nova.compute.manager [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 05 09:28:21 compute-1 nova_compute[189066]: 2025-12-05 09:28:21.404 189070 WARNING nova.virt.libvirt.driver [None req-947d3976-ed56-4ac5-ba59-39c49cb0c8cd bcc37d16c39547bba794fb1f43e889c1 6c5bb818cba543bbb1bcff8df31dd9cd - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 05 09:28:21 compute-1 nova_compute[189066]: 2025-12-05 09:28:21.411 189070 DEBUG nova.virt.libvirt.host [None req-947d3976-ed56-4ac5-ba59-39c49cb0c8cd bcc37d16c39547bba794fb1f43e889c1 6c5bb818cba543bbb1bcff8df31dd9cd - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Dec 05 09:28:21 compute-1 nova_compute[189066]: 2025-12-05 09:28:21.411 189070 DEBUG nova.virt.libvirt.host [None req-947d3976-ed56-4ac5-ba59-39c49cb0c8cd bcc37d16c39547bba794fb1f43e889c1 6c5bb818cba543bbb1bcff8df31dd9cd - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Dec 05 09:28:21 compute-1 nova_compute[189066]: 2025-12-05 09:28:21.415 189070 DEBUG nova.virt.libvirt.host [None req-947d3976-ed56-4ac5-ba59-39c49cb0c8cd bcc37d16c39547bba794fb1f43e889c1 6c5bb818cba543bbb1bcff8df31dd9cd - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Dec 05 09:28:21 compute-1 nova_compute[189066]: 2025-12-05 09:28:21.415 189070 DEBUG nova.virt.libvirt.host [None req-947d3976-ed56-4ac5-ba59-39c49cb0c8cd bcc37d16c39547bba794fb1f43e889c1 6c5bb818cba543bbb1bcff8df31dd9cd - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Dec 05 09:28:21 compute-1 nova_compute[189066]: 2025-12-05 09:28:21.417 189070 DEBUG nova.virt.libvirt.driver [None req-947d3976-ed56-4ac5-ba59-39c49cb0c8cd bcc37d16c39547bba794fb1f43e889c1 6c5bb818cba543bbb1bcff8df31dd9cd - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Dec 05 09:28:21 compute-1 nova_compute[189066]: 2025-12-05 09:28:21.417 189070 DEBUG nova.virt.hardware [None req-947d3976-ed56-4ac5-ba59-39c49cb0c8cd bcc37d16c39547bba794fb1f43e889c1 6c5bb818cba543bbb1bcff8df31dd9cd - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-05T09:19:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='fbadeab4-f24f-4100-963a-d228b2a6f7c4',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T09:19:04Z,direct_url=<?>,disk_format='qcow2',id=3ebffd97-b242-42d7-b245-ebdaf8e4377c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='1cb29f3743454b7fb71af92993455437',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T09:19:06Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Dec 05 09:28:21 compute-1 nova_compute[189066]: 2025-12-05 09:28:21.418 189070 DEBUG nova.virt.hardware [None req-947d3976-ed56-4ac5-ba59-39c49cb0c8cd bcc37d16c39547bba794fb1f43e889c1 6c5bb818cba543bbb1bcff8df31dd9cd - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Dec 05 09:28:21 compute-1 nova_compute[189066]: 2025-12-05 09:28:21.418 189070 DEBUG nova.virt.hardware [None req-947d3976-ed56-4ac5-ba59-39c49cb0c8cd bcc37d16c39547bba794fb1f43e889c1 6c5bb818cba543bbb1bcff8df31dd9cd - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Dec 05 09:28:21 compute-1 nova_compute[189066]: 2025-12-05 09:28:21.418 189070 DEBUG nova.virt.hardware [None req-947d3976-ed56-4ac5-ba59-39c49cb0c8cd bcc37d16c39547bba794fb1f43e889c1 6c5bb818cba543bbb1bcff8df31dd9cd - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Dec 05 09:28:21 compute-1 nova_compute[189066]: 2025-12-05 09:28:21.418 189070 DEBUG nova.virt.hardware [None req-947d3976-ed56-4ac5-ba59-39c49cb0c8cd bcc37d16c39547bba794fb1f43e889c1 6c5bb818cba543bbb1bcff8df31dd9cd - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Dec 05 09:28:21 compute-1 nova_compute[189066]: 2025-12-05 09:28:21.419 189070 DEBUG nova.virt.hardware [None req-947d3976-ed56-4ac5-ba59-39c49cb0c8cd bcc37d16c39547bba794fb1f43e889c1 6c5bb818cba543bbb1bcff8df31dd9cd - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Dec 05 09:28:21 compute-1 nova_compute[189066]: 2025-12-05 09:28:21.419 189070 DEBUG nova.virt.hardware [None req-947d3976-ed56-4ac5-ba59-39c49cb0c8cd bcc37d16c39547bba794fb1f43e889c1 6c5bb818cba543bbb1bcff8df31dd9cd - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Dec 05 09:28:21 compute-1 nova_compute[189066]: 2025-12-05 09:28:21.419 189070 DEBUG nova.virt.hardware [None req-947d3976-ed56-4ac5-ba59-39c49cb0c8cd bcc37d16c39547bba794fb1f43e889c1 6c5bb818cba543bbb1bcff8df31dd9cd - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Dec 05 09:28:21 compute-1 nova_compute[189066]: 2025-12-05 09:28:21.419 189070 DEBUG nova.virt.hardware [None req-947d3976-ed56-4ac5-ba59-39c49cb0c8cd bcc37d16c39547bba794fb1f43e889c1 6c5bb818cba543bbb1bcff8df31dd9cd - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Dec 05 09:28:21 compute-1 nova_compute[189066]: 2025-12-05 09:28:21.420 189070 DEBUG nova.virt.hardware [None req-947d3976-ed56-4ac5-ba59-39c49cb0c8cd bcc37d16c39547bba794fb1f43e889c1 6c5bb818cba543bbb1bcff8df31dd9cd - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Dec 05 09:28:21 compute-1 nova_compute[189066]: 2025-12-05 09:28:21.420 189070 DEBUG nova.virt.hardware [None req-947d3976-ed56-4ac5-ba59-39c49cb0c8cd bcc37d16c39547bba794fb1f43e889c1 6c5bb818cba543bbb1bcff8df31dd9cd - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Dec 05 09:28:21 compute-1 nova_compute[189066]: 2025-12-05 09:28:21.423 189070 DEBUG nova.virt.libvirt.vif [None req-947d3976-ed56-4ac5-ba59-39c49cb0c8cd bcc37d16c39547bba794fb1f43e889c1 6c5bb818cba543bbb1bcff8df31dd9cd - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T09:28:05Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-tempest-TestSecurityGroupsBasicOps-1223075532-gen-0-1100534983',display_name='tempest-server-tempest-TestSecurityGroupsBasicOps-1223075532-gen-0-1100534983',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-server-tempest-testsecuritygroupsbasicops-1223075532-ge',id=14,image_ref='3ebffd97-b242-42d7-b245-ebdaf8e4377c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBGz0y8jwIN4iPTFbzm3Hzs8Gj5T8iGXuYAPED3QXZF12BYqK3ZPCYby6NEQkacCTzSuBzj+1xJy02xU8j3suuVwp8Fj+P/tKJdPWLIb14U774zp1EnovtYJFW3cus5PkUA==',key_name='tempest-TestSecurityGroupsBasicOps-295190475',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='6c5bb818cba543bbb1bcff8df31dd9cd',ramdisk_id='',reservation_id='r-kp7icvtw',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='3ebffd97-b242-42d7-b245-ebdaf8e4377c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestSecurityGroupsBasicOps-1223075532',owner_user_name='tempest-TestSecurityGroupsBasicOps-1223075532-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T09:28:09Z,user_data=None,user_id='bcc37d16c39547bba794fb1f43e889c1',uuid=2bdfdb89-21af-43b6-93eb-48a637bfbd4c,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "6cdc578a-4ede-49b9-83a0-819716269b48", "address": "fa:16:3e:8c:88:28", "network": {"id": "846936f3-e466-44ea-ae63-40fc6aa49531", "bridge": "br-int", "label": "tempest-network-smoke--1014331195", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6c5bb818cba543bbb1bcff8df31dd9cd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6cdc578a-4e", "ovs_interfaceid": "6cdc578a-4ede-49b9-83a0-819716269b48", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Dec 05 09:28:21 compute-1 nova_compute[189066]: 2025-12-05 09:28:21.424 189070 DEBUG nova.network.os_vif_util [None req-947d3976-ed56-4ac5-ba59-39c49cb0c8cd bcc37d16c39547bba794fb1f43e889c1 6c5bb818cba543bbb1bcff8df31dd9cd - - default default] Converting VIF {"id": "6cdc578a-4ede-49b9-83a0-819716269b48", "address": "fa:16:3e:8c:88:28", "network": {"id": "846936f3-e466-44ea-ae63-40fc6aa49531", "bridge": "br-int", "label": "tempest-network-smoke--1014331195", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6c5bb818cba543bbb1bcff8df31dd9cd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6cdc578a-4e", "ovs_interfaceid": "6cdc578a-4ede-49b9-83a0-819716269b48", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 09:28:21 compute-1 nova_compute[189066]: 2025-12-05 09:28:21.425 189070 DEBUG nova.network.os_vif_util [None req-947d3976-ed56-4ac5-ba59-39c49cb0c8cd bcc37d16c39547bba794fb1f43e889c1 6c5bb818cba543bbb1bcff8df31dd9cd - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:8c:88:28,bridge_name='br-int',has_traffic_filtering=True,id=6cdc578a-4ede-49b9-83a0-819716269b48,network=Network(846936f3-e466-44ea-ae63-40fc6aa49531),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6cdc578a-4e') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 09:28:21 compute-1 nova_compute[189066]: 2025-12-05 09:28:21.426 189070 DEBUG nova.objects.instance [None req-947d3976-ed56-4ac5-ba59-39c49cb0c8cd bcc37d16c39547bba794fb1f43e889c1 6c5bb818cba543bbb1bcff8df31dd9cd - - default default] Lazy-loading 'pci_devices' on Instance uuid 2bdfdb89-21af-43b6-93eb-48a637bfbd4c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 09:28:21 compute-1 nova_compute[189066]: 2025-12-05 09:28:21.460 189070 DEBUG nova.virt.libvirt.driver [None req-947d3976-ed56-4ac5-ba59-39c49cb0c8cd bcc37d16c39547bba794fb1f43e889c1 6c5bb818cba543bbb1bcff8df31dd9cd - - default default] [instance: 2bdfdb89-21af-43b6-93eb-48a637bfbd4c] End _get_guest_xml xml=<domain type="kvm">
Dec 05 09:28:21 compute-1 nova_compute[189066]:   <uuid>2bdfdb89-21af-43b6-93eb-48a637bfbd4c</uuid>
Dec 05 09:28:21 compute-1 nova_compute[189066]:   <name>instance-0000000e</name>
Dec 05 09:28:21 compute-1 nova_compute[189066]:   <memory>131072</memory>
Dec 05 09:28:21 compute-1 nova_compute[189066]:   <vcpu>1</vcpu>
Dec 05 09:28:21 compute-1 nova_compute[189066]:   <metadata>
Dec 05 09:28:21 compute-1 nova_compute[189066]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 05 09:28:21 compute-1 nova_compute[189066]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec 05 09:28:21 compute-1 nova_compute[189066]:       <nova:name>tempest-server-tempest-TestSecurityGroupsBasicOps-1223075532-gen-0-1100534983</nova:name>
Dec 05 09:28:21 compute-1 nova_compute[189066]:       <nova:creationTime>2025-12-05 09:28:21</nova:creationTime>
Dec 05 09:28:21 compute-1 nova_compute[189066]:       <nova:flavor name="m1.nano">
Dec 05 09:28:21 compute-1 nova_compute[189066]:         <nova:memory>128</nova:memory>
Dec 05 09:28:21 compute-1 nova_compute[189066]:         <nova:disk>1</nova:disk>
Dec 05 09:28:21 compute-1 nova_compute[189066]:         <nova:swap>0</nova:swap>
Dec 05 09:28:21 compute-1 nova_compute[189066]:         <nova:ephemeral>0</nova:ephemeral>
Dec 05 09:28:21 compute-1 nova_compute[189066]:         <nova:vcpus>1</nova:vcpus>
Dec 05 09:28:21 compute-1 nova_compute[189066]:       </nova:flavor>
Dec 05 09:28:21 compute-1 nova_compute[189066]:       <nova:owner>
Dec 05 09:28:21 compute-1 nova_compute[189066]:         <nova:user uuid="bcc37d16c39547bba794fb1f43e889c1">tempest-TestSecurityGroupsBasicOps-1223075532-project-member</nova:user>
Dec 05 09:28:21 compute-1 nova_compute[189066]:         <nova:project uuid="6c5bb818cba543bbb1bcff8df31dd9cd">tempest-TestSecurityGroupsBasicOps-1223075532</nova:project>
Dec 05 09:28:21 compute-1 nova_compute[189066]:       </nova:owner>
Dec 05 09:28:21 compute-1 nova_compute[189066]:       <nova:root type="image" uuid="3ebffd97-b242-42d7-b245-ebdaf8e4377c"/>
Dec 05 09:28:21 compute-1 nova_compute[189066]:       <nova:ports>
Dec 05 09:28:21 compute-1 nova_compute[189066]:         <nova:port uuid="6cdc578a-4ede-49b9-83a0-819716269b48">
Dec 05 09:28:21 compute-1 nova_compute[189066]:           <nova:ip type="fixed" address="10.100.0.7" ipVersion="4"/>
Dec 05 09:28:21 compute-1 nova_compute[189066]:         </nova:port>
Dec 05 09:28:21 compute-1 nova_compute[189066]:       </nova:ports>
Dec 05 09:28:21 compute-1 nova_compute[189066]:     </nova:instance>
Dec 05 09:28:21 compute-1 nova_compute[189066]:   </metadata>
Dec 05 09:28:21 compute-1 nova_compute[189066]:   <sysinfo type="smbios">
Dec 05 09:28:21 compute-1 nova_compute[189066]:     <system>
Dec 05 09:28:21 compute-1 nova_compute[189066]:       <entry name="manufacturer">RDO</entry>
Dec 05 09:28:21 compute-1 nova_compute[189066]:       <entry name="product">OpenStack Compute</entry>
Dec 05 09:28:21 compute-1 nova_compute[189066]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec 05 09:28:21 compute-1 nova_compute[189066]:       <entry name="serial">2bdfdb89-21af-43b6-93eb-48a637bfbd4c</entry>
Dec 05 09:28:21 compute-1 nova_compute[189066]:       <entry name="uuid">2bdfdb89-21af-43b6-93eb-48a637bfbd4c</entry>
Dec 05 09:28:21 compute-1 nova_compute[189066]:       <entry name="family">Virtual Machine</entry>
Dec 05 09:28:21 compute-1 nova_compute[189066]:     </system>
Dec 05 09:28:21 compute-1 nova_compute[189066]:   </sysinfo>
Dec 05 09:28:21 compute-1 nova_compute[189066]:   <os>
Dec 05 09:28:21 compute-1 nova_compute[189066]:     <type arch="x86_64" machine="q35">hvm</type>
Dec 05 09:28:21 compute-1 nova_compute[189066]:     <boot dev="hd"/>
Dec 05 09:28:21 compute-1 nova_compute[189066]:     <smbios mode="sysinfo"/>
Dec 05 09:28:21 compute-1 nova_compute[189066]:   </os>
Dec 05 09:28:21 compute-1 nova_compute[189066]:   <features>
Dec 05 09:28:21 compute-1 nova_compute[189066]:     <acpi/>
Dec 05 09:28:21 compute-1 nova_compute[189066]:     <apic/>
Dec 05 09:28:21 compute-1 nova_compute[189066]:     <vmcoreinfo/>
Dec 05 09:28:21 compute-1 nova_compute[189066]:   </features>
Dec 05 09:28:21 compute-1 nova_compute[189066]:   <clock offset="utc">
Dec 05 09:28:21 compute-1 nova_compute[189066]:     <timer name="pit" tickpolicy="delay"/>
Dec 05 09:28:21 compute-1 nova_compute[189066]:     <timer name="rtc" tickpolicy="catchup"/>
Dec 05 09:28:21 compute-1 nova_compute[189066]:     <timer name="hpet" present="no"/>
Dec 05 09:28:21 compute-1 nova_compute[189066]:   </clock>
Dec 05 09:28:21 compute-1 nova_compute[189066]:   <cpu mode="custom" match="exact">
Dec 05 09:28:21 compute-1 nova_compute[189066]:     <model>Nehalem</model>
Dec 05 09:28:21 compute-1 nova_compute[189066]:     <topology sockets="1" cores="1" threads="1"/>
Dec 05 09:28:21 compute-1 nova_compute[189066]:   </cpu>
Dec 05 09:28:21 compute-1 nova_compute[189066]:   <devices>
Dec 05 09:28:21 compute-1 nova_compute[189066]:     <disk type="file" device="disk">
Dec 05 09:28:21 compute-1 nova_compute[189066]:       <driver name="qemu" type="qcow2" cache="none"/>
Dec 05 09:28:21 compute-1 nova_compute[189066]:       <source file="/var/lib/nova/instances/2bdfdb89-21af-43b6-93eb-48a637bfbd4c/disk"/>
Dec 05 09:28:21 compute-1 nova_compute[189066]:       <target dev="vda" bus="virtio"/>
Dec 05 09:28:21 compute-1 nova_compute[189066]:     </disk>
Dec 05 09:28:21 compute-1 nova_compute[189066]:     <disk type="file" device="cdrom">
Dec 05 09:28:21 compute-1 nova_compute[189066]:       <driver name="qemu" type="raw" cache="none"/>
Dec 05 09:28:21 compute-1 nova_compute[189066]:       <source file="/var/lib/nova/instances/2bdfdb89-21af-43b6-93eb-48a637bfbd4c/disk.config"/>
Dec 05 09:28:21 compute-1 nova_compute[189066]:       <target dev="sda" bus="sata"/>
Dec 05 09:28:21 compute-1 nova_compute[189066]:     </disk>
Dec 05 09:28:21 compute-1 nova_compute[189066]:     <interface type="ethernet">
Dec 05 09:28:21 compute-1 nova_compute[189066]:       <mac address="fa:16:3e:8c:88:28"/>
Dec 05 09:28:21 compute-1 nova_compute[189066]:       <model type="virtio"/>
Dec 05 09:28:21 compute-1 nova_compute[189066]:       <driver name="vhost" rx_queue_size="512"/>
Dec 05 09:28:21 compute-1 nova_compute[189066]:       <mtu size="1442"/>
Dec 05 09:28:21 compute-1 nova_compute[189066]:       <target dev="tap6cdc578a-4e"/>
Dec 05 09:28:21 compute-1 nova_compute[189066]:     </interface>
Dec 05 09:28:21 compute-1 nova_compute[189066]:     <serial type="pty">
Dec 05 09:28:21 compute-1 nova_compute[189066]:       <log file="/var/lib/nova/instances/2bdfdb89-21af-43b6-93eb-48a637bfbd4c/console.log" append="off"/>
Dec 05 09:28:21 compute-1 nova_compute[189066]:     </serial>
Dec 05 09:28:21 compute-1 nova_compute[189066]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Dec 05 09:28:21 compute-1 nova_compute[189066]:     <video>
Dec 05 09:28:21 compute-1 nova_compute[189066]:       <model type="virtio"/>
Dec 05 09:28:21 compute-1 nova_compute[189066]:     </video>
Dec 05 09:28:21 compute-1 nova_compute[189066]:     <input type="tablet" bus="usb"/>
Dec 05 09:28:21 compute-1 nova_compute[189066]:     <rng model="virtio">
Dec 05 09:28:21 compute-1 nova_compute[189066]:       <backend model="random">/dev/urandom</backend>
Dec 05 09:28:21 compute-1 nova_compute[189066]:     </rng>
Dec 05 09:28:21 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root"/>
Dec 05 09:28:21 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:28:21 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:28:21 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:28:21 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:28:21 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:28:21 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:28:21 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:28:21 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:28:21 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:28:21 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:28:21 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:28:21 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:28:21 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:28:21 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:28:21 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:28:21 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:28:21 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:28:21 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:28:21 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:28:21 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:28:21 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:28:21 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:28:21 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:28:21 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:28:21 compute-1 nova_compute[189066]:     <controller type="usb" index="0"/>
Dec 05 09:28:21 compute-1 nova_compute[189066]:     <memballoon model="virtio">
Dec 05 09:28:21 compute-1 nova_compute[189066]:       <stats period="10"/>
Dec 05 09:28:21 compute-1 nova_compute[189066]:     </memballoon>
Dec 05 09:28:21 compute-1 nova_compute[189066]:   </devices>
Dec 05 09:28:21 compute-1 nova_compute[189066]: </domain>
Dec 05 09:28:21 compute-1 nova_compute[189066]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Dec 05 09:28:21 compute-1 nova_compute[189066]: 2025-12-05 09:28:21.462 189070 DEBUG nova.compute.manager [None req-947d3976-ed56-4ac5-ba59-39c49cb0c8cd bcc37d16c39547bba794fb1f43e889c1 6c5bb818cba543bbb1bcff8df31dd9cd - - default default] [instance: 2bdfdb89-21af-43b6-93eb-48a637bfbd4c] Preparing to wait for external event network-vif-plugged-6cdc578a-4ede-49b9-83a0-819716269b48 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Dec 05 09:28:21 compute-1 nova_compute[189066]: 2025-12-05 09:28:21.462 189070 DEBUG oslo_concurrency.lockutils [None req-947d3976-ed56-4ac5-ba59-39c49cb0c8cd bcc37d16c39547bba794fb1f43e889c1 6c5bb818cba543bbb1bcff8df31dd9cd - - default default] Acquiring lock "2bdfdb89-21af-43b6-93eb-48a637bfbd4c-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:28:21 compute-1 nova_compute[189066]: 2025-12-05 09:28:21.462 189070 DEBUG oslo_concurrency.lockutils [None req-947d3976-ed56-4ac5-ba59-39c49cb0c8cd bcc37d16c39547bba794fb1f43e889c1 6c5bb818cba543bbb1bcff8df31dd9cd - - default default] Lock "2bdfdb89-21af-43b6-93eb-48a637bfbd4c-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:28:21 compute-1 nova_compute[189066]: 2025-12-05 09:28:21.462 189070 DEBUG oslo_concurrency.lockutils [None req-947d3976-ed56-4ac5-ba59-39c49cb0c8cd bcc37d16c39547bba794fb1f43e889c1 6c5bb818cba543bbb1bcff8df31dd9cd - - default default] Lock "2bdfdb89-21af-43b6-93eb-48a637bfbd4c-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:28:21 compute-1 nova_compute[189066]: 2025-12-05 09:28:21.463 189070 DEBUG nova.virt.libvirt.vif [None req-947d3976-ed56-4ac5-ba59-39c49cb0c8cd bcc37d16c39547bba794fb1f43e889c1 6c5bb818cba543bbb1bcff8df31dd9cd - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T09:28:05Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-tempest-TestSecurityGroupsBasicOps-1223075532-gen-0-1100534983',display_name='tempest-server-tempest-TestSecurityGroupsBasicOps-1223075532-gen-0-1100534983',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-server-tempest-testsecuritygroupsbasicops-1223075532-ge',id=14,image_ref='3ebffd97-b242-42d7-b245-ebdaf8e4377c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBGz0y8jwIN4iPTFbzm3Hzs8Gj5T8iGXuYAPED3QXZF12BYqK3ZPCYby6NEQkacCTzSuBzj+1xJy02xU8j3suuVwp8Fj+P/tKJdPWLIb14U774zp1EnovtYJFW3cus5PkUA==',key_name='tempest-TestSecurityGroupsBasicOps-295190475',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='6c5bb818cba543bbb1bcff8df31dd9cd',ramdisk_id='',reservation_id='r-kp7icvtw',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='3ebffd97-b242-42d7-b245-ebdaf8e4377c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestSecurityGroupsBasicOps-1223075532',owner_user_name='tempest-TestSecurityGroupsBasicOps-1223075532-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T09:28:09Z,user_data=None,user_id='bcc37d16c39547bba794fb1f43e889c1',uuid=2bdfdb89-21af-43b6-93eb-48a637bfbd4c,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "6cdc578a-4ede-49b9-83a0-819716269b48", "address": "fa:16:3e:8c:88:28", "network": {"id": "846936f3-e466-44ea-ae63-40fc6aa49531", "bridge": "br-int", "label": "tempest-network-smoke--1014331195", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6c5bb818cba543bbb1bcff8df31dd9cd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6cdc578a-4e", "ovs_interfaceid": "6cdc578a-4ede-49b9-83a0-819716269b48", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Dec 05 09:28:21 compute-1 nova_compute[189066]: 2025-12-05 09:28:21.464 189070 DEBUG nova.network.os_vif_util [None req-947d3976-ed56-4ac5-ba59-39c49cb0c8cd bcc37d16c39547bba794fb1f43e889c1 6c5bb818cba543bbb1bcff8df31dd9cd - - default default] Converting VIF {"id": "6cdc578a-4ede-49b9-83a0-819716269b48", "address": "fa:16:3e:8c:88:28", "network": {"id": "846936f3-e466-44ea-ae63-40fc6aa49531", "bridge": "br-int", "label": "tempest-network-smoke--1014331195", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6c5bb818cba543bbb1bcff8df31dd9cd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6cdc578a-4e", "ovs_interfaceid": "6cdc578a-4ede-49b9-83a0-819716269b48", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 09:28:21 compute-1 nova_compute[189066]: 2025-12-05 09:28:21.464 189070 DEBUG nova.network.os_vif_util [None req-947d3976-ed56-4ac5-ba59-39c49cb0c8cd bcc37d16c39547bba794fb1f43e889c1 6c5bb818cba543bbb1bcff8df31dd9cd - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:8c:88:28,bridge_name='br-int',has_traffic_filtering=True,id=6cdc578a-4ede-49b9-83a0-819716269b48,network=Network(846936f3-e466-44ea-ae63-40fc6aa49531),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6cdc578a-4e') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 09:28:21 compute-1 nova_compute[189066]: 2025-12-05 09:28:21.465 189070 DEBUG os_vif [None req-947d3976-ed56-4ac5-ba59-39c49cb0c8cd bcc37d16c39547bba794fb1f43e889c1 6c5bb818cba543bbb1bcff8df31dd9cd - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:8c:88:28,bridge_name='br-int',has_traffic_filtering=True,id=6cdc578a-4ede-49b9-83a0-819716269b48,network=Network(846936f3-e466-44ea-ae63-40fc6aa49531),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6cdc578a-4e') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Dec 05 09:28:21 compute-1 nova_compute[189066]: 2025-12-05 09:28:21.465 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:28:21 compute-1 nova_compute[189066]: 2025-12-05 09:28:21.466 189070 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 09:28:21 compute-1 nova_compute[189066]: 2025-12-05 09:28:21.466 189070 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 09:28:21 compute-1 nova_compute[189066]: 2025-12-05 09:28:21.469 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:28:21 compute-1 nova_compute[189066]: 2025-12-05 09:28:21.469 189070 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap6cdc578a-4e, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 09:28:21 compute-1 nova_compute[189066]: 2025-12-05 09:28:21.469 189070 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap6cdc578a-4e, col_values=(('external_ids', {'iface-id': '6cdc578a-4ede-49b9-83a0-819716269b48', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:8c:88:28', 'vm-uuid': '2bdfdb89-21af-43b6-93eb-48a637bfbd4c'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 09:28:21 compute-1 NetworkManager[55704]: <info>  [1764926901.5200] manager: (tap6cdc578a-4e): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/57)
Dec 05 09:28:21 compute-1 nova_compute[189066]: 2025-12-05 09:28:21.519 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:28:21 compute-1 nova_compute[189066]: 2025-12-05 09:28:21.523 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 09:28:21 compute-1 nova_compute[189066]: 2025-12-05 09:28:21.527 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:28:21 compute-1 nova_compute[189066]: 2025-12-05 09:28:21.528 189070 INFO os_vif [None req-947d3976-ed56-4ac5-ba59-39c49cb0c8cd bcc37d16c39547bba794fb1f43e889c1 6c5bb818cba543bbb1bcff8df31dd9cd - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:8c:88:28,bridge_name='br-int',has_traffic_filtering=True,id=6cdc578a-4ede-49b9-83a0-819716269b48,network=Network(846936f3-e466-44ea-ae63-40fc6aa49531),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6cdc578a-4e')
Dec 05 09:28:21 compute-1 nova_compute[189066]: 2025-12-05 09:28:21.614 189070 DEBUG nova.virt.libvirt.driver [None req-947d3976-ed56-4ac5-ba59-39c49cb0c8cd bcc37d16c39547bba794fb1f43e889c1 6c5bb818cba543bbb1bcff8df31dd9cd - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 05 09:28:21 compute-1 nova_compute[189066]: 2025-12-05 09:28:21.616 189070 DEBUG nova.virt.libvirt.driver [None req-947d3976-ed56-4ac5-ba59-39c49cb0c8cd bcc37d16c39547bba794fb1f43e889c1 6c5bb818cba543bbb1bcff8df31dd9cd - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 05 09:28:21 compute-1 nova_compute[189066]: 2025-12-05 09:28:21.616 189070 DEBUG nova.virt.libvirt.driver [None req-947d3976-ed56-4ac5-ba59-39c49cb0c8cd bcc37d16c39547bba794fb1f43e889c1 6c5bb818cba543bbb1bcff8df31dd9cd - - default default] No VIF found with MAC fa:16:3e:8c:88:28, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Dec 05 09:28:21 compute-1 nova_compute[189066]: 2025-12-05 09:28:21.617 189070 INFO nova.virt.libvirt.driver [None req-947d3976-ed56-4ac5-ba59-39c49cb0c8cd bcc37d16c39547bba794fb1f43e889c1 6c5bb818cba543bbb1bcff8df31dd9cd - - default default] [instance: 2bdfdb89-21af-43b6-93eb-48a637bfbd4c] Using config drive
Dec 05 09:28:22 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:28:22.067 105272 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=2deaed7a-68f6-453c-b7f8-10ef033f3762, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '11'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 09:28:22 compute-1 nova_compute[189066]: 2025-12-05 09:28:22.077 189070 INFO nova.virt.libvirt.driver [None req-947d3976-ed56-4ac5-ba59-39c49cb0c8cd bcc37d16c39547bba794fb1f43e889c1 6c5bb818cba543bbb1bcff8df31dd9cd - - default default] [instance: 2bdfdb89-21af-43b6-93eb-48a637bfbd4c] Creating config drive at /var/lib/nova/instances/2bdfdb89-21af-43b6-93eb-48a637bfbd4c/disk.config
Dec 05 09:28:22 compute-1 nova_compute[189066]: 2025-12-05 09:28:22.085 189070 DEBUG oslo_concurrency.processutils [None req-947d3976-ed56-4ac5-ba59-39c49cb0c8cd bcc37d16c39547bba794fb1f43e889c1 6c5bb818cba543bbb1bcff8df31dd9cd - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/2bdfdb89-21af-43b6-93eb-48a637bfbd4c/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpl_rfxllw execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 09:28:22 compute-1 nova_compute[189066]: 2025-12-05 09:28:22.233 189070 DEBUG oslo_concurrency.processutils [None req-947d3976-ed56-4ac5-ba59-39c49cb0c8cd bcc37d16c39547bba794fb1f43e889c1 6c5bb818cba543bbb1bcff8df31dd9cd - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/2bdfdb89-21af-43b6-93eb-48a637bfbd4c/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpl_rfxllw" returned: 0 in 0.147s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 09:28:22 compute-1 kernel: tap6cdc578a-4e: entered promiscuous mode
Dec 05 09:28:22 compute-1 nova_compute[189066]: 2025-12-05 09:28:22.297 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:28:22 compute-1 NetworkManager[55704]: <info>  [1764926902.2991] manager: (tap6cdc578a-4e): new Tun device (/org/freedesktop/NetworkManager/Devices/58)
Dec 05 09:28:22 compute-1 ovn_controller[95809]: 2025-12-05T09:28:22Z|00103|binding|INFO|Claiming lport 6cdc578a-4ede-49b9-83a0-819716269b48 for this chassis.
Dec 05 09:28:22 compute-1 ovn_controller[95809]: 2025-12-05T09:28:22Z|00104|binding|INFO|6cdc578a-4ede-49b9-83a0-819716269b48: Claiming fa:16:3e:8c:88:28 10.100.0.7
Dec 05 09:28:22 compute-1 ovn_controller[95809]: 2025-12-05T09:28:22Z|00105|binding|INFO|Setting lport 6cdc578a-4ede-49b9-83a0-819716269b48 ovn-installed in OVS
Dec 05 09:28:22 compute-1 nova_compute[189066]: 2025-12-05 09:28:22.314 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:28:22 compute-1 systemd-machined[154815]: New machine qemu-8-instance-0000000e.
Dec 05 09:28:22 compute-1 systemd-udevd[224558]: Network interface NamePolicy= disabled on kernel command line.
Dec 05 09:28:22 compute-1 systemd[1]: Started Virtual Machine qemu-8-instance-0000000e.
Dec 05 09:28:22 compute-1 NetworkManager[55704]: <info>  [1764926902.3590] device (tap6cdc578a-4e): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec 05 09:28:22 compute-1 NetworkManager[55704]: <info>  [1764926902.3605] device (tap6cdc578a-4e): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec 05 09:28:22 compute-1 nova_compute[189066]: 2025-12-05 09:28:22.480 189070 DEBUG nova.compute.manager [req-a04f70b4-1740-488b-a9d3-93161222f6c4 req-5d124b35-f721-47da-815b-0437b6108d5a 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 2bdfdb89-21af-43b6-93eb-48a637bfbd4c] Received event network-changed-6cdc578a-4ede-49b9-83a0-819716269b48 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 09:28:22 compute-1 nova_compute[189066]: 2025-12-05 09:28:22.482 189070 DEBUG nova.compute.manager [req-a04f70b4-1740-488b-a9d3-93161222f6c4 req-5d124b35-f721-47da-815b-0437b6108d5a 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 2bdfdb89-21af-43b6-93eb-48a637bfbd4c] Refreshing instance network info cache due to event network-changed-6cdc578a-4ede-49b9-83a0-819716269b48. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Dec 05 09:28:22 compute-1 nova_compute[189066]: 2025-12-05 09:28:22.483 189070 DEBUG oslo_concurrency.lockutils [req-a04f70b4-1740-488b-a9d3-93161222f6c4 req-5d124b35-f721-47da-815b-0437b6108d5a 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Acquiring lock "refresh_cache-2bdfdb89-21af-43b6-93eb-48a637bfbd4c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 09:28:22 compute-1 nova_compute[189066]: 2025-12-05 09:28:22.483 189070 DEBUG oslo_concurrency.lockutils [req-a04f70b4-1740-488b-a9d3-93161222f6c4 req-5d124b35-f721-47da-815b-0437b6108d5a 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Acquired lock "refresh_cache-2bdfdb89-21af-43b6-93eb-48a637bfbd4c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 09:28:22 compute-1 nova_compute[189066]: 2025-12-05 09:28:22.484 189070 DEBUG nova.network.neutron [req-a04f70b4-1740-488b-a9d3-93161222f6c4 req-5d124b35-f721-47da-815b-0437b6108d5a 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 2bdfdb89-21af-43b6-93eb-48a637bfbd4c] Refreshing network info cache for port 6cdc578a-4ede-49b9-83a0-819716269b48 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Dec 05 09:28:22 compute-1 ovn_controller[95809]: 2025-12-05T09:28:22Z|00106|binding|INFO|Setting lport 6cdc578a-4ede-49b9-83a0-819716269b48 up in Southbound
Dec 05 09:28:22 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:28:22.504 105272 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:8c:88:28 10.100.0.7'], port_security=['fa:16:3e:8c:88:28 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': '2bdfdb89-21af-43b6-93eb-48a637bfbd4c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-846936f3-e466-44ea-ae63-40fc6aa49531', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6c5bb818cba543bbb1bcff8df31dd9cd', 'neutron:revision_number': '2', 'neutron:security_group_ids': '9012b762-0914-4346-9b93-d550f3c25e13', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3a2ff281-26e9-4eb5-b083-ea6de485eb77, chassis=[<ovs.db.idl.Row object at 0x7fc06f54c970>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fc06f54c970>], logical_port=6cdc578a-4ede-49b9-83a0-819716269b48) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 09:28:22 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:28:22.508 105272 INFO neutron.agent.ovn.metadata.agent [-] Port 6cdc578a-4ede-49b9-83a0-819716269b48 in datapath 846936f3-e466-44ea-ae63-40fc6aa49531 bound to our chassis
Dec 05 09:28:22 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:28:22.513 105272 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 846936f3-e466-44ea-ae63-40fc6aa49531
Dec 05 09:28:22 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:28:22.528 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[cba436d5-d484-4d0b-8ede-67ec37a33809]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:28:22 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:28:22.530 105272 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap846936f3-e1 in ovnmeta-846936f3-e466-44ea-ae63-40fc6aa49531 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Dec 05 09:28:22 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:28:22.533 220556 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap846936f3-e0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Dec 05 09:28:22 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:28:22.533 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[99078e48-5e02-4d3f-bde8-0e0ea435a127]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:28:22 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:28:22.541 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[9748dd1b-be65-47fd-b392-03a9e30af5fe]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:28:22 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:28:22.557 105809 DEBUG oslo.privsep.daemon [-] privsep: reply[415abe5f-0991-4f68-bb61-82348e86d924]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:28:22 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:28:22.587 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[71ffe58d-1410-4110-8546-079afa8d48d6]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:28:22 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:28:22.627 220896 DEBUG oslo.privsep.daemon [-] privsep: reply[93aed868-ef73-4706-9bc1-f6fe298e84d2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:28:22 compute-1 NetworkManager[55704]: <info>  [1764926902.6364] manager: (tap846936f3-e0): new Veth device (/org/freedesktop/NetworkManager/Devices/59)
Dec 05 09:28:22 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:28:22.635 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[f7f20617-a1f7-4914-bb07-e828d2200479]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:28:22 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:28:22.673 220896 DEBUG oslo.privsep.daemon [-] privsep: reply[3e249c66-3436-4d14-9586-d7187d170359]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:28:22 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:28:22.676 220896 DEBUG oslo.privsep.daemon [-] privsep: reply[58b593e6-c202-4d8f-8c47-12f9b3de13b1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:28:22 compute-1 NetworkManager[55704]: <info>  [1764926902.7037] device (tap846936f3-e0): carrier: link connected
Dec 05 09:28:22 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:28:22.712 220896 DEBUG oslo.privsep.daemon [-] privsep: reply[ddfbb87b-a3a1-4d05-ba77-e8718543cdf2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:28:22 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:28:22.730 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[7fadae0f-0dad-460d-83a4-b767e7c06794]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap846936f3-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:2c:71:63'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 33], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 417570, 'reachable_time': 28629, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 224598, 'error': None, 'target': 'ovnmeta-846936f3-e466-44ea-ae63-40fc6aa49531', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:28:22 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:28:22.744 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[67d9cadf-f062-4c3e-b8c6-744f66d7d5cc]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe2c:7163'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 417570, 'tstamp': 417570}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 224599, 'error': None, 'target': 'ovnmeta-846936f3-e466-44ea-ae63-40fc6aa49531', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:28:22 compute-1 nova_compute[189066]: 2025-12-05 09:28:22.757 189070 DEBUG nova.virt.driver [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] Emitting event <LifecycleEvent: 1764926902.7566075, 2bdfdb89-21af-43b6-93eb-48a637bfbd4c => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 09:28:22 compute-1 nova_compute[189066]: 2025-12-05 09:28:22.758 189070 INFO nova.compute.manager [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] [instance: 2bdfdb89-21af-43b6-93eb-48a637bfbd4c] VM Started (Lifecycle Event)
Dec 05 09:28:22 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:28:22.768 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[82949e29-05b1-4af2-b4f8-36c93053fda5]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap846936f3-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:2c:71:63'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 33], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 417570, 'reachable_time': 28629, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 224600, 'error': None, 'target': 'ovnmeta-846936f3-e466-44ea-ae63-40fc6aa49531', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:28:22 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:28:22.807 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[87b3bed4-aeeb-40c9-812d-384ec00ac669]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:28:22 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:28:22.883 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[aff1dc78-0a45-434c-acdc-e235ff784acf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:28:22 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:28:22.885 105272 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap846936f3-e0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 09:28:22 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:28:22.885 105272 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 09:28:22 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:28:22.886 105272 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap846936f3-e0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 09:28:22 compute-1 kernel: tap846936f3-e0: entered promiscuous mode
Dec 05 09:28:22 compute-1 NetworkManager[55704]: <info>  [1764926902.8899] manager: (tap846936f3-e0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/60)
Dec 05 09:28:22 compute-1 nova_compute[189066]: 2025-12-05 09:28:22.891 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:28:22 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:28:22.892 105272 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap846936f3-e0, col_values=(('external_ids', {'iface-id': 'a470ec20-79a2-4492-bf5d-41ca50657a47'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 09:28:22 compute-1 ovn_controller[95809]: 2025-12-05T09:28:22Z|00107|binding|INFO|Releasing lport a470ec20-79a2-4492-bf5d-41ca50657a47 from this chassis (sb_readonly=0)
Dec 05 09:28:22 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:28:22.894 105272 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/846936f3-e466-44ea-ae63-40fc6aa49531.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/846936f3-e466-44ea-ae63-40fc6aa49531.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Dec 05 09:28:22 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:28:22.895 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[4beb1881-a349-4562-8150-0183d73fd3f0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:28:22 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:28:22.896 105272 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Dec 05 09:28:22 compute-1 ovn_metadata_agent[105267]: global
Dec 05 09:28:22 compute-1 ovn_metadata_agent[105267]:     log         /dev/log local0 debug
Dec 05 09:28:22 compute-1 ovn_metadata_agent[105267]:     log-tag     haproxy-metadata-proxy-846936f3-e466-44ea-ae63-40fc6aa49531
Dec 05 09:28:22 compute-1 ovn_metadata_agent[105267]:     user        root
Dec 05 09:28:22 compute-1 ovn_metadata_agent[105267]:     group       root
Dec 05 09:28:22 compute-1 ovn_metadata_agent[105267]:     maxconn     1024
Dec 05 09:28:22 compute-1 ovn_metadata_agent[105267]:     pidfile     /var/lib/neutron/external/pids/846936f3-e466-44ea-ae63-40fc6aa49531.pid.haproxy
Dec 05 09:28:22 compute-1 ovn_metadata_agent[105267]:     daemon
Dec 05 09:28:22 compute-1 ovn_metadata_agent[105267]: 
Dec 05 09:28:22 compute-1 ovn_metadata_agent[105267]: defaults
Dec 05 09:28:22 compute-1 ovn_metadata_agent[105267]:     log global
Dec 05 09:28:22 compute-1 ovn_metadata_agent[105267]:     mode http
Dec 05 09:28:22 compute-1 ovn_metadata_agent[105267]:     option httplog
Dec 05 09:28:22 compute-1 ovn_metadata_agent[105267]:     option dontlognull
Dec 05 09:28:22 compute-1 ovn_metadata_agent[105267]:     option http-server-close
Dec 05 09:28:22 compute-1 ovn_metadata_agent[105267]:     option forwardfor
Dec 05 09:28:22 compute-1 ovn_metadata_agent[105267]:     retries                 3
Dec 05 09:28:22 compute-1 ovn_metadata_agent[105267]:     timeout http-request    30s
Dec 05 09:28:22 compute-1 ovn_metadata_agent[105267]:     timeout connect         30s
Dec 05 09:28:22 compute-1 ovn_metadata_agent[105267]:     timeout client          32s
Dec 05 09:28:22 compute-1 ovn_metadata_agent[105267]:     timeout server          32s
Dec 05 09:28:22 compute-1 ovn_metadata_agent[105267]:     timeout http-keep-alive 30s
Dec 05 09:28:22 compute-1 ovn_metadata_agent[105267]: 
Dec 05 09:28:22 compute-1 ovn_metadata_agent[105267]: 
Dec 05 09:28:22 compute-1 ovn_metadata_agent[105267]: listen listener
Dec 05 09:28:22 compute-1 ovn_metadata_agent[105267]:     bind 169.254.169.254:80
Dec 05 09:28:22 compute-1 ovn_metadata_agent[105267]:     server metadata /var/lib/neutron/metadata_proxy
Dec 05 09:28:22 compute-1 ovn_metadata_agent[105267]:     http-request add-header X-OVN-Network-ID 846936f3-e466-44ea-ae63-40fc6aa49531
Dec 05 09:28:22 compute-1 ovn_metadata_agent[105267]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Dec 05 09:28:22 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:28:22.896 105272 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-846936f3-e466-44ea-ae63-40fc6aa49531', 'env', 'PROCESS_TAG=haproxy-846936f3-e466-44ea-ae63-40fc6aa49531', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/846936f3-e466-44ea-ae63-40fc6aa49531.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Dec 05 09:28:22 compute-1 nova_compute[189066]: 2025-12-05 09:28:22.906 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:28:22 compute-1 nova_compute[189066]: 2025-12-05 09:28:22.986 189070 DEBUG nova.compute.manager [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] [instance: 2bdfdb89-21af-43b6-93eb-48a637bfbd4c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 09:28:22 compute-1 nova_compute[189066]: 2025-12-05 09:28:22.990 189070 DEBUG nova.virt.driver [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] Emitting event <LifecycleEvent: 1764926902.7567186, 2bdfdb89-21af-43b6-93eb-48a637bfbd4c => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 09:28:22 compute-1 nova_compute[189066]: 2025-12-05 09:28:22.991 189070 INFO nova.compute.manager [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] [instance: 2bdfdb89-21af-43b6-93eb-48a637bfbd4c] VM Paused (Lifecycle Event)
Dec 05 09:28:23 compute-1 podman[224633]: 2025-12-05 09:28:23.329699957 +0000 UTC m=+0.063896753 container create e51546cfa9683f2fd50024f6a572440718db3cb6c104ae280dd5be46aeab02fe (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-846936f3-e466-44ea-ae63-40fc6aa49531, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0)
Dec 05 09:28:23 compute-1 systemd[1]: Started libpod-conmon-e51546cfa9683f2fd50024f6a572440718db3cb6c104ae280dd5be46aeab02fe.scope.
Dec 05 09:28:23 compute-1 podman[224633]: 2025-12-05 09:28:23.301718883 +0000 UTC m=+0.035915709 image pull 014dc726c85414b29f2dde7b5d875685d08784761c0f0ffa8630d1583a877bf9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Dec 05 09:28:23 compute-1 systemd[1]: Started libcrun container.
Dec 05 09:28:23 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/fc574073b4218a7a6721097f6d7a3d74cf01d205c90afd8a27e088a422a689cc/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 05 09:28:23 compute-1 podman[224633]: 2025-12-05 09:28:23.424379471 +0000 UTC m=+0.158576267 container init e51546cfa9683f2fd50024f6a572440718db3cb6c104ae280dd5be46aeab02fe (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-846936f3-e466-44ea-ae63-40fc6aa49531, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Dec 05 09:28:23 compute-1 podman[224633]: 2025-12-05 09:28:23.432070079 +0000 UTC m=+0.166266875 container start e51546cfa9683f2fd50024f6a572440718db3cb6c104ae280dd5be46aeab02fe (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-846936f3-e466-44ea-ae63-40fc6aa49531, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 05 09:28:23 compute-1 neutron-haproxy-ovnmeta-846936f3-e466-44ea-ae63-40fc6aa49531[224648]: [NOTICE]   (224652) : New worker (224654) forked
Dec 05 09:28:23 compute-1 neutron-haproxy-ovnmeta-846936f3-e466-44ea-ae63-40fc6aa49531[224648]: [NOTICE]   (224652) : Loading success.
Dec 05 09:28:24 compute-1 nova_compute[189066]: 2025-12-05 09:28:24.156 189070 DEBUG nova.compute.manager [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] [instance: 2bdfdb89-21af-43b6-93eb-48a637bfbd4c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 09:28:24 compute-1 nova_compute[189066]: 2025-12-05 09:28:24.161 189070 DEBUG nova.compute.manager [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] [instance: 2bdfdb89-21af-43b6-93eb-48a637bfbd4c] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 09:28:24 compute-1 nova_compute[189066]: 2025-12-05 09:28:24.194 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:28:24 compute-1 nova_compute[189066]: 2025-12-05 09:28:24.789 189070 INFO nova.compute.manager [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] [instance: 2bdfdb89-21af-43b6-93eb-48a637bfbd4c] During sync_power_state the instance has a pending task (spawning). Skip.
Dec 05 09:28:24 compute-1 nova_compute[189066]: 2025-12-05 09:28:24.808 189070 DEBUG nova.compute.manager [req-ed839da9-a3b6-4243-b02b-e973354c2b52 req-aee40ad2-5989-4463-ba99-95e583eac4dc 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 2bdfdb89-21af-43b6-93eb-48a637bfbd4c] Received event network-vif-plugged-6cdc578a-4ede-49b9-83a0-819716269b48 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 09:28:24 compute-1 nova_compute[189066]: 2025-12-05 09:28:24.808 189070 DEBUG oslo_concurrency.lockutils [req-ed839da9-a3b6-4243-b02b-e973354c2b52 req-aee40ad2-5989-4463-ba99-95e583eac4dc 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Acquiring lock "2bdfdb89-21af-43b6-93eb-48a637bfbd4c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:28:24 compute-1 nova_compute[189066]: 2025-12-05 09:28:24.809 189070 DEBUG oslo_concurrency.lockutils [req-ed839da9-a3b6-4243-b02b-e973354c2b52 req-aee40ad2-5989-4463-ba99-95e583eac4dc 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Lock "2bdfdb89-21af-43b6-93eb-48a637bfbd4c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:28:24 compute-1 nova_compute[189066]: 2025-12-05 09:28:24.809 189070 DEBUG oslo_concurrency.lockutils [req-ed839da9-a3b6-4243-b02b-e973354c2b52 req-aee40ad2-5989-4463-ba99-95e583eac4dc 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Lock "2bdfdb89-21af-43b6-93eb-48a637bfbd4c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:28:24 compute-1 nova_compute[189066]: 2025-12-05 09:28:24.809 189070 DEBUG nova.compute.manager [req-ed839da9-a3b6-4243-b02b-e973354c2b52 req-aee40ad2-5989-4463-ba99-95e583eac4dc 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 2bdfdb89-21af-43b6-93eb-48a637bfbd4c] Processing event network-vif-plugged-6cdc578a-4ede-49b9-83a0-819716269b48 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Dec 05 09:28:24 compute-1 nova_compute[189066]: 2025-12-05 09:28:24.810 189070 DEBUG nova.compute.manager [None req-947d3976-ed56-4ac5-ba59-39c49cb0c8cd bcc37d16c39547bba794fb1f43e889c1 6c5bb818cba543bbb1bcff8df31dd9cd - - default default] [instance: 2bdfdb89-21af-43b6-93eb-48a637bfbd4c] Instance event wait completed in 2 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Dec 05 09:28:24 compute-1 nova_compute[189066]: 2025-12-05 09:28:24.823 189070 DEBUG nova.virt.driver [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] Emitting event <LifecycleEvent: 1764926904.8227985, 2bdfdb89-21af-43b6-93eb-48a637bfbd4c => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 09:28:24 compute-1 nova_compute[189066]: 2025-12-05 09:28:24.824 189070 INFO nova.compute.manager [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] [instance: 2bdfdb89-21af-43b6-93eb-48a637bfbd4c] VM Resumed (Lifecycle Event)
Dec 05 09:28:24 compute-1 nova_compute[189066]: 2025-12-05 09:28:24.827 189070 DEBUG nova.virt.libvirt.driver [None req-947d3976-ed56-4ac5-ba59-39c49cb0c8cd bcc37d16c39547bba794fb1f43e889c1 6c5bb818cba543bbb1bcff8df31dd9cd - - default default] [instance: 2bdfdb89-21af-43b6-93eb-48a637bfbd4c] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Dec 05 09:28:24 compute-1 nova_compute[189066]: 2025-12-05 09:28:24.831 189070 INFO nova.virt.libvirt.driver [-] [instance: 2bdfdb89-21af-43b6-93eb-48a637bfbd4c] Instance spawned successfully.
Dec 05 09:28:24 compute-1 nova_compute[189066]: 2025-12-05 09:28:24.831 189070 DEBUG nova.virt.libvirt.driver [None req-947d3976-ed56-4ac5-ba59-39c49cb0c8cd bcc37d16c39547bba794fb1f43e889c1 6c5bb818cba543bbb1bcff8df31dd9cd - - default default] [instance: 2bdfdb89-21af-43b6-93eb-48a637bfbd4c] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Dec 05 09:28:24 compute-1 nova_compute[189066]: 2025-12-05 09:28:24.869 189070 DEBUG nova.compute.manager [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] [instance: 2bdfdb89-21af-43b6-93eb-48a637bfbd4c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 09:28:24 compute-1 nova_compute[189066]: 2025-12-05 09:28:24.874 189070 DEBUG nova.compute.manager [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] [instance: 2bdfdb89-21af-43b6-93eb-48a637bfbd4c] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 09:28:24 compute-1 nova_compute[189066]: 2025-12-05 09:28:24.891 189070 DEBUG nova.virt.libvirt.driver [None req-947d3976-ed56-4ac5-ba59-39c49cb0c8cd bcc37d16c39547bba794fb1f43e889c1 6c5bb818cba543bbb1bcff8df31dd9cd - - default default] [instance: 2bdfdb89-21af-43b6-93eb-48a637bfbd4c] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 09:28:24 compute-1 nova_compute[189066]: 2025-12-05 09:28:24.892 189070 DEBUG nova.virt.libvirt.driver [None req-947d3976-ed56-4ac5-ba59-39c49cb0c8cd bcc37d16c39547bba794fb1f43e889c1 6c5bb818cba543bbb1bcff8df31dd9cd - - default default] [instance: 2bdfdb89-21af-43b6-93eb-48a637bfbd4c] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 09:28:24 compute-1 nova_compute[189066]: 2025-12-05 09:28:24.892 189070 DEBUG nova.virt.libvirt.driver [None req-947d3976-ed56-4ac5-ba59-39c49cb0c8cd bcc37d16c39547bba794fb1f43e889c1 6c5bb818cba543bbb1bcff8df31dd9cd - - default default] [instance: 2bdfdb89-21af-43b6-93eb-48a637bfbd4c] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 09:28:24 compute-1 nova_compute[189066]: 2025-12-05 09:28:24.893 189070 DEBUG nova.virt.libvirt.driver [None req-947d3976-ed56-4ac5-ba59-39c49cb0c8cd bcc37d16c39547bba794fb1f43e889c1 6c5bb818cba543bbb1bcff8df31dd9cd - - default default] [instance: 2bdfdb89-21af-43b6-93eb-48a637bfbd4c] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 09:28:24 compute-1 nova_compute[189066]: 2025-12-05 09:28:24.893 189070 DEBUG nova.virt.libvirt.driver [None req-947d3976-ed56-4ac5-ba59-39c49cb0c8cd bcc37d16c39547bba794fb1f43e889c1 6c5bb818cba543bbb1bcff8df31dd9cd - - default default] [instance: 2bdfdb89-21af-43b6-93eb-48a637bfbd4c] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 09:28:24 compute-1 nova_compute[189066]: 2025-12-05 09:28:24.894 189070 DEBUG nova.virt.libvirt.driver [None req-947d3976-ed56-4ac5-ba59-39c49cb0c8cd bcc37d16c39547bba794fb1f43e889c1 6c5bb818cba543bbb1bcff8df31dd9cd - - default default] [instance: 2bdfdb89-21af-43b6-93eb-48a637bfbd4c] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 09:28:24 compute-1 nova_compute[189066]: 2025-12-05 09:28:24.898 189070 INFO nova.compute.manager [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] [instance: 2bdfdb89-21af-43b6-93eb-48a637bfbd4c] During sync_power_state the instance has a pending task (spawning). Skip.
Dec 05 09:28:26 compute-1 nova_compute[189066]: 2025-12-05 09:28:26.556 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:28:26 compute-1 podman[224663]: 2025-12-05 09:28:26.656196699 +0000 UTC m=+0.062879978 container health_status 550b604b3b40c506829669f3af0f3e8dc4e7ac01464222ce7f26998708fe1a96 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Dec 05 09:28:27 compute-1 nova_compute[189066]: 2025-12-05 09:28:27.299 189070 INFO nova.compute.manager [None req-947d3976-ed56-4ac5-ba59-39c49cb0c8cd bcc37d16c39547bba794fb1f43e889c1 6c5bb818cba543bbb1bcff8df31dd9cd - - default default] [instance: 2bdfdb89-21af-43b6-93eb-48a637bfbd4c] Took 17.77 seconds to spawn the instance on the hypervisor.
Dec 05 09:28:27 compute-1 nova_compute[189066]: 2025-12-05 09:28:27.300 189070 DEBUG nova.compute.manager [None req-947d3976-ed56-4ac5-ba59-39c49cb0c8cd bcc37d16c39547bba794fb1f43e889c1 6c5bb818cba543bbb1bcff8df31dd9cd - - default default] [instance: 2bdfdb89-21af-43b6-93eb-48a637bfbd4c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 09:28:27 compute-1 nova_compute[189066]: 2025-12-05 09:28:27.651 189070 INFO nova.compute.manager [None req-947d3976-ed56-4ac5-ba59-39c49cb0c8cd bcc37d16c39547bba794fb1f43e889c1 6c5bb818cba543bbb1bcff8df31dd9cd - - default default] [instance: 2bdfdb89-21af-43b6-93eb-48a637bfbd4c] Took 18.97 seconds to build instance.
Dec 05 09:28:27 compute-1 nova_compute[189066]: 2025-12-05 09:28:27.799 189070 DEBUG oslo_concurrency.lockutils [None req-947d3976-ed56-4ac5-ba59-39c49cb0c8cd bcc37d16c39547bba794fb1f43e889c1 6c5bb818cba543bbb1bcff8df31dd9cd - - default default] Lock "2bdfdb89-21af-43b6-93eb-48a637bfbd4c" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 19.493s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:28:28 compute-1 nova_compute[189066]: 2025-12-05 09:28:28.094 189070 DEBUG nova.compute.manager [req-94621b82-d034-4e01-8cb9-8da762b8ac0e req-f60f2363-bdd9-4786-8bd3-35e3ac4e9dd7 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 2bdfdb89-21af-43b6-93eb-48a637bfbd4c] Received event network-vif-plugged-6cdc578a-4ede-49b9-83a0-819716269b48 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 09:28:28 compute-1 nova_compute[189066]: 2025-12-05 09:28:28.095 189070 DEBUG oslo_concurrency.lockutils [req-94621b82-d034-4e01-8cb9-8da762b8ac0e req-f60f2363-bdd9-4786-8bd3-35e3ac4e9dd7 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Acquiring lock "2bdfdb89-21af-43b6-93eb-48a637bfbd4c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:28:28 compute-1 nova_compute[189066]: 2025-12-05 09:28:28.095 189070 DEBUG oslo_concurrency.lockutils [req-94621b82-d034-4e01-8cb9-8da762b8ac0e req-f60f2363-bdd9-4786-8bd3-35e3ac4e9dd7 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Lock "2bdfdb89-21af-43b6-93eb-48a637bfbd4c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:28:28 compute-1 nova_compute[189066]: 2025-12-05 09:28:28.095 189070 DEBUG oslo_concurrency.lockutils [req-94621b82-d034-4e01-8cb9-8da762b8ac0e req-f60f2363-bdd9-4786-8bd3-35e3ac4e9dd7 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Lock "2bdfdb89-21af-43b6-93eb-48a637bfbd4c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:28:28 compute-1 nova_compute[189066]: 2025-12-05 09:28:28.096 189070 DEBUG nova.compute.manager [req-94621b82-d034-4e01-8cb9-8da762b8ac0e req-f60f2363-bdd9-4786-8bd3-35e3ac4e9dd7 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 2bdfdb89-21af-43b6-93eb-48a637bfbd4c] No waiting events found dispatching network-vif-plugged-6cdc578a-4ede-49b9-83a0-819716269b48 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 09:28:28 compute-1 nova_compute[189066]: 2025-12-05 09:28:28.096 189070 WARNING nova.compute.manager [req-94621b82-d034-4e01-8cb9-8da762b8ac0e req-f60f2363-bdd9-4786-8bd3-35e3ac4e9dd7 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 2bdfdb89-21af-43b6-93eb-48a637bfbd4c] Received unexpected event network-vif-plugged-6cdc578a-4ede-49b9-83a0-819716269b48 for instance with vm_state active and task_state None.
Dec 05 09:28:29 compute-1 nova_compute[189066]: 2025-12-05 09:28:29.198 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:28:29 compute-1 nova_compute[189066]: 2025-12-05 09:28:29.711 189070 DEBUG nova.network.neutron [req-a04f70b4-1740-488b-a9d3-93161222f6c4 req-5d124b35-f721-47da-815b-0437b6108d5a 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 2bdfdb89-21af-43b6-93eb-48a637bfbd4c] Updated VIF entry in instance network info cache for port 6cdc578a-4ede-49b9-83a0-819716269b48. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Dec 05 09:28:29 compute-1 nova_compute[189066]: 2025-12-05 09:28:29.712 189070 DEBUG nova.network.neutron [req-a04f70b4-1740-488b-a9d3-93161222f6c4 req-5d124b35-f721-47da-815b-0437b6108d5a 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 2bdfdb89-21af-43b6-93eb-48a637bfbd4c] Updating instance_info_cache with network_info: [{"id": "6cdc578a-4ede-49b9-83a0-819716269b48", "address": "fa:16:3e:8c:88:28", "network": {"id": "846936f3-e466-44ea-ae63-40fc6aa49531", "bridge": "br-int", "label": "tempest-network-smoke--1014331195", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6c5bb818cba543bbb1bcff8df31dd9cd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6cdc578a-4e", "ovs_interfaceid": "6cdc578a-4ede-49b9-83a0-819716269b48", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 09:28:29 compute-1 nova_compute[189066]: 2025-12-05 09:28:29.944 189070 DEBUG oslo_concurrency.lockutils [req-a04f70b4-1740-488b-a9d3-93161222f6c4 req-5d124b35-f721-47da-815b-0437b6108d5a 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Releasing lock "refresh_cache-2bdfdb89-21af-43b6-93eb-48a637bfbd4c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 09:28:30 compute-1 ovn_controller[95809]: 2025-12-05T09:28:30Z|00016|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:ad:12:e3 10.100.0.14
Dec 05 09:28:31 compute-1 nova_compute[189066]: 2025-12-05 09:28:31.611 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:28:34 compute-1 nova_compute[189066]: 2025-12-05 09:28:34.200 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:28:34 compute-1 podman[224690]: 2025-12-05 09:28:34.671996248 +0000 UTC m=+0.100118168 container health_status d60d3dfd47255bc7c068dbedc03dbf86678863f0414a1b8f8536e4d53dd67a13 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a42d21893a42142b0b00e96296a13ac9d2c0fedf55d6438ad104fd0309c63376-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, config_id=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Dec 05 09:28:36 compute-1 nova_compute[189066]: 2025-12-05 09:28:36.614 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:28:37 compute-1 nova_compute[189066]: 2025-12-05 09:28:37.131 189070 INFO nova.compute.manager [None req-5c86ffaa-0f6c-471f-a1ec-7c386726a42f 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] [instance: 9d2b0f76-0408-4f6b-8a37-ac9882a44b4e] Get console output
Dec 05 09:28:37 compute-1 nova_compute[189066]: 2025-12-05 09:28:37.140 220618 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Dec 05 09:28:38 compute-1 podman[224726]: 2025-12-05 09:28:38.669272558 +0000 UTC m=+0.104874335 container health_status 0cdc951f3b455a1459471b20e1f1626ca53f702831c125f835d1b670aeb17ccf (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a42d21893a42142b0b00e96296a13ac9d2c0fedf55d6438ad104fd0309c63376-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Dec 05 09:28:38 compute-1 ovn_controller[95809]: 2025-12-05T09:28:38Z|00017|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:8c:88:28 10.100.0.7
Dec 05 09:28:38 compute-1 ovn_controller[95809]: 2025-12-05T09:28:38Z|00018|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:8c:88:28 10.100.0.7
Dec 05 09:28:39 compute-1 nova_compute[189066]: 2025-12-05 09:28:39.204 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:28:40 compute-1 podman[224753]: 2025-12-05 09:28:40.644388029 +0000 UTC m=+0.081263038 container health_status e8cea6352187f28e672aa25c227f2705a90d58074b8cf42235a56df5d4026fd5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a42d21893a42142b0b00e96296a13ac9d2c0fedf55d6438ad104fd0309c63376-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Dec 05 09:28:41 compute-1 ovn_controller[95809]: 2025-12-05T09:28:41Z|00108|binding|INFO|Releasing lport da7e4261-23bc-43e6-a0d1-1c34dd95f6c8 from this chassis (sb_readonly=0)
Dec 05 09:28:41 compute-1 ovn_controller[95809]: 2025-12-05T09:28:41Z|00109|binding|INFO|Releasing lport a470ec20-79a2-4492-bf5d-41ca50657a47 from this chassis (sb_readonly=0)
Dec 05 09:28:41 compute-1 nova_compute[189066]: 2025-12-05 09:28:41.120 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:28:41 compute-1 nova_compute[189066]: 2025-12-05 09:28:41.661 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:28:41 compute-1 nova_compute[189066]: 2025-12-05 09:28:41.781 189070 DEBUG oslo_concurrency.lockutils [None req-b160e745-4690-43b0-a494-9f447036d57c 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] Acquiring lock "9d2b0f76-0408-4f6b-8a37-ac9882a44b4e" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:28:41 compute-1 nova_compute[189066]: 2025-12-05 09:28:41.782 189070 DEBUG oslo_concurrency.lockutils [None req-b160e745-4690-43b0-a494-9f447036d57c 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] Lock "9d2b0f76-0408-4f6b-8a37-ac9882a44b4e" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:28:41 compute-1 nova_compute[189066]: 2025-12-05 09:28:41.782 189070 DEBUG oslo_concurrency.lockutils [None req-b160e745-4690-43b0-a494-9f447036d57c 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] Acquiring lock "9d2b0f76-0408-4f6b-8a37-ac9882a44b4e-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:28:41 compute-1 nova_compute[189066]: 2025-12-05 09:28:41.782 189070 DEBUG oslo_concurrency.lockutils [None req-b160e745-4690-43b0-a494-9f447036d57c 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] Lock "9d2b0f76-0408-4f6b-8a37-ac9882a44b4e-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:28:41 compute-1 nova_compute[189066]: 2025-12-05 09:28:41.782 189070 DEBUG oslo_concurrency.lockutils [None req-b160e745-4690-43b0-a494-9f447036d57c 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] Lock "9d2b0f76-0408-4f6b-8a37-ac9882a44b4e-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:28:41 compute-1 nova_compute[189066]: 2025-12-05 09:28:41.783 189070 INFO nova.compute.manager [None req-b160e745-4690-43b0-a494-9f447036d57c 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] [instance: 9d2b0f76-0408-4f6b-8a37-ac9882a44b4e] Terminating instance
Dec 05 09:28:41 compute-1 nova_compute[189066]: 2025-12-05 09:28:41.784 189070 DEBUG nova.compute.manager [None req-b160e745-4690-43b0-a494-9f447036d57c 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] [instance: 9d2b0f76-0408-4f6b-8a37-ac9882a44b4e] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Dec 05 09:28:41 compute-1 kernel: tapfee19e88-d1 (unregistering): left promiscuous mode
Dec 05 09:28:41 compute-1 NetworkManager[55704]: <info>  [1764926921.8222] device (tapfee19e88-d1): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec 05 09:28:41 compute-1 ovn_controller[95809]: 2025-12-05T09:28:41Z|00110|binding|INFO|Releasing lport fee19e88-d18e-4020-97b6-26caf4ef6fa9 from this chassis (sb_readonly=0)
Dec 05 09:28:41 compute-1 ovn_controller[95809]: 2025-12-05T09:28:41Z|00111|binding|INFO|Setting lport fee19e88-d18e-4020-97b6-26caf4ef6fa9 down in Southbound
Dec 05 09:28:41 compute-1 nova_compute[189066]: 2025-12-05 09:28:41.829 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:28:41 compute-1 ovn_controller[95809]: 2025-12-05T09:28:41Z|00112|binding|INFO|Removing iface tapfee19e88-d1 ovn-installed in OVS
Dec 05 09:28:41 compute-1 nova_compute[189066]: 2025-12-05 09:28:41.833 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:28:41 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:28:41.843 105272 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ad:12:e3 10.100.0.14'], port_security=['fa:16:3e:ad:12:e3 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '9d2b0f76-0408-4f6b-8a37-ac9882a44b4e', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-2d5ff262-0b2d-49fb-b643-980510ce97c7', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e26ae3fdd48d4947978a480f70e14f84', 'neutron:revision_number': '14', 'neutron:security_group_ids': '8cc6de91-de2a-4730-92b2-0be4bf62385a', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=512618ca-52a2-4c70-b76c-f25c0d00e2da, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fc06f54c970>], logical_port=fee19e88-d18e-4020-97b6-26caf4ef6fa9) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fc06f54c970>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 09:28:41 compute-1 nova_compute[189066]: 2025-12-05 09:28:41.846 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:28:41 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:28:41.847 105272 INFO neutron.agent.ovn.metadata.agent [-] Port fee19e88-d18e-4020-97b6-26caf4ef6fa9 in datapath 2d5ff262-0b2d-49fb-b643-980510ce97c7 unbound from our chassis
Dec 05 09:28:41 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:28:41.850 105272 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 2d5ff262-0b2d-49fb-b643-980510ce97c7, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 05 09:28:41 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:28:41.851 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[61143b04-401c-448e-af4e-96d82d0ca6b9]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:28:41 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:28:41.853 105272 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-2d5ff262-0b2d-49fb-b643-980510ce97c7 namespace which is not needed anymore
Dec 05 09:28:41 compute-1 systemd[1]: machine-qemu\x2d7\x2dinstance\x2d0000000a.scope: Deactivated successfully.
Dec 05 09:28:41 compute-1 systemd[1]: machine-qemu\x2d7\x2dinstance\x2d0000000a.scope: Consumed 15.275s CPU time.
Dec 05 09:28:41 compute-1 systemd-machined[154815]: Machine qemu-7-instance-0000000a terminated.
Dec 05 09:28:42 compute-1 neutron-haproxy-ovnmeta-2d5ff262-0b2d-49fb-b643-980510ce97c7[224476]: [NOTICE]   (224481) : haproxy version is 2.8.14-c23fe91
Dec 05 09:28:42 compute-1 neutron-haproxy-ovnmeta-2d5ff262-0b2d-49fb-b643-980510ce97c7[224476]: [NOTICE]   (224481) : path to executable is /usr/sbin/haproxy
Dec 05 09:28:42 compute-1 neutron-haproxy-ovnmeta-2d5ff262-0b2d-49fb-b643-980510ce97c7[224476]: [WARNING]  (224481) : Exiting Master process...
Dec 05 09:28:42 compute-1 neutron-haproxy-ovnmeta-2d5ff262-0b2d-49fb-b643-980510ce97c7[224476]: [ALERT]    (224481) : Current worker (224484) exited with code 143 (Terminated)
Dec 05 09:28:42 compute-1 neutron-haproxy-ovnmeta-2d5ff262-0b2d-49fb-b643-980510ce97c7[224476]: [WARNING]  (224481) : All workers exited. Exiting... (0)
Dec 05 09:28:42 compute-1 systemd[1]: libpod-fc928b59a3e5d862d68ae647d7cf7b3926d36f3eda22fadc3aef3ef5c66e850e.scope: Deactivated successfully.
Dec 05 09:28:42 compute-1 kernel: tapfee19e88-d1: entered promiscuous mode
Dec 05 09:28:42 compute-1 NetworkManager[55704]: <info>  [1764926922.0112] manager: (tapfee19e88-d1): new Tun device (/org/freedesktop/NetworkManager/Devices/61)
Dec 05 09:28:42 compute-1 podman[224797]: 2025-12-05 09:28:42.011836065 +0000 UTC m=+0.051928300 container died fc928b59a3e5d862d68ae647d7cf7b3926d36f3eda22fadc3aef3ef5c66e850e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-2d5ff262-0b2d-49fb-b643-980510ce97c7, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Dec 05 09:28:42 compute-1 ovn_controller[95809]: 2025-12-05T09:28:42Z|00113|binding|INFO|Claiming lport fee19e88-d18e-4020-97b6-26caf4ef6fa9 for this chassis.
Dec 05 09:28:42 compute-1 ovn_controller[95809]: 2025-12-05T09:28:42Z|00114|binding|INFO|fee19e88-d18e-4020-97b6-26caf4ef6fa9: Claiming fa:16:3e:ad:12:e3 10.100.0.14
Dec 05 09:28:42 compute-1 nova_compute[189066]: 2025-12-05 09:28:42.013 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:28:42 compute-1 kernel: tapfee19e88-d1 (unregistering): left promiscuous mode
Dec 05 09:28:42 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:28:42.022 105272 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ad:12:e3 10.100.0.14'], port_security=['fa:16:3e:ad:12:e3 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '9d2b0f76-0408-4f6b-8a37-ac9882a44b4e', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-2d5ff262-0b2d-49fb-b643-980510ce97c7', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e26ae3fdd48d4947978a480f70e14f84', 'neutron:revision_number': '14', 'neutron:security_group_ids': '8cc6de91-de2a-4730-92b2-0be4bf62385a', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=512618ca-52a2-4c70-b76c-f25c0d00e2da, chassis=[<ovs.db.idl.Row object at 0x7fc06f54c970>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fc06f54c970>], logical_port=fee19e88-d18e-4020-97b6-26caf4ef6fa9) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 09:28:42 compute-1 nova_compute[189066]: 2025-12-05 09:28:42.038 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:28:42 compute-1 ovn_controller[95809]: 2025-12-05T09:28:42Z|00115|binding|INFO|Releasing lport fee19e88-d18e-4020-97b6-26caf4ef6fa9 from this chassis (sb_readonly=0)
Dec 05 09:28:42 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-fc928b59a3e5d862d68ae647d7cf7b3926d36f3eda22fadc3aef3ef5c66e850e-userdata-shm.mount: Deactivated successfully.
Dec 05 09:28:42 compute-1 systemd[1]: var-lib-containers-storage-overlay-a98e71583e7427a1c8bbbe932a85ece63cbd2d0e4f4e53f87b0b0ece8df447ec-merged.mount: Deactivated successfully.
Dec 05 09:28:42 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:28:42.058 105272 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ad:12:e3 10.100.0.14'], port_security=['fa:16:3e:ad:12:e3 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '9d2b0f76-0408-4f6b-8a37-ac9882a44b4e', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-2d5ff262-0b2d-49fb-b643-980510ce97c7', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e26ae3fdd48d4947978a480f70e14f84', 'neutron:revision_number': '14', 'neutron:security_group_ids': '8cc6de91-de2a-4730-92b2-0be4bf62385a', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=512618ca-52a2-4c70-b76c-f25c0d00e2da, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fc06f54c970>], logical_port=fee19e88-d18e-4020-97b6-26caf4ef6fa9) old=Port_Binding(chassis=[<ovs.db.idl.Row object at 0x7fc06f54c970>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 09:28:42 compute-1 podman[224797]: 2025-12-05 09:28:42.066110341 +0000 UTC m=+0.106202596 container cleanup fc928b59a3e5d862d68ae647d7cf7b3926d36f3eda22fadc3aef3ef5c66e850e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-2d5ff262-0b2d-49fb-b643-980510ce97c7, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Dec 05 09:28:42 compute-1 nova_compute[189066]: 2025-12-05 09:28:42.071 189070 INFO nova.virt.libvirt.driver [-] [instance: 9d2b0f76-0408-4f6b-8a37-ac9882a44b4e] Instance destroyed successfully.
Dec 05 09:28:42 compute-1 nova_compute[189066]: 2025-12-05 09:28:42.072 189070 DEBUG nova.objects.instance [None req-b160e745-4690-43b0-a494-9f447036d57c 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] Lazy-loading 'resources' on Instance uuid 9d2b0f76-0408-4f6b-8a37-ac9882a44b4e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 09:28:42 compute-1 systemd[1]: libpod-conmon-fc928b59a3e5d862d68ae647d7cf7b3926d36f3eda22fadc3aef3ef5c66e850e.scope: Deactivated successfully.
Dec 05 09:28:42 compute-1 nova_compute[189066]: 2025-12-05 09:28:42.097 189070 DEBUG nova.virt.libvirt.vif [None req-b160e745-4690-43b0-a494-9f447036d57c 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2025-12-05T09:26:52Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-1301659118',display_name='tempest-TestNetworkAdvancedServerOps-server-1301659118',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-1301659118',id=10,image_ref='3ebffd97-b242-42d7-b245-ebdaf8e4377c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBE/IH5546c3h1NqRaMnG8oeFTcOTykF/TM0HkVvHYHh6GuyCm0ZuW+dN3e3eFJ7fYYhI257JWo40Mrp34xaUZfOzZGF5l/jYsmwg8LZO1T3z+5cPaeu5EMQJR4Fd4T/olg==',key_name='tempest-TestNetworkAdvancedServerOps-1059800776',keypairs=<?>,launch_index=0,launched_at=2025-12-05T09:28:16Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='e26ae3fdd48d4947978a480f70e14f84',ramdisk_id='',reservation_id='r-i1fw0oxn',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',clean_attempts='1',image_base_image_ref='3ebffd97-b242-42d7-b245-ebdaf8e4377c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkAdvancedServerOps-1829130727',owner_user_name='tempest-TestNetworkAdvancedServerOps-1829130727-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-05T09:28:17Z,user_data=None,user_id='65751a90715341b2984ef84ebbaa1650',uuid=9d2b0f76-0408-4f6b-8a37-ac9882a44b4e,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "fee19e88-d18e-4020-97b6-26caf4ef6fa9", "address": "fa:16:3e:ad:12:e3", "network": {"id": "2d5ff262-0b2d-49fb-b643-980510ce97c7", "bridge": "br-int", "label": "tempest-network-smoke--2096539509", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.214", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e26ae3fdd48d4947978a480f70e14f84", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfee19e88-d1", "ovs_interfaceid": "fee19e88-d18e-4020-97b6-26caf4ef6fa9", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Dec 05 09:28:42 compute-1 nova_compute[189066]: 2025-12-05 09:28:42.098 189070 DEBUG nova.network.os_vif_util [None req-b160e745-4690-43b0-a494-9f447036d57c 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] Converting VIF {"id": "fee19e88-d18e-4020-97b6-26caf4ef6fa9", "address": "fa:16:3e:ad:12:e3", "network": {"id": "2d5ff262-0b2d-49fb-b643-980510ce97c7", "bridge": "br-int", "label": "tempest-network-smoke--2096539509", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.214", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e26ae3fdd48d4947978a480f70e14f84", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfee19e88-d1", "ovs_interfaceid": "fee19e88-d18e-4020-97b6-26caf4ef6fa9", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 09:28:42 compute-1 nova_compute[189066]: 2025-12-05 09:28:42.099 189070 DEBUG nova.network.os_vif_util [None req-b160e745-4690-43b0-a494-9f447036d57c 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:ad:12:e3,bridge_name='br-int',has_traffic_filtering=True,id=fee19e88-d18e-4020-97b6-26caf4ef6fa9,network=Network(2d5ff262-0b2d-49fb-b643-980510ce97c7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfee19e88-d1') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 09:28:42 compute-1 nova_compute[189066]: 2025-12-05 09:28:42.099 189070 DEBUG os_vif [None req-b160e745-4690-43b0-a494-9f447036d57c 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:ad:12:e3,bridge_name='br-int',has_traffic_filtering=True,id=fee19e88-d18e-4020-97b6-26caf4ef6fa9,network=Network(2d5ff262-0b2d-49fb-b643-980510ce97c7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfee19e88-d1') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Dec 05 09:28:42 compute-1 nova_compute[189066]: 2025-12-05 09:28:42.102 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:28:42 compute-1 nova_compute[189066]: 2025-12-05 09:28:42.102 189070 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapfee19e88-d1, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 09:28:42 compute-1 nova_compute[189066]: 2025-12-05 09:28:42.104 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:28:42 compute-1 nova_compute[189066]: 2025-12-05 09:28:42.107 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 09:28:42 compute-1 nova_compute[189066]: 2025-12-05 09:28:42.110 189070 INFO os_vif [None req-b160e745-4690-43b0-a494-9f447036d57c 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:ad:12:e3,bridge_name='br-int',has_traffic_filtering=True,id=fee19e88-d18e-4020-97b6-26caf4ef6fa9,network=Network(2d5ff262-0b2d-49fb-b643-980510ce97c7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfee19e88-d1')
Dec 05 09:28:42 compute-1 nova_compute[189066]: 2025-12-05 09:28:42.111 189070 INFO nova.virt.libvirt.driver [None req-b160e745-4690-43b0-a494-9f447036d57c 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] [instance: 9d2b0f76-0408-4f6b-8a37-ac9882a44b4e] Deleting instance files /var/lib/nova/instances/9d2b0f76-0408-4f6b-8a37-ac9882a44b4e_del
Dec 05 09:28:42 compute-1 nova_compute[189066]: 2025-12-05 09:28:42.117 189070 INFO nova.virt.libvirt.driver [None req-b160e745-4690-43b0-a494-9f447036d57c 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] [instance: 9d2b0f76-0408-4f6b-8a37-ac9882a44b4e] Deletion of /var/lib/nova/instances/9d2b0f76-0408-4f6b-8a37-ac9882a44b4e_del complete
Dec 05 09:28:42 compute-1 podman[224835]: 2025-12-05 09:28:42.136260726 +0000 UTC m=+0.046942838 container remove fc928b59a3e5d862d68ae647d7cf7b3926d36f3eda22fadc3aef3ef5c66e850e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-2d5ff262-0b2d-49fb-b643-980510ce97c7, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 05 09:28:42 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:28:42.142 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[b2bb01b1-7da5-48e4-bc9e-6c7ae824c831]: (4, ('Fri Dec  5 09:28:41 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-2d5ff262-0b2d-49fb-b643-980510ce97c7 (fc928b59a3e5d862d68ae647d7cf7b3926d36f3eda22fadc3aef3ef5c66e850e)\nfc928b59a3e5d862d68ae647d7cf7b3926d36f3eda22fadc3aef3ef5c66e850e\nFri Dec  5 09:28:42 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-2d5ff262-0b2d-49fb-b643-980510ce97c7 (fc928b59a3e5d862d68ae647d7cf7b3926d36f3eda22fadc3aef3ef5c66e850e)\nfc928b59a3e5d862d68ae647d7cf7b3926d36f3eda22fadc3aef3ef5c66e850e\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:28:42 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:28:42.145 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[79c402be-b9c7-4b26-ba1d-4e58b924c1d6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:28:42 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:28:42.146 105272 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap2d5ff262-00, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 09:28:42 compute-1 kernel: tap2d5ff262-00: left promiscuous mode
Dec 05 09:28:42 compute-1 nova_compute[189066]: 2025-12-05 09:28:42.150 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:28:42 compute-1 nova_compute[189066]: 2025-12-05 09:28:42.159 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:28:42 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:28:42.165 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[d1ad32da-a3f7-445c-a233-653119608f18]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:28:42 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:28:42.186 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[5b7dd49e-0ed3-4f99-bed5-91bb657e8978]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:28:42 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:28:42.187 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[62b70d49-4ec0-4d0b-a44a-e7ed9e2c5a25]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:28:42 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:28:42.206 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[57ff739b-df3c-4ace-95e6-238174694537]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 416775, 'reachable_time': 15948, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 224852, 'error': None, 'target': 'ovnmeta-2d5ff262-0b2d-49fb-b643-980510ce97c7', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:28:42 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:28:42.210 105809 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-2d5ff262-0b2d-49fb-b643-980510ce97c7 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Dec 05 09:28:42 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:28:42.211 105809 DEBUG oslo.privsep.daemon [-] privsep: reply[5aca0e03-4e50-4d03-b5c4-370b432abe14]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:28:42 compute-1 systemd[1]: run-netns-ovnmeta\x2d2d5ff262\x2d0b2d\x2d49fb\x2db643\x2d980510ce97c7.mount: Deactivated successfully.
Dec 05 09:28:42 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:28:42.212 105272 INFO neutron.agent.ovn.metadata.agent [-] Port fee19e88-d18e-4020-97b6-26caf4ef6fa9 in datapath 2d5ff262-0b2d-49fb-b643-980510ce97c7 unbound from our chassis
Dec 05 09:28:42 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:28:42.213 105272 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 2d5ff262-0b2d-49fb-b643-980510ce97c7, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 05 09:28:42 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:28:42.214 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[7f7c8d57-7bb4-46f9-bce6-5942579055ad]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:28:42 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:28:42.215 105272 INFO neutron.agent.ovn.metadata.agent [-] Port fee19e88-d18e-4020-97b6-26caf4ef6fa9 in datapath 2d5ff262-0b2d-49fb-b643-980510ce97c7 unbound from our chassis
Dec 05 09:28:42 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:28:42.216 105272 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 2d5ff262-0b2d-49fb-b643-980510ce97c7, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 05 09:28:42 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:28:42.216 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[394f6ec5-6db7-49de-9dd2-893f3c414a59]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:28:43 compute-1 nova_compute[189066]: 2025-12-05 09:28:43.145 189070 INFO nova.compute.manager [None req-b160e745-4690-43b0-a494-9f447036d57c 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] [instance: 9d2b0f76-0408-4f6b-8a37-ac9882a44b4e] Took 1.36 seconds to destroy the instance on the hypervisor.
Dec 05 09:28:43 compute-1 nova_compute[189066]: 2025-12-05 09:28:43.146 189070 DEBUG oslo.service.loopingcall [None req-b160e745-4690-43b0-a494-9f447036d57c 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Dec 05 09:28:43 compute-1 nova_compute[189066]: 2025-12-05 09:28:43.146 189070 DEBUG nova.compute.manager [-] [instance: 9d2b0f76-0408-4f6b-8a37-ac9882a44b4e] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Dec 05 09:28:43 compute-1 nova_compute[189066]: 2025-12-05 09:28:43.146 189070 DEBUG nova.network.neutron [-] [instance: 9d2b0f76-0408-4f6b-8a37-ac9882a44b4e] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Dec 05 09:28:43 compute-1 podman[224853]: 2025-12-05 09:28:43.637889962 +0000 UTC m=+0.073575290 container health_status 3f3288243aefe700d53cffce184353529fbaf6e44f1c19ec4792ed576af41176 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, config_id=multipathd, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3)
Dec 05 09:28:43 compute-1 nova_compute[189066]: 2025-12-05 09:28:43.830 189070 DEBUG nova.compute.manager [req-858d8790-6b46-4ca3-9ae7-baf9ac3ec4ac req-94d08b7f-70cb-4607-899e-0b246a814ec8 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 9d2b0f76-0408-4f6b-8a37-ac9882a44b4e] Received event network-vif-unplugged-fee19e88-d18e-4020-97b6-26caf4ef6fa9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 09:28:43 compute-1 nova_compute[189066]: 2025-12-05 09:28:43.831 189070 DEBUG oslo_concurrency.lockutils [req-858d8790-6b46-4ca3-9ae7-baf9ac3ec4ac req-94d08b7f-70cb-4607-899e-0b246a814ec8 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Acquiring lock "9d2b0f76-0408-4f6b-8a37-ac9882a44b4e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:28:43 compute-1 nova_compute[189066]: 2025-12-05 09:28:43.831 189070 DEBUG oslo_concurrency.lockutils [req-858d8790-6b46-4ca3-9ae7-baf9ac3ec4ac req-94d08b7f-70cb-4607-899e-0b246a814ec8 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Lock "9d2b0f76-0408-4f6b-8a37-ac9882a44b4e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:28:43 compute-1 nova_compute[189066]: 2025-12-05 09:28:43.831 189070 DEBUG oslo_concurrency.lockutils [req-858d8790-6b46-4ca3-9ae7-baf9ac3ec4ac req-94d08b7f-70cb-4607-899e-0b246a814ec8 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Lock "9d2b0f76-0408-4f6b-8a37-ac9882a44b4e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:28:43 compute-1 nova_compute[189066]: 2025-12-05 09:28:43.831 189070 DEBUG nova.compute.manager [req-858d8790-6b46-4ca3-9ae7-baf9ac3ec4ac req-94d08b7f-70cb-4607-899e-0b246a814ec8 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 9d2b0f76-0408-4f6b-8a37-ac9882a44b4e] No waiting events found dispatching network-vif-unplugged-fee19e88-d18e-4020-97b6-26caf4ef6fa9 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 09:28:43 compute-1 nova_compute[189066]: 2025-12-05 09:28:43.831 189070 DEBUG nova.compute.manager [req-858d8790-6b46-4ca3-9ae7-baf9ac3ec4ac req-94d08b7f-70cb-4607-899e-0b246a814ec8 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 9d2b0f76-0408-4f6b-8a37-ac9882a44b4e] Received event network-vif-unplugged-fee19e88-d18e-4020-97b6-26caf4ef6fa9 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Dec 05 09:28:43 compute-1 nova_compute[189066]: 2025-12-05 09:28:43.915 189070 DEBUG nova.compute.manager [req-83dbb778-8336-46ef-a28c-4ea8d867b4db req-8857e56f-fc04-41ff-bd50-8d2bba824b9e 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 9d2b0f76-0408-4f6b-8a37-ac9882a44b4e] Received event network-changed-fee19e88-d18e-4020-97b6-26caf4ef6fa9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 09:28:43 compute-1 nova_compute[189066]: 2025-12-05 09:28:43.915 189070 DEBUG nova.compute.manager [req-83dbb778-8336-46ef-a28c-4ea8d867b4db req-8857e56f-fc04-41ff-bd50-8d2bba824b9e 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 9d2b0f76-0408-4f6b-8a37-ac9882a44b4e] Refreshing instance network info cache due to event network-changed-fee19e88-d18e-4020-97b6-26caf4ef6fa9. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Dec 05 09:28:43 compute-1 nova_compute[189066]: 2025-12-05 09:28:43.915 189070 DEBUG oslo_concurrency.lockutils [req-83dbb778-8336-46ef-a28c-4ea8d867b4db req-8857e56f-fc04-41ff-bd50-8d2bba824b9e 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Acquiring lock "refresh_cache-9d2b0f76-0408-4f6b-8a37-ac9882a44b4e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 09:28:43 compute-1 nova_compute[189066]: 2025-12-05 09:28:43.916 189070 DEBUG oslo_concurrency.lockutils [req-83dbb778-8336-46ef-a28c-4ea8d867b4db req-8857e56f-fc04-41ff-bd50-8d2bba824b9e 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Acquired lock "refresh_cache-9d2b0f76-0408-4f6b-8a37-ac9882a44b4e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 09:28:43 compute-1 nova_compute[189066]: 2025-12-05 09:28:43.916 189070 DEBUG nova.network.neutron [req-83dbb778-8336-46ef-a28c-4ea8d867b4db req-8857e56f-fc04-41ff-bd50-8d2bba824b9e 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 9d2b0f76-0408-4f6b-8a37-ac9882a44b4e] Refreshing network info cache for port fee19e88-d18e-4020-97b6-26caf4ef6fa9 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Dec 05 09:28:44 compute-1 nova_compute[189066]: 2025-12-05 09:28:44.205 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:28:44 compute-1 nova_compute[189066]: 2025-12-05 09:28:44.314 189070 DEBUG nova.network.neutron [-] [instance: 9d2b0f76-0408-4f6b-8a37-ac9882a44b4e] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 09:28:44 compute-1 nova_compute[189066]: 2025-12-05 09:28:44.359 189070 INFO nova.compute.manager [-] [instance: 9d2b0f76-0408-4f6b-8a37-ac9882a44b4e] Took 1.21 seconds to deallocate network for instance.
Dec 05 09:28:45 compute-1 nova_compute[189066]: 2025-12-05 09:28:45.311 189070 DEBUG oslo_concurrency.lockutils [None req-b160e745-4690-43b0-a494-9f447036d57c 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:28:45 compute-1 nova_compute[189066]: 2025-12-05 09:28:45.311 189070 DEBUG oslo_concurrency.lockutils [None req-b160e745-4690-43b0-a494-9f447036d57c 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:28:45 compute-1 nova_compute[189066]: 2025-12-05 09:28:45.385 189070 DEBUG nova.compute.provider_tree [None req-b160e745-4690-43b0-a494-9f447036d57c 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] Inventory has not changed in ProviderTree for provider: be68f9f1-7820-4bfa-8dbd-210e13729f64 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 09:28:45 compute-1 nova_compute[189066]: 2025-12-05 09:28:45.548 189070 DEBUG nova.scheduler.client.report [None req-b160e745-4690-43b0-a494-9f447036d57c 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] Inventory has not changed for provider be68f9f1-7820-4bfa-8dbd-210e13729f64 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 09:28:45 compute-1 nova_compute[189066]: 2025-12-05 09:28:45.592 189070 DEBUG oslo_concurrency.lockutils [None req-b160e745-4690-43b0-a494-9f447036d57c 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.281s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:28:45 compute-1 nova_compute[189066]: 2025-12-05 09:28:45.640 189070 INFO nova.scheduler.client.report [None req-b160e745-4690-43b0-a494-9f447036d57c 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] Deleted allocations for instance 9d2b0f76-0408-4f6b-8a37-ac9882a44b4e
Dec 05 09:28:45 compute-1 nova_compute[189066]: 2025-12-05 09:28:45.731 189070 DEBUG oslo_concurrency.lockutils [None req-b160e745-4690-43b0-a494-9f447036d57c 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] Lock "9d2b0f76-0408-4f6b-8a37-ac9882a44b4e" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.950s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:28:45 compute-1 nova_compute[189066]: 2025-12-05 09:28:45.968 189070 DEBUG nova.compute.manager [req-409c8ad9-4a3f-4ae0-b698-10a2d0e82c9e req-dcdbbed0-4c8d-466d-8358-76e796ffcddf 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 9d2b0f76-0408-4f6b-8a37-ac9882a44b4e] Received event network-vif-plugged-fee19e88-d18e-4020-97b6-26caf4ef6fa9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 09:28:45 compute-1 nova_compute[189066]: 2025-12-05 09:28:45.968 189070 DEBUG oslo_concurrency.lockutils [req-409c8ad9-4a3f-4ae0-b698-10a2d0e82c9e req-dcdbbed0-4c8d-466d-8358-76e796ffcddf 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Acquiring lock "9d2b0f76-0408-4f6b-8a37-ac9882a44b4e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:28:45 compute-1 nova_compute[189066]: 2025-12-05 09:28:45.968 189070 DEBUG oslo_concurrency.lockutils [req-409c8ad9-4a3f-4ae0-b698-10a2d0e82c9e req-dcdbbed0-4c8d-466d-8358-76e796ffcddf 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Lock "9d2b0f76-0408-4f6b-8a37-ac9882a44b4e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:28:45 compute-1 nova_compute[189066]: 2025-12-05 09:28:45.969 189070 DEBUG oslo_concurrency.lockutils [req-409c8ad9-4a3f-4ae0-b698-10a2d0e82c9e req-dcdbbed0-4c8d-466d-8358-76e796ffcddf 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Lock "9d2b0f76-0408-4f6b-8a37-ac9882a44b4e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:28:45 compute-1 nova_compute[189066]: 2025-12-05 09:28:45.969 189070 DEBUG nova.compute.manager [req-409c8ad9-4a3f-4ae0-b698-10a2d0e82c9e req-dcdbbed0-4c8d-466d-8358-76e796ffcddf 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 9d2b0f76-0408-4f6b-8a37-ac9882a44b4e] No waiting events found dispatching network-vif-plugged-fee19e88-d18e-4020-97b6-26caf4ef6fa9 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 09:28:45 compute-1 nova_compute[189066]: 2025-12-05 09:28:45.969 189070 WARNING nova.compute.manager [req-409c8ad9-4a3f-4ae0-b698-10a2d0e82c9e req-dcdbbed0-4c8d-466d-8358-76e796ffcddf 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 9d2b0f76-0408-4f6b-8a37-ac9882a44b4e] Received unexpected event network-vif-plugged-fee19e88-d18e-4020-97b6-26caf4ef6fa9 for instance with vm_state deleted and task_state None.
Dec 05 09:28:45 compute-1 nova_compute[189066]: 2025-12-05 09:28:45.969 189070 DEBUG nova.compute.manager [req-409c8ad9-4a3f-4ae0-b698-10a2d0e82c9e req-dcdbbed0-4c8d-466d-8358-76e796ffcddf 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 9d2b0f76-0408-4f6b-8a37-ac9882a44b4e] Received event network-vif-deleted-fee19e88-d18e-4020-97b6-26caf4ef6fa9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 09:28:46 compute-1 nova_compute[189066]: 2025-12-05 09:28:46.461 189070 DEBUG oslo_concurrency.lockutils [None req-f42ea235-e730-44b0-9d07-dde0d9676573 bcc37d16c39547bba794fb1f43e889c1 6c5bb818cba543bbb1bcff8df31dd9cd - - default default] Acquiring lock "2bdfdb89-21af-43b6-93eb-48a637bfbd4c" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:28:46 compute-1 nova_compute[189066]: 2025-12-05 09:28:46.462 189070 DEBUG oslo_concurrency.lockutils [None req-f42ea235-e730-44b0-9d07-dde0d9676573 bcc37d16c39547bba794fb1f43e889c1 6c5bb818cba543bbb1bcff8df31dd9cd - - default default] Lock "2bdfdb89-21af-43b6-93eb-48a637bfbd4c" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:28:46 compute-1 nova_compute[189066]: 2025-12-05 09:28:46.462 189070 DEBUG oslo_concurrency.lockutils [None req-f42ea235-e730-44b0-9d07-dde0d9676573 bcc37d16c39547bba794fb1f43e889c1 6c5bb818cba543bbb1bcff8df31dd9cd - - default default] Acquiring lock "2bdfdb89-21af-43b6-93eb-48a637bfbd4c-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:28:46 compute-1 nova_compute[189066]: 2025-12-05 09:28:46.462 189070 DEBUG oslo_concurrency.lockutils [None req-f42ea235-e730-44b0-9d07-dde0d9676573 bcc37d16c39547bba794fb1f43e889c1 6c5bb818cba543bbb1bcff8df31dd9cd - - default default] Lock "2bdfdb89-21af-43b6-93eb-48a637bfbd4c-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:28:46 compute-1 nova_compute[189066]: 2025-12-05 09:28:46.463 189070 DEBUG oslo_concurrency.lockutils [None req-f42ea235-e730-44b0-9d07-dde0d9676573 bcc37d16c39547bba794fb1f43e889c1 6c5bb818cba543bbb1bcff8df31dd9cd - - default default] Lock "2bdfdb89-21af-43b6-93eb-48a637bfbd4c-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:28:46 compute-1 nova_compute[189066]: 2025-12-05 09:28:46.464 189070 INFO nova.compute.manager [None req-f42ea235-e730-44b0-9d07-dde0d9676573 bcc37d16c39547bba794fb1f43e889c1 6c5bb818cba543bbb1bcff8df31dd9cd - - default default] [instance: 2bdfdb89-21af-43b6-93eb-48a637bfbd4c] Terminating instance
Dec 05 09:28:46 compute-1 nova_compute[189066]: 2025-12-05 09:28:46.466 189070 DEBUG nova.compute.manager [None req-f42ea235-e730-44b0-9d07-dde0d9676573 bcc37d16c39547bba794fb1f43e889c1 6c5bb818cba543bbb1bcff8df31dd9cd - - default default] [instance: 2bdfdb89-21af-43b6-93eb-48a637bfbd4c] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Dec 05 09:28:46 compute-1 kernel: tap6cdc578a-4e (unregistering): left promiscuous mode
Dec 05 09:28:46 compute-1 NetworkManager[55704]: <info>  [1764926926.4901] device (tap6cdc578a-4e): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec 05 09:28:46 compute-1 ovn_controller[95809]: 2025-12-05T09:28:46Z|00116|binding|INFO|Releasing lport 6cdc578a-4ede-49b9-83a0-819716269b48 from this chassis (sb_readonly=0)
Dec 05 09:28:46 compute-1 ovn_controller[95809]: 2025-12-05T09:28:46Z|00117|binding|INFO|Setting lport 6cdc578a-4ede-49b9-83a0-819716269b48 down in Southbound
Dec 05 09:28:46 compute-1 nova_compute[189066]: 2025-12-05 09:28:46.498 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:28:46 compute-1 ovn_controller[95809]: 2025-12-05T09:28:46Z|00118|binding|INFO|Removing iface tap6cdc578a-4e ovn-installed in OVS
Dec 05 09:28:46 compute-1 nova_compute[189066]: 2025-12-05 09:28:46.500 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:28:46 compute-1 nova_compute[189066]: 2025-12-05 09:28:46.515 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:28:46 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:28:46.538 105272 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:8c:88:28 10.100.0.7'], port_security=['fa:16:3e:8c:88:28 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': '2bdfdb89-21af-43b6-93eb-48a637bfbd4c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-846936f3-e466-44ea-ae63-40fc6aa49531', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6c5bb818cba543bbb1bcff8df31dd9cd', 'neutron:revision_number': '4', 'neutron:security_group_ids': '9012b762-0914-4346-9b93-d550f3c25e13', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3a2ff281-26e9-4eb5-b083-ea6de485eb77, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fc06f54c970>], logical_port=6cdc578a-4ede-49b9-83a0-819716269b48) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fc06f54c970>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 09:28:46 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:28:46.540 105272 INFO neutron.agent.ovn.metadata.agent [-] Port 6cdc578a-4ede-49b9-83a0-819716269b48 in datapath 846936f3-e466-44ea-ae63-40fc6aa49531 unbound from our chassis
Dec 05 09:28:46 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:28:46.542 105272 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 846936f3-e466-44ea-ae63-40fc6aa49531, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 05 09:28:46 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:28:46.542 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[08512565-4050-492e-88eb-f7cf267fe7f5]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:28:46 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:28:46.543 105272 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-846936f3-e466-44ea-ae63-40fc6aa49531 namespace which is not needed anymore
Dec 05 09:28:46 compute-1 systemd[1]: machine-qemu\x2d8\x2dinstance\x2d0000000e.scope: Deactivated successfully.
Dec 05 09:28:46 compute-1 systemd[1]: machine-qemu\x2d8\x2dinstance\x2d0000000e.scope: Consumed 14.492s CPU time.
Dec 05 09:28:46 compute-1 systemd-machined[154815]: Machine qemu-8-instance-0000000e terminated.
Dec 05 09:28:46 compute-1 nova_compute[189066]: 2025-12-05 09:28:46.601 189070 DEBUG nova.network.neutron [req-83dbb778-8336-46ef-a28c-4ea8d867b4db req-8857e56f-fc04-41ff-bd50-8d2bba824b9e 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 9d2b0f76-0408-4f6b-8a37-ac9882a44b4e] Updated VIF entry in instance network info cache for port fee19e88-d18e-4020-97b6-26caf4ef6fa9. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Dec 05 09:28:46 compute-1 nova_compute[189066]: 2025-12-05 09:28:46.601 189070 DEBUG nova.network.neutron [req-83dbb778-8336-46ef-a28c-4ea8d867b4db req-8857e56f-fc04-41ff-bd50-8d2bba824b9e 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 9d2b0f76-0408-4f6b-8a37-ac9882a44b4e] Updating instance_info_cache with network_info: [{"id": "fee19e88-d18e-4020-97b6-26caf4ef6fa9", "address": "fa:16:3e:ad:12:e3", "network": {"id": "2d5ff262-0b2d-49fb-b643-980510ce97c7", "bridge": "br-int", "label": "tempest-network-smoke--2096539509", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e26ae3fdd48d4947978a480f70e14f84", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfee19e88-d1", "ovs_interfaceid": "fee19e88-d18e-4020-97b6-26caf4ef6fa9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 09:28:46 compute-1 nova_compute[189066]: 2025-12-05 09:28:46.627 189070 DEBUG oslo_concurrency.lockutils [req-83dbb778-8336-46ef-a28c-4ea8d867b4db req-8857e56f-fc04-41ff-bd50-8d2bba824b9e 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Releasing lock "refresh_cache-9d2b0f76-0408-4f6b-8a37-ac9882a44b4e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 09:28:46 compute-1 neutron-haproxy-ovnmeta-846936f3-e466-44ea-ae63-40fc6aa49531[224648]: [NOTICE]   (224652) : haproxy version is 2.8.14-c23fe91
Dec 05 09:28:46 compute-1 neutron-haproxy-ovnmeta-846936f3-e466-44ea-ae63-40fc6aa49531[224648]: [NOTICE]   (224652) : path to executable is /usr/sbin/haproxy
Dec 05 09:28:46 compute-1 neutron-haproxy-ovnmeta-846936f3-e466-44ea-ae63-40fc6aa49531[224648]: [WARNING]  (224652) : Exiting Master process...
Dec 05 09:28:46 compute-1 neutron-haproxy-ovnmeta-846936f3-e466-44ea-ae63-40fc6aa49531[224648]: [WARNING]  (224652) : Exiting Master process...
Dec 05 09:28:46 compute-1 neutron-haproxy-ovnmeta-846936f3-e466-44ea-ae63-40fc6aa49531[224648]: [ALERT]    (224652) : Current worker (224654) exited with code 143 (Terminated)
Dec 05 09:28:46 compute-1 neutron-haproxy-ovnmeta-846936f3-e466-44ea-ae63-40fc6aa49531[224648]: [WARNING]  (224652) : All workers exited. Exiting... (0)
Dec 05 09:28:46 compute-1 systemd[1]: libpod-e51546cfa9683f2fd50024f6a572440718db3cb6c104ae280dd5be46aeab02fe.scope: Deactivated successfully.
Dec 05 09:28:46 compute-1 podman[224899]: 2025-12-05 09:28:46.709697381 +0000 UTC m=+0.057220910 container died e51546cfa9683f2fd50024f6a572440718db3cb6c104ae280dd5be46aeab02fe (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-846936f3-e466-44ea-ae63-40fc6aa49531, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Dec 05 09:28:46 compute-1 nova_compute[189066]: 2025-12-05 09:28:46.731 189070 INFO nova.virt.libvirt.driver [-] [instance: 2bdfdb89-21af-43b6-93eb-48a637bfbd4c] Instance destroyed successfully.
Dec 05 09:28:46 compute-1 nova_compute[189066]: 2025-12-05 09:28:46.732 189070 DEBUG nova.objects.instance [None req-f42ea235-e730-44b0-9d07-dde0d9676573 bcc37d16c39547bba794fb1f43e889c1 6c5bb818cba543bbb1bcff8df31dd9cd - - default default] Lazy-loading 'resources' on Instance uuid 2bdfdb89-21af-43b6-93eb-48a637bfbd4c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 09:28:46 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-e51546cfa9683f2fd50024f6a572440718db3cb6c104ae280dd5be46aeab02fe-userdata-shm.mount: Deactivated successfully.
Dec 05 09:28:46 compute-1 systemd[1]: var-lib-containers-storage-overlay-fc574073b4218a7a6721097f6d7a3d74cf01d205c90afd8a27e088a422a689cc-merged.mount: Deactivated successfully.
Dec 05 09:28:46 compute-1 podman[224899]: 2025-12-05 09:28:46.748059318 +0000 UTC m=+0.095582817 container cleanup e51546cfa9683f2fd50024f6a572440718db3cb6c104ae280dd5be46aeab02fe (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-846936f3-e466-44ea-ae63-40fc6aa49531, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3)
Dec 05 09:28:46 compute-1 nova_compute[189066]: 2025-12-05 09:28:46.757 189070 DEBUG nova.virt.libvirt.vif [None req-f42ea235-e730-44b0-9d07-dde0d9676573 bcc37d16c39547bba794fb1f43e889c1 6c5bb818cba543bbb1bcff8df31dd9cd - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-05T09:28:05Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-server-tempest-TestSecurityGroupsBasicOps-1223075532-gen-0-1100534983',display_name='tempest-server-tempest-TestSecurityGroupsBasicOps-1223075532-gen-0-1100534983',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-server-tempest-testsecuritygroupsbasicops-1223075532-ge',id=14,image_ref='3ebffd97-b242-42d7-b245-ebdaf8e4377c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBGz0y8jwIN4iPTFbzm3Hzs8Gj5T8iGXuYAPED3QXZF12BYqK3ZPCYby6NEQkacCTzSuBzj+1xJy02xU8j3suuVwp8Fj+P/tKJdPWLIb14U774zp1EnovtYJFW3cus5PkUA==',key_name='tempest-TestSecurityGroupsBasicOps-295190475',keypairs=<?>,launch_index=0,launched_at=2025-12-05T09:28:27Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='6c5bb818cba543bbb1bcff8df31dd9cd',ramdisk_id='',reservation_id='r-kp7icvtw',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='3ebffd97-b242-42d7-b245-ebdaf8e4377c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestSecurityGroupsBasicOps-1223075532',owner_user_name='tempest-TestSecurityGroupsBasicOps-1223075532-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-05T09:28:27Z,user_data=None,user_id='bcc37d16c39547bba794fb1f43e889c1',uuid=2bdfdb89-21af-43b6-93eb-48a637bfbd4c,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "6cdc578a-4ede-49b9-83a0-819716269b48", "address": "fa:16:3e:8c:88:28", "network": {"id": "846936f3-e466-44ea-ae63-40fc6aa49531", "bridge": "br-int", "label": "tempest-network-smoke--1014331195", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6c5bb818cba543bbb1bcff8df31dd9cd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6cdc578a-4e", "ovs_interfaceid": "6cdc578a-4ede-49b9-83a0-819716269b48", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Dec 05 09:28:46 compute-1 nova_compute[189066]: 2025-12-05 09:28:46.758 189070 DEBUG nova.network.os_vif_util [None req-f42ea235-e730-44b0-9d07-dde0d9676573 bcc37d16c39547bba794fb1f43e889c1 6c5bb818cba543bbb1bcff8df31dd9cd - - default default] Converting VIF {"id": "6cdc578a-4ede-49b9-83a0-819716269b48", "address": "fa:16:3e:8c:88:28", "network": {"id": "846936f3-e466-44ea-ae63-40fc6aa49531", "bridge": "br-int", "label": "tempest-network-smoke--1014331195", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6c5bb818cba543bbb1bcff8df31dd9cd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6cdc578a-4e", "ovs_interfaceid": "6cdc578a-4ede-49b9-83a0-819716269b48", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 09:28:46 compute-1 nova_compute[189066]: 2025-12-05 09:28:46.759 189070 DEBUG nova.network.os_vif_util [None req-f42ea235-e730-44b0-9d07-dde0d9676573 bcc37d16c39547bba794fb1f43e889c1 6c5bb818cba543bbb1bcff8df31dd9cd - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:8c:88:28,bridge_name='br-int',has_traffic_filtering=True,id=6cdc578a-4ede-49b9-83a0-819716269b48,network=Network(846936f3-e466-44ea-ae63-40fc6aa49531),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6cdc578a-4e') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 09:28:46 compute-1 nova_compute[189066]: 2025-12-05 09:28:46.759 189070 DEBUG os_vif [None req-f42ea235-e730-44b0-9d07-dde0d9676573 bcc37d16c39547bba794fb1f43e889c1 6c5bb818cba543bbb1bcff8df31dd9cd - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:8c:88:28,bridge_name='br-int',has_traffic_filtering=True,id=6cdc578a-4ede-49b9-83a0-819716269b48,network=Network(846936f3-e466-44ea-ae63-40fc6aa49531),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6cdc578a-4e') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Dec 05 09:28:46 compute-1 nova_compute[189066]: 2025-12-05 09:28:46.761 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:28:46 compute-1 nova_compute[189066]: 2025-12-05 09:28:46.762 189070 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap6cdc578a-4e, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 09:28:46 compute-1 nova_compute[189066]: 2025-12-05 09:28:46.764 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:28:46 compute-1 nova_compute[189066]: 2025-12-05 09:28:46.765 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:28:46 compute-1 systemd[1]: libpod-conmon-e51546cfa9683f2fd50024f6a572440718db3cb6c104ae280dd5be46aeab02fe.scope: Deactivated successfully.
Dec 05 09:28:46 compute-1 nova_compute[189066]: 2025-12-05 09:28:46.768 189070 INFO os_vif [None req-f42ea235-e730-44b0-9d07-dde0d9676573 bcc37d16c39547bba794fb1f43e889c1 6c5bb818cba543bbb1bcff8df31dd9cd - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:8c:88:28,bridge_name='br-int',has_traffic_filtering=True,id=6cdc578a-4ede-49b9-83a0-819716269b48,network=Network(846936f3-e466-44ea-ae63-40fc6aa49531),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6cdc578a-4e')
Dec 05 09:28:46 compute-1 nova_compute[189066]: 2025-12-05 09:28:46.769 189070 INFO nova.virt.libvirt.driver [None req-f42ea235-e730-44b0-9d07-dde0d9676573 bcc37d16c39547bba794fb1f43e889c1 6c5bb818cba543bbb1bcff8df31dd9cd - - default default] [instance: 2bdfdb89-21af-43b6-93eb-48a637bfbd4c] Deleting instance files /var/lib/nova/instances/2bdfdb89-21af-43b6-93eb-48a637bfbd4c_del
Dec 05 09:28:46 compute-1 nova_compute[189066]: 2025-12-05 09:28:46.770 189070 INFO nova.virt.libvirt.driver [None req-f42ea235-e730-44b0-9d07-dde0d9676573 bcc37d16c39547bba794fb1f43e889c1 6c5bb818cba543bbb1bcff8df31dd9cd - - default default] [instance: 2bdfdb89-21af-43b6-93eb-48a637bfbd4c] Deletion of /var/lib/nova/instances/2bdfdb89-21af-43b6-93eb-48a637bfbd4c_del complete
Dec 05 09:28:46 compute-1 podman[224944]: 2025-12-05 09:28:46.813903237 +0000 UTC m=+0.040814298 container remove e51546cfa9683f2fd50024f6a572440718db3cb6c104ae280dd5be46aeab02fe (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-846936f3-e466-44ea-ae63-40fc6aa49531, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Dec 05 09:28:46 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:28:46.820 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[4003d644-709d-404e-bc9e-5333dbb05aa8]: (4, ('Fri Dec  5 09:28:46 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-846936f3-e466-44ea-ae63-40fc6aa49531 (e51546cfa9683f2fd50024f6a572440718db3cb6c104ae280dd5be46aeab02fe)\ne51546cfa9683f2fd50024f6a572440718db3cb6c104ae280dd5be46aeab02fe\nFri Dec  5 09:28:46 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-846936f3-e466-44ea-ae63-40fc6aa49531 (e51546cfa9683f2fd50024f6a572440718db3cb6c104ae280dd5be46aeab02fe)\ne51546cfa9683f2fd50024f6a572440718db3cb6c104ae280dd5be46aeab02fe\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:28:46 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:28:46.822 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[90978013-5ceb-4cbb-806f-ca6e2a567e9f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:28:46 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:28:46.823 105272 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap846936f3-e0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 09:28:46 compute-1 kernel: tap846936f3-e0: left promiscuous mode
Dec 05 09:28:46 compute-1 nova_compute[189066]: 2025-12-05 09:28:46.826 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:28:46 compute-1 nova_compute[189066]: 2025-12-05 09:28:46.837 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:28:46 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:28:46.841 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[708f03fc-aa75-4112-a390-1f367431248b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:28:46 compute-1 nova_compute[189066]: 2025-12-05 09:28:46.853 189070 INFO nova.compute.manager [None req-f42ea235-e730-44b0-9d07-dde0d9676573 bcc37d16c39547bba794fb1f43e889c1 6c5bb818cba543bbb1bcff8df31dd9cd - - default default] [instance: 2bdfdb89-21af-43b6-93eb-48a637bfbd4c] Took 0.39 seconds to destroy the instance on the hypervisor.
Dec 05 09:28:46 compute-1 nova_compute[189066]: 2025-12-05 09:28:46.854 189070 DEBUG oslo.service.loopingcall [None req-f42ea235-e730-44b0-9d07-dde0d9676573 bcc37d16c39547bba794fb1f43e889c1 6c5bb818cba543bbb1bcff8df31dd9cd - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Dec 05 09:28:46 compute-1 nova_compute[189066]: 2025-12-05 09:28:46.854 189070 DEBUG nova.compute.manager [-] [instance: 2bdfdb89-21af-43b6-93eb-48a637bfbd4c] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Dec 05 09:28:46 compute-1 nova_compute[189066]: 2025-12-05 09:28:46.854 189070 DEBUG nova.network.neutron [-] [instance: 2bdfdb89-21af-43b6-93eb-48a637bfbd4c] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Dec 05 09:28:46 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:28:46.862 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[c70d0d28-72a1-4689-a3a7-4ef39f3868ec]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:28:46 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:28:46.863 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[8c01b178-7201-44a6-b32b-6973712cea6d]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:28:46 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:28:46.881 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[40bd3f75-6009-4473-bcfc-8571b0b00509]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 417562, 'reachable_time': 24728, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 224959, 'error': None, 'target': 'ovnmeta-846936f3-e466-44ea-ae63-40fc6aa49531', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:28:46 compute-1 nova_compute[189066]: 2025-12-05 09:28:46.882 189070 DEBUG nova.compute.manager [req-68390cb4-d640-40ae-bf81-9346f8058cc5 req-11946bcf-f6e5-4ad7-ba74-26eec825f34e 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 2bdfdb89-21af-43b6-93eb-48a637bfbd4c] Received event network-vif-unplugged-6cdc578a-4ede-49b9-83a0-819716269b48 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 09:28:46 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:28:46.883 105809 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-846936f3-e466-44ea-ae63-40fc6aa49531 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Dec 05 09:28:46 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:28:46.883 105809 DEBUG oslo.privsep.daemon [-] privsep: reply[7c62998a-01b2-45c5-9d97-a36fc93d67d8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:28:46 compute-1 nova_compute[189066]: 2025-12-05 09:28:46.883 189070 DEBUG oslo_concurrency.lockutils [req-68390cb4-d640-40ae-bf81-9346f8058cc5 req-11946bcf-f6e5-4ad7-ba74-26eec825f34e 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Acquiring lock "2bdfdb89-21af-43b6-93eb-48a637bfbd4c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:28:46 compute-1 nova_compute[189066]: 2025-12-05 09:28:46.884 189070 DEBUG oslo_concurrency.lockutils [req-68390cb4-d640-40ae-bf81-9346f8058cc5 req-11946bcf-f6e5-4ad7-ba74-26eec825f34e 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Lock "2bdfdb89-21af-43b6-93eb-48a637bfbd4c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:28:46 compute-1 nova_compute[189066]: 2025-12-05 09:28:46.884 189070 DEBUG oslo_concurrency.lockutils [req-68390cb4-d640-40ae-bf81-9346f8058cc5 req-11946bcf-f6e5-4ad7-ba74-26eec825f34e 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Lock "2bdfdb89-21af-43b6-93eb-48a637bfbd4c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:28:46 compute-1 nova_compute[189066]: 2025-12-05 09:28:46.884 189070 DEBUG nova.compute.manager [req-68390cb4-d640-40ae-bf81-9346f8058cc5 req-11946bcf-f6e5-4ad7-ba74-26eec825f34e 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 2bdfdb89-21af-43b6-93eb-48a637bfbd4c] No waiting events found dispatching network-vif-unplugged-6cdc578a-4ede-49b9-83a0-819716269b48 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 09:28:46 compute-1 nova_compute[189066]: 2025-12-05 09:28:46.884 189070 DEBUG nova.compute.manager [req-68390cb4-d640-40ae-bf81-9346f8058cc5 req-11946bcf-f6e5-4ad7-ba74-26eec825f34e 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 2bdfdb89-21af-43b6-93eb-48a637bfbd4c] Received event network-vif-unplugged-6cdc578a-4ede-49b9-83a0-819716269b48 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Dec 05 09:28:46 compute-1 systemd[1]: run-netns-ovnmeta\x2d846936f3\x2de466\x2d44ea\x2dae63\x2d40fc6aa49531.mount: Deactivated successfully.
Dec 05 09:28:48 compute-1 podman[224960]: 2025-12-05 09:28:48.636159911 +0000 UTC m=+0.067800579 container health_status b07f36300303a5c1729056a61e344d6c6fd9b2e404082d8e8ce7185d45652c2b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, release=1755695350, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., version=9.6, container_name=openstack_network_exporter, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, architecture=x86_64, maintainer=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, distribution-scope=public, io.openshift.expose-services=, name=ubi9-minimal, config_id=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git)
Dec 05 09:28:48 compute-1 nova_compute[189066]: 2025-12-05 09:28:48.703 189070 DEBUG nova.network.neutron [-] [instance: 2bdfdb89-21af-43b6-93eb-48a637bfbd4c] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 09:28:48 compute-1 nova_compute[189066]: 2025-12-05 09:28:48.746 189070 INFO nova.compute.manager [-] [instance: 2bdfdb89-21af-43b6-93eb-48a637bfbd4c] Took 1.89 seconds to deallocate network for instance.
Dec 05 09:28:48 compute-1 nova_compute[189066]: 2025-12-05 09:28:48.822 189070 DEBUG oslo_concurrency.lockutils [None req-f42ea235-e730-44b0-9d07-dde0d9676573 bcc37d16c39547bba794fb1f43e889c1 6c5bb818cba543bbb1bcff8df31dd9cd - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:28:48 compute-1 nova_compute[189066]: 2025-12-05 09:28:48.822 189070 DEBUG oslo_concurrency.lockutils [None req-f42ea235-e730-44b0-9d07-dde0d9676573 bcc37d16c39547bba794fb1f43e889c1 6c5bb818cba543bbb1bcff8df31dd9cd - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:28:48 compute-1 nova_compute[189066]: 2025-12-05 09:28:48.897 189070 DEBUG nova.compute.provider_tree [None req-f42ea235-e730-44b0-9d07-dde0d9676573 bcc37d16c39547bba794fb1f43e889c1 6c5bb818cba543bbb1bcff8df31dd9cd - - default default] Inventory has not changed in ProviderTree for provider: be68f9f1-7820-4bfa-8dbd-210e13729f64 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 09:28:48 compute-1 nova_compute[189066]: 2025-12-05 09:28:48.920 189070 DEBUG nova.scheduler.client.report [None req-f42ea235-e730-44b0-9d07-dde0d9676573 bcc37d16c39547bba794fb1f43e889c1 6c5bb818cba543bbb1bcff8df31dd9cd - - default default] Inventory has not changed for provider be68f9f1-7820-4bfa-8dbd-210e13729f64 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 09:28:48 compute-1 nova_compute[189066]: 2025-12-05 09:28:48.954 189070 DEBUG oslo_concurrency.lockutils [None req-f42ea235-e730-44b0-9d07-dde0d9676573 bcc37d16c39547bba794fb1f43e889c1 6c5bb818cba543bbb1bcff8df31dd9cd - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.131s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:28:49 compute-1 nova_compute[189066]: 2025-12-05 09:28:49.008 189070 INFO nova.scheduler.client.report [None req-f42ea235-e730-44b0-9d07-dde0d9676573 bcc37d16c39547bba794fb1f43e889c1 6c5bb818cba543bbb1bcff8df31dd9cd - - default default] Deleted allocations for instance 2bdfdb89-21af-43b6-93eb-48a637bfbd4c
Dec 05 09:28:49 compute-1 nova_compute[189066]: 2025-12-05 09:28:49.046 189070 DEBUG nova.compute.manager [req-b852ba6a-8625-4ace-bc76-2b938e3ee629 req-de3b59b4-6f7c-4923-9da9-d586daeebc09 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 2bdfdb89-21af-43b6-93eb-48a637bfbd4c] Received event network-vif-plugged-6cdc578a-4ede-49b9-83a0-819716269b48 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 09:28:49 compute-1 nova_compute[189066]: 2025-12-05 09:28:49.047 189070 DEBUG oslo_concurrency.lockutils [req-b852ba6a-8625-4ace-bc76-2b938e3ee629 req-de3b59b4-6f7c-4923-9da9-d586daeebc09 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Acquiring lock "2bdfdb89-21af-43b6-93eb-48a637bfbd4c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:28:49 compute-1 nova_compute[189066]: 2025-12-05 09:28:49.047 189070 DEBUG oslo_concurrency.lockutils [req-b852ba6a-8625-4ace-bc76-2b938e3ee629 req-de3b59b4-6f7c-4923-9da9-d586daeebc09 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Lock "2bdfdb89-21af-43b6-93eb-48a637bfbd4c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:28:49 compute-1 nova_compute[189066]: 2025-12-05 09:28:49.047 189070 DEBUG oslo_concurrency.lockutils [req-b852ba6a-8625-4ace-bc76-2b938e3ee629 req-de3b59b4-6f7c-4923-9da9-d586daeebc09 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Lock "2bdfdb89-21af-43b6-93eb-48a637bfbd4c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:28:49 compute-1 nova_compute[189066]: 2025-12-05 09:28:49.048 189070 DEBUG nova.compute.manager [req-b852ba6a-8625-4ace-bc76-2b938e3ee629 req-de3b59b4-6f7c-4923-9da9-d586daeebc09 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 2bdfdb89-21af-43b6-93eb-48a637bfbd4c] No waiting events found dispatching network-vif-plugged-6cdc578a-4ede-49b9-83a0-819716269b48 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 09:28:49 compute-1 nova_compute[189066]: 2025-12-05 09:28:49.048 189070 WARNING nova.compute.manager [req-b852ba6a-8625-4ace-bc76-2b938e3ee629 req-de3b59b4-6f7c-4923-9da9-d586daeebc09 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 2bdfdb89-21af-43b6-93eb-48a637bfbd4c] Received unexpected event network-vif-plugged-6cdc578a-4ede-49b9-83a0-819716269b48 for instance with vm_state deleted and task_state None.
Dec 05 09:28:49 compute-1 nova_compute[189066]: 2025-12-05 09:28:49.048 189070 DEBUG nova.compute.manager [req-b852ba6a-8625-4ace-bc76-2b938e3ee629 req-de3b59b4-6f7c-4923-9da9-d586daeebc09 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 2bdfdb89-21af-43b6-93eb-48a637bfbd4c] Received event network-vif-deleted-6cdc578a-4ede-49b9-83a0-819716269b48 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 09:28:49 compute-1 nova_compute[189066]: 2025-12-05 09:28:49.141 189070 DEBUG oslo_concurrency.lockutils [None req-f42ea235-e730-44b0-9d07-dde0d9676573 bcc37d16c39547bba794fb1f43e889c1 6c5bb818cba543bbb1bcff8df31dd9cd - - default default] Lock "2bdfdb89-21af-43b6-93eb-48a637bfbd4c" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.679s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:28:49 compute-1 nova_compute[189066]: 2025-12-05 09:28:49.207 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:28:49 compute-1 nova_compute[189066]: 2025-12-05 09:28:49.681 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:28:49 compute-1 nova_compute[189066]: 2025-12-05 09:28:49.930 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:28:50 compute-1 sshd-session[224981]: Received disconnect from 185.118.15.236 port 36414:11: Bye Bye [preauth]
Dec 05 09:28:50 compute-1 sshd-session[224981]: Disconnected from authenticating user root 185.118.15.236 port 36414 [preauth]
Dec 05 09:28:50 compute-1 podman[224984]: 2025-12-05 09:28:50.621723016 +0000 UTC m=+0.059804573 container health_status b9f45cdcb567bb250381fa80d86c78c226d5638c7de1b6d79ee017275814ff68 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Dec 05 09:28:51 compute-1 nova_compute[189066]: 2025-12-05 09:28:51.766 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:28:54 compute-1 nova_compute[189066]: 2025-12-05 09:28:54.270 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:28:56 compute-1 nova_compute[189066]: 2025-12-05 09:28:56.769 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:28:57 compute-1 nova_compute[189066]: 2025-12-05 09:28:57.070 189070 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764926922.0688677, 9d2b0f76-0408-4f6b-8a37-ac9882a44b4e => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 09:28:57 compute-1 nova_compute[189066]: 2025-12-05 09:28:57.071 189070 INFO nova.compute.manager [-] [instance: 9d2b0f76-0408-4f6b-8a37-ac9882a44b4e] VM Stopped (Lifecycle Event)
Dec 05 09:28:57 compute-1 nova_compute[189066]: 2025-12-05 09:28:57.098 189070 DEBUG nova.compute.manager [None req-4549f55a-f1aa-445c-893a-677e5b799773 - - - - - -] [instance: 9d2b0f76-0408-4f6b-8a37-ac9882a44b4e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 09:28:57 compute-1 podman[225008]: 2025-12-05 09:28:57.621243733 +0000 UTC m=+0.059465044 container health_status 550b604b3b40c506829669f3af0f3e8dc4e7ac01464222ce7f26998708fe1a96 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec 05 09:28:59 compute-1 nova_compute[189066]: 2025-12-05 09:28:59.273 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:29:01 compute-1 nova_compute[189066]: 2025-12-05 09:29:01.730 189070 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764926926.7288675, 2bdfdb89-21af-43b6-93eb-48a637bfbd4c => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 09:29:01 compute-1 nova_compute[189066]: 2025-12-05 09:29:01.731 189070 INFO nova.compute.manager [-] [instance: 2bdfdb89-21af-43b6-93eb-48a637bfbd4c] VM Stopped (Lifecycle Event)
Dec 05 09:29:01 compute-1 nova_compute[189066]: 2025-12-05 09:29:01.769 189070 DEBUG nova.compute.manager [None req-0204dd4c-4fd7-48c9-8590-baf8ca571808 - - - - - -] [instance: 2bdfdb89-21af-43b6-93eb-48a637bfbd4c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 09:29:01 compute-1 nova_compute[189066]: 2025-12-05 09:29:01.772 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:29:04 compute-1 nova_compute[189066]: 2025-12-05 09:29:04.276 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:29:05 compute-1 podman[225032]: 2025-12-05 09:29:05.634781426 +0000 UTC m=+0.073867185 container health_status d60d3dfd47255bc7c068dbedc03dbf86678863f0414a1b8f8536e4d53dd67a13 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a42d21893a42142b0b00e96296a13ac9d2c0fedf55d6438ad104fd0309c63376-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ceilometer_agent_compute, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Dec 05 09:29:06 compute-1 nova_compute[189066]: 2025-12-05 09:29:06.774 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:29:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:29:08.873 105272 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:29:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:29:08.874 105272 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:29:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:29:08.874 105272 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:29:09 compute-1 nova_compute[189066]: 2025-12-05 09:29:09.021 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:29:09 compute-1 nova_compute[189066]: 2025-12-05 09:29:09.278 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:29:09 compute-1 podman[225052]: 2025-12-05 09:29:09.661429114 +0000 UTC m=+0.100722563 container health_status 0cdc951f3b455a1459471b20e1f1626ca53f702831c125f835d1b670aeb17ccf (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a42d21893a42142b0b00e96296a13ac9d2c0fedf55d6438ad104fd0309c63376-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Dec 05 09:29:10 compute-1 nova_compute[189066]: 2025-12-05 09:29:10.015 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:29:10 compute-1 nova_compute[189066]: 2025-12-05 09:29:10.020 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:29:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:29:10.745 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 09:29:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:29:10.747 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 09:29:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:29:10.747 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 09:29:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:29:10.747 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 09:29:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:29:10.747 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 09:29:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:29:10.748 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 09:29:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:29:10.748 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 09:29:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:29:10.748 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 09:29:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:29:10.748 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 09:29:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:29:10.748 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 09:29:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:29:10.748 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 09:29:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:29:10.749 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 09:29:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:29:10.749 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 09:29:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:29:10.749 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 09:29:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:29:10.749 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 09:29:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:29:10.749 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 09:29:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:29:10.749 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 09:29:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:29:10.749 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 09:29:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:29:10.749 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 09:29:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:29:10.750 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 09:29:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:29:10.750 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 09:29:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:29:10.750 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 09:29:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:29:10.750 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 09:29:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:29:10.750 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 09:29:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:29:10.750 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 09:29:11 compute-1 podman[225078]: 2025-12-05 09:29:11.626714444 +0000 UTC m=+0.062883118 container health_status e8cea6352187f28e672aa25c227f2705a90d58074b8cf42235a56df5d4026fd5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a42d21893a42142b0b00e96296a13ac9d2c0fedf55d6438ad104fd0309c63376-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 05 09:29:11 compute-1 nova_compute[189066]: 2025-12-05 09:29:11.808 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:29:12 compute-1 nova_compute[189066]: 2025-12-05 09:29:12.021 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:29:13 compute-1 nova_compute[189066]: 2025-12-05 09:29:13.021 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:29:13 compute-1 nova_compute[189066]: 2025-12-05 09:29:13.150 189070 DEBUG oslo_concurrency.lockutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:29:13 compute-1 nova_compute[189066]: 2025-12-05 09:29:13.150 189070 DEBUG oslo_concurrency.lockutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:29:13 compute-1 nova_compute[189066]: 2025-12-05 09:29:13.151 189070 DEBUG oslo_concurrency.lockutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:29:13 compute-1 nova_compute[189066]: 2025-12-05 09:29:13.151 189070 DEBUG nova.compute.resource_tracker [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 05 09:29:13 compute-1 nova_compute[189066]: 2025-12-05 09:29:13.325 189070 WARNING nova.virt.libvirt.driver [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 05 09:29:13 compute-1 nova_compute[189066]: 2025-12-05 09:29:13.326 189070 DEBUG nova.compute.resource_tracker [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5745MB free_disk=73.33369445800781GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 05 09:29:13 compute-1 nova_compute[189066]: 2025-12-05 09:29:13.326 189070 DEBUG oslo_concurrency.lockutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:29:13 compute-1 nova_compute[189066]: 2025-12-05 09:29:13.327 189070 DEBUG oslo_concurrency.lockutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:29:14 compute-1 nova_compute[189066]: 2025-12-05 09:29:14.290 189070 DEBUG nova.compute.resource_tracker [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 05 09:29:14 compute-1 nova_compute[189066]: 2025-12-05 09:29:14.291 189070 DEBUG nova.compute.resource_tracker [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 05 09:29:14 compute-1 nova_compute[189066]: 2025-12-05 09:29:14.295 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:29:14 compute-1 nova_compute[189066]: 2025-12-05 09:29:14.321 189070 DEBUG nova.compute.provider_tree [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Inventory has not changed in ProviderTree for provider: be68f9f1-7820-4bfa-8dbd-210e13729f64 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 09:29:14 compute-1 nova_compute[189066]: 2025-12-05 09:29:14.521 189070 DEBUG nova.scheduler.client.report [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Inventory has not changed for provider be68f9f1-7820-4bfa-8dbd-210e13729f64 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 09:29:14 compute-1 podman[225098]: 2025-12-05 09:29:14.618518247 +0000 UTC m=+0.062172081 container health_status 3f3288243aefe700d53cffce184353529fbaf6e44f1c19ec4792ed576af41176 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=multipathd, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd)
Dec 05 09:29:14 compute-1 nova_compute[189066]: 2025-12-05 09:29:14.914 189070 DEBUG nova.compute.resource_tracker [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 05 09:29:14 compute-1 nova_compute[189066]: 2025-12-05 09:29:14.915 189070 DEBUG oslo_concurrency.lockutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.588s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:29:15 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:29:15.649 105272 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=12, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f2:d3:89', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '8e:43:2c:7d:20:dd'}, ipsec=False) old=SB_Global(nb_cfg=11) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 09:29:15 compute-1 nova_compute[189066]: 2025-12-05 09:29:15.650 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:29:15 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:29:15.651 105272 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Dec 05 09:29:15 compute-1 nova_compute[189066]: 2025-12-05 09:29:15.915 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:29:15 compute-1 nova_compute[189066]: 2025-12-05 09:29:15.916 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:29:16 compute-1 nova_compute[189066]: 2025-12-05 09:29:16.020 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:29:16 compute-1 nova_compute[189066]: 2025-12-05 09:29:16.020 189070 DEBUG nova.compute.manager [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 05 09:29:16 compute-1 nova_compute[189066]: 2025-12-05 09:29:16.020 189070 DEBUG nova.compute.manager [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 05 09:29:16 compute-1 nova_compute[189066]: 2025-12-05 09:29:16.111 189070 DEBUG nova.compute.manager [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Dec 05 09:29:16 compute-1 nova_compute[189066]: 2025-12-05 09:29:16.111 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:29:16 compute-1 nova_compute[189066]: 2025-12-05 09:29:16.112 189070 DEBUG nova.compute.manager [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 05 09:29:16 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:29:16.655 105272 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=2deaed7a-68f6-453c-b7f8-10ef033f3762, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '12'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 09:29:16 compute-1 nova_compute[189066]: 2025-12-05 09:29:16.810 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:29:19 compute-1 nova_compute[189066]: 2025-12-05 09:29:19.297 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:29:19 compute-1 podman[225120]: 2025-12-05 09:29:19.6284503 +0000 UTC m=+0.063650058 container health_status b07f36300303a5c1729056a61e344d6c6fd9b2e404082d8e8ce7185d45652c2b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=9.6, container_name=openstack_network_exporter, io.openshift.expose-services=, vendor=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, maintainer=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, release=1755695350, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7)
Dec 05 09:29:21 compute-1 podman[225143]: 2025-12-05 09:29:21.609439493 +0000 UTC m=+0.053707134 container health_status b9f45cdcb567bb250381fa80d86c78c226d5638c7de1b6d79ee017275814ff68 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Dec 05 09:29:21 compute-1 nova_compute[189066]: 2025-12-05 09:29:21.813 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:29:24 compute-1 nova_compute[189066]: 2025-12-05 09:29:24.299 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:29:26 compute-1 nova_compute[189066]: 2025-12-05 09:29:26.817 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:29:28 compute-1 podman[225167]: 2025-12-05 09:29:28.633907391 +0000 UTC m=+0.077484865 container health_status 550b604b3b40c506829669f3af0f3e8dc4e7ac01464222ce7f26998708fe1a96 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Dec 05 09:29:28 compute-1 nova_compute[189066]: 2025-12-05 09:29:28.825 189070 DEBUG oslo_concurrency.lockutils [None req-719adf5c-043a-472c-b571-1736994731ac 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] Acquiring lock "04f78c05-bd66-4054-89c0-78b77c57ecf7" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:29:28 compute-1 nova_compute[189066]: 2025-12-05 09:29:28.826 189070 DEBUG oslo_concurrency.lockutils [None req-719adf5c-043a-472c-b571-1736994731ac 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] Lock "04f78c05-bd66-4054-89c0-78b77c57ecf7" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:29:28 compute-1 nova_compute[189066]: 2025-12-05 09:29:28.869 189070 DEBUG nova.compute.manager [None req-719adf5c-043a-472c-b571-1736994731ac 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] [instance: 04f78c05-bd66-4054-89c0-78b77c57ecf7] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Dec 05 09:29:29 compute-1 nova_compute[189066]: 2025-12-05 09:29:29.111 189070 DEBUG oslo_concurrency.lockutils [None req-719adf5c-043a-472c-b571-1736994731ac 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:29:29 compute-1 nova_compute[189066]: 2025-12-05 09:29:29.112 189070 DEBUG oslo_concurrency.lockutils [None req-719adf5c-043a-472c-b571-1736994731ac 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:29:29 compute-1 nova_compute[189066]: 2025-12-05 09:29:29.121 189070 DEBUG nova.virt.hardware [None req-719adf5c-043a-472c-b571-1736994731ac 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Dec 05 09:29:29 compute-1 nova_compute[189066]: 2025-12-05 09:29:29.121 189070 INFO nova.compute.claims [None req-719adf5c-043a-472c-b571-1736994731ac 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] [instance: 04f78c05-bd66-4054-89c0-78b77c57ecf7] Claim successful on node compute-1.ctlplane.example.com
Dec 05 09:29:29 compute-1 nova_compute[189066]: 2025-12-05 09:29:29.304 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:29:29 compute-1 nova_compute[189066]: 2025-12-05 09:29:29.342 189070 DEBUG nova.compute.provider_tree [None req-719adf5c-043a-472c-b571-1736994731ac 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] Inventory has not changed in ProviderTree for provider: be68f9f1-7820-4bfa-8dbd-210e13729f64 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 09:29:29 compute-1 nova_compute[189066]: 2025-12-05 09:29:29.372 189070 DEBUG nova.scheduler.client.report [None req-719adf5c-043a-472c-b571-1736994731ac 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] Inventory has not changed for provider be68f9f1-7820-4bfa-8dbd-210e13729f64 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 09:29:29 compute-1 nova_compute[189066]: 2025-12-05 09:29:29.412 189070 DEBUG oslo_concurrency.lockutils [None req-719adf5c-043a-472c-b571-1736994731ac 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.300s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:29:29 compute-1 nova_compute[189066]: 2025-12-05 09:29:29.413 189070 DEBUG nova.compute.manager [None req-719adf5c-043a-472c-b571-1736994731ac 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] [instance: 04f78c05-bd66-4054-89c0-78b77c57ecf7] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Dec 05 09:29:29 compute-1 nova_compute[189066]: 2025-12-05 09:29:29.504 189070 DEBUG nova.compute.manager [None req-719adf5c-043a-472c-b571-1736994731ac 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] [instance: 04f78c05-bd66-4054-89c0-78b77c57ecf7] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Dec 05 09:29:29 compute-1 nova_compute[189066]: 2025-12-05 09:29:29.504 189070 DEBUG nova.network.neutron [None req-719adf5c-043a-472c-b571-1736994731ac 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] [instance: 04f78c05-bd66-4054-89c0-78b77c57ecf7] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Dec 05 09:29:29 compute-1 nova_compute[189066]: 2025-12-05 09:29:29.535 189070 INFO nova.virt.libvirt.driver [None req-719adf5c-043a-472c-b571-1736994731ac 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] [instance: 04f78c05-bd66-4054-89c0-78b77c57ecf7] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Dec 05 09:29:29 compute-1 nova_compute[189066]: 2025-12-05 09:29:29.558 189070 DEBUG nova.compute.manager [None req-719adf5c-043a-472c-b571-1736994731ac 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] [instance: 04f78c05-bd66-4054-89c0-78b77c57ecf7] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Dec 05 09:29:29 compute-1 nova_compute[189066]: 2025-12-05 09:29:29.964 189070 DEBUG nova.policy [None req-719adf5c-043a-472c-b571-1736994731ac 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '65751a90715341b2984ef84ebbaa1650', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'e26ae3fdd48d4947978a480f70e14f84', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Dec 05 09:29:30 compute-1 nova_compute[189066]: 2025-12-05 09:29:30.391 189070 DEBUG nova.compute.manager [None req-719adf5c-043a-472c-b571-1736994731ac 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] [instance: 04f78c05-bd66-4054-89c0-78b77c57ecf7] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Dec 05 09:29:30 compute-1 nova_compute[189066]: 2025-12-05 09:29:30.393 189070 DEBUG nova.virt.libvirt.driver [None req-719adf5c-043a-472c-b571-1736994731ac 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] [instance: 04f78c05-bd66-4054-89c0-78b77c57ecf7] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Dec 05 09:29:30 compute-1 nova_compute[189066]: 2025-12-05 09:29:30.393 189070 INFO nova.virt.libvirt.driver [None req-719adf5c-043a-472c-b571-1736994731ac 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] [instance: 04f78c05-bd66-4054-89c0-78b77c57ecf7] Creating image(s)
Dec 05 09:29:30 compute-1 nova_compute[189066]: 2025-12-05 09:29:30.394 189070 DEBUG oslo_concurrency.lockutils [None req-719adf5c-043a-472c-b571-1736994731ac 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] Acquiring lock "/var/lib/nova/instances/04f78c05-bd66-4054-89c0-78b77c57ecf7/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:29:30 compute-1 nova_compute[189066]: 2025-12-05 09:29:30.394 189070 DEBUG oslo_concurrency.lockutils [None req-719adf5c-043a-472c-b571-1736994731ac 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] Lock "/var/lib/nova/instances/04f78c05-bd66-4054-89c0-78b77c57ecf7/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:29:30 compute-1 nova_compute[189066]: 2025-12-05 09:29:30.395 189070 DEBUG oslo_concurrency.lockutils [None req-719adf5c-043a-472c-b571-1736994731ac 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] Lock "/var/lib/nova/instances/04f78c05-bd66-4054-89c0-78b77c57ecf7/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:29:30 compute-1 nova_compute[189066]: 2025-12-05 09:29:30.409 189070 DEBUG oslo_concurrency.processutils [None req-719adf5c-043a-472c-b571-1736994731ac 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5b1bdb5cc50b9bf43ebe3094e9279a12a18df0ce --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 09:29:30 compute-1 nova_compute[189066]: 2025-12-05 09:29:30.488 189070 DEBUG oslo_concurrency.processutils [None req-719adf5c-043a-472c-b571-1736994731ac 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5b1bdb5cc50b9bf43ebe3094e9279a12a18df0ce --force-share --output=json" returned: 0 in 0.080s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 09:29:30 compute-1 nova_compute[189066]: 2025-12-05 09:29:30.490 189070 DEBUG oslo_concurrency.lockutils [None req-719adf5c-043a-472c-b571-1736994731ac 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] Acquiring lock "5b1bdb5cc50b9bf43ebe3094e9279a12a18df0ce" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:29:30 compute-1 nova_compute[189066]: 2025-12-05 09:29:30.491 189070 DEBUG oslo_concurrency.lockutils [None req-719adf5c-043a-472c-b571-1736994731ac 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] Lock "5b1bdb5cc50b9bf43ebe3094e9279a12a18df0ce" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:29:30 compute-1 nova_compute[189066]: 2025-12-05 09:29:30.515 189070 DEBUG oslo_concurrency.processutils [None req-719adf5c-043a-472c-b571-1736994731ac 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5b1bdb5cc50b9bf43ebe3094e9279a12a18df0ce --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 09:29:30 compute-1 nova_compute[189066]: 2025-12-05 09:29:30.582 189070 DEBUG oslo_concurrency.processutils [None req-719adf5c-043a-472c-b571-1736994731ac 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5b1bdb5cc50b9bf43ebe3094e9279a12a18df0ce --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 09:29:30 compute-1 nova_compute[189066]: 2025-12-05 09:29:30.583 189070 DEBUG oslo_concurrency.processutils [None req-719adf5c-043a-472c-b571-1736994731ac 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5b1bdb5cc50b9bf43ebe3094e9279a12a18df0ce,backing_fmt=raw /var/lib/nova/instances/04f78c05-bd66-4054-89c0-78b77c57ecf7/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 09:29:30 compute-1 nova_compute[189066]: 2025-12-05 09:29:30.623 189070 DEBUG oslo_concurrency.processutils [None req-719adf5c-043a-472c-b571-1736994731ac 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5b1bdb5cc50b9bf43ebe3094e9279a12a18df0ce,backing_fmt=raw /var/lib/nova/instances/04f78c05-bd66-4054-89c0-78b77c57ecf7/disk 1073741824" returned: 0 in 0.041s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 09:29:30 compute-1 nova_compute[189066]: 2025-12-05 09:29:30.625 189070 DEBUG oslo_concurrency.lockutils [None req-719adf5c-043a-472c-b571-1736994731ac 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] Lock "5b1bdb5cc50b9bf43ebe3094e9279a12a18df0ce" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.134s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:29:30 compute-1 nova_compute[189066]: 2025-12-05 09:29:30.625 189070 DEBUG oslo_concurrency.processutils [None req-719adf5c-043a-472c-b571-1736994731ac 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5b1bdb5cc50b9bf43ebe3094e9279a12a18df0ce --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 09:29:30 compute-1 nova_compute[189066]: 2025-12-05 09:29:30.686 189070 DEBUG oslo_concurrency.processutils [None req-719adf5c-043a-472c-b571-1736994731ac 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5b1bdb5cc50b9bf43ebe3094e9279a12a18df0ce --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 09:29:30 compute-1 nova_compute[189066]: 2025-12-05 09:29:30.687 189070 DEBUG nova.virt.disk.api [None req-719adf5c-043a-472c-b571-1736994731ac 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] Checking if we can resize image /var/lib/nova/instances/04f78c05-bd66-4054-89c0-78b77c57ecf7/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Dec 05 09:29:30 compute-1 nova_compute[189066]: 2025-12-05 09:29:30.687 189070 DEBUG oslo_concurrency.processutils [None req-719adf5c-043a-472c-b571-1736994731ac 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/04f78c05-bd66-4054-89c0-78b77c57ecf7/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 09:29:30 compute-1 nova_compute[189066]: 2025-12-05 09:29:30.752 189070 DEBUG oslo_concurrency.processutils [None req-719adf5c-043a-472c-b571-1736994731ac 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/04f78c05-bd66-4054-89c0-78b77c57ecf7/disk --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 09:29:30 compute-1 nova_compute[189066]: 2025-12-05 09:29:30.753 189070 DEBUG nova.virt.disk.api [None req-719adf5c-043a-472c-b571-1736994731ac 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] Cannot resize image /var/lib/nova/instances/04f78c05-bd66-4054-89c0-78b77c57ecf7/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Dec 05 09:29:30 compute-1 nova_compute[189066]: 2025-12-05 09:29:30.754 189070 DEBUG nova.objects.instance [None req-719adf5c-043a-472c-b571-1736994731ac 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] Lazy-loading 'migration_context' on Instance uuid 04f78c05-bd66-4054-89c0-78b77c57ecf7 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 09:29:30 compute-1 nova_compute[189066]: 2025-12-05 09:29:30.773 189070 DEBUG nova.virt.libvirt.driver [None req-719adf5c-043a-472c-b571-1736994731ac 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] [instance: 04f78c05-bd66-4054-89c0-78b77c57ecf7] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Dec 05 09:29:30 compute-1 nova_compute[189066]: 2025-12-05 09:29:30.773 189070 DEBUG nova.virt.libvirt.driver [None req-719adf5c-043a-472c-b571-1736994731ac 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] [instance: 04f78c05-bd66-4054-89c0-78b77c57ecf7] Ensure instance console log exists: /var/lib/nova/instances/04f78c05-bd66-4054-89c0-78b77c57ecf7/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Dec 05 09:29:30 compute-1 nova_compute[189066]: 2025-12-05 09:29:30.774 189070 DEBUG oslo_concurrency.lockutils [None req-719adf5c-043a-472c-b571-1736994731ac 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:29:30 compute-1 nova_compute[189066]: 2025-12-05 09:29:30.774 189070 DEBUG oslo_concurrency.lockutils [None req-719adf5c-043a-472c-b571-1736994731ac 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:29:30 compute-1 nova_compute[189066]: 2025-12-05 09:29:30.775 189070 DEBUG oslo_concurrency.lockutils [None req-719adf5c-043a-472c-b571-1736994731ac 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:29:31 compute-1 nova_compute[189066]: 2025-12-05 09:29:31.805 189070 DEBUG nova.network.neutron [None req-719adf5c-043a-472c-b571-1736994731ac 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] [instance: 04f78c05-bd66-4054-89c0-78b77c57ecf7] Successfully created port: 7f77bcc1-96ff-4d64-ae85-ba761c46b85e _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Dec 05 09:29:31 compute-1 nova_compute[189066]: 2025-12-05 09:29:31.861 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:29:34 compute-1 nova_compute[189066]: 2025-12-05 09:29:34.305 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:29:34 compute-1 nova_compute[189066]: 2025-12-05 09:29:34.719 189070 DEBUG nova.network.neutron [None req-719adf5c-043a-472c-b571-1736994731ac 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] [instance: 04f78c05-bd66-4054-89c0-78b77c57ecf7] Successfully updated port: 7f77bcc1-96ff-4d64-ae85-ba761c46b85e _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Dec 05 09:29:34 compute-1 nova_compute[189066]: 2025-12-05 09:29:34.766 189070 DEBUG oslo_concurrency.lockutils [None req-719adf5c-043a-472c-b571-1736994731ac 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] Acquiring lock "refresh_cache-04f78c05-bd66-4054-89c0-78b77c57ecf7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 09:29:34 compute-1 nova_compute[189066]: 2025-12-05 09:29:34.767 189070 DEBUG oslo_concurrency.lockutils [None req-719adf5c-043a-472c-b571-1736994731ac 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] Acquired lock "refresh_cache-04f78c05-bd66-4054-89c0-78b77c57ecf7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 09:29:34 compute-1 nova_compute[189066]: 2025-12-05 09:29:34.767 189070 DEBUG nova.network.neutron [None req-719adf5c-043a-472c-b571-1736994731ac 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] [instance: 04f78c05-bd66-4054-89c0-78b77c57ecf7] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Dec 05 09:29:34 compute-1 nova_compute[189066]: 2025-12-05 09:29:34.937 189070 DEBUG nova.compute.manager [req-e1c26889-1932-4dd1-b41d-3e45283da48b req-4eea2dc2-0535-411f-a04d-5eee68badd73 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 04f78c05-bd66-4054-89c0-78b77c57ecf7] Received event network-changed-7f77bcc1-96ff-4d64-ae85-ba761c46b85e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 09:29:34 compute-1 nova_compute[189066]: 2025-12-05 09:29:34.938 189070 DEBUG nova.compute.manager [req-e1c26889-1932-4dd1-b41d-3e45283da48b req-4eea2dc2-0535-411f-a04d-5eee68badd73 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 04f78c05-bd66-4054-89c0-78b77c57ecf7] Refreshing instance network info cache due to event network-changed-7f77bcc1-96ff-4d64-ae85-ba761c46b85e. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Dec 05 09:29:34 compute-1 nova_compute[189066]: 2025-12-05 09:29:34.938 189070 DEBUG oslo_concurrency.lockutils [req-e1c26889-1932-4dd1-b41d-3e45283da48b req-4eea2dc2-0535-411f-a04d-5eee68badd73 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Acquiring lock "refresh_cache-04f78c05-bd66-4054-89c0-78b77c57ecf7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 09:29:35 compute-1 nova_compute[189066]: 2025-12-05 09:29:35.082 189070 DEBUG nova.network.neutron [None req-719adf5c-043a-472c-b571-1736994731ac 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] [instance: 04f78c05-bd66-4054-89c0-78b77c57ecf7] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Dec 05 09:29:36 compute-1 nova_compute[189066]: 2025-12-05 09:29:36.301 189070 DEBUG nova.network.neutron [None req-719adf5c-043a-472c-b571-1736994731ac 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] [instance: 04f78c05-bd66-4054-89c0-78b77c57ecf7] Updating instance_info_cache with network_info: [{"id": "7f77bcc1-96ff-4d64-ae85-ba761c46b85e", "address": "fa:16:3e:1e:41:0b", "network": {"id": "8195ce32-c9e7-4b77-af7d-e020c8cddd3e", "bridge": "br-int", "label": "tempest-network-smoke--234633567", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e26ae3fdd48d4947978a480f70e14f84", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7f77bcc1-96", "ovs_interfaceid": "7f77bcc1-96ff-4d64-ae85-ba761c46b85e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 09:29:36 compute-1 nova_compute[189066]: 2025-12-05 09:29:36.346 189070 DEBUG oslo_concurrency.lockutils [None req-719adf5c-043a-472c-b571-1736994731ac 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] Releasing lock "refresh_cache-04f78c05-bd66-4054-89c0-78b77c57ecf7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 09:29:36 compute-1 nova_compute[189066]: 2025-12-05 09:29:36.347 189070 DEBUG nova.compute.manager [None req-719adf5c-043a-472c-b571-1736994731ac 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] [instance: 04f78c05-bd66-4054-89c0-78b77c57ecf7] Instance network_info: |[{"id": "7f77bcc1-96ff-4d64-ae85-ba761c46b85e", "address": "fa:16:3e:1e:41:0b", "network": {"id": "8195ce32-c9e7-4b77-af7d-e020c8cddd3e", "bridge": "br-int", "label": "tempest-network-smoke--234633567", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e26ae3fdd48d4947978a480f70e14f84", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7f77bcc1-96", "ovs_interfaceid": "7f77bcc1-96ff-4d64-ae85-ba761c46b85e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Dec 05 09:29:36 compute-1 nova_compute[189066]: 2025-12-05 09:29:36.348 189070 DEBUG oslo_concurrency.lockutils [req-e1c26889-1932-4dd1-b41d-3e45283da48b req-4eea2dc2-0535-411f-a04d-5eee68badd73 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Acquired lock "refresh_cache-04f78c05-bd66-4054-89c0-78b77c57ecf7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 09:29:36 compute-1 nova_compute[189066]: 2025-12-05 09:29:36.348 189070 DEBUG nova.network.neutron [req-e1c26889-1932-4dd1-b41d-3e45283da48b req-4eea2dc2-0535-411f-a04d-5eee68badd73 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 04f78c05-bd66-4054-89c0-78b77c57ecf7] Refreshing network info cache for port 7f77bcc1-96ff-4d64-ae85-ba761c46b85e _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Dec 05 09:29:36 compute-1 nova_compute[189066]: 2025-12-05 09:29:36.353 189070 DEBUG nova.virt.libvirt.driver [None req-719adf5c-043a-472c-b571-1736994731ac 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] [instance: 04f78c05-bd66-4054-89c0-78b77c57ecf7] Start _get_guest_xml network_info=[{"id": "7f77bcc1-96ff-4d64-ae85-ba761c46b85e", "address": "fa:16:3e:1e:41:0b", "network": {"id": "8195ce32-c9e7-4b77-af7d-e020c8cddd3e", "bridge": "br-int", "label": "tempest-network-smoke--234633567", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e26ae3fdd48d4947978a480f70e14f84", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7f77bcc1-96", "ovs_interfaceid": "7f77bcc1-96ff-4d64-ae85-ba761c46b85e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T09:19:04Z,direct_url=<?>,disk_format='qcow2',id=3ebffd97-b242-42d7-b245-ebdaf8e4377c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='1cb29f3743454b7fb71af92993455437',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T09:19:06Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encrypted': False, 'encryption_secret_uuid': None, 'size': 0, 'boot_index': 0, 'device_name': '/dev/vda', 'disk_bus': 'virtio', 'guest_format': None, 'encryption_options': None, 'encryption_format': None, 'device_type': 'disk', 'image_id': '3ebffd97-b242-42d7-b245-ebdaf8e4377c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Dec 05 09:29:36 compute-1 nova_compute[189066]: 2025-12-05 09:29:36.361 189070 WARNING nova.virt.libvirt.driver [None req-719adf5c-043a-472c-b571-1736994731ac 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 05 09:29:36 compute-1 nova_compute[189066]: 2025-12-05 09:29:36.368 189070 DEBUG nova.virt.libvirt.host [None req-719adf5c-043a-472c-b571-1736994731ac 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Dec 05 09:29:36 compute-1 nova_compute[189066]: 2025-12-05 09:29:36.369 189070 DEBUG nova.virt.libvirt.host [None req-719adf5c-043a-472c-b571-1736994731ac 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Dec 05 09:29:36 compute-1 nova_compute[189066]: 2025-12-05 09:29:36.376 189070 DEBUG nova.virt.libvirt.host [None req-719adf5c-043a-472c-b571-1736994731ac 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Dec 05 09:29:36 compute-1 nova_compute[189066]: 2025-12-05 09:29:36.377 189070 DEBUG nova.virt.libvirt.host [None req-719adf5c-043a-472c-b571-1736994731ac 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Dec 05 09:29:36 compute-1 nova_compute[189066]: 2025-12-05 09:29:36.379 189070 DEBUG nova.virt.libvirt.driver [None req-719adf5c-043a-472c-b571-1736994731ac 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Dec 05 09:29:36 compute-1 nova_compute[189066]: 2025-12-05 09:29:36.380 189070 DEBUG nova.virt.hardware [None req-719adf5c-043a-472c-b571-1736994731ac 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-05T09:19:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='fbadeab4-f24f-4100-963a-d228b2a6f7c4',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T09:19:04Z,direct_url=<?>,disk_format='qcow2',id=3ebffd97-b242-42d7-b245-ebdaf8e4377c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='1cb29f3743454b7fb71af92993455437',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T09:19:06Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Dec 05 09:29:36 compute-1 nova_compute[189066]: 2025-12-05 09:29:36.380 189070 DEBUG nova.virt.hardware [None req-719adf5c-043a-472c-b571-1736994731ac 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Dec 05 09:29:36 compute-1 nova_compute[189066]: 2025-12-05 09:29:36.381 189070 DEBUG nova.virt.hardware [None req-719adf5c-043a-472c-b571-1736994731ac 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Dec 05 09:29:36 compute-1 nova_compute[189066]: 2025-12-05 09:29:36.381 189070 DEBUG nova.virt.hardware [None req-719adf5c-043a-472c-b571-1736994731ac 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Dec 05 09:29:36 compute-1 nova_compute[189066]: 2025-12-05 09:29:36.381 189070 DEBUG nova.virt.hardware [None req-719adf5c-043a-472c-b571-1736994731ac 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Dec 05 09:29:36 compute-1 nova_compute[189066]: 2025-12-05 09:29:36.381 189070 DEBUG nova.virt.hardware [None req-719adf5c-043a-472c-b571-1736994731ac 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Dec 05 09:29:36 compute-1 nova_compute[189066]: 2025-12-05 09:29:36.382 189070 DEBUG nova.virt.hardware [None req-719adf5c-043a-472c-b571-1736994731ac 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Dec 05 09:29:36 compute-1 nova_compute[189066]: 2025-12-05 09:29:36.382 189070 DEBUG nova.virt.hardware [None req-719adf5c-043a-472c-b571-1736994731ac 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Dec 05 09:29:36 compute-1 nova_compute[189066]: 2025-12-05 09:29:36.382 189070 DEBUG nova.virt.hardware [None req-719adf5c-043a-472c-b571-1736994731ac 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Dec 05 09:29:36 compute-1 nova_compute[189066]: 2025-12-05 09:29:36.382 189070 DEBUG nova.virt.hardware [None req-719adf5c-043a-472c-b571-1736994731ac 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Dec 05 09:29:36 compute-1 nova_compute[189066]: 2025-12-05 09:29:36.383 189070 DEBUG nova.virt.hardware [None req-719adf5c-043a-472c-b571-1736994731ac 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Dec 05 09:29:36 compute-1 nova_compute[189066]: 2025-12-05 09:29:36.388 189070 DEBUG nova.virt.libvirt.vif [None req-719adf5c-043a-472c-b571-1736994731ac 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T09:29:27Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-9809392',display_name='tempest-TestNetworkAdvancedServerOps-server-9809392',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-9809392',id=16,image_ref='3ebffd97-b242-42d7-b245-ebdaf8e4377c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBElVLGKxlEVwzAZJ1guSW4shHmDK6WqU6dDNGGxi9zvR6hWR12i1LobTF33T7eAwK5RFh9NTFWyxVw3WpoiJEBk8hm03HrID/XQf1e/9rdkXquVTwGJStOX/buxANUYi7Q==',key_name='tempest-TestNetworkAdvancedServerOps-148347316',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='e26ae3fdd48d4947978a480f70e14f84',ramdisk_id='',reservation_id='r-87n9oxue',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='3ebffd97-b242-42d7-b245-ebdaf8e4377c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkAdvancedServerOps-1829130727',owner_user_name='tempest-TestNetworkAdvancedServerOps-1829130727-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T09:29:29Z,user_data=None,user_id='65751a90715341b2984ef84ebbaa1650',uuid=04f78c05-bd66-4054-89c0-78b77c57ecf7,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "7f77bcc1-96ff-4d64-ae85-ba761c46b85e", "address": "fa:16:3e:1e:41:0b", "network": {"id": "8195ce32-c9e7-4b77-af7d-e020c8cddd3e", "bridge": "br-int", "label": "tempest-network-smoke--234633567", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e26ae3fdd48d4947978a480f70e14f84", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7f77bcc1-96", "ovs_interfaceid": "7f77bcc1-96ff-4d64-ae85-ba761c46b85e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Dec 05 09:29:36 compute-1 nova_compute[189066]: 2025-12-05 09:29:36.389 189070 DEBUG nova.network.os_vif_util [None req-719adf5c-043a-472c-b571-1736994731ac 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] Converting VIF {"id": "7f77bcc1-96ff-4d64-ae85-ba761c46b85e", "address": "fa:16:3e:1e:41:0b", "network": {"id": "8195ce32-c9e7-4b77-af7d-e020c8cddd3e", "bridge": "br-int", "label": "tempest-network-smoke--234633567", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e26ae3fdd48d4947978a480f70e14f84", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7f77bcc1-96", "ovs_interfaceid": "7f77bcc1-96ff-4d64-ae85-ba761c46b85e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 09:29:36 compute-1 nova_compute[189066]: 2025-12-05 09:29:36.390 189070 DEBUG nova.network.os_vif_util [None req-719adf5c-043a-472c-b571-1736994731ac 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:1e:41:0b,bridge_name='br-int',has_traffic_filtering=True,id=7f77bcc1-96ff-4d64-ae85-ba761c46b85e,network=Network(8195ce32-c9e7-4b77-af7d-e020c8cddd3e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7f77bcc1-96') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 09:29:36 compute-1 nova_compute[189066]: 2025-12-05 09:29:36.391 189070 DEBUG nova.objects.instance [None req-719adf5c-043a-472c-b571-1736994731ac 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] Lazy-loading 'pci_devices' on Instance uuid 04f78c05-bd66-4054-89c0-78b77c57ecf7 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 09:29:36 compute-1 nova_compute[189066]: 2025-12-05 09:29:36.411 189070 DEBUG nova.virt.libvirt.driver [None req-719adf5c-043a-472c-b571-1736994731ac 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] [instance: 04f78c05-bd66-4054-89c0-78b77c57ecf7] End _get_guest_xml xml=<domain type="kvm">
Dec 05 09:29:36 compute-1 nova_compute[189066]:   <uuid>04f78c05-bd66-4054-89c0-78b77c57ecf7</uuid>
Dec 05 09:29:36 compute-1 nova_compute[189066]:   <name>instance-00000010</name>
Dec 05 09:29:36 compute-1 nova_compute[189066]:   <memory>131072</memory>
Dec 05 09:29:36 compute-1 nova_compute[189066]:   <vcpu>1</vcpu>
Dec 05 09:29:36 compute-1 nova_compute[189066]:   <metadata>
Dec 05 09:29:36 compute-1 nova_compute[189066]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 05 09:29:36 compute-1 nova_compute[189066]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec 05 09:29:36 compute-1 nova_compute[189066]:       <nova:name>tempest-TestNetworkAdvancedServerOps-server-9809392</nova:name>
Dec 05 09:29:36 compute-1 nova_compute[189066]:       <nova:creationTime>2025-12-05 09:29:36</nova:creationTime>
Dec 05 09:29:36 compute-1 nova_compute[189066]:       <nova:flavor name="m1.nano">
Dec 05 09:29:36 compute-1 nova_compute[189066]:         <nova:memory>128</nova:memory>
Dec 05 09:29:36 compute-1 nova_compute[189066]:         <nova:disk>1</nova:disk>
Dec 05 09:29:36 compute-1 nova_compute[189066]:         <nova:swap>0</nova:swap>
Dec 05 09:29:36 compute-1 nova_compute[189066]:         <nova:ephemeral>0</nova:ephemeral>
Dec 05 09:29:36 compute-1 nova_compute[189066]:         <nova:vcpus>1</nova:vcpus>
Dec 05 09:29:36 compute-1 nova_compute[189066]:       </nova:flavor>
Dec 05 09:29:36 compute-1 nova_compute[189066]:       <nova:owner>
Dec 05 09:29:36 compute-1 nova_compute[189066]:         <nova:user uuid="65751a90715341b2984ef84ebbaa1650">tempest-TestNetworkAdvancedServerOps-1829130727-project-member</nova:user>
Dec 05 09:29:36 compute-1 nova_compute[189066]:         <nova:project uuid="e26ae3fdd48d4947978a480f70e14f84">tempest-TestNetworkAdvancedServerOps-1829130727</nova:project>
Dec 05 09:29:36 compute-1 nova_compute[189066]:       </nova:owner>
Dec 05 09:29:36 compute-1 nova_compute[189066]:       <nova:root type="image" uuid="3ebffd97-b242-42d7-b245-ebdaf8e4377c"/>
Dec 05 09:29:36 compute-1 nova_compute[189066]:       <nova:ports>
Dec 05 09:29:36 compute-1 nova_compute[189066]:         <nova:port uuid="7f77bcc1-96ff-4d64-ae85-ba761c46b85e">
Dec 05 09:29:36 compute-1 nova_compute[189066]:           <nova:ip type="fixed" address="10.100.0.4" ipVersion="4"/>
Dec 05 09:29:36 compute-1 nova_compute[189066]:         </nova:port>
Dec 05 09:29:36 compute-1 nova_compute[189066]:       </nova:ports>
Dec 05 09:29:36 compute-1 nova_compute[189066]:     </nova:instance>
Dec 05 09:29:36 compute-1 nova_compute[189066]:   </metadata>
Dec 05 09:29:36 compute-1 nova_compute[189066]:   <sysinfo type="smbios">
Dec 05 09:29:36 compute-1 nova_compute[189066]:     <system>
Dec 05 09:29:36 compute-1 nova_compute[189066]:       <entry name="manufacturer">RDO</entry>
Dec 05 09:29:36 compute-1 nova_compute[189066]:       <entry name="product">OpenStack Compute</entry>
Dec 05 09:29:36 compute-1 nova_compute[189066]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec 05 09:29:36 compute-1 nova_compute[189066]:       <entry name="serial">04f78c05-bd66-4054-89c0-78b77c57ecf7</entry>
Dec 05 09:29:36 compute-1 nova_compute[189066]:       <entry name="uuid">04f78c05-bd66-4054-89c0-78b77c57ecf7</entry>
Dec 05 09:29:36 compute-1 nova_compute[189066]:       <entry name="family">Virtual Machine</entry>
Dec 05 09:29:36 compute-1 nova_compute[189066]:     </system>
Dec 05 09:29:36 compute-1 nova_compute[189066]:   </sysinfo>
Dec 05 09:29:36 compute-1 nova_compute[189066]:   <os>
Dec 05 09:29:36 compute-1 nova_compute[189066]:     <type arch="x86_64" machine="q35">hvm</type>
Dec 05 09:29:36 compute-1 nova_compute[189066]:     <boot dev="hd"/>
Dec 05 09:29:36 compute-1 nova_compute[189066]:     <smbios mode="sysinfo"/>
Dec 05 09:29:36 compute-1 nova_compute[189066]:   </os>
Dec 05 09:29:36 compute-1 nova_compute[189066]:   <features>
Dec 05 09:29:36 compute-1 nova_compute[189066]:     <acpi/>
Dec 05 09:29:36 compute-1 nova_compute[189066]:     <apic/>
Dec 05 09:29:36 compute-1 nova_compute[189066]:     <vmcoreinfo/>
Dec 05 09:29:36 compute-1 nova_compute[189066]:   </features>
Dec 05 09:29:36 compute-1 nova_compute[189066]:   <clock offset="utc">
Dec 05 09:29:36 compute-1 nova_compute[189066]:     <timer name="pit" tickpolicy="delay"/>
Dec 05 09:29:36 compute-1 nova_compute[189066]:     <timer name="rtc" tickpolicy="catchup"/>
Dec 05 09:29:36 compute-1 nova_compute[189066]:     <timer name="hpet" present="no"/>
Dec 05 09:29:36 compute-1 nova_compute[189066]:   </clock>
Dec 05 09:29:36 compute-1 nova_compute[189066]:   <cpu mode="custom" match="exact">
Dec 05 09:29:36 compute-1 nova_compute[189066]:     <model>Nehalem</model>
Dec 05 09:29:36 compute-1 nova_compute[189066]:     <topology sockets="1" cores="1" threads="1"/>
Dec 05 09:29:36 compute-1 nova_compute[189066]:   </cpu>
Dec 05 09:29:36 compute-1 nova_compute[189066]:   <devices>
Dec 05 09:29:36 compute-1 nova_compute[189066]:     <disk type="file" device="disk">
Dec 05 09:29:36 compute-1 nova_compute[189066]:       <driver name="qemu" type="qcow2" cache="none"/>
Dec 05 09:29:36 compute-1 nova_compute[189066]:       <source file="/var/lib/nova/instances/04f78c05-bd66-4054-89c0-78b77c57ecf7/disk"/>
Dec 05 09:29:36 compute-1 nova_compute[189066]:       <target dev="vda" bus="virtio"/>
Dec 05 09:29:36 compute-1 nova_compute[189066]:     </disk>
Dec 05 09:29:36 compute-1 nova_compute[189066]:     <disk type="file" device="cdrom">
Dec 05 09:29:36 compute-1 nova_compute[189066]:       <driver name="qemu" type="raw" cache="none"/>
Dec 05 09:29:36 compute-1 nova_compute[189066]:       <source file="/var/lib/nova/instances/04f78c05-bd66-4054-89c0-78b77c57ecf7/disk.config"/>
Dec 05 09:29:36 compute-1 nova_compute[189066]:       <target dev="sda" bus="sata"/>
Dec 05 09:29:36 compute-1 nova_compute[189066]:     </disk>
Dec 05 09:29:36 compute-1 nova_compute[189066]:     <interface type="ethernet">
Dec 05 09:29:36 compute-1 nova_compute[189066]:       <mac address="fa:16:3e:1e:41:0b"/>
Dec 05 09:29:36 compute-1 nova_compute[189066]:       <model type="virtio"/>
Dec 05 09:29:36 compute-1 nova_compute[189066]:       <driver name="vhost" rx_queue_size="512"/>
Dec 05 09:29:36 compute-1 nova_compute[189066]:       <mtu size="1442"/>
Dec 05 09:29:36 compute-1 nova_compute[189066]:       <target dev="tap7f77bcc1-96"/>
Dec 05 09:29:36 compute-1 nova_compute[189066]:     </interface>
Dec 05 09:29:36 compute-1 nova_compute[189066]:     <serial type="pty">
Dec 05 09:29:36 compute-1 nova_compute[189066]:       <log file="/var/lib/nova/instances/04f78c05-bd66-4054-89c0-78b77c57ecf7/console.log" append="off"/>
Dec 05 09:29:36 compute-1 nova_compute[189066]:     </serial>
Dec 05 09:29:36 compute-1 nova_compute[189066]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Dec 05 09:29:36 compute-1 nova_compute[189066]:     <video>
Dec 05 09:29:36 compute-1 nova_compute[189066]:       <model type="virtio"/>
Dec 05 09:29:36 compute-1 nova_compute[189066]:     </video>
Dec 05 09:29:36 compute-1 nova_compute[189066]:     <input type="tablet" bus="usb"/>
Dec 05 09:29:36 compute-1 nova_compute[189066]:     <rng model="virtio">
Dec 05 09:29:36 compute-1 nova_compute[189066]:       <backend model="random">/dev/urandom</backend>
Dec 05 09:29:36 compute-1 nova_compute[189066]:     </rng>
Dec 05 09:29:36 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root"/>
Dec 05 09:29:36 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:29:36 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:29:36 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:29:36 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:29:36 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:29:36 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:29:36 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:29:36 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:29:36 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:29:36 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:29:36 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:29:36 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:29:36 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:29:36 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:29:36 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:29:36 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:29:36 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:29:36 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:29:36 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:29:36 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:29:36 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:29:36 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:29:36 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:29:36 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:29:36 compute-1 nova_compute[189066]:     <controller type="usb" index="0"/>
Dec 05 09:29:36 compute-1 nova_compute[189066]:     <memballoon model="virtio">
Dec 05 09:29:36 compute-1 nova_compute[189066]:       <stats period="10"/>
Dec 05 09:29:36 compute-1 nova_compute[189066]:     </memballoon>
Dec 05 09:29:36 compute-1 nova_compute[189066]:   </devices>
Dec 05 09:29:36 compute-1 nova_compute[189066]: </domain>
Dec 05 09:29:36 compute-1 nova_compute[189066]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Dec 05 09:29:36 compute-1 nova_compute[189066]: 2025-12-05 09:29:36.413 189070 DEBUG nova.compute.manager [None req-719adf5c-043a-472c-b571-1736994731ac 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] [instance: 04f78c05-bd66-4054-89c0-78b77c57ecf7] Preparing to wait for external event network-vif-plugged-7f77bcc1-96ff-4d64-ae85-ba761c46b85e prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Dec 05 09:29:36 compute-1 nova_compute[189066]: 2025-12-05 09:29:36.414 189070 DEBUG oslo_concurrency.lockutils [None req-719adf5c-043a-472c-b571-1736994731ac 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] Acquiring lock "04f78c05-bd66-4054-89c0-78b77c57ecf7-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:29:36 compute-1 nova_compute[189066]: 2025-12-05 09:29:36.414 189070 DEBUG oslo_concurrency.lockutils [None req-719adf5c-043a-472c-b571-1736994731ac 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] Lock "04f78c05-bd66-4054-89c0-78b77c57ecf7-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:29:36 compute-1 nova_compute[189066]: 2025-12-05 09:29:36.414 189070 DEBUG oslo_concurrency.lockutils [None req-719adf5c-043a-472c-b571-1736994731ac 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] Lock "04f78c05-bd66-4054-89c0-78b77c57ecf7-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:29:36 compute-1 nova_compute[189066]: 2025-12-05 09:29:36.415 189070 DEBUG nova.virt.libvirt.vif [None req-719adf5c-043a-472c-b571-1736994731ac 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T09:29:27Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-9809392',display_name='tempest-TestNetworkAdvancedServerOps-server-9809392',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-9809392',id=16,image_ref='3ebffd97-b242-42d7-b245-ebdaf8e4377c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBElVLGKxlEVwzAZJ1guSW4shHmDK6WqU6dDNGGxi9zvR6hWR12i1LobTF33T7eAwK5RFh9NTFWyxVw3WpoiJEBk8hm03HrID/XQf1e/9rdkXquVTwGJStOX/buxANUYi7Q==',key_name='tempest-TestNetworkAdvancedServerOps-148347316',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='e26ae3fdd48d4947978a480f70e14f84',ramdisk_id='',reservation_id='r-87n9oxue',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='3ebffd97-b242-42d7-b245-ebdaf8e4377c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkAdvancedServerOps-1829130727',owner_user_name='tempest-TestNetworkAdvancedServerOps-1829130727-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T09:29:29Z,user_data=None,user_id='65751a90715341b2984ef84ebbaa1650',uuid=04f78c05-bd66-4054-89c0-78b77c57ecf7,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "7f77bcc1-96ff-4d64-ae85-ba761c46b85e", "address": "fa:16:3e:1e:41:0b", "network": {"id": "8195ce32-c9e7-4b77-af7d-e020c8cddd3e", "bridge": "br-int", "label": "tempest-network-smoke--234633567", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e26ae3fdd48d4947978a480f70e14f84", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7f77bcc1-96", "ovs_interfaceid": "7f77bcc1-96ff-4d64-ae85-ba761c46b85e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Dec 05 09:29:36 compute-1 nova_compute[189066]: 2025-12-05 09:29:36.416 189070 DEBUG nova.network.os_vif_util [None req-719adf5c-043a-472c-b571-1736994731ac 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] Converting VIF {"id": "7f77bcc1-96ff-4d64-ae85-ba761c46b85e", "address": "fa:16:3e:1e:41:0b", "network": {"id": "8195ce32-c9e7-4b77-af7d-e020c8cddd3e", "bridge": "br-int", "label": "tempest-network-smoke--234633567", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e26ae3fdd48d4947978a480f70e14f84", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7f77bcc1-96", "ovs_interfaceid": "7f77bcc1-96ff-4d64-ae85-ba761c46b85e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 09:29:36 compute-1 nova_compute[189066]: 2025-12-05 09:29:36.417 189070 DEBUG nova.network.os_vif_util [None req-719adf5c-043a-472c-b571-1736994731ac 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:1e:41:0b,bridge_name='br-int',has_traffic_filtering=True,id=7f77bcc1-96ff-4d64-ae85-ba761c46b85e,network=Network(8195ce32-c9e7-4b77-af7d-e020c8cddd3e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7f77bcc1-96') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 09:29:36 compute-1 nova_compute[189066]: 2025-12-05 09:29:36.417 189070 DEBUG os_vif [None req-719adf5c-043a-472c-b571-1736994731ac 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:1e:41:0b,bridge_name='br-int',has_traffic_filtering=True,id=7f77bcc1-96ff-4d64-ae85-ba761c46b85e,network=Network(8195ce32-c9e7-4b77-af7d-e020c8cddd3e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7f77bcc1-96') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Dec 05 09:29:36 compute-1 nova_compute[189066]: 2025-12-05 09:29:36.418 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:29:36 compute-1 nova_compute[189066]: 2025-12-05 09:29:36.418 189070 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 09:29:36 compute-1 nova_compute[189066]: 2025-12-05 09:29:36.419 189070 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 09:29:36 compute-1 nova_compute[189066]: 2025-12-05 09:29:36.425 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:29:36 compute-1 nova_compute[189066]: 2025-12-05 09:29:36.425 189070 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap7f77bcc1-96, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 09:29:36 compute-1 nova_compute[189066]: 2025-12-05 09:29:36.426 189070 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap7f77bcc1-96, col_values=(('external_ids', {'iface-id': '7f77bcc1-96ff-4d64-ae85-ba761c46b85e', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:1e:41:0b', 'vm-uuid': '04f78c05-bd66-4054-89c0-78b77c57ecf7'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 09:29:36 compute-1 nova_compute[189066]: 2025-12-05 09:29:36.428 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:29:36 compute-1 NetworkManager[55704]: <info>  [1764926976.4294] manager: (tap7f77bcc1-96): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/62)
Dec 05 09:29:36 compute-1 nova_compute[189066]: 2025-12-05 09:29:36.431 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 09:29:36 compute-1 nova_compute[189066]: 2025-12-05 09:29:36.440 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:29:36 compute-1 nova_compute[189066]: 2025-12-05 09:29:36.441 189070 INFO os_vif [None req-719adf5c-043a-472c-b571-1736994731ac 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:1e:41:0b,bridge_name='br-int',has_traffic_filtering=True,id=7f77bcc1-96ff-4d64-ae85-ba761c46b85e,network=Network(8195ce32-c9e7-4b77-af7d-e020c8cddd3e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7f77bcc1-96')
Dec 05 09:29:36 compute-1 nova_compute[189066]: 2025-12-05 09:29:36.508 189070 DEBUG nova.virt.libvirt.driver [None req-719adf5c-043a-472c-b571-1736994731ac 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 05 09:29:36 compute-1 nova_compute[189066]: 2025-12-05 09:29:36.509 189070 DEBUG nova.virt.libvirt.driver [None req-719adf5c-043a-472c-b571-1736994731ac 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 05 09:29:36 compute-1 nova_compute[189066]: 2025-12-05 09:29:36.509 189070 DEBUG nova.virt.libvirt.driver [None req-719adf5c-043a-472c-b571-1736994731ac 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] No VIF found with MAC fa:16:3e:1e:41:0b, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Dec 05 09:29:36 compute-1 nova_compute[189066]: 2025-12-05 09:29:36.510 189070 INFO nova.virt.libvirt.driver [None req-719adf5c-043a-472c-b571-1736994731ac 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] [instance: 04f78c05-bd66-4054-89c0-78b77c57ecf7] Using config drive
Dec 05 09:29:36 compute-1 podman[225208]: 2025-12-05 09:29:36.630660941 +0000 UTC m=+0.068476006 container health_status d60d3dfd47255bc7c068dbedc03dbf86678863f0414a1b8f8536e4d53dd67a13 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a42d21893a42142b0b00e96296a13ac9d2c0fedf55d6438ad104fd0309c63376-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Dec 05 09:29:37 compute-1 nova_compute[189066]: 2025-12-05 09:29:37.083 189070 INFO nova.virt.libvirt.driver [None req-719adf5c-043a-472c-b571-1736994731ac 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] [instance: 04f78c05-bd66-4054-89c0-78b77c57ecf7] Creating config drive at /var/lib/nova/instances/04f78c05-bd66-4054-89c0-78b77c57ecf7/disk.config
Dec 05 09:29:37 compute-1 nova_compute[189066]: 2025-12-05 09:29:37.088 189070 DEBUG oslo_concurrency.processutils [None req-719adf5c-043a-472c-b571-1736994731ac 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/04f78c05-bd66-4054-89c0-78b77c57ecf7/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpboujt7qw execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 09:29:37 compute-1 nova_compute[189066]: 2025-12-05 09:29:37.218 189070 DEBUG oslo_concurrency.processutils [None req-719adf5c-043a-472c-b571-1736994731ac 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/04f78c05-bd66-4054-89c0-78b77c57ecf7/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpboujt7qw" returned: 0 in 0.130s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 09:29:37 compute-1 kernel: tap7f77bcc1-96: entered promiscuous mode
Dec 05 09:29:37 compute-1 ovn_controller[95809]: 2025-12-05T09:29:37Z|00119|binding|INFO|Claiming lport 7f77bcc1-96ff-4d64-ae85-ba761c46b85e for this chassis.
Dec 05 09:29:37 compute-1 nova_compute[189066]: 2025-12-05 09:29:37.293 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:29:37 compute-1 ovn_controller[95809]: 2025-12-05T09:29:37Z|00120|binding|INFO|7f77bcc1-96ff-4d64-ae85-ba761c46b85e: Claiming fa:16:3e:1e:41:0b 10.100.0.4
Dec 05 09:29:37 compute-1 nova_compute[189066]: 2025-12-05 09:29:37.297 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:29:37 compute-1 NetworkManager[55704]: <info>  [1764926977.2981] manager: (tap7f77bcc1-96): new Tun device (/org/freedesktop/NetworkManager/Devices/63)
Dec 05 09:29:37 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:29:37.317 105272 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:1e:41:0b 10.100.0.4'], port_security=['fa:16:3e:1e:41:0b 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': '04f78c05-bd66-4054-89c0-78b77c57ecf7', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8195ce32-c9e7-4b77-af7d-e020c8cddd3e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e26ae3fdd48d4947978a480f70e14f84', 'neutron:revision_number': '2', 'neutron:security_group_ids': '42f1426a-e52e-4db3-9f22-2fc031853713', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a949aafc-203c-48e1-86d6-9f80dc9b7491, chassis=[<ovs.db.idl.Row object at 0x7fc06f54c970>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fc06f54c970>], logical_port=7f77bcc1-96ff-4d64-ae85-ba761c46b85e) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 09:29:37 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:29:37.319 105272 INFO neutron.agent.ovn.metadata.agent [-] Port 7f77bcc1-96ff-4d64-ae85-ba761c46b85e in datapath 8195ce32-c9e7-4b77-af7d-e020c8cddd3e bound to our chassis
Dec 05 09:29:37 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:29:37.321 105272 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 8195ce32-c9e7-4b77-af7d-e020c8cddd3e
Dec 05 09:29:37 compute-1 systemd-udevd[225246]: Network interface NamePolicy= disabled on kernel command line.
Dec 05 09:29:37 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:29:37.339 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[82c9b455-8e48-4db8-b825-4fafb0d7380c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:29:37 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:29:37.340 105272 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap8195ce32-c1 in ovnmeta-8195ce32-c9e7-4b77-af7d-e020c8cddd3e namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Dec 05 09:29:37 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:29:37.343 220556 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap8195ce32-c0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Dec 05 09:29:37 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:29:37.343 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[a5c5f703-d49f-40de-854f-4d911442a417]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:29:37 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:29:37.344 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[c28cc836-c2da-4877-b6ef-8f896f9b89e2]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:29:37 compute-1 systemd-machined[154815]: New machine qemu-9-instance-00000010.
Dec 05 09:29:37 compute-1 NetworkManager[55704]: <info>  [1764926977.3541] device (tap7f77bcc1-96): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec 05 09:29:37 compute-1 nova_compute[189066]: 2025-12-05 09:29:37.353 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:29:37 compute-1 NetworkManager[55704]: <info>  [1764926977.3554] device (tap7f77bcc1-96): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec 05 09:29:37 compute-1 systemd[1]: Started Virtual Machine qemu-9-instance-00000010.
Dec 05 09:29:37 compute-1 ovn_controller[95809]: 2025-12-05T09:29:37Z|00121|binding|INFO|Setting lport 7f77bcc1-96ff-4d64-ae85-ba761c46b85e ovn-installed in OVS
Dec 05 09:29:37 compute-1 ovn_controller[95809]: 2025-12-05T09:29:37Z|00122|binding|INFO|Setting lport 7f77bcc1-96ff-4d64-ae85-ba761c46b85e up in Southbound
Dec 05 09:29:37 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:29:37.360 105809 DEBUG oslo.privsep.daemon [-] privsep: reply[7dcb5b81-9adf-4d43-9f9e-809faf78ebd6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:29:37 compute-1 nova_compute[189066]: 2025-12-05 09:29:37.364 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:29:37 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:29:37.390 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[ee86e06d-a712-4080-8a9d-148030ab515f]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:29:37 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:29:37.432 220896 DEBUG oslo.privsep.daemon [-] privsep: reply[7ae8cb81-1a02-4059-a7f7-cff21d6c4e57]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:29:37 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:29:37.440 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[69bd3301-c37e-4177-b2c1-9cd7567e610d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:29:37 compute-1 NetworkManager[55704]: <info>  [1764926977.4414] manager: (tap8195ce32-c0): new Veth device (/org/freedesktop/NetworkManager/Devices/64)
Dec 05 09:29:37 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:29:37.483 220896 DEBUG oslo.privsep.daemon [-] privsep: reply[f43c6f83-108c-4481-a303-ed56f944a2e1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:29:37 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:29:37.487 220896 DEBUG oslo.privsep.daemon [-] privsep: reply[eb0a8a71-0ede-44f2-9732-906304aaf2d7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:29:37 compute-1 NetworkManager[55704]: <info>  [1764926977.5156] device (tap8195ce32-c0): carrier: link connected
Dec 05 09:29:37 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:29:37.522 220896 DEBUG oslo.privsep.daemon [-] privsep: reply[fb0e20b6-e1f6-4625-a39a-1417e9f96287]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:29:37 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:29:37.545 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[03731fa1-c8b8-43ba-91fb-61273d593eba]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap8195ce32-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:9a:de:83'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 37], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 425051, 'reachable_time': 24019, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 225284, 'error': None, 'target': 'ovnmeta-8195ce32-c9e7-4b77-af7d-e020c8cddd3e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:29:37 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:29:37.566 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[28a19115-83d9-48c2-85d5-58b779089b3c]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe9a:de83'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 425051, 'tstamp': 425051}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 225286, 'error': None, 'target': 'ovnmeta-8195ce32-c9e7-4b77-af7d-e020c8cddd3e', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:29:37 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:29:37.627 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[6412a6f2-7134-40d5-8799-d09ec6428c51]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap8195ce32-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:9a:de:83'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 37], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 425051, 'reachable_time': 24019, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 225287, 'error': None, 'target': 'ovnmeta-8195ce32-c9e7-4b77-af7d-e020c8cddd3e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:29:37 compute-1 nova_compute[189066]: 2025-12-05 09:29:37.677 189070 DEBUG nova.virt.driver [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] Emitting event <LifecycleEvent: 1764926977.676561, 04f78c05-bd66-4054-89c0-78b77c57ecf7 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 09:29:37 compute-1 nova_compute[189066]: 2025-12-05 09:29:37.678 189070 INFO nova.compute.manager [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] [instance: 04f78c05-bd66-4054-89c0-78b77c57ecf7] VM Started (Lifecycle Event)
Dec 05 09:29:37 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:29:37.679 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[5cef6c08-adbd-4478-a32a-cb005d3c3348]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:29:37 compute-1 nova_compute[189066]: 2025-12-05 09:29:37.708 189070 DEBUG nova.compute.manager [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] [instance: 04f78c05-bd66-4054-89c0-78b77c57ecf7] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 09:29:37 compute-1 nova_compute[189066]: 2025-12-05 09:29:37.713 189070 DEBUG nova.virt.driver [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] Emitting event <LifecycleEvent: 1764926977.6773913, 04f78c05-bd66-4054-89c0-78b77c57ecf7 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 09:29:37 compute-1 nova_compute[189066]: 2025-12-05 09:29:37.714 189070 INFO nova.compute.manager [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] [instance: 04f78c05-bd66-4054-89c0-78b77c57ecf7] VM Paused (Lifecycle Event)
Dec 05 09:29:37 compute-1 nova_compute[189066]: 2025-12-05 09:29:37.738 189070 DEBUG nova.compute.manager [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] [instance: 04f78c05-bd66-4054-89c0-78b77c57ecf7] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 09:29:37 compute-1 nova_compute[189066]: 2025-12-05 09:29:37.743 189070 DEBUG nova.compute.manager [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] [instance: 04f78c05-bd66-4054-89c0-78b77c57ecf7] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 09:29:37 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:29:37.763 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[6b2134c8-30e4-4bc9-ae94-80bacd41e6e1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:29:37 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:29:37.765 105272 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap8195ce32-c0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 09:29:37 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:29:37.765 105272 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 09:29:37 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:29:37.766 105272 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap8195ce32-c0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 09:29:37 compute-1 NetworkManager[55704]: <info>  [1764926977.7690] manager: (tap8195ce32-c0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/65)
Dec 05 09:29:37 compute-1 kernel: tap8195ce32-c0: entered promiscuous mode
Dec 05 09:29:37 compute-1 nova_compute[189066]: 2025-12-05 09:29:37.768 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:29:37 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:29:37.772 105272 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap8195ce32-c0, col_values=(('external_ids', {'iface-id': '8420ea11-bf02-412c-9fba-367b868c0df0'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 09:29:37 compute-1 nova_compute[189066]: 2025-12-05 09:29:37.773 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:29:37 compute-1 ovn_controller[95809]: 2025-12-05T09:29:37Z|00123|binding|INFO|Releasing lport 8420ea11-bf02-412c-9fba-367b868c0df0 from this chassis (sb_readonly=0)
Dec 05 09:29:37 compute-1 nova_compute[189066]: 2025-12-05 09:29:37.774 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:29:37 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:29:37.774 105272 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/8195ce32-c9e7-4b77-af7d-e020c8cddd3e.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/8195ce32-c9e7-4b77-af7d-e020c8cddd3e.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Dec 05 09:29:37 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:29:37.776 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[8e645f90-2fe0-469a-b2eb-8ff45d0483da]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:29:37 compute-1 nova_compute[189066]: 2025-12-05 09:29:37.776 189070 INFO nova.compute.manager [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] [instance: 04f78c05-bd66-4054-89c0-78b77c57ecf7] During sync_power_state the instance has a pending task (spawning). Skip.
Dec 05 09:29:37 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:29:37.777 105272 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Dec 05 09:29:37 compute-1 ovn_metadata_agent[105267]: global
Dec 05 09:29:37 compute-1 ovn_metadata_agent[105267]:     log         /dev/log local0 debug
Dec 05 09:29:37 compute-1 ovn_metadata_agent[105267]:     log-tag     haproxy-metadata-proxy-8195ce32-c9e7-4b77-af7d-e020c8cddd3e
Dec 05 09:29:37 compute-1 ovn_metadata_agent[105267]:     user        root
Dec 05 09:29:37 compute-1 ovn_metadata_agent[105267]:     group       root
Dec 05 09:29:37 compute-1 ovn_metadata_agent[105267]:     maxconn     1024
Dec 05 09:29:37 compute-1 ovn_metadata_agent[105267]:     pidfile     /var/lib/neutron/external/pids/8195ce32-c9e7-4b77-af7d-e020c8cddd3e.pid.haproxy
Dec 05 09:29:37 compute-1 ovn_metadata_agent[105267]:     daemon
Dec 05 09:29:37 compute-1 ovn_metadata_agent[105267]: 
Dec 05 09:29:37 compute-1 ovn_metadata_agent[105267]: defaults
Dec 05 09:29:37 compute-1 ovn_metadata_agent[105267]:     log global
Dec 05 09:29:37 compute-1 ovn_metadata_agent[105267]:     mode http
Dec 05 09:29:37 compute-1 ovn_metadata_agent[105267]:     option httplog
Dec 05 09:29:37 compute-1 ovn_metadata_agent[105267]:     option dontlognull
Dec 05 09:29:37 compute-1 ovn_metadata_agent[105267]:     option http-server-close
Dec 05 09:29:37 compute-1 ovn_metadata_agent[105267]:     option forwardfor
Dec 05 09:29:37 compute-1 ovn_metadata_agent[105267]:     retries                 3
Dec 05 09:29:37 compute-1 ovn_metadata_agent[105267]:     timeout http-request    30s
Dec 05 09:29:37 compute-1 ovn_metadata_agent[105267]:     timeout connect         30s
Dec 05 09:29:37 compute-1 ovn_metadata_agent[105267]:     timeout client          32s
Dec 05 09:29:37 compute-1 ovn_metadata_agent[105267]:     timeout server          32s
Dec 05 09:29:37 compute-1 ovn_metadata_agent[105267]:     timeout http-keep-alive 30s
Dec 05 09:29:37 compute-1 ovn_metadata_agent[105267]: 
Dec 05 09:29:37 compute-1 ovn_metadata_agent[105267]: 
Dec 05 09:29:37 compute-1 ovn_metadata_agent[105267]: listen listener
Dec 05 09:29:37 compute-1 ovn_metadata_agent[105267]:     bind 169.254.169.254:80
Dec 05 09:29:37 compute-1 ovn_metadata_agent[105267]:     server metadata /var/lib/neutron/metadata_proxy
Dec 05 09:29:37 compute-1 ovn_metadata_agent[105267]:     http-request add-header X-OVN-Network-ID 8195ce32-c9e7-4b77-af7d-e020c8cddd3e
Dec 05 09:29:37 compute-1 ovn_metadata_agent[105267]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Dec 05 09:29:37 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:29:37.779 105272 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-8195ce32-c9e7-4b77-af7d-e020c8cddd3e', 'env', 'PROCESS_TAG=haproxy-8195ce32-c9e7-4b77-af7d-e020c8cddd3e', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/8195ce32-c9e7-4b77-af7d-e020c8cddd3e.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Dec 05 09:29:37 compute-1 nova_compute[189066]: 2025-12-05 09:29:37.788 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:29:38 compute-1 podman[225320]: 2025-12-05 09:29:38.211219301 +0000 UTC m=+0.066199840 container create ef1319c271b8981336f93163f54e7878665b76863cc6e195c544f8a8a9129ccc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8195ce32-c9e7-4b77-af7d-e020c8cddd3e, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3)
Dec 05 09:29:38 compute-1 systemd[1]: Started libpod-conmon-ef1319c271b8981336f93163f54e7878665b76863cc6e195c544f8a8a9129ccc.scope.
Dec 05 09:29:38 compute-1 podman[225320]: 2025-12-05 09:29:38.181831438 +0000 UTC m=+0.036812017 image pull 014dc726c85414b29f2dde7b5d875685d08784761c0f0ffa8630d1583a877bf9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Dec 05 09:29:38 compute-1 systemd[1]: Started libcrun container.
Dec 05 09:29:38 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6b3bdf509417a518f4a341eed45708fa459c3cb65999bf4d31d059ea62532629/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 05 09:29:38 compute-1 podman[225320]: 2025-12-05 09:29:38.305667234 +0000 UTC m=+0.160647803 container init ef1319c271b8981336f93163f54e7878665b76863cc6e195c544f8a8a9129ccc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8195ce32-c9e7-4b77-af7d-e020c8cddd3e, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.3)
Dec 05 09:29:38 compute-1 podman[225320]: 2025-12-05 09:29:38.31154826 +0000 UTC m=+0.166528809 container start ef1319c271b8981336f93163f54e7878665b76863cc6e195c544f8a8a9129ccc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8195ce32-c9e7-4b77-af7d-e020c8cddd3e, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125)
Dec 05 09:29:38 compute-1 neutron-haproxy-ovnmeta-8195ce32-c9e7-4b77-af7d-e020c8cddd3e[225334]: [NOTICE]   (225338) : New worker (225340) forked
Dec 05 09:29:38 compute-1 neutron-haproxy-ovnmeta-8195ce32-c9e7-4b77-af7d-e020c8cddd3e[225334]: [NOTICE]   (225338) : Loading success.
Dec 05 09:29:38 compute-1 nova_compute[189066]: 2025-12-05 09:29:38.366 189070 DEBUG nova.network.neutron [req-e1c26889-1932-4dd1-b41d-3e45283da48b req-4eea2dc2-0535-411f-a04d-5eee68badd73 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 04f78c05-bd66-4054-89c0-78b77c57ecf7] Updated VIF entry in instance network info cache for port 7f77bcc1-96ff-4d64-ae85-ba761c46b85e. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Dec 05 09:29:38 compute-1 nova_compute[189066]: 2025-12-05 09:29:38.367 189070 DEBUG nova.network.neutron [req-e1c26889-1932-4dd1-b41d-3e45283da48b req-4eea2dc2-0535-411f-a04d-5eee68badd73 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 04f78c05-bd66-4054-89c0-78b77c57ecf7] Updating instance_info_cache with network_info: [{"id": "7f77bcc1-96ff-4d64-ae85-ba761c46b85e", "address": "fa:16:3e:1e:41:0b", "network": {"id": "8195ce32-c9e7-4b77-af7d-e020c8cddd3e", "bridge": "br-int", "label": "tempest-network-smoke--234633567", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e26ae3fdd48d4947978a480f70e14f84", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7f77bcc1-96", "ovs_interfaceid": "7f77bcc1-96ff-4d64-ae85-ba761c46b85e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 09:29:38 compute-1 nova_compute[189066]: 2025-12-05 09:29:38.387 189070 DEBUG oslo_concurrency.lockutils [req-e1c26889-1932-4dd1-b41d-3e45283da48b req-4eea2dc2-0535-411f-a04d-5eee68badd73 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Releasing lock "refresh_cache-04f78c05-bd66-4054-89c0-78b77c57ecf7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 09:29:39 compute-1 nova_compute[189066]: 2025-12-05 09:29:39.307 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:29:40 compute-1 podman[225349]: 2025-12-05 09:29:40.647369632 +0000 UTC m=+0.082508611 container health_status 0cdc951f3b455a1459471b20e1f1626ca53f702831c125f835d1b670aeb17ccf (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a42d21893a42142b0b00e96296a13ac9d2c0fedf55d6438ad104fd0309c63376-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3)
Dec 05 09:29:41 compute-1 nova_compute[189066]: 2025-12-05 09:29:41.327 189070 DEBUG nova.compute.manager [req-c644f14a-c6e4-48af-bbcb-a0212b9ef12c req-700771f4-6270-4086-a676-0321a5ec8b1d 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 04f78c05-bd66-4054-89c0-78b77c57ecf7] Received event network-vif-plugged-7f77bcc1-96ff-4d64-ae85-ba761c46b85e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 09:29:41 compute-1 nova_compute[189066]: 2025-12-05 09:29:41.327 189070 DEBUG oslo_concurrency.lockutils [req-c644f14a-c6e4-48af-bbcb-a0212b9ef12c req-700771f4-6270-4086-a676-0321a5ec8b1d 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Acquiring lock "04f78c05-bd66-4054-89c0-78b77c57ecf7-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:29:41 compute-1 nova_compute[189066]: 2025-12-05 09:29:41.328 189070 DEBUG oslo_concurrency.lockutils [req-c644f14a-c6e4-48af-bbcb-a0212b9ef12c req-700771f4-6270-4086-a676-0321a5ec8b1d 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Lock "04f78c05-bd66-4054-89c0-78b77c57ecf7-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:29:41 compute-1 nova_compute[189066]: 2025-12-05 09:29:41.328 189070 DEBUG oslo_concurrency.lockutils [req-c644f14a-c6e4-48af-bbcb-a0212b9ef12c req-700771f4-6270-4086-a676-0321a5ec8b1d 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Lock "04f78c05-bd66-4054-89c0-78b77c57ecf7-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:29:41 compute-1 nova_compute[189066]: 2025-12-05 09:29:41.328 189070 DEBUG nova.compute.manager [req-c644f14a-c6e4-48af-bbcb-a0212b9ef12c req-700771f4-6270-4086-a676-0321a5ec8b1d 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 04f78c05-bd66-4054-89c0-78b77c57ecf7] Processing event network-vif-plugged-7f77bcc1-96ff-4d64-ae85-ba761c46b85e _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Dec 05 09:29:41 compute-1 nova_compute[189066]: 2025-12-05 09:29:41.328 189070 DEBUG nova.compute.manager [req-c644f14a-c6e4-48af-bbcb-a0212b9ef12c req-700771f4-6270-4086-a676-0321a5ec8b1d 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 04f78c05-bd66-4054-89c0-78b77c57ecf7] Received event network-vif-plugged-7f77bcc1-96ff-4d64-ae85-ba761c46b85e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 09:29:41 compute-1 nova_compute[189066]: 2025-12-05 09:29:41.328 189070 DEBUG oslo_concurrency.lockutils [req-c644f14a-c6e4-48af-bbcb-a0212b9ef12c req-700771f4-6270-4086-a676-0321a5ec8b1d 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Acquiring lock "04f78c05-bd66-4054-89c0-78b77c57ecf7-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:29:41 compute-1 nova_compute[189066]: 2025-12-05 09:29:41.329 189070 DEBUG oslo_concurrency.lockutils [req-c644f14a-c6e4-48af-bbcb-a0212b9ef12c req-700771f4-6270-4086-a676-0321a5ec8b1d 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Lock "04f78c05-bd66-4054-89c0-78b77c57ecf7-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:29:41 compute-1 nova_compute[189066]: 2025-12-05 09:29:41.329 189070 DEBUG oslo_concurrency.lockutils [req-c644f14a-c6e4-48af-bbcb-a0212b9ef12c req-700771f4-6270-4086-a676-0321a5ec8b1d 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Lock "04f78c05-bd66-4054-89c0-78b77c57ecf7-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:29:41 compute-1 nova_compute[189066]: 2025-12-05 09:29:41.329 189070 DEBUG nova.compute.manager [req-c644f14a-c6e4-48af-bbcb-a0212b9ef12c req-700771f4-6270-4086-a676-0321a5ec8b1d 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 04f78c05-bd66-4054-89c0-78b77c57ecf7] No waiting events found dispatching network-vif-plugged-7f77bcc1-96ff-4d64-ae85-ba761c46b85e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 09:29:41 compute-1 nova_compute[189066]: 2025-12-05 09:29:41.329 189070 WARNING nova.compute.manager [req-c644f14a-c6e4-48af-bbcb-a0212b9ef12c req-700771f4-6270-4086-a676-0321a5ec8b1d 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 04f78c05-bd66-4054-89c0-78b77c57ecf7] Received unexpected event network-vif-plugged-7f77bcc1-96ff-4d64-ae85-ba761c46b85e for instance with vm_state building and task_state spawning.
Dec 05 09:29:41 compute-1 nova_compute[189066]: 2025-12-05 09:29:41.330 189070 DEBUG nova.compute.manager [None req-719adf5c-043a-472c-b571-1736994731ac 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] [instance: 04f78c05-bd66-4054-89c0-78b77c57ecf7] Instance event wait completed in 3 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Dec 05 09:29:41 compute-1 nova_compute[189066]: 2025-12-05 09:29:41.335 189070 DEBUG nova.virt.driver [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] Emitting event <LifecycleEvent: 1764926981.3350286, 04f78c05-bd66-4054-89c0-78b77c57ecf7 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 09:29:41 compute-1 nova_compute[189066]: 2025-12-05 09:29:41.335 189070 INFO nova.compute.manager [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] [instance: 04f78c05-bd66-4054-89c0-78b77c57ecf7] VM Resumed (Lifecycle Event)
Dec 05 09:29:41 compute-1 nova_compute[189066]: 2025-12-05 09:29:41.337 189070 DEBUG nova.virt.libvirt.driver [None req-719adf5c-043a-472c-b571-1736994731ac 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] [instance: 04f78c05-bd66-4054-89c0-78b77c57ecf7] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Dec 05 09:29:41 compute-1 nova_compute[189066]: 2025-12-05 09:29:41.339 189070 INFO nova.virt.libvirt.driver [-] [instance: 04f78c05-bd66-4054-89c0-78b77c57ecf7] Instance spawned successfully.
Dec 05 09:29:41 compute-1 nova_compute[189066]: 2025-12-05 09:29:41.340 189070 DEBUG nova.virt.libvirt.driver [None req-719adf5c-043a-472c-b571-1736994731ac 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] [instance: 04f78c05-bd66-4054-89c0-78b77c57ecf7] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Dec 05 09:29:41 compute-1 nova_compute[189066]: 2025-12-05 09:29:41.382 189070 DEBUG nova.compute.manager [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] [instance: 04f78c05-bd66-4054-89c0-78b77c57ecf7] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 09:29:41 compute-1 nova_compute[189066]: 2025-12-05 09:29:41.386 189070 DEBUG nova.virt.libvirt.driver [None req-719adf5c-043a-472c-b571-1736994731ac 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] [instance: 04f78c05-bd66-4054-89c0-78b77c57ecf7] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 09:29:41 compute-1 nova_compute[189066]: 2025-12-05 09:29:41.387 189070 DEBUG nova.virt.libvirt.driver [None req-719adf5c-043a-472c-b571-1736994731ac 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] [instance: 04f78c05-bd66-4054-89c0-78b77c57ecf7] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 09:29:41 compute-1 nova_compute[189066]: 2025-12-05 09:29:41.387 189070 DEBUG nova.virt.libvirt.driver [None req-719adf5c-043a-472c-b571-1736994731ac 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] [instance: 04f78c05-bd66-4054-89c0-78b77c57ecf7] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 09:29:41 compute-1 nova_compute[189066]: 2025-12-05 09:29:41.387 189070 DEBUG nova.virt.libvirt.driver [None req-719adf5c-043a-472c-b571-1736994731ac 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] [instance: 04f78c05-bd66-4054-89c0-78b77c57ecf7] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 09:29:41 compute-1 nova_compute[189066]: 2025-12-05 09:29:41.388 189070 DEBUG nova.virt.libvirt.driver [None req-719adf5c-043a-472c-b571-1736994731ac 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] [instance: 04f78c05-bd66-4054-89c0-78b77c57ecf7] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 09:29:41 compute-1 nova_compute[189066]: 2025-12-05 09:29:41.388 189070 DEBUG nova.virt.libvirt.driver [None req-719adf5c-043a-472c-b571-1736994731ac 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] [instance: 04f78c05-bd66-4054-89c0-78b77c57ecf7] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 09:29:41 compute-1 nova_compute[189066]: 2025-12-05 09:29:41.391 189070 DEBUG nova.compute.manager [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] [instance: 04f78c05-bd66-4054-89c0-78b77c57ecf7] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 09:29:41 compute-1 nova_compute[189066]: 2025-12-05 09:29:41.429 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:29:41 compute-1 nova_compute[189066]: 2025-12-05 09:29:41.472 189070 INFO nova.compute.manager [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] [instance: 04f78c05-bd66-4054-89c0-78b77c57ecf7] During sync_power_state the instance has a pending task (spawning). Skip.
Dec 05 09:29:41 compute-1 nova_compute[189066]: 2025-12-05 09:29:41.603 189070 INFO nova.compute.manager [None req-719adf5c-043a-472c-b571-1736994731ac 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] [instance: 04f78c05-bd66-4054-89c0-78b77c57ecf7] Took 11.21 seconds to spawn the instance on the hypervisor.
Dec 05 09:29:41 compute-1 nova_compute[189066]: 2025-12-05 09:29:41.604 189070 DEBUG nova.compute.manager [None req-719adf5c-043a-472c-b571-1736994731ac 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] [instance: 04f78c05-bd66-4054-89c0-78b77c57ecf7] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 09:29:41 compute-1 nova_compute[189066]: 2025-12-05 09:29:41.734 189070 INFO nova.compute.manager [None req-719adf5c-043a-472c-b571-1736994731ac 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] [instance: 04f78c05-bd66-4054-89c0-78b77c57ecf7] Took 12.78 seconds to build instance.
Dec 05 09:29:41 compute-1 nova_compute[189066]: 2025-12-05 09:29:41.775 189070 DEBUG oslo_concurrency.lockutils [None req-719adf5c-043a-472c-b571-1736994731ac 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] Lock "04f78c05-bd66-4054-89c0-78b77c57ecf7" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 12.949s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:29:42 compute-1 podman[225376]: 2025-12-05 09:29:42.626557879 +0000 UTC m=+0.061371530 container health_status e8cea6352187f28e672aa25c227f2705a90d58074b8cf42235a56df5d4026fd5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a42d21893a42142b0b00e96296a13ac9d2c0fedf55d6438ad104fd0309c63376-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Dec 05 09:29:44 compute-1 nova_compute[189066]: 2025-12-05 09:29:44.309 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:29:45 compute-1 podman[225395]: 2025-12-05 09:29:45.619880629 +0000 UTC m=+0.061707849 container health_status 3f3288243aefe700d53cffce184353529fbaf6e44f1c19ec4792ed576af41176 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Dec 05 09:29:46 compute-1 nova_compute[189066]: 2025-12-05 09:29:46.431 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:29:49 compute-1 nova_compute[189066]: 2025-12-05 09:29:49.348 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:29:50 compute-1 podman[225414]: 2025-12-05 09:29:50.637093827 +0000 UTC m=+0.064309163 container health_status b07f36300303a5c1729056a61e344d6c6fd9b2e404082d8e8ce7185d45652c2b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., version=9.6, com.redhat.component=ubi9-minimal-container, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.expose-services=, config_id=openstack_network_exporter, release=1755695350, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, name=ubi9-minimal, architecture=x86_64, io.buildah.version=1.33.7, vcs-type=git)
Dec 05 09:29:51 compute-1 nova_compute[189066]: 2025-12-05 09:29:51.435 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:29:52 compute-1 nova_compute[189066]: 2025-12-05 09:29:52.536 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:29:52 compute-1 NetworkManager[55704]: <info>  [1764926992.5437] manager: (patch-br-int-to-provnet-540ef657-1e81-4b72-856b-0e1c84504735): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/66)
Dec 05 09:29:52 compute-1 NetworkManager[55704]: <info>  [1764926992.5448] manager: (patch-provnet-540ef657-1e81-4b72-856b-0e1c84504735-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/67)
Dec 05 09:29:52 compute-1 nova_compute[189066]: 2025-12-05 09:29:52.697 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:29:52 compute-1 ovn_controller[95809]: 2025-12-05T09:29:52Z|00124|binding|INFO|Releasing lport 8420ea11-bf02-412c-9fba-367b868c0df0 from this chassis (sb_readonly=0)
Dec 05 09:29:52 compute-1 podman[225435]: 2025-12-05 09:29:52.716195793 +0000 UTC m=+0.116427586 container health_status b9f45cdcb567bb250381fa80d86c78c226d5638c7de1b6d79ee017275814ff68 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Dec 05 09:29:52 compute-1 nova_compute[189066]: 2025-12-05 09:29:52.725 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:29:54 compute-1 nova_compute[189066]: 2025-12-05 09:29:54.042 189070 DEBUG nova.compute.manager [req-e7b18d26-2ff7-457f-bed9-12db4d2d949e req-a4eab61d-a429-4dc8-b8a4-c557040cd3e3 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 04f78c05-bd66-4054-89c0-78b77c57ecf7] Received event network-changed-7f77bcc1-96ff-4d64-ae85-ba761c46b85e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 09:29:54 compute-1 nova_compute[189066]: 2025-12-05 09:29:54.042 189070 DEBUG nova.compute.manager [req-e7b18d26-2ff7-457f-bed9-12db4d2d949e req-a4eab61d-a429-4dc8-b8a4-c557040cd3e3 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 04f78c05-bd66-4054-89c0-78b77c57ecf7] Refreshing instance network info cache due to event network-changed-7f77bcc1-96ff-4d64-ae85-ba761c46b85e. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Dec 05 09:29:54 compute-1 nova_compute[189066]: 2025-12-05 09:29:54.042 189070 DEBUG oslo_concurrency.lockutils [req-e7b18d26-2ff7-457f-bed9-12db4d2d949e req-a4eab61d-a429-4dc8-b8a4-c557040cd3e3 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Acquiring lock "refresh_cache-04f78c05-bd66-4054-89c0-78b77c57ecf7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 09:29:54 compute-1 nova_compute[189066]: 2025-12-05 09:29:54.043 189070 DEBUG oslo_concurrency.lockutils [req-e7b18d26-2ff7-457f-bed9-12db4d2d949e req-a4eab61d-a429-4dc8-b8a4-c557040cd3e3 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Acquired lock "refresh_cache-04f78c05-bd66-4054-89c0-78b77c57ecf7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 09:29:54 compute-1 nova_compute[189066]: 2025-12-05 09:29:54.043 189070 DEBUG nova.network.neutron [req-e7b18d26-2ff7-457f-bed9-12db4d2d949e req-a4eab61d-a429-4dc8-b8a4-c557040cd3e3 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 04f78c05-bd66-4054-89c0-78b77c57ecf7] Refreshing network info cache for port 7f77bcc1-96ff-4d64-ae85-ba761c46b85e _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Dec 05 09:29:54 compute-1 nova_compute[189066]: 2025-12-05 09:29:54.381 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:29:56 compute-1 ovn_controller[95809]: 2025-12-05T09:29:56Z|00019|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:1e:41:0b 10.100.0.4
Dec 05 09:29:56 compute-1 ovn_controller[95809]: 2025-12-05T09:29:56Z|00020|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:1e:41:0b 10.100.0.4
Dec 05 09:29:56 compute-1 nova_compute[189066]: 2025-12-05 09:29:56.438 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:29:59 compute-1 nova_compute[189066]: 2025-12-05 09:29:59.386 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:29:59 compute-1 nova_compute[189066]: 2025-12-05 09:29:59.493 189070 DEBUG nova.network.neutron [req-e7b18d26-2ff7-457f-bed9-12db4d2d949e req-a4eab61d-a429-4dc8-b8a4-c557040cd3e3 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 04f78c05-bd66-4054-89c0-78b77c57ecf7] Updated VIF entry in instance network info cache for port 7f77bcc1-96ff-4d64-ae85-ba761c46b85e. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Dec 05 09:29:59 compute-1 nova_compute[189066]: 2025-12-05 09:29:59.494 189070 DEBUG nova.network.neutron [req-e7b18d26-2ff7-457f-bed9-12db4d2d949e req-a4eab61d-a429-4dc8-b8a4-c557040cd3e3 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 04f78c05-bd66-4054-89c0-78b77c57ecf7] Updating instance_info_cache with network_info: [{"id": "7f77bcc1-96ff-4d64-ae85-ba761c46b85e", "address": "fa:16:3e:1e:41:0b", "network": {"id": "8195ce32-c9e7-4b77-af7d-e020c8cddd3e", "bridge": "br-int", "label": "tempest-network-smoke--234633567", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.184", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e26ae3fdd48d4947978a480f70e14f84", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7f77bcc1-96", "ovs_interfaceid": "7f77bcc1-96ff-4d64-ae85-ba761c46b85e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 09:29:59 compute-1 nova_compute[189066]: 2025-12-05 09:29:59.527 189070 DEBUG oslo_concurrency.lockutils [req-e7b18d26-2ff7-457f-bed9-12db4d2d949e req-a4eab61d-a429-4dc8-b8a4-c557040cd3e3 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Releasing lock "refresh_cache-04f78c05-bd66-4054-89c0-78b77c57ecf7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 09:29:59 compute-1 podman[225473]: 2025-12-05 09:29:59.639932601 +0000 UTC m=+0.074003132 container health_status 550b604b3b40c506829669f3af0f3e8dc4e7ac01464222ce7f26998708fe1a96 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec 05 09:30:01 compute-1 nova_compute[189066]: 2025-12-05 09:30:01.440 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:30:02 compute-1 nova_compute[189066]: 2025-12-05 09:30:02.461 189070 INFO nova.compute.manager [None req-98ab8e82-12e0-4128-84bf-6c255dd67489 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] [instance: 04f78c05-bd66-4054-89c0-78b77c57ecf7] Get console output
Dec 05 09:30:02 compute-1 nova_compute[189066]: 2025-12-05 09:30:02.470 220618 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Dec 05 09:30:02 compute-1 nova_compute[189066]: 2025-12-05 09:30:02.980 189070 INFO nova.compute.manager [None req-fda6d7ab-1ee2-4b1a-977e-04e293548617 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] [instance: 04f78c05-bd66-4054-89c0-78b77c57ecf7] Pausing
Dec 05 09:30:02 compute-1 nova_compute[189066]: 2025-12-05 09:30:02.981 189070 DEBUG nova.objects.instance [None req-fda6d7ab-1ee2-4b1a-977e-04e293548617 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] Lazy-loading 'flavor' on Instance uuid 04f78c05-bd66-4054-89c0-78b77c57ecf7 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 09:30:03 compute-1 nova_compute[189066]: 2025-12-05 09:30:03.029 189070 DEBUG nova.virt.driver [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] Emitting event <LifecycleEvent: 1764927003.0294764, 04f78c05-bd66-4054-89c0-78b77c57ecf7 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 09:30:03 compute-1 nova_compute[189066]: 2025-12-05 09:30:03.030 189070 INFO nova.compute.manager [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] [instance: 04f78c05-bd66-4054-89c0-78b77c57ecf7] VM Paused (Lifecycle Event)
Dec 05 09:30:03 compute-1 nova_compute[189066]: 2025-12-05 09:30:03.032 189070 DEBUG nova.compute.manager [None req-fda6d7ab-1ee2-4b1a-977e-04e293548617 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] [instance: 04f78c05-bd66-4054-89c0-78b77c57ecf7] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 09:30:03 compute-1 nova_compute[189066]: 2025-12-05 09:30:03.082 189070 DEBUG nova.compute.manager [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] [instance: 04f78c05-bd66-4054-89c0-78b77c57ecf7] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 09:30:03 compute-1 nova_compute[189066]: 2025-12-05 09:30:03.087 189070 DEBUG nova.compute.manager [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] [instance: 04f78c05-bd66-4054-89c0-78b77c57ecf7] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: active, current task_state: pausing, current DB power_state: 1, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 09:30:04 compute-1 nova_compute[189066]: 2025-12-05 09:30:04.388 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:30:06 compute-1 nova_compute[189066]: 2025-12-05 09:30:06.443 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:30:07 compute-1 nova_compute[189066]: 2025-12-05 09:30:07.106 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:30:07 compute-1 podman[225499]: 2025-12-05 09:30:07.664689241 +0000 UTC m=+0.096912916 container health_status d60d3dfd47255bc7c068dbedc03dbf86678863f0414a1b8f8536e4d53dd67a13 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a42d21893a42142b0b00e96296a13ac9d2c0fedf55d6438ad104fd0309c63376-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Dec 05 09:30:08 compute-1 nova_compute[189066]: 2025-12-05 09:30:08.671 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:30:08 compute-1 nova_compute[189066]: 2025-12-05 09:30:08.704 189070 INFO nova.compute.manager [None req-ef74470f-2565-4458-bc84-12d85d0bb803 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] [instance: 04f78c05-bd66-4054-89c0-78b77c57ecf7] Get console output
Dec 05 09:30:08 compute-1 nova_compute[189066]: 2025-12-05 09:30:08.711 220618 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Dec 05 09:30:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:30:08.874 105272 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:30:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:30:08.876 105272 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:30:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:30:08.877 105272 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:30:08 compute-1 nova_compute[189066]: 2025-12-05 09:30:08.990 189070 INFO nova.compute.manager [None req-9e4ee91a-80a0-4b8a-b744-3ef949731377 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] [instance: 04f78c05-bd66-4054-89c0-78b77c57ecf7] Unpausing
Dec 05 09:30:08 compute-1 nova_compute[189066]: 2025-12-05 09:30:08.991 189070 DEBUG nova.objects.instance [None req-9e4ee91a-80a0-4b8a-b744-3ef949731377 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] Lazy-loading 'flavor' on Instance uuid 04f78c05-bd66-4054-89c0-78b77c57ecf7 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 09:30:09 compute-1 nova_compute[189066]: 2025-12-05 09:30:09.020 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:30:09 compute-1 nova_compute[189066]: 2025-12-05 09:30:09.057 189070 DEBUG nova.virt.driver [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] Emitting event <LifecycleEvent: 1764927009.056863, 04f78c05-bd66-4054-89c0-78b77c57ecf7 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 09:30:09 compute-1 nova_compute[189066]: 2025-12-05 09:30:09.058 189070 INFO nova.compute.manager [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] [instance: 04f78c05-bd66-4054-89c0-78b77c57ecf7] VM Resumed (Lifecycle Event)
Dec 05 09:30:09 compute-1 virtqemud[188731]: argument unsupported: QEMU guest agent is not configured
Dec 05 09:30:09 compute-1 nova_compute[189066]: 2025-12-05 09:30:09.063 189070 DEBUG nova.virt.libvirt.guest [None req-9e4ee91a-80a0-4b8a-b744-3ef949731377 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] [instance: 04f78c05-bd66-4054-89c0-78b77c57ecf7] Failed to set time: agent not configured sync_guest_time /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:200
Dec 05 09:30:09 compute-1 nova_compute[189066]: 2025-12-05 09:30:09.063 189070 DEBUG nova.compute.manager [None req-9e4ee91a-80a0-4b8a-b744-3ef949731377 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] [instance: 04f78c05-bd66-4054-89c0-78b77c57ecf7] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 09:30:09 compute-1 nova_compute[189066]: 2025-12-05 09:30:09.106 189070 DEBUG nova.compute.manager [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] [instance: 04f78c05-bd66-4054-89c0-78b77c57ecf7] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 09:30:09 compute-1 nova_compute[189066]: 2025-12-05 09:30:09.110 189070 DEBUG nova.compute.manager [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] [instance: 04f78c05-bd66-4054-89c0-78b77c57ecf7] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: paused, current task_state: unpausing, current DB power_state: 3, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 09:30:09 compute-1 nova_compute[189066]: 2025-12-05 09:30:09.152 189070 INFO nova.compute.manager [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] [instance: 04f78c05-bd66-4054-89c0-78b77c57ecf7] During sync_power_state the instance has a pending task (unpausing). Skip.
Dec 05 09:30:09 compute-1 nova_compute[189066]: 2025-12-05 09:30:09.390 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:30:10 compute-1 nova_compute[189066]: 2025-12-05 09:30:10.015 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:30:11 compute-1 nova_compute[189066]: 2025-12-05 09:30:11.021 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:30:11 compute-1 nova_compute[189066]: 2025-12-05 09:30:11.445 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:30:11 compute-1 podman[225519]: 2025-12-05 09:30:11.656774134 +0000 UTC m=+0.098126345 container health_status 0cdc951f3b455a1459471b20e1f1626ca53f702831c125f835d1b670aeb17ccf (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a42d21893a42142b0b00e96296a13ac9d2c0fedf55d6438ad104fd0309c63376-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Dec 05 09:30:12 compute-1 nova_compute[189066]: 2025-12-05 09:30:12.359 189070 INFO nova.compute.manager [None req-79fd3369-e2d1-4431-ac57-18c7ae150196 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] [instance: 04f78c05-bd66-4054-89c0-78b77c57ecf7] Get console output
Dec 05 09:30:12 compute-1 nova_compute[189066]: 2025-12-05 09:30:12.365 220618 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Dec 05 09:30:13 compute-1 nova_compute[189066]: 2025-12-05 09:30:13.020 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:30:13 compute-1 nova_compute[189066]: 2025-12-05 09:30:13.519 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:30:13 compute-1 podman[225546]: 2025-12-05 09:30:13.637818097 +0000 UTC m=+0.064568210 container health_status e8cea6352187f28e672aa25c227f2705a90d58074b8cf42235a56df5d4026fd5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a42d21893a42142b0b00e96296a13ac9d2c0fedf55d6438ad104fd0309c63376-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20251125)
Dec 05 09:30:13 compute-1 nova_compute[189066]: 2025-12-05 09:30:13.945 189070 DEBUG nova.compute.manager [req-a8c032a8-9649-43cc-861c-67c13d229fbb req-ea245581-2af0-4760-a30b-6aecc2f6feb7 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 04f78c05-bd66-4054-89c0-78b77c57ecf7] Received event network-changed-7f77bcc1-96ff-4d64-ae85-ba761c46b85e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 09:30:13 compute-1 nova_compute[189066]: 2025-12-05 09:30:13.946 189070 DEBUG nova.compute.manager [req-a8c032a8-9649-43cc-861c-67c13d229fbb req-ea245581-2af0-4760-a30b-6aecc2f6feb7 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 04f78c05-bd66-4054-89c0-78b77c57ecf7] Refreshing instance network info cache due to event network-changed-7f77bcc1-96ff-4d64-ae85-ba761c46b85e. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Dec 05 09:30:13 compute-1 nova_compute[189066]: 2025-12-05 09:30:13.946 189070 DEBUG oslo_concurrency.lockutils [req-a8c032a8-9649-43cc-861c-67c13d229fbb req-ea245581-2af0-4760-a30b-6aecc2f6feb7 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Acquiring lock "refresh_cache-04f78c05-bd66-4054-89c0-78b77c57ecf7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 09:30:13 compute-1 nova_compute[189066]: 2025-12-05 09:30:13.946 189070 DEBUG oslo_concurrency.lockutils [req-a8c032a8-9649-43cc-861c-67c13d229fbb req-ea245581-2af0-4760-a30b-6aecc2f6feb7 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Acquired lock "refresh_cache-04f78c05-bd66-4054-89c0-78b77c57ecf7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 09:30:13 compute-1 nova_compute[189066]: 2025-12-05 09:30:13.946 189070 DEBUG nova.network.neutron [req-a8c032a8-9649-43cc-861c-67c13d229fbb req-ea245581-2af0-4760-a30b-6aecc2f6feb7 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 04f78c05-bd66-4054-89c0-78b77c57ecf7] Refreshing network info cache for port 7f77bcc1-96ff-4d64-ae85-ba761c46b85e _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Dec 05 09:30:14 compute-1 nova_compute[189066]: 2025-12-05 09:30:14.020 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:30:14 compute-1 nova_compute[189066]: 2025-12-05 09:30:14.067 189070 DEBUG oslo_concurrency.lockutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:30:14 compute-1 nova_compute[189066]: 2025-12-05 09:30:14.068 189070 DEBUG oslo_concurrency.lockutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:30:14 compute-1 nova_compute[189066]: 2025-12-05 09:30:14.068 189070 DEBUG oslo_concurrency.lockutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:30:14 compute-1 nova_compute[189066]: 2025-12-05 09:30:14.069 189070 DEBUG nova.compute.resource_tracker [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 05 09:30:14 compute-1 nova_compute[189066]: 2025-12-05 09:30:14.160 189070 DEBUG oslo_concurrency.lockutils [None req-192edfe4-a6b3-4503-ae38-f1b25286e581 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] Acquiring lock "04f78c05-bd66-4054-89c0-78b77c57ecf7" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:30:14 compute-1 nova_compute[189066]: 2025-12-05 09:30:14.161 189070 DEBUG oslo_concurrency.lockutils [None req-192edfe4-a6b3-4503-ae38-f1b25286e581 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] Lock "04f78c05-bd66-4054-89c0-78b77c57ecf7" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:30:14 compute-1 nova_compute[189066]: 2025-12-05 09:30:14.161 189070 DEBUG oslo_concurrency.lockutils [None req-192edfe4-a6b3-4503-ae38-f1b25286e581 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] Acquiring lock "04f78c05-bd66-4054-89c0-78b77c57ecf7-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:30:14 compute-1 nova_compute[189066]: 2025-12-05 09:30:14.161 189070 DEBUG oslo_concurrency.lockutils [None req-192edfe4-a6b3-4503-ae38-f1b25286e581 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] Lock "04f78c05-bd66-4054-89c0-78b77c57ecf7-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:30:14 compute-1 nova_compute[189066]: 2025-12-05 09:30:14.162 189070 DEBUG oslo_concurrency.lockutils [None req-192edfe4-a6b3-4503-ae38-f1b25286e581 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] Lock "04f78c05-bd66-4054-89c0-78b77c57ecf7-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:30:14 compute-1 nova_compute[189066]: 2025-12-05 09:30:14.163 189070 INFO nova.compute.manager [None req-192edfe4-a6b3-4503-ae38-f1b25286e581 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] [instance: 04f78c05-bd66-4054-89c0-78b77c57ecf7] Terminating instance
Dec 05 09:30:14 compute-1 nova_compute[189066]: 2025-12-05 09:30:14.164 189070 DEBUG nova.compute.manager [None req-192edfe4-a6b3-4503-ae38-f1b25286e581 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] [instance: 04f78c05-bd66-4054-89c0-78b77c57ecf7] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Dec 05 09:30:14 compute-1 kernel: tap7f77bcc1-96 (unregistering): left promiscuous mode
Dec 05 09:30:14 compute-1 NetworkManager[55704]: <info>  [1764927014.1944] device (tap7f77bcc1-96): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec 05 09:30:14 compute-1 nova_compute[189066]: 2025-12-05 09:30:14.207 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:30:14 compute-1 ovn_controller[95809]: 2025-12-05T09:30:14Z|00125|binding|INFO|Releasing lport 7f77bcc1-96ff-4d64-ae85-ba761c46b85e from this chassis (sb_readonly=0)
Dec 05 09:30:14 compute-1 ovn_controller[95809]: 2025-12-05T09:30:14Z|00126|binding|INFO|Setting lport 7f77bcc1-96ff-4d64-ae85-ba761c46b85e down in Southbound
Dec 05 09:30:14 compute-1 ovn_controller[95809]: 2025-12-05T09:30:14Z|00127|binding|INFO|Removing iface tap7f77bcc1-96 ovn-installed in OVS
Dec 05 09:30:14 compute-1 nova_compute[189066]: 2025-12-05 09:30:14.211 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:30:14 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:30:14.220 105272 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:1e:41:0b 10.100.0.4'], port_security=['fa:16:3e:1e:41:0b 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': '04f78c05-bd66-4054-89c0-78b77c57ecf7', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8195ce32-c9e7-4b77-af7d-e020c8cddd3e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e26ae3fdd48d4947978a480f70e14f84', 'neutron:revision_number': '4', 'neutron:security_group_ids': '42f1426a-e52e-4db3-9f22-2fc031853713', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a949aafc-203c-48e1-86d6-9f80dc9b7491, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fc06f54c970>], logical_port=7f77bcc1-96ff-4d64-ae85-ba761c46b85e) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fc06f54c970>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 09:30:14 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:30:14.222 105272 INFO neutron.agent.ovn.metadata.agent [-] Port 7f77bcc1-96ff-4d64-ae85-ba761c46b85e in datapath 8195ce32-c9e7-4b77-af7d-e020c8cddd3e unbound from our chassis
Dec 05 09:30:14 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:30:14.224 105272 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 8195ce32-c9e7-4b77-af7d-e020c8cddd3e, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 05 09:30:14 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:30:14.226 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[01946f8f-ec36-4c0d-901e-55934f90383e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:30:14 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:30:14.227 105272 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-8195ce32-c9e7-4b77-af7d-e020c8cddd3e namespace which is not needed anymore
Dec 05 09:30:14 compute-1 nova_compute[189066]: 2025-12-05 09:30:14.229 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:30:14 compute-1 systemd[1]: machine-qemu\x2d9\x2dinstance\x2d00000010.scope: Deactivated successfully.
Dec 05 09:30:14 compute-1 systemd[1]: machine-qemu\x2d9\x2dinstance\x2d00000010.scope: Consumed 14.494s CPU time.
Dec 05 09:30:14 compute-1 systemd-machined[154815]: Machine qemu-9-instance-00000010 terminated.
Dec 05 09:30:14 compute-1 neutron-haproxy-ovnmeta-8195ce32-c9e7-4b77-af7d-e020c8cddd3e[225334]: [NOTICE]   (225338) : haproxy version is 2.8.14-c23fe91
Dec 05 09:30:14 compute-1 neutron-haproxy-ovnmeta-8195ce32-c9e7-4b77-af7d-e020c8cddd3e[225334]: [NOTICE]   (225338) : path to executable is /usr/sbin/haproxy
Dec 05 09:30:14 compute-1 neutron-haproxy-ovnmeta-8195ce32-c9e7-4b77-af7d-e020c8cddd3e[225334]: [WARNING]  (225338) : Exiting Master process...
Dec 05 09:30:14 compute-1 neutron-haproxy-ovnmeta-8195ce32-c9e7-4b77-af7d-e020c8cddd3e[225334]: [ALERT]    (225338) : Current worker (225340) exited with code 143 (Terminated)
Dec 05 09:30:14 compute-1 neutron-haproxy-ovnmeta-8195ce32-c9e7-4b77-af7d-e020c8cddd3e[225334]: [WARNING]  (225338) : All workers exited. Exiting... (0)
Dec 05 09:30:14 compute-1 systemd[1]: libpod-ef1319c271b8981336f93163f54e7878665b76863cc6e195c544f8a8a9129ccc.scope: Deactivated successfully.
Dec 05 09:30:14 compute-1 podman[225592]: 2025-12-05 09:30:14.387061991 +0000 UTC m=+0.053911227 container died ef1319c271b8981336f93163f54e7878665b76863cc6e195c544f8a8a9129ccc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8195ce32-c9e7-4b77-af7d-e020c8cddd3e, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Dec 05 09:30:14 compute-1 kernel: tap7f77bcc1-96: entered promiscuous mode
Dec 05 09:30:14 compute-1 kernel: tap7f77bcc1-96 (unregistering): left promiscuous mode
Dec 05 09:30:14 compute-1 NetworkManager[55704]: <info>  [1764927014.3906] manager: (tap7f77bcc1-96): new Tun device (/org/freedesktop/NetworkManager/Devices/68)
Dec 05 09:30:14 compute-1 systemd-udevd[225570]: Network interface NamePolicy= disabled on kernel command line.
Dec 05 09:30:14 compute-1 ovn_controller[95809]: 2025-12-05T09:30:14Z|00128|binding|INFO|Claiming lport 7f77bcc1-96ff-4d64-ae85-ba761c46b85e for this chassis.
Dec 05 09:30:14 compute-1 ovn_controller[95809]: 2025-12-05T09:30:14Z|00129|binding|INFO|7f77bcc1-96ff-4d64-ae85-ba761c46b85e: Claiming fa:16:3e:1e:41:0b 10.100.0.4
Dec 05 09:30:14 compute-1 nova_compute[189066]: 2025-12-05 09:30:14.394 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:30:14 compute-1 ovn_controller[95809]: 2025-12-05T09:30:14Z|00130|binding|INFO|Setting lport 7f77bcc1-96ff-4d64-ae85-ba761c46b85e ovn-installed in OVS
Dec 05 09:30:14 compute-1 ovn_controller[95809]: 2025-12-05T09:30:14Z|00131|if_status|INFO|Not setting lport 7f77bcc1-96ff-4d64-ae85-ba761c46b85e down as sb is readonly
Dec 05 09:30:14 compute-1 nova_compute[189066]: 2025-12-05 09:30:14.415 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:30:14 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-ef1319c271b8981336f93163f54e7878665b76863cc6e195c544f8a8a9129ccc-userdata-shm.mount: Deactivated successfully.
Dec 05 09:30:14 compute-1 systemd[1]: var-lib-containers-storage-overlay-6b3bdf509417a518f4a341eed45708fa459c3cb65999bf4d31d059ea62532629-merged.mount: Deactivated successfully.
Dec 05 09:30:14 compute-1 ovn_controller[95809]: 2025-12-05T09:30:14Z|00132|binding|INFO|Releasing lport 7f77bcc1-96ff-4d64-ae85-ba761c46b85e from this chassis (sb_readonly=0)
Dec 05 09:30:14 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:30:14.437 105272 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:1e:41:0b 10.100.0.4'], port_security=['fa:16:3e:1e:41:0b 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': '04f78c05-bd66-4054-89c0-78b77c57ecf7', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8195ce32-c9e7-4b77-af7d-e020c8cddd3e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e26ae3fdd48d4947978a480f70e14f84', 'neutron:revision_number': '4', 'neutron:security_group_ids': '42f1426a-e52e-4db3-9f22-2fc031853713', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a949aafc-203c-48e1-86d6-9f80dc9b7491, chassis=[<ovs.db.idl.Row object at 0x7fc06f54c970>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fc06f54c970>], logical_port=7f77bcc1-96ff-4d64-ae85-ba761c46b85e) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 09:30:14 compute-1 podman[225592]: 2025-12-05 09:30:14.438684302 +0000 UTC m=+0.105533538 container cleanup ef1319c271b8981336f93163f54e7878665b76863cc6e195c544f8a8a9129ccc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8195ce32-c9e7-4b77-af7d-e020c8cddd3e, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Dec 05 09:30:14 compute-1 systemd[1]: libpod-conmon-ef1319c271b8981336f93163f54e7878665b76863cc6e195c544f8a8a9129ccc.scope: Deactivated successfully.
Dec 05 09:30:14 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:30:14.451 105272 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:1e:41:0b 10.100.0.4'], port_security=['fa:16:3e:1e:41:0b 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': '04f78c05-bd66-4054-89c0-78b77c57ecf7', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8195ce32-c9e7-4b77-af7d-e020c8cddd3e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e26ae3fdd48d4947978a480f70e14f84', 'neutron:revision_number': '4', 'neutron:security_group_ids': '42f1426a-e52e-4db3-9f22-2fc031853713', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a949aafc-203c-48e1-86d6-9f80dc9b7491, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fc06f54c970>], logical_port=7f77bcc1-96ff-4d64-ae85-ba761c46b85e) old=Port_Binding(chassis=[<ovs.db.idl.Row object at 0x7fc06f54c970>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 09:30:14 compute-1 nova_compute[189066]: 2025-12-05 09:30:14.452 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:30:14 compute-1 nova_compute[189066]: 2025-12-05 09:30:14.455 189070 DEBUG oslo_concurrency.processutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/04f78c05-bd66-4054-89c0-78b77c57ecf7/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 09:30:14 compute-1 nova_compute[189066]: 2025-12-05 09:30:14.475 189070 INFO nova.virt.libvirt.driver [-] [instance: 04f78c05-bd66-4054-89c0-78b77c57ecf7] Instance destroyed successfully.
Dec 05 09:30:14 compute-1 nova_compute[189066]: 2025-12-05 09:30:14.476 189070 DEBUG nova.objects.instance [None req-192edfe4-a6b3-4503-ae38-f1b25286e581 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] Lazy-loading 'resources' on Instance uuid 04f78c05-bd66-4054-89c0-78b77c57ecf7 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 09:30:14 compute-1 nova_compute[189066]: 2025-12-05 09:30:14.502 189070 DEBUG nova.virt.libvirt.vif [None req-192edfe4-a6b3-4503-ae38-f1b25286e581 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-05T09:29:27Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-9809392',display_name='tempest-TestNetworkAdvancedServerOps-server-9809392',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-9809392',id=16,image_ref='3ebffd97-b242-42d7-b245-ebdaf8e4377c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBElVLGKxlEVwzAZJ1guSW4shHmDK6WqU6dDNGGxi9zvR6hWR12i1LobTF33T7eAwK5RFh9NTFWyxVw3WpoiJEBk8hm03HrID/XQf1e/9rdkXquVTwGJStOX/buxANUYi7Q==',key_name='tempest-TestNetworkAdvancedServerOps-148347316',keypairs=<?>,launch_index=0,launched_at=2025-12-05T09:29:41Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='e26ae3fdd48d4947978a480f70e14f84',ramdisk_id='',reservation_id='r-87n9oxue',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='3ebffd97-b242-42d7-b245-ebdaf8e4377c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkAdvancedServerOps-1829130727',owner_user_name='tempest-TestNetworkAdvancedServerOps-1829130727-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-05T09:30:09Z,user_data=None,user_id='65751a90715341b2984ef84ebbaa1650',uuid=04f78c05-bd66-4054-89c0-78b77c57ecf7,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "7f77bcc1-96ff-4d64-ae85-ba761c46b85e", "address": "fa:16:3e:1e:41:0b", "network": {"id": "8195ce32-c9e7-4b77-af7d-e020c8cddd3e", "bridge": "br-int", "label": "tempest-network-smoke--234633567", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.184", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e26ae3fdd48d4947978a480f70e14f84", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7f77bcc1-96", "ovs_interfaceid": "7f77bcc1-96ff-4d64-ae85-ba761c46b85e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Dec 05 09:30:14 compute-1 nova_compute[189066]: 2025-12-05 09:30:14.503 189070 DEBUG nova.network.os_vif_util [None req-192edfe4-a6b3-4503-ae38-f1b25286e581 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] Converting VIF {"id": "7f77bcc1-96ff-4d64-ae85-ba761c46b85e", "address": "fa:16:3e:1e:41:0b", "network": {"id": "8195ce32-c9e7-4b77-af7d-e020c8cddd3e", "bridge": "br-int", "label": "tempest-network-smoke--234633567", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.184", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e26ae3fdd48d4947978a480f70e14f84", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7f77bcc1-96", "ovs_interfaceid": "7f77bcc1-96ff-4d64-ae85-ba761c46b85e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 09:30:14 compute-1 nova_compute[189066]: 2025-12-05 09:30:14.504 189070 DEBUG nova.network.os_vif_util [None req-192edfe4-a6b3-4503-ae38-f1b25286e581 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:1e:41:0b,bridge_name='br-int',has_traffic_filtering=True,id=7f77bcc1-96ff-4d64-ae85-ba761c46b85e,network=Network(8195ce32-c9e7-4b77-af7d-e020c8cddd3e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7f77bcc1-96') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 09:30:14 compute-1 nova_compute[189066]: 2025-12-05 09:30:14.504 189070 DEBUG os_vif [None req-192edfe4-a6b3-4503-ae38-f1b25286e581 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:1e:41:0b,bridge_name='br-int',has_traffic_filtering=True,id=7f77bcc1-96ff-4d64-ae85-ba761c46b85e,network=Network(8195ce32-c9e7-4b77-af7d-e020c8cddd3e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7f77bcc1-96') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Dec 05 09:30:14 compute-1 nova_compute[189066]: 2025-12-05 09:30:14.508 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:30:14 compute-1 nova_compute[189066]: 2025-12-05 09:30:14.508 189070 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7f77bcc1-96, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 09:30:14 compute-1 podman[225631]: 2025-12-05 09:30:14.509853053 +0000 UTC m=+0.044544147 container remove ef1319c271b8981336f93163f54e7878665b76863cc6e195c544f8a8a9129ccc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8195ce32-c9e7-4b77-af7d-e020c8cddd3e, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true)
Dec 05 09:30:14 compute-1 nova_compute[189066]: 2025-12-05 09:30:14.518 189070 DEBUG oslo_concurrency.processutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/04f78c05-bd66-4054-89c0-78b77c57ecf7/disk --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 09:30:14 compute-1 nova_compute[189066]: 2025-12-05 09:30:14.519 189070 DEBUG oslo_concurrency.processutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/04f78c05-bd66-4054-89c0-78b77c57ecf7/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 09:30:14 compute-1 nova_compute[189066]: 2025-12-05 09:30:14.548 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:30:14 compute-1 nova_compute[189066]: 2025-12-05 09:30:14.552 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 09:30:14 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:30:14.551 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[79b81e1e-8f75-45fe-8f39-786262f07658]: (4, ('Fri Dec  5 09:30:14 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-8195ce32-c9e7-4b77-af7d-e020c8cddd3e (ef1319c271b8981336f93163f54e7878665b76863cc6e195c544f8a8a9129ccc)\nef1319c271b8981336f93163f54e7878665b76863cc6e195c544f8a8a9129ccc\nFri Dec  5 09:30:14 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-8195ce32-c9e7-4b77-af7d-e020c8cddd3e (ef1319c271b8981336f93163f54e7878665b76863cc6e195c544f8a8a9129ccc)\nef1319c271b8981336f93163f54e7878665b76863cc6e195c544f8a8a9129ccc\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:30:14 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:30:14.554 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[02d69fe1-7d06-4d6f-8ef9-547259efd3f3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:30:14 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:30:14.556 105272 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap8195ce32-c0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 09:30:14 compute-1 nova_compute[189066]: 2025-12-05 09:30:14.556 189070 INFO os_vif [None req-192edfe4-a6b3-4503-ae38-f1b25286e581 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:1e:41:0b,bridge_name='br-int',has_traffic_filtering=True,id=7f77bcc1-96ff-4d64-ae85-ba761c46b85e,network=Network(8195ce32-c9e7-4b77-af7d-e020c8cddd3e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7f77bcc1-96')
Dec 05 09:30:14 compute-1 kernel: tap8195ce32-c0: left promiscuous mode
Dec 05 09:30:14 compute-1 nova_compute[189066]: 2025-12-05 09:30:14.557 189070 INFO nova.virt.libvirt.driver [None req-192edfe4-a6b3-4503-ae38-f1b25286e581 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] [instance: 04f78c05-bd66-4054-89c0-78b77c57ecf7] Deleting instance files /var/lib/nova/instances/04f78c05-bd66-4054-89c0-78b77c57ecf7_del
Dec 05 09:30:14 compute-1 nova_compute[189066]: 2025-12-05 09:30:14.560 189070 INFO nova.virt.libvirt.driver [None req-192edfe4-a6b3-4503-ae38-f1b25286e581 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] [instance: 04f78c05-bd66-4054-89c0-78b77c57ecf7] Deletion of /var/lib/nova/instances/04f78c05-bd66-4054-89c0-78b77c57ecf7_del complete
Dec 05 09:30:14 compute-1 nova_compute[189066]: 2025-12-05 09:30:14.567 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:30:14 compute-1 nova_compute[189066]: 2025-12-05 09:30:14.569 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:30:14 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:30:14.573 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[575e2757-3898-4879-9c00-12d41cef2c1b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:30:14 compute-1 nova_compute[189066]: 2025-12-05 09:30:14.584 189070 DEBUG oslo_concurrency.processutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/04f78c05-bd66-4054-89c0-78b77c57ecf7/disk --force-share --output=json" returned: 1 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 09:30:14 compute-1 nova_compute[189066]: 2025-12-05 09:30:14.586 189070 DEBUG oslo_concurrency.processutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] '/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/04f78c05-bd66-4054-89c0-78b77c57ecf7/disk --force-share --output=json' failed. Not Retrying. execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:473
Dec 05 09:30:14 compute-1 nova_compute[189066]: 2025-12-05 09:30:14.588 189070 WARNING nova.virt.libvirt.driver [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Periodic task is updating the host stats, it is trying to get disk info for instance-00000010, but the backing disk storage was removed by a concurrent operation such as resize. Error: No disk at /var/lib/nova/instances/04f78c05-bd66-4054-89c0-78b77c57ecf7/disk: nova.exception.DiskNotFound: No disk at /var/lib/nova/instances/04f78c05-bd66-4054-89c0-78b77c57ecf7/disk
Dec 05 09:30:14 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:30:14.589 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[18dfe898-a43e-472b-a4a6-8ca805be9089]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:30:14 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:30:14.590 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[de436bcb-0fa5-4ae8-aa8f-b2d221eadc4f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:30:14 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:30:14.615 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[54ae86b9-5a38-4e11-9945-9fbff1b3a966]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 425042, 'reachable_time': 43901, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 225652, 'error': None, 'target': 'ovnmeta-8195ce32-c9e7-4b77-af7d-e020c8cddd3e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:30:14 compute-1 systemd[1]: run-netns-ovnmeta\x2d8195ce32\x2dc9e7\x2d4b77\x2daf7d\x2de020c8cddd3e.mount: Deactivated successfully.
Dec 05 09:30:14 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:30:14.620 105809 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-8195ce32-c9e7-4b77-af7d-e020c8cddd3e deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Dec 05 09:30:14 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:30:14.620 105809 DEBUG oslo.privsep.daemon [-] privsep: reply[81e58788-9637-4eff-b5a5-926adef931ef]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:30:14 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:30:14.622 105272 INFO neutron.agent.ovn.metadata.agent [-] Port 7f77bcc1-96ff-4d64-ae85-ba761c46b85e in datapath 8195ce32-c9e7-4b77-af7d-e020c8cddd3e unbound from our chassis
Dec 05 09:30:14 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:30:14.625 105272 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 8195ce32-c9e7-4b77-af7d-e020c8cddd3e, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 05 09:30:14 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:30:14.626 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[b196e282-10b1-4328-9269-354eab651c6a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:30:14 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:30:14.627 105272 INFO neutron.agent.ovn.metadata.agent [-] Port 7f77bcc1-96ff-4d64-ae85-ba761c46b85e in datapath 8195ce32-c9e7-4b77-af7d-e020c8cddd3e unbound from our chassis
Dec 05 09:30:14 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:30:14.629 105272 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 8195ce32-c9e7-4b77-af7d-e020c8cddd3e, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 05 09:30:14 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:30:14.630 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[d0a009ab-8879-4f21-b349-9fe2222a2dd5]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:30:14 compute-1 nova_compute[189066]: 2025-12-05 09:30:14.643 189070 INFO nova.compute.manager [None req-192edfe4-a6b3-4503-ae38-f1b25286e581 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] [instance: 04f78c05-bd66-4054-89c0-78b77c57ecf7] Took 0.48 seconds to destroy the instance on the hypervisor.
Dec 05 09:30:14 compute-1 nova_compute[189066]: 2025-12-05 09:30:14.644 189070 DEBUG oslo.service.loopingcall [None req-192edfe4-a6b3-4503-ae38-f1b25286e581 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Dec 05 09:30:14 compute-1 nova_compute[189066]: 2025-12-05 09:30:14.644 189070 DEBUG nova.compute.manager [-] [instance: 04f78c05-bd66-4054-89c0-78b77c57ecf7] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Dec 05 09:30:14 compute-1 nova_compute[189066]: 2025-12-05 09:30:14.644 189070 DEBUG nova.network.neutron [-] [instance: 04f78c05-bd66-4054-89c0-78b77c57ecf7] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Dec 05 09:30:14 compute-1 nova_compute[189066]: 2025-12-05 09:30:14.767 189070 WARNING nova.virt.libvirt.driver [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 05 09:30:14 compute-1 nova_compute[189066]: 2025-12-05 09:30:14.768 189070 DEBUG nova.compute.resource_tracker [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5600MB free_disk=73.30430603027344GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 05 09:30:14 compute-1 nova_compute[189066]: 2025-12-05 09:30:14.769 189070 DEBUG oslo_concurrency.lockutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:30:14 compute-1 nova_compute[189066]: 2025-12-05 09:30:14.769 189070 DEBUG oslo_concurrency.lockutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:30:15 compute-1 nova_compute[189066]: 2025-12-05 09:30:15.196 189070 DEBUG nova.compute.resource_tracker [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Instance 04f78c05-bd66-4054-89c0-78b77c57ecf7 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Dec 05 09:30:15 compute-1 nova_compute[189066]: 2025-12-05 09:30:15.196 189070 DEBUG nova.compute.resource_tracker [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 05 09:30:15 compute-1 nova_compute[189066]: 2025-12-05 09:30:15.196 189070 DEBUG nova.compute.resource_tracker [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 05 09:30:15 compute-1 nova_compute[189066]: 2025-12-05 09:30:15.263 189070 DEBUG nova.compute.provider_tree [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Inventory has not changed in ProviderTree for provider: be68f9f1-7820-4bfa-8dbd-210e13729f64 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 09:30:15 compute-1 nova_compute[189066]: 2025-12-05 09:30:15.284 189070 DEBUG nova.scheduler.client.report [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Inventory has not changed for provider be68f9f1-7820-4bfa-8dbd-210e13729f64 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 09:30:15 compute-1 nova_compute[189066]: 2025-12-05 09:30:15.321 189070 DEBUG nova.compute.resource_tracker [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 05 09:30:15 compute-1 nova_compute[189066]: 2025-12-05 09:30:15.322 189070 DEBUG oslo_concurrency.lockutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.553s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:30:16 compute-1 nova_compute[189066]: 2025-12-05 09:30:16.323 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:30:16 compute-1 nova_compute[189066]: 2025-12-05 09:30:16.324 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:30:16 compute-1 nova_compute[189066]: 2025-12-05 09:30:16.324 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:30:16 compute-1 nova_compute[189066]: 2025-12-05 09:30:16.325 189070 DEBUG nova.compute.manager [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 05 09:30:16 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:30:16.394 105272 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=13, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f2:d3:89', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '8e:43:2c:7d:20:dd'}, ipsec=False) old=SB_Global(nb_cfg=12) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 09:30:16 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:30:16.395 105272 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Dec 05 09:30:16 compute-1 nova_compute[189066]: 2025-12-05 09:30:16.395 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:30:16 compute-1 podman[225653]: 2025-12-05 09:30:16.62960816 +0000 UTC m=+0.064938769 container health_status 3f3288243aefe700d53cffce184353529fbaf6e44f1c19ec4792ed576af41176 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team)
Dec 05 09:30:17 compute-1 nova_compute[189066]: 2025-12-05 09:30:17.022 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:30:17 compute-1 nova_compute[189066]: 2025-12-05 09:30:17.023 189070 DEBUG nova.compute.manager [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 05 09:30:17 compute-1 nova_compute[189066]: 2025-12-05 09:30:17.023 189070 DEBUG nova.compute.manager [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 05 09:30:17 compute-1 nova_compute[189066]: 2025-12-05 09:30:17.051 189070 DEBUG nova.compute.manager [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] [instance: 04f78c05-bd66-4054-89c0-78b77c57ecf7] Skipping network cache update for instance because it is being deleted. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9875
Dec 05 09:30:17 compute-1 nova_compute[189066]: 2025-12-05 09:30:17.051 189070 DEBUG nova.compute.manager [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Dec 05 09:30:18 compute-1 nova_compute[189066]: 2025-12-05 09:30:18.529 189070 DEBUG nova.network.neutron [-] [instance: 04f78c05-bd66-4054-89c0-78b77c57ecf7] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 09:30:18 compute-1 nova_compute[189066]: 2025-12-05 09:30:18.566 189070 INFO nova.compute.manager [-] [instance: 04f78c05-bd66-4054-89c0-78b77c57ecf7] Took 3.92 seconds to deallocate network for instance.
Dec 05 09:30:18 compute-1 nova_compute[189066]: 2025-12-05 09:30:18.622 189070 DEBUG oslo_concurrency.lockutils [None req-192edfe4-a6b3-4503-ae38-f1b25286e581 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:30:18 compute-1 nova_compute[189066]: 2025-12-05 09:30:18.623 189070 DEBUG oslo_concurrency.lockutils [None req-192edfe4-a6b3-4503-ae38-f1b25286e581 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:30:18 compute-1 nova_compute[189066]: 2025-12-05 09:30:18.627 189070 DEBUG nova.compute.manager [req-c843c475-07cb-4557-9def-540160f6fc0e req-fc83b2ea-5e4d-4283-a0ed-ef4c526390e1 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 04f78c05-bd66-4054-89c0-78b77c57ecf7] Received event network-vif-deleted-7f77bcc1-96ff-4d64-ae85-ba761c46b85e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 09:30:18 compute-1 nova_compute[189066]: 2025-12-05 09:30:18.693 189070 DEBUG nova.compute.provider_tree [None req-192edfe4-a6b3-4503-ae38-f1b25286e581 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] Inventory has not changed in ProviderTree for provider: be68f9f1-7820-4bfa-8dbd-210e13729f64 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 09:30:18 compute-1 nova_compute[189066]: 2025-12-05 09:30:18.721 189070 DEBUG nova.scheduler.client.report [None req-192edfe4-a6b3-4503-ae38-f1b25286e581 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] Inventory has not changed for provider be68f9f1-7820-4bfa-8dbd-210e13729f64 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 09:30:18 compute-1 nova_compute[189066]: 2025-12-05 09:30:18.751 189070 DEBUG oslo_concurrency.lockutils [None req-192edfe4-a6b3-4503-ae38-f1b25286e581 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.128s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:30:18 compute-1 nova_compute[189066]: 2025-12-05 09:30:18.776 189070 INFO nova.scheduler.client.report [None req-192edfe4-a6b3-4503-ae38-f1b25286e581 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] Deleted allocations for instance 04f78c05-bd66-4054-89c0-78b77c57ecf7
Dec 05 09:30:18 compute-1 nova_compute[189066]: 2025-12-05 09:30:18.975 189070 DEBUG oslo_concurrency.lockutils [None req-192edfe4-a6b3-4503-ae38-f1b25286e581 65751a90715341b2984ef84ebbaa1650 e26ae3fdd48d4947978a480f70e14f84 - - default default] Lock "04f78c05-bd66-4054-89c0-78b77c57ecf7" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 4.814s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:30:19 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:30:19.397 105272 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=2deaed7a-68f6-453c-b7f8-10ef033f3762, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '13'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 09:30:19 compute-1 nova_compute[189066]: 2025-12-05 09:30:19.442 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:30:19 compute-1 nova_compute[189066]: 2025-12-05 09:30:19.547 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:30:19 compute-1 nova_compute[189066]: 2025-12-05 09:30:19.908 189070 DEBUG nova.network.neutron [req-a8c032a8-9649-43cc-861c-67c13d229fbb req-ea245581-2af0-4760-a30b-6aecc2f6feb7 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 04f78c05-bd66-4054-89c0-78b77c57ecf7] Updated VIF entry in instance network info cache for port 7f77bcc1-96ff-4d64-ae85-ba761c46b85e. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Dec 05 09:30:19 compute-1 nova_compute[189066]: 2025-12-05 09:30:19.909 189070 DEBUG nova.network.neutron [req-a8c032a8-9649-43cc-861c-67c13d229fbb req-ea245581-2af0-4760-a30b-6aecc2f6feb7 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 04f78c05-bd66-4054-89c0-78b77c57ecf7] Updating instance_info_cache with network_info: [{"id": "7f77bcc1-96ff-4d64-ae85-ba761c46b85e", "address": "fa:16:3e:1e:41:0b", "network": {"id": "8195ce32-c9e7-4b77-af7d-e020c8cddd3e", "bridge": "br-int", "label": "tempest-network-smoke--234633567", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e26ae3fdd48d4947978a480f70e14f84", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7f77bcc1-96", "ovs_interfaceid": "7f77bcc1-96ff-4d64-ae85-ba761c46b85e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 09:30:19 compute-1 nova_compute[189066]: 2025-12-05 09:30:19.966 189070 DEBUG oslo_concurrency.lockutils [req-a8c032a8-9649-43cc-861c-67c13d229fbb req-ea245581-2af0-4760-a30b-6aecc2f6feb7 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Releasing lock "refresh_cache-04f78c05-bd66-4054-89c0-78b77c57ecf7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 09:30:21 compute-1 podman[225676]: 2025-12-05 09:30:21.641824885 +0000 UTC m=+0.083401774 container health_status b07f36300303a5c1729056a61e344d6c6fd9b2e404082d8e8ce7185d45652c2b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, build-date=2025-08-20T13:12:41, config_id=openstack_network_exporter, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, distribution-scope=public, version=9.6, vendor=Red Hat, Inc., architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, managed_by=edpm_ansible)
Dec 05 09:30:23 compute-1 podman[225697]: 2025-12-05 09:30:23.615942278 +0000 UTC m=+0.059583698 container health_status b9f45cdcb567bb250381fa80d86c78c226d5638c7de1b6d79ee017275814ff68 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Dec 05 09:30:24 compute-1 nova_compute[189066]: 2025-12-05 09:30:24.501 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:30:24 compute-1 nova_compute[189066]: 2025-12-05 09:30:24.548 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:30:24 compute-1 nova_compute[189066]: 2025-12-05 09:30:24.670 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:30:24 compute-1 nova_compute[189066]: 2025-12-05 09:30:24.971 189070 DEBUG oslo_concurrency.lockutils [None req-ab315688-b56d-460e-9e9d-6b5533215e96 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Acquiring lock "d615ca65-b10a-4ff3-a274-7b300d2e1808" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:30:24 compute-1 nova_compute[189066]: 2025-12-05 09:30:24.971 189070 DEBUG oslo_concurrency.lockutils [None req-ab315688-b56d-460e-9e9d-6b5533215e96 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Lock "d615ca65-b10a-4ff3-a274-7b300d2e1808" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:30:25 compute-1 nova_compute[189066]: 2025-12-05 09:30:25.007 189070 DEBUG nova.compute.manager [None req-ab315688-b56d-460e-9e9d-6b5533215e96 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: d615ca65-b10a-4ff3-a274-7b300d2e1808] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Dec 05 09:30:25 compute-1 nova_compute[189066]: 2025-12-05 09:30:25.125 189070 DEBUG oslo_concurrency.lockutils [None req-ab315688-b56d-460e-9e9d-6b5533215e96 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:30:25 compute-1 nova_compute[189066]: 2025-12-05 09:30:25.126 189070 DEBUG oslo_concurrency.lockutils [None req-ab315688-b56d-460e-9e9d-6b5533215e96 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:30:25 compute-1 nova_compute[189066]: 2025-12-05 09:30:25.135 189070 DEBUG nova.virt.hardware [None req-ab315688-b56d-460e-9e9d-6b5533215e96 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Dec 05 09:30:25 compute-1 nova_compute[189066]: 2025-12-05 09:30:25.136 189070 INFO nova.compute.claims [None req-ab315688-b56d-460e-9e9d-6b5533215e96 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: d615ca65-b10a-4ff3-a274-7b300d2e1808] Claim successful on node compute-1.ctlplane.example.com
Dec 05 09:30:25 compute-1 nova_compute[189066]: 2025-12-05 09:30:25.309 189070 DEBUG nova.compute.provider_tree [None req-ab315688-b56d-460e-9e9d-6b5533215e96 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Inventory has not changed in ProviderTree for provider: be68f9f1-7820-4bfa-8dbd-210e13729f64 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 09:30:25 compute-1 nova_compute[189066]: 2025-12-05 09:30:25.363 189070 DEBUG nova.scheduler.client.report [None req-ab315688-b56d-460e-9e9d-6b5533215e96 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Inventory has not changed for provider be68f9f1-7820-4bfa-8dbd-210e13729f64 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 09:30:25 compute-1 nova_compute[189066]: 2025-12-05 09:30:25.393 189070 DEBUG oslo_concurrency.lockutils [None req-ab315688-b56d-460e-9e9d-6b5533215e96 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.267s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:30:25 compute-1 nova_compute[189066]: 2025-12-05 09:30:25.393 189070 DEBUG nova.compute.manager [None req-ab315688-b56d-460e-9e9d-6b5533215e96 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: d615ca65-b10a-4ff3-a274-7b300d2e1808] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Dec 05 09:30:25 compute-1 nova_compute[189066]: 2025-12-05 09:30:25.451 189070 DEBUG nova.compute.manager [None req-ab315688-b56d-460e-9e9d-6b5533215e96 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: d615ca65-b10a-4ff3-a274-7b300d2e1808] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Dec 05 09:30:25 compute-1 nova_compute[189066]: 2025-12-05 09:30:25.452 189070 DEBUG nova.network.neutron [None req-ab315688-b56d-460e-9e9d-6b5533215e96 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: d615ca65-b10a-4ff3-a274-7b300d2e1808] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Dec 05 09:30:25 compute-1 nova_compute[189066]: 2025-12-05 09:30:25.472 189070 INFO nova.virt.libvirt.driver [None req-ab315688-b56d-460e-9e9d-6b5533215e96 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: d615ca65-b10a-4ff3-a274-7b300d2e1808] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Dec 05 09:30:25 compute-1 nova_compute[189066]: 2025-12-05 09:30:25.494 189070 DEBUG nova.compute.manager [None req-ab315688-b56d-460e-9e9d-6b5533215e96 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: d615ca65-b10a-4ff3-a274-7b300d2e1808] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Dec 05 09:30:25 compute-1 nova_compute[189066]: 2025-12-05 09:30:25.597 189070 DEBUG nova.compute.manager [None req-ab315688-b56d-460e-9e9d-6b5533215e96 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: d615ca65-b10a-4ff3-a274-7b300d2e1808] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Dec 05 09:30:25 compute-1 nova_compute[189066]: 2025-12-05 09:30:25.598 189070 DEBUG nova.virt.libvirt.driver [None req-ab315688-b56d-460e-9e9d-6b5533215e96 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: d615ca65-b10a-4ff3-a274-7b300d2e1808] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Dec 05 09:30:25 compute-1 nova_compute[189066]: 2025-12-05 09:30:25.599 189070 INFO nova.virt.libvirt.driver [None req-ab315688-b56d-460e-9e9d-6b5533215e96 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: d615ca65-b10a-4ff3-a274-7b300d2e1808] Creating image(s)
Dec 05 09:30:25 compute-1 nova_compute[189066]: 2025-12-05 09:30:25.599 189070 DEBUG oslo_concurrency.lockutils [None req-ab315688-b56d-460e-9e9d-6b5533215e96 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Acquiring lock "/var/lib/nova/instances/d615ca65-b10a-4ff3-a274-7b300d2e1808/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:30:25 compute-1 nova_compute[189066]: 2025-12-05 09:30:25.600 189070 DEBUG oslo_concurrency.lockutils [None req-ab315688-b56d-460e-9e9d-6b5533215e96 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Lock "/var/lib/nova/instances/d615ca65-b10a-4ff3-a274-7b300d2e1808/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:30:25 compute-1 nova_compute[189066]: 2025-12-05 09:30:25.600 189070 DEBUG oslo_concurrency.lockutils [None req-ab315688-b56d-460e-9e9d-6b5533215e96 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Lock "/var/lib/nova/instances/d615ca65-b10a-4ff3-a274-7b300d2e1808/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:30:25 compute-1 nova_compute[189066]: 2025-12-05 09:30:25.613 189070 DEBUG oslo_concurrency.processutils [None req-ab315688-b56d-460e-9e9d-6b5533215e96 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5b1bdb5cc50b9bf43ebe3094e9279a12a18df0ce --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 09:30:25 compute-1 nova_compute[189066]: 2025-12-05 09:30:25.673 189070 DEBUG oslo_concurrency.processutils [None req-ab315688-b56d-460e-9e9d-6b5533215e96 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5b1bdb5cc50b9bf43ebe3094e9279a12a18df0ce --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 09:30:25 compute-1 nova_compute[189066]: 2025-12-05 09:30:25.674 189070 DEBUG oslo_concurrency.lockutils [None req-ab315688-b56d-460e-9e9d-6b5533215e96 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Acquiring lock "5b1bdb5cc50b9bf43ebe3094e9279a12a18df0ce" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:30:25 compute-1 nova_compute[189066]: 2025-12-05 09:30:25.674 189070 DEBUG oslo_concurrency.lockutils [None req-ab315688-b56d-460e-9e9d-6b5533215e96 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Lock "5b1bdb5cc50b9bf43ebe3094e9279a12a18df0ce" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:30:25 compute-1 nova_compute[189066]: 2025-12-05 09:30:25.687 189070 DEBUG oslo_concurrency.processutils [None req-ab315688-b56d-460e-9e9d-6b5533215e96 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5b1bdb5cc50b9bf43ebe3094e9279a12a18df0ce --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 09:30:25 compute-1 nova_compute[189066]: 2025-12-05 09:30:25.746 189070 DEBUG oslo_concurrency.processutils [None req-ab315688-b56d-460e-9e9d-6b5533215e96 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5b1bdb5cc50b9bf43ebe3094e9279a12a18df0ce --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 09:30:25 compute-1 nova_compute[189066]: 2025-12-05 09:30:25.748 189070 DEBUG oslo_concurrency.processutils [None req-ab315688-b56d-460e-9e9d-6b5533215e96 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5b1bdb5cc50b9bf43ebe3094e9279a12a18df0ce,backing_fmt=raw /var/lib/nova/instances/d615ca65-b10a-4ff3-a274-7b300d2e1808/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 09:30:25 compute-1 nova_compute[189066]: 2025-12-05 09:30:25.787 189070 DEBUG oslo_concurrency.processutils [None req-ab315688-b56d-460e-9e9d-6b5533215e96 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5b1bdb5cc50b9bf43ebe3094e9279a12a18df0ce,backing_fmt=raw /var/lib/nova/instances/d615ca65-b10a-4ff3-a274-7b300d2e1808/disk 1073741824" returned: 0 in 0.039s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 09:30:25 compute-1 nova_compute[189066]: 2025-12-05 09:30:25.788 189070 DEBUG oslo_concurrency.lockutils [None req-ab315688-b56d-460e-9e9d-6b5533215e96 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Lock "5b1bdb5cc50b9bf43ebe3094e9279a12a18df0ce" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.113s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:30:25 compute-1 nova_compute[189066]: 2025-12-05 09:30:25.788 189070 DEBUG oslo_concurrency.processutils [None req-ab315688-b56d-460e-9e9d-6b5533215e96 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5b1bdb5cc50b9bf43ebe3094e9279a12a18df0ce --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 09:30:25 compute-1 nova_compute[189066]: 2025-12-05 09:30:25.848 189070 DEBUG oslo_concurrency.processutils [None req-ab315688-b56d-460e-9e9d-6b5533215e96 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5b1bdb5cc50b9bf43ebe3094e9279a12a18df0ce --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 09:30:25 compute-1 nova_compute[189066]: 2025-12-05 09:30:25.849 189070 DEBUG nova.virt.disk.api [None req-ab315688-b56d-460e-9e9d-6b5533215e96 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Checking if we can resize image /var/lib/nova/instances/d615ca65-b10a-4ff3-a274-7b300d2e1808/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Dec 05 09:30:25 compute-1 nova_compute[189066]: 2025-12-05 09:30:25.849 189070 DEBUG oslo_concurrency.processutils [None req-ab315688-b56d-460e-9e9d-6b5533215e96 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d615ca65-b10a-4ff3-a274-7b300d2e1808/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 09:30:25 compute-1 nova_compute[189066]: 2025-12-05 09:30:25.910 189070 DEBUG oslo_concurrency.processutils [None req-ab315688-b56d-460e-9e9d-6b5533215e96 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d615ca65-b10a-4ff3-a274-7b300d2e1808/disk --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 09:30:25 compute-1 nova_compute[189066]: 2025-12-05 09:30:25.911 189070 DEBUG nova.virt.disk.api [None req-ab315688-b56d-460e-9e9d-6b5533215e96 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Cannot resize image /var/lib/nova/instances/d615ca65-b10a-4ff3-a274-7b300d2e1808/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Dec 05 09:30:25 compute-1 nova_compute[189066]: 2025-12-05 09:30:25.911 189070 DEBUG nova.objects.instance [None req-ab315688-b56d-460e-9e9d-6b5533215e96 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Lazy-loading 'migration_context' on Instance uuid d615ca65-b10a-4ff3-a274-7b300d2e1808 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 09:30:25 compute-1 nova_compute[189066]: 2025-12-05 09:30:25.935 189070 DEBUG nova.virt.libvirt.driver [None req-ab315688-b56d-460e-9e9d-6b5533215e96 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: d615ca65-b10a-4ff3-a274-7b300d2e1808] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Dec 05 09:30:25 compute-1 nova_compute[189066]: 2025-12-05 09:30:25.936 189070 DEBUG nova.virt.libvirt.driver [None req-ab315688-b56d-460e-9e9d-6b5533215e96 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: d615ca65-b10a-4ff3-a274-7b300d2e1808] Ensure instance console log exists: /var/lib/nova/instances/d615ca65-b10a-4ff3-a274-7b300d2e1808/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Dec 05 09:30:25 compute-1 nova_compute[189066]: 2025-12-05 09:30:25.936 189070 DEBUG oslo_concurrency.lockutils [None req-ab315688-b56d-460e-9e9d-6b5533215e96 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:30:25 compute-1 nova_compute[189066]: 2025-12-05 09:30:25.937 189070 DEBUG oslo_concurrency.lockutils [None req-ab315688-b56d-460e-9e9d-6b5533215e96 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:30:25 compute-1 nova_compute[189066]: 2025-12-05 09:30:25.937 189070 DEBUG oslo_concurrency.lockutils [None req-ab315688-b56d-460e-9e9d-6b5533215e96 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:30:26 compute-1 nova_compute[189066]: 2025-12-05 09:30:26.125 189070 DEBUG nova.policy [None req-ab315688-b56d-460e-9e9d-6b5533215e96 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '822a2d80e80e46d9a5a49e1e9560e0d9', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'f918144b49634ed5a43d75f8f7d194d3', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Dec 05 09:30:28 compute-1 nova_compute[189066]: 2025-12-05 09:30:28.620 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:30:28 compute-1 nova_compute[189066]: 2025-12-05 09:30:28.641 189070 DEBUG nova.network.neutron [None req-ab315688-b56d-460e-9e9d-6b5533215e96 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: d615ca65-b10a-4ff3-a274-7b300d2e1808] Successfully created port: 8ecf8178-afa3-4403-9157-3d31a281b7aa _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Dec 05 09:30:28 compute-1 sshd-session[225721]: Received disconnect from 101.47.162.91 port 47064:11: Bye Bye [preauth]
Dec 05 09:30:28 compute-1 sshd-session[225721]: Disconnected from authenticating user root 101.47.162.91 port 47064 [preauth]
Dec 05 09:30:28 compute-1 nova_compute[189066]: 2025-12-05 09:30:28.912 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:30:29 compute-1 nova_compute[189066]: 2025-12-05 09:30:29.451 189070 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764927014.4493384, 04f78c05-bd66-4054-89c0-78b77c57ecf7 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 09:30:29 compute-1 nova_compute[189066]: 2025-12-05 09:30:29.452 189070 INFO nova.compute.manager [-] [instance: 04f78c05-bd66-4054-89c0-78b77c57ecf7] VM Stopped (Lifecycle Event)
Dec 05 09:30:29 compute-1 nova_compute[189066]: 2025-12-05 09:30:29.503 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:30:29 compute-1 nova_compute[189066]: 2025-12-05 09:30:29.509 189070 DEBUG nova.compute.manager [None req-47e3da13-1757-48b3-a7e4-b53fb2211452 - - - - - -] [instance: 04f78c05-bd66-4054-89c0-78b77c57ecf7] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 09:30:29 compute-1 nova_compute[189066]: 2025-12-05 09:30:29.551 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:30:30 compute-1 podman[225739]: 2025-12-05 09:30:30.894467474 +0000 UTC m=+0.327615622 container health_status 550b604b3b40c506829669f3af0f3e8dc4e7ac01464222ce7f26998708fe1a96 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Dec 05 09:30:32 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:30:32.064 105272 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:78:90:c2 2001:db8:0:1:f816:3eff:fe78:90c2 2001:db8::f816:3eff:fe78:90c2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8:0:1:f816:3eff:fe78:90c2/64 2001:db8::f816:3eff:fe78:90c2/64', 'neutron:device_id': 'ovnmeta-98fe47c1-61c4-467d-aac4-24f57be8bf8d', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-98fe47c1-61c4-467d-aac4-24f57be8bf8d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fa1cd463d74b49139a088d332d37e611', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e5367df7-d393-487e-8618-33c8723ccb61, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=93ebf118-8bf0-4da9-a0d0-bc09103a1f8f) old=Port_Binding(mac=['fa:16:3e:78:90:c2 2001:db8::f816:3eff:fe78:90c2'], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:fe78:90c2/64', 'neutron:device_id': 'ovnmeta-98fe47c1-61c4-467d-aac4-24f57be8bf8d', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-98fe47c1-61c4-467d-aac4-24f57be8bf8d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fa1cd463d74b49139a088d332d37e611', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 09:30:32 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:30:32.065 105272 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 93ebf118-8bf0-4da9-a0d0-bc09103a1f8f in datapath 98fe47c1-61c4-467d-aac4-24f57be8bf8d updated
Dec 05 09:30:32 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:30:32.067 105272 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 98fe47c1-61c4-467d-aac4-24f57be8bf8d, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 05 09:30:32 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:30:32.068 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[f53fabfa-3e26-44ca-b21c-1ee17078a47e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:30:32 compute-1 nova_compute[189066]: 2025-12-05 09:30:32.990 189070 DEBUG nova.network.neutron [None req-ab315688-b56d-460e-9e9d-6b5533215e96 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: d615ca65-b10a-4ff3-a274-7b300d2e1808] Successfully updated port: 8ecf8178-afa3-4403-9157-3d31a281b7aa _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Dec 05 09:30:33 compute-1 nova_compute[189066]: 2025-12-05 09:30:33.024 189070 DEBUG oslo_concurrency.lockutils [None req-ab315688-b56d-460e-9e9d-6b5533215e96 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Acquiring lock "refresh_cache-d615ca65-b10a-4ff3-a274-7b300d2e1808" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 09:30:33 compute-1 nova_compute[189066]: 2025-12-05 09:30:33.025 189070 DEBUG oslo_concurrency.lockutils [None req-ab315688-b56d-460e-9e9d-6b5533215e96 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Acquired lock "refresh_cache-d615ca65-b10a-4ff3-a274-7b300d2e1808" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 09:30:33 compute-1 nova_compute[189066]: 2025-12-05 09:30:33.025 189070 DEBUG nova.network.neutron [None req-ab315688-b56d-460e-9e9d-6b5533215e96 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: d615ca65-b10a-4ff3-a274-7b300d2e1808] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Dec 05 09:30:33 compute-1 nova_compute[189066]: 2025-12-05 09:30:33.216 189070 DEBUG nova.compute.manager [req-05f131e9-07c1-4945-8b55-1e5d989948b0 req-f2ea9a03-c76a-4d43-8819-a7683b2d5290 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: d615ca65-b10a-4ff3-a274-7b300d2e1808] Received event network-changed-8ecf8178-afa3-4403-9157-3d31a281b7aa external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 09:30:33 compute-1 nova_compute[189066]: 2025-12-05 09:30:33.217 189070 DEBUG nova.compute.manager [req-05f131e9-07c1-4945-8b55-1e5d989948b0 req-f2ea9a03-c76a-4d43-8819-a7683b2d5290 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: d615ca65-b10a-4ff3-a274-7b300d2e1808] Refreshing instance network info cache due to event network-changed-8ecf8178-afa3-4403-9157-3d31a281b7aa. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Dec 05 09:30:33 compute-1 nova_compute[189066]: 2025-12-05 09:30:33.217 189070 DEBUG oslo_concurrency.lockutils [req-05f131e9-07c1-4945-8b55-1e5d989948b0 req-f2ea9a03-c76a-4d43-8819-a7683b2d5290 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Acquiring lock "refresh_cache-d615ca65-b10a-4ff3-a274-7b300d2e1808" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 09:30:33 compute-1 nova_compute[189066]: 2025-12-05 09:30:33.399 189070 DEBUG nova.network.neutron [None req-ab315688-b56d-460e-9e9d-6b5533215e96 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: d615ca65-b10a-4ff3-a274-7b300d2e1808] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Dec 05 09:30:34 compute-1 nova_compute[189066]: 2025-12-05 09:30:34.505 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:30:34 compute-1 nova_compute[189066]: 2025-12-05 09:30:34.553 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:30:36 compute-1 nova_compute[189066]: 2025-12-05 09:30:36.206 189070 DEBUG nova.network.neutron [None req-ab315688-b56d-460e-9e9d-6b5533215e96 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: d615ca65-b10a-4ff3-a274-7b300d2e1808] Updating instance_info_cache with network_info: [{"id": "8ecf8178-afa3-4403-9157-3d31a281b7aa", "address": "fa:16:3e:d0:ae:d2", "network": {"id": "4b2a905b-11a3-4881-aea2-258a30f07ac8", "bridge": "br-int", "label": "tempest-network-smoke--2084868532", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.24", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f918144b49634ed5a43d75f8f7d194d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8ecf8178-af", "ovs_interfaceid": "8ecf8178-afa3-4403-9157-3d31a281b7aa", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 09:30:36 compute-1 nova_compute[189066]: 2025-12-05 09:30:36.260 189070 DEBUG oslo_concurrency.lockutils [None req-ab315688-b56d-460e-9e9d-6b5533215e96 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Releasing lock "refresh_cache-d615ca65-b10a-4ff3-a274-7b300d2e1808" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 09:30:36 compute-1 nova_compute[189066]: 2025-12-05 09:30:36.261 189070 DEBUG nova.compute.manager [None req-ab315688-b56d-460e-9e9d-6b5533215e96 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: d615ca65-b10a-4ff3-a274-7b300d2e1808] Instance network_info: |[{"id": "8ecf8178-afa3-4403-9157-3d31a281b7aa", "address": "fa:16:3e:d0:ae:d2", "network": {"id": "4b2a905b-11a3-4881-aea2-258a30f07ac8", "bridge": "br-int", "label": "tempest-network-smoke--2084868532", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.24", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f918144b49634ed5a43d75f8f7d194d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8ecf8178-af", "ovs_interfaceid": "8ecf8178-afa3-4403-9157-3d31a281b7aa", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Dec 05 09:30:36 compute-1 nova_compute[189066]: 2025-12-05 09:30:36.262 189070 DEBUG oslo_concurrency.lockutils [req-05f131e9-07c1-4945-8b55-1e5d989948b0 req-f2ea9a03-c76a-4d43-8819-a7683b2d5290 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Acquired lock "refresh_cache-d615ca65-b10a-4ff3-a274-7b300d2e1808" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 09:30:36 compute-1 nova_compute[189066]: 2025-12-05 09:30:36.263 189070 DEBUG nova.network.neutron [req-05f131e9-07c1-4945-8b55-1e5d989948b0 req-f2ea9a03-c76a-4d43-8819-a7683b2d5290 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: d615ca65-b10a-4ff3-a274-7b300d2e1808] Refreshing network info cache for port 8ecf8178-afa3-4403-9157-3d31a281b7aa _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Dec 05 09:30:36 compute-1 nova_compute[189066]: 2025-12-05 09:30:36.270 189070 DEBUG nova.virt.libvirt.driver [None req-ab315688-b56d-460e-9e9d-6b5533215e96 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: d615ca65-b10a-4ff3-a274-7b300d2e1808] Start _get_guest_xml network_info=[{"id": "8ecf8178-afa3-4403-9157-3d31a281b7aa", "address": "fa:16:3e:d0:ae:d2", "network": {"id": "4b2a905b-11a3-4881-aea2-258a30f07ac8", "bridge": "br-int", "label": "tempest-network-smoke--2084868532", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.24", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f918144b49634ed5a43d75f8f7d194d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8ecf8178-af", "ovs_interfaceid": "8ecf8178-afa3-4403-9157-3d31a281b7aa", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T09:19:04Z,direct_url=<?>,disk_format='qcow2',id=3ebffd97-b242-42d7-b245-ebdaf8e4377c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='1cb29f3743454b7fb71af92993455437',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T09:19:06Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encrypted': False, 'encryption_secret_uuid': None, 'size': 0, 'boot_index': 0, 'device_name': '/dev/vda', 'disk_bus': 'virtio', 'guest_format': None, 'encryption_options': None, 'encryption_format': None, 'device_type': 'disk', 'image_id': '3ebffd97-b242-42d7-b245-ebdaf8e4377c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Dec 05 09:30:36 compute-1 nova_compute[189066]: 2025-12-05 09:30:36.279 189070 WARNING nova.virt.libvirt.driver [None req-ab315688-b56d-460e-9e9d-6b5533215e96 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 05 09:30:36 compute-1 nova_compute[189066]: 2025-12-05 09:30:36.291 189070 DEBUG nova.virt.libvirt.host [None req-ab315688-b56d-460e-9e9d-6b5533215e96 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Dec 05 09:30:36 compute-1 nova_compute[189066]: 2025-12-05 09:30:36.292 189070 DEBUG nova.virt.libvirt.host [None req-ab315688-b56d-460e-9e9d-6b5533215e96 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Dec 05 09:30:36 compute-1 nova_compute[189066]: 2025-12-05 09:30:36.302 189070 DEBUG nova.virt.libvirt.host [None req-ab315688-b56d-460e-9e9d-6b5533215e96 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Dec 05 09:30:36 compute-1 nova_compute[189066]: 2025-12-05 09:30:36.303 189070 DEBUG nova.virt.libvirt.host [None req-ab315688-b56d-460e-9e9d-6b5533215e96 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Dec 05 09:30:36 compute-1 nova_compute[189066]: 2025-12-05 09:30:36.304 189070 DEBUG nova.virt.libvirt.driver [None req-ab315688-b56d-460e-9e9d-6b5533215e96 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Dec 05 09:30:36 compute-1 nova_compute[189066]: 2025-12-05 09:30:36.304 189070 DEBUG nova.virt.hardware [None req-ab315688-b56d-460e-9e9d-6b5533215e96 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-05T09:19:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='fbadeab4-f24f-4100-963a-d228b2a6f7c4',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T09:19:04Z,direct_url=<?>,disk_format='qcow2',id=3ebffd97-b242-42d7-b245-ebdaf8e4377c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='1cb29f3743454b7fb71af92993455437',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T09:19:06Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Dec 05 09:30:36 compute-1 nova_compute[189066]: 2025-12-05 09:30:36.305 189070 DEBUG nova.virt.hardware [None req-ab315688-b56d-460e-9e9d-6b5533215e96 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Dec 05 09:30:36 compute-1 nova_compute[189066]: 2025-12-05 09:30:36.305 189070 DEBUG nova.virt.hardware [None req-ab315688-b56d-460e-9e9d-6b5533215e96 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Dec 05 09:30:36 compute-1 nova_compute[189066]: 2025-12-05 09:30:36.305 189070 DEBUG nova.virt.hardware [None req-ab315688-b56d-460e-9e9d-6b5533215e96 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Dec 05 09:30:36 compute-1 nova_compute[189066]: 2025-12-05 09:30:36.306 189070 DEBUG nova.virt.hardware [None req-ab315688-b56d-460e-9e9d-6b5533215e96 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Dec 05 09:30:36 compute-1 nova_compute[189066]: 2025-12-05 09:30:36.306 189070 DEBUG nova.virt.hardware [None req-ab315688-b56d-460e-9e9d-6b5533215e96 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Dec 05 09:30:36 compute-1 nova_compute[189066]: 2025-12-05 09:30:36.306 189070 DEBUG nova.virt.hardware [None req-ab315688-b56d-460e-9e9d-6b5533215e96 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Dec 05 09:30:36 compute-1 nova_compute[189066]: 2025-12-05 09:30:36.306 189070 DEBUG nova.virt.hardware [None req-ab315688-b56d-460e-9e9d-6b5533215e96 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Dec 05 09:30:36 compute-1 nova_compute[189066]: 2025-12-05 09:30:36.307 189070 DEBUG nova.virt.hardware [None req-ab315688-b56d-460e-9e9d-6b5533215e96 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Dec 05 09:30:36 compute-1 nova_compute[189066]: 2025-12-05 09:30:36.307 189070 DEBUG nova.virt.hardware [None req-ab315688-b56d-460e-9e9d-6b5533215e96 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Dec 05 09:30:36 compute-1 nova_compute[189066]: 2025-12-05 09:30:36.307 189070 DEBUG nova.virt.hardware [None req-ab315688-b56d-460e-9e9d-6b5533215e96 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Dec 05 09:30:36 compute-1 nova_compute[189066]: 2025-12-05 09:30:36.312 189070 DEBUG nova.virt.libvirt.vif [None req-ab315688-b56d-460e-9e9d-6b5533215e96 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T09:30:23Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1530331023',display_name='tempest-TestNetworkBasicOps-server-1530331023',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1530331023',id=19,image_ref='3ebffd97-b242-42d7-b245-ebdaf8e4377c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBNREUxZEXDGlM1cS2M56hWAaHBDxXPzppWgxjbb+JJgdE8aiFxjvmXx2SUTfh+sxoQ78m1bgaZQKtd++zJCNAQJVePjcvSgQzSxjDBnGLKPV8IYUkASQ+pSRHD7qN2MCBA==',key_name='tempest-TestNetworkBasicOps-357394776',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='f918144b49634ed5a43d75f8f7d194d3',ramdisk_id='',reservation_id='r-gvmh0cdu',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='3ebffd97-b242-42d7-b245-ebdaf8e4377c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1341993432',owner_user_name='tempest-TestNetworkBasicOps-1341993432-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T09:30:25Z,user_data=None,user_id='822a2d80e80e46d9a5a49e1e9560e0d9',uuid=d615ca65-b10a-4ff3-a274-7b300d2e1808,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "8ecf8178-afa3-4403-9157-3d31a281b7aa", "address": "fa:16:3e:d0:ae:d2", "network": {"id": "4b2a905b-11a3-4881-aea2-258a30f07ac8", "bridge": "br-int", "label": "tempest-network-smoke--2084868532", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.24", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f918144b49634ed5a43d75f8f7d194d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8ecf8178-af", "ovs_interfaceid": "8ecf8178-afa3-4403-9157-3d31a281b7aa", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Dec 05 09:30:36 compute-1 nova_compute[189066]: 2025-12-05 09:30:36.313 189070 DEBUG nova.network.os_vif_util [None req-ab315688-b56d-460e-9e9d-6b5533215e96 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Converting VIF {"id": "8ecf8178-afa3-4403-9157-3d31a281b7aa", "address": "fa:16:3e:d0:ae:d2", "network": {"id": "4b2a905b-11a3-4881-aea2-258a30f07ac8", "bridge": "br-int", "label": "tempest-network-smoke--2084868532", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.24", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f918144b49634ed5a43d75f8f7d194d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8ecf8178-af", "ovs_interfaceid": "8ecf8178-afa3-4403-9157-3d31a281b7aa", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 09:30:36 compute-1 nova_compute[189066]: 2025-12-05 09:30:36.313 189070 DEBUG nova.network.os_vif_util [None req-ab315688-b56d-460e-9e9d-6b5533215e96 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d0:ae:d2,bridge_name='br-int',has_traffic_filtering=True,id=8ecf8178-afa3-4403-9157-3d31a281b7aa,network=Network(4b2a905b-11a3-4881-aea2-258a30f07ac8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8ecf8178-af') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 09:30:36 compute-1 nova_compute[189066]: 2025-12-05 09:30:36.315 189070 DEBUG nova.objects.instance [None req-ab315688-b56d-460e-9e9d-6b5533215e96 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Lazy-loading 'pci_devices' on Instance uuid d615ca65-b10a-4ff3-a274-7b300d2e1808 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 09:30:36 compute-1 nova_compute[189066]: 2025-12-05 09:30:36.338 189070 DEBUG nova.virt.libvirt.driver [None req-ab315688-b56d-460e-9e9d-6b5533215e96 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: d615ca65-b10a-4ff3-a274-7b300d2e1808] End _get_guest_xml xml=<domain type="kvm">
Dec 05 09:30:36 compute-1 nova_compute[189066]:   <uuid>d615ca65-b10a-4ff3-a274-7b300d2e1808</uuid>
Dec 05 09:30:36 compute-1 nova_compute[189066]:   <name>instance-00000013</name>
Dec 05 09:30:36 compute-1 nova_compute[189066]:   <memory>131072</memory>
Dec 05 09:30:36 compute-1 nova_compute[189066]:   <vcpu>1</vcpu>
Dec 05 09:30:36 compute-1 nova_compute[189066]:   <metadata>
Dec 05 09:30:36 compute-1 nova_compute[189066]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 05 09:30:36 compute-1 nova_compute[189066]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec 05 09:30:36 compute-1 nova_compute[189066]:       <nova:name>tempest-TestNetworkBasicOps-server-1530331023</nova:name>
Dec 05 09:30:36 compute-1 nova_compute[189066]:       <nova:creationTime>2025-12-05 09:30:36</nova:creationTime>
Dec 05 09:30:36 compute-1 nova_compute[189066]:       <nova:flavor name="m1.nano">
Dec 05 09:30:36 compute-1 nova_compute[189066]:         <nova:memory>128</nova:memory>
Dec 05 09:30:36 compute-1 nova_compute[189066]:         <nova:disk>1</nova:disk>
Dec 05 09:30:36 compute-1 nova_compute[189066]:         <nova:swap>0</nova:swap>
Dec 05 09:30:36 compute-1 nova_compute[189066]:         <nova:ephemeral>0</nova:ephemeral>
Dec 05 09:30:36 compute-1 nova_compute[189066]:         <nova:vcpus>1</nova:vcpus>
Dec 05 09:30:36 compute-1 nova_compute[189066]:       </nova:flavor>
Dec 05 09:30:36 compute-1 nova_compute[189066]:       <nova:owner>
Dec 05 09:30:36 compute-1 nova_compute[189066]:         <nova:user uuid="822a2d80e80e46d9a5a49e1e9560e0d9">tempest-TestNetworkBasicOps-1341993432-project-member</nova:user>
Dec 05 09:30:36 compute-1 nova_compute[189066]:         <nova:project uuid="f918144b49634ed5a43d75f8f7d194d3">tempest-TestNetworkBasicOps-1341993432</nova:project>
Dec 05 09:30:36 compute-1 nova_compute[189066]:       </nova:owner>
Dec 05 09:30:36 compute-1 nova_compute[189066]:       <nova:root type="image" uuid="3ebffd97-b242-42d7-b245-ebdaf8e4377c"/>
Dec 05 09:30:36 compute-1 nova_compute[189066]:       <nova:ports>
Dec 05 09:30:36 compute-1 nova_compute[189066]:         <nova:port uuid="8ecf8178-afa3-4403-9157-3d31a281b7aa">
Dec 05 09:30:36 compute-1 nova_compute[189066]:           <nova:ip type="fixed" address="10.100.0.24" ipVersion="4"/>
Dec 05 09:30:36 compute-1 nova_compute[189066]:         </nova:port>
Dec 05 09:30:36 compute-1 nova_compute[189066]:       </nova:ports>
Dec 05 09:30:36 compute-1 nova_compute[189066]:     </nova:instance>
Dec 05 09:30:36 compute-1 nova_compute[189066]:   </metadata>
Dec 05 09:30:36 compute-1 nova_compute[189066]:   <sysinfo type="smbios">
Dec 05 09:30:36 compute-1 nova_compute[189066]:     <system>
Dec 05 09:30:36 compute-1 nova_compute[189066]:       <entry name="manufacturer">RDO</entry>
Dec 05 09:30:36 compute-1 nova_compute[189066]:       <entry name="product">OpenStack Compute</entry>
Dec 05 09:30:36 compute-1 nova_compute[189066]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec 05 09:30:36 compute-1 nova_compute[189066]:       <entry name="serial">d615ca65-b10a-4ff3-a274-7b300d2e1808</entry>
Dec 05 09:30:36 compute-1 nova_compute[189066]:       <entry name="uuid">d615ca65-b10a-4ff3-a274-7b300d2e1808</entry>
Dec 05 09:30:36 compute-1 nova_compute[189066]:       <entry name="family">Virtual Machine</entry>
Dec 05 09:30:36 compute-1 nova_compute[189066]:     </system>
Dec 05 09:30:36 compute-1 nova_compute[189066]:   </sysinfo>
Dec 05 09:30:36 compute-1 nova_compute[189066]:   <os>
Dec 05 09:30:36 compute-1 nova_compute[189066]:     <type arch="x86_64" machine="q35">hvm</type>
Dec 05 09:30:36 compute-1 nova_compute[189066]:     <boot dev="hd"/>
Dec 05 09:30:36 compute-1 nova_compute[189066]:     <smbios mode="sysinfo"/>
Dec 05 09:30:36 compute-1 nova_compute[189066]:   </os>
Dec 05 09:30:36 compute-1 nova_compute[189066]:   <features>
Dec 05 09:30:36 compute-1 nova_compute[189066]:     <acpi/>
Dec 05 09:30:36 compute-1 nova_compute[189066]:     <apic/>
Dec 05 09:30:36 compute-1 nova_compute[189066]:     <vmcoreinfo/>
Dec 05 09:30:36 compute-1 nova_compute[189066]:   </features>
Dec 05 09:30:36 compute-1 nova_compute[189066]:   <clock offset="utc">
Dec 05 09:30:36 compute-1 nova_compute[189066]:     <timer name="pit" tickpolicy="delay"/>
Dec 05 09:30:36 compute-1 nova_compute[189066]:     <timer name="rtc" tickpolicy="catchup"/>
Dec 05 09:30:36 compute-1 nova_compute[189066]:     <timer name="hpet" present="no"/>
Dec 05 09:30:36 compute-1 nova_compute[189066]:   </clock>
Dec 05 09:30:36 compute-1 nova_compute[189066]:   <cpu mode="custom" match="exact">
Dec 05 09:30:36 compute-1 nova_compute[189066]:     <model>Nehalem</model>
Dec 05 09:30:36 compute-1 nova_compute[189066]:     <topology sockets="1" cores="1" threads="1"/>
Dec 05 09:30:36 compute-1 nova_compute[189066]:   </cpu>
Dec 05 09:30:36 compute-1 nova_compute[189066]:   <devices>
Dec 05 09:30:36 compute-1 nova_compute[189066]:     <disk type="file" device="disk">
Dec 05 09:30:36 compute-1 nova_compute[189066]:       <driver name="qemu" type="qcow2" cache="none"/>
Dec 05 09:30:36 compute-1 nova_compute[189066]:       <source file="/var/lib/nova/instances/d615ca65-b10a-4ff3-a274-7b300d2e1808/disk"/>
Dec 05 09:30:36 compute-1 nova_compute[189066]:       <target dev="vda" bus="virtio"/>
Dec 05 09:30:36 compute-1 nova_compute[189066]:     </disk>
Dec 05 09:30:36 compute-1 nova_compute[189066]:     <disk type="file" device="cdrom">
Dec 05 09:30:36 compute-1 nova_compute[189066]:       <driver name="qemu" type="raw" cache="none"/>
Dec 05 09:30:36 compute-1 nova_compute[189066]:       <source file="/var/lib/nova/instances/d615ca65-b10a-4ff3-a274-7b300d2e1808/disk.config"/>
Dec 05 09:30:36 compute-1 nova_compute[189066]:       <target dev="sda" bus="sata"/>
Dec 05 09:30:36 compute-1 nova_compute[189066]:     </disk>
Dec 05 09:30:36 compute-1 nova_compute[189066]:     <interface type="ethernet">
Dec 05 09:30:36 compute-1 nova_compute[189066]:       <mac address="fa:16:3e:d0:ae:d2"/>
Dec 05 09:30:36 compute-1 nova_compute[189066]:       <model type="virtio"/>
Dec 05 09:30:36 compute-1 nova_compute[189066]:       <driver name="vhost" rx_queue_size="512"/>
Dec 05 09:30:36 compute-1 nova_compute[189066]:       <mtu size="1442"/>
Dec 05 09:30:36 compute-1 nova_compute[189066]:       <target dev="tap8ecf8178-af"/>
Dec 05 09:30:36 compute-1 nova_compute[189066]:     </interface>
Dec 05 09:30:36 compute-1 nova_compute[189066]:     <serial type="pty">
Dec 05 09:30:36 compute-1 nova_compute[189066]:       <log file="/var/lib/nova/instances/d615ca65-b10a-4ff3-a274-7b300d2e1808/console.log" append="off"/>
Dec 05 09:30:36 compute-1 nova_compute[189066]:     </serial>
Dec 05 09:30:36 compute-1 nova_compute[189066]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Dec 05 09:30:36 compute-1 nova_compute[189066]:     <video>
Dec 05 09:30:36 compute-1 nova_compute[189066]:       <model type="virtio"/>
Dec 05 09:30:36 compute-1 nova_compute[189066]:     </video>
Dec 05 09:30:36 compute-1 nova_compute[189066]:     <input type="tablet" bus="usb"/>
Dec 05 09:30:36 compute-1 nova_compute[189066]:     <rng model="virtio">
Dec 05 09:30:36 compute-1 nova_compute[189066]:       <backend model="random">/dev/urandom</backend>
Dec 05 09:30:36 compute-1 nova_compute[189066]:     </rng>
Dec 05 09:30:36 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root"/>
Dec 05 09:30:36 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:30:36 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:30:36 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:30:36 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:30:36 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:30:36 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:30:36 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:30:36 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:30:36 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:30:36 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:30:36 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:30:36 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:30:36 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:30:36 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:30:36 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:30:36 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:30:36 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:30:36 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:30:36 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:30:36 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:30:36 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:30:36 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:30:36 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:30:36 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:30:36 compute-1 nova_compute[189066]:     <controller type="usb" index="0"/>
Dec 05 09:30:36 compute-1 nova_compute[189066]:     <memballoon model="virtio">
Dec 05 09:30:36 compute-1 nova_compute[189066]:       <stats period="10"/>
Dec 05 09:30:36 compute-1 nova_compute[189066]:     </memballoon>
Dec 05 09:30:36 compute-1 nova_compute[189066]:   </devices>
Dec 05 09:30:36 compute-1 nova_compute[189066]: </domain>
Dec 05 09:30:36 compute-1 nova_compute[189066]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Dec 05 09:30:36 compute-1 nova_compute[189066]: 2025-12-05 09:30:36.339 189070 DEBUG nova.compute.manager [None req-ab315688-b56d-460e-9e9d-6b5533215e96 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: d615ca65-b10a-4ff3-a274-7b300d2e1808] Preparing to wait for external event network-vif-plugged-8ecf8178-afa3-4403-9157-3d31a281b7aa prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Dec 05 09:30:36 compute-1 nova_compute[189066]: 2025-12-05 09:30:36.340 189070 DEBUG oslo_concurrency.lockutils [None req-ab315688-b56d-460e-9e9d-6b5533215e96 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Acquiring lock "d615ca65-b10a-4ff3-a274-7b300d2e1808-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:30:36 compute-1 nova_compute[189066]: 2025-12-05 09:30:36.340 189070 DEBUG oslo_concurrency.lockutils [None req-ab315688-b56d-460e-9e9d-6b5533215e96 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Lock "d615ca65-b10a-4ff3-a274-7b300d2e1808-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:30:36 compute-1 nova_compute[189066]: 2025-12-05 09:30:36.340 189070 DEBUG oslo_concurrency.lockutils [None req-ab315688-b56d-460e-9e9d-6b5533215e96 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Lock "d615ca65-b10a-4ff3-a274-7b300d2e1808-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:30:36 compute-1 nova_compute[189066]: 2025-12-05 09:30:36.341 189070 DEBUG nova.virt.libvirt.vif [None req-ab315688-b56d-460e-9e9d-6b5533215e96 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T09:30:23Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1530331023',display_name='tempest-TestNetworkBasicOps-server-1530331023',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1530331023',id=19,image_ref='3ebffd97-b242-42d7-b245-ebdaf8e4377c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBNREUxZEXDGlM1cS2M56hWAaHBDxXPzppWgxjbb+JJgdE8aiFxjvmXx2SUTfh+sxoQ78m1bgaZQKtd++zJCNAQJVePjcvSgQzSxjDBnGLKPV8IYUkASQ+pSRHD7qN2MCBA==',key_name='tempest-TestNetworkBasicOps-357394776',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='f918144b49634ed5a43d75f8f7d194d3',ramdisk_id='',reservation_id='r-gvmh0cdu',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='3ebffd97-b242-42d7-b245-ebdaf8e4377c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1341993432',owner_user_name='tempest-TestNetworkBasicOps-1341993432-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T09:30:25Z,user_data=None,user_id='822a2d80e80e46d9a5a49e1e9560e0d9',uuid=d615ca65-b10a-4ff3-a274-7b300d2e1808,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "8ecf8178-afa3-4403-9157-3d31a281b7aa", "address": "fa:16:3e:d0:ae:d2", "network": {"id": "4b2a905b-11a3-4881-aea2-258a30f07ac8", "bridge": "br-int", "label": "tempest-network-smoke--2084868532", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.24", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f918144b49634ed5a43d75f8f7d194d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8ecf8178-af", "ovs_interfaceid": "8ecf8178-afa3-4403-9157-3d31a281b7aa", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Dec 05 09:30:36 compute-1 nova_compute[189066]: 2025-12-05 09:30:36.341 189070 DEBUG nova.network.os_vif_util [None req-ab315688-b56d-460e-9e9d-6b5533215e96 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Converting VIF {"id": "8ecf8178-afa3-4403-9157-3d31a281b7aa", "address": "fa:16:3e:d0:ae:d2", "network": {"id": "4b2a905b-11a3-4881-aea2-258a30f07ac8", "bridge": "br-int", "label": "tempest-network-smoke--2084868532", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.24", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f918144b49634ed5a43d75f8f7d194d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8ecf8178-af", "ovs_interfaceid": "8ecf8178-afa3-4403-9157-3d31a281b7aa", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 09:30:36 compute-1 nova_compute[189066]: 2025-12-05 09:30:36.342 189070 DEBUG nova.network.os_vif_util [None req-ab315688-b56d-460e-9e9d-6b5533215e96 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d0:ae:d2,bridge_name='br-int',has_traffic_filtering=True,id=8ecf8178-afa3-4403-9157-3d31a281b7aa,network=Network(4b2a905b-11a3-4881-aea2-258a30f07ac8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8ecf8178-af') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 09:30:36 compute-1 nova_compute[189066]: 2025-12-05 09:30:36.343 189070 DEBUG os_vif [None req-ab315688-b56d-460e-9e9d-6b5533215e96 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:d0:ae:d2,bridge_name='br-int',has_traffic_filtering=True,id=8ecf8178-afa3-4403-9157-3d31a281b7aa,network=Network(4b2a905b-11a3-4881-aea2-258a30f07ac8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8ecf8178-af') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Dec 05 09:30:36 compute-1 nova_compute[189066]: 2025-12-05 09:30:36.343 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:30:36 compute-1 nova_compute[189066]: 2025-12-05 09:30:36.344 189070 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 09:30:36 compute-1 nova_compute[189066]: 2025-12-05 09:30:36.344 189070 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 09:30:36 compute-1 nova_compute[189066]: 2025-12-05 09:30:36.349 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:30:36 compute-1 nova_compute[189066]: 2025-12-05 09:30:36.349 189070 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap8ecf8178-af, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 09:30:36 compute-1 nova_compute[189066]: 2025-12-05 09:30:36.350 189070 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap8ecf8178-af, col_values=(('external_ids', {'iface-id': '8ecf8178-afa3-4403-9157-3d31a281b7aa', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:d0:ae:d2', 'vm-uuid': 'd615ca65-b10a-4ff3-a274-7b300d2e1808'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 09:30:36 compute-1 nova_compute[189066]: 2025-12-05 09:30:36.351 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:30:36 compute-1 NetworkManager[55704]: <info>  [1764927036.3523] manager: (tap8ecf8178-af): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/69)
Dec 05 09:30:36 compute-1 nova_compute[189066]: 2025-12-05 09:30:36.354 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 09:30:36 compute-1 nova_compute[189066]: 2025-12-05 09:30:36.360 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:30:36 compute-1 nova_compute[189066]: 2025-12-05 09:30:36.362 189070 INFO os_vif [None req-ab315688-b56d-460e-9e9d-6b5533215e96 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:d0:ae:d2,bridge_name='br-int',has_traffic_filtering=True,id=8ecf8178-afa3-4403-9157-3d31a281b7aa,network=Network(4b2a905b-11a3-4881-aea2-258a30f07ac8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8ecf8178-af')
Dec 05 09:30:36 compute-1 nova_compute[189066]: 2025-12-05 09:30:36.418 189070 DEBUG nova.virt.libvirt.driver [None req-ab315688-b56d-460e-9e9d-6b5533215e96 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 05 09:30:36 compute-1 nova_compute[189066]: 2025-12-05 09:30:36.418 189070 DEBUG nova.virt.libvirt.driver [None req-ab315688-b56d-460e-9e9d-6b5533215e96 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 05 09:30:36 compute-1 nova_compute[189066]: 2025-12-05 09:30:36.418 189070 DEBUG nova.virt.libvirt.driver [None req-ab315688-b56d-460e-9e9d-6b5533215e96 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] No VIF found with MAC fa:16:3e:d0:ae:d2, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Dec 05 09:30:36 compute-1 nova_compute[189066]: 2025-12-05 09:30:36.419 189070 INFO nova.virt.libvirt.driver [None req-ab315688-b56d-460e-9e9d-6b5533215e96 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: d615ca65-b10a-4ff3-a274-7b300d2e1808] Using config drive
Dec 05 09:30:36 compute-1 nova_compute[189066]: 2025-12-05 09:30:36.984 189070 INFO nova.virt.libvirt.driver [None req-ab315688-b56d-460e-9e9d-6b5533215e96 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: d615ca65-b10a-4ff3-a274-7b300d2e1808] Creating config drive at /var/lib/nova/instances/d615ca65-b10a-4ff3-a274-7b300d2e1808/disk.config
Dec 05 09:30:36 compute-1 nova_compute[189066]: 2025-12-05 09:30:36.989 189070 DEBUG oslo_concurrency.processutils [None req-ab315688-b56d-460e-9e9d-6b5533215e96 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/d615ca65-b10a-4ff3-a274-7b300d2e1808/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp1ln3j9ke execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 09:30:37 compute-1 nova_compute[189066]: 2025-12-05 09:30:37.121 189070 DEBUG oslo_concurrency.processutils [None req-ab315688-b56d-460e-9e9d-6b5533215e96 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/d615ca65-b10a-4ff3-a274-7b300d2e1808/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp1ln3j9ke" returned: 0 in 0.132s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 09:30:37 compute-1 kernel: tap8ecf8178-af: entered promiscuous mode
Dec 05 09:30:37 compute-1 NetworkManager[55704]: <info>  [1764927037.1958] manager: (tap8ecf8178-af): new Tun device (/org/freedesktop/NetworkManager/Devices/70)
Dec 05 09:30:37 compute-1 ovn_controller[95809]: 2025-12-05T09:30:37Z|00133|binding|INFO|Claiming lport 8ecf8178-afa3-4403-9157-3d31a281b7aa for this chassis.
Dec 05 09:30:37 compute-1 nova_compute[189066]: 2025-12-05 09:30:37.198 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:30:37 compute-1 ovn_controller[95809]: 2025-12-05T09:30:37Z|00134|binding|INFO|8ecf8178-afa3-4403-9157-3d31a281b7aa: Claiming fa:16:3e:d0:ae:d2 10.100.0.24
Dec 05 09:30:37 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:30:37.216 105272 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d0:ae:d2 10.100.0.24'], port_security=['fa:16:3e:d0:ae:d2 10.100.0.24'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.24/28', 'neutron:device_id': 'd615ca65-b10a-4ff3-a274-7b300d2e1808', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-4b2a905b-11a3-4881-aea2-258a30f07ac8', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'f918144b49634ed5a43d75f8f7d194d3', 'neutron:revision_number': '2', 'neutron:security_group_ids': '3e81dcd5-6a80-4841-b630-75615078ae6f', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=68649471-e728-44ca-85c4-bb6fd0fc7f6e, chassis=[<ovs.db.idl.Row object at 0x7fc06f54c970>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fc06f54c970>], logical_port=8ecf8178-afa3-4403-9157-3d31a281b7aa) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 09:30:37 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:30:37.218 105272 INFO neutron.agent.ovn.metadata.agent [-] Port 8ecf8178-afa3-4403-9157-3d31a281b7aa in datapath 4b2a905b-11a3-4881-aea2-258a30f07ac8 bound to our chassis
Dec 05 09:30:37 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:30:37.220 105272 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 4b2a905b-11a3-4881-aea2-258a30f07ac8
Dec 05 09:30:37 compute-1 systemd-udevd[225779]: Network interface NamePolicy= disabled on kernel command line.
Dec 05 09:30:37 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:30:37.232 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[c044b35e-fc71-4eff-9f8a-9e44efe9d83d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:30:37 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:30:37.233 105272 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap4b2a905b-11 in ovnmeta-4b2a905b-11a3-4881-aea2-258a30f07ac8 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Dec 05 09:30:37 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:30:37.236 220556 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap4b2a905b-10 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Dec 05 09:30:37 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:30:37.236 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[bc7800b9-dc7e-4654-a712-99babc4b7f28]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:30:37 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:30:37.239 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[17214f9c-9bfa-4ea9-8685-b24a91b65fdd]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:30:37 compute-1 NetworkManager[55704]: <info>  [1764927037.2438] device (tap8ecf8178-af): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec 05 09:30:37 compute-1 NetworkManager[55704]: <info>  [1764927037.2457] device (tap8ecf8178-af): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec 05 09:30:37 compute-1 nova_compute[189066]: 2025-12-05 09:30:37.245 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:30:37 compute-1 ovn_controller[95809]: 2025-12-05T09:30:37Z|00135|binding|INFO|Setting lport 8ecf8178-afa3-4403-9157-3d31a281b7aa ovn-installed in OVS
Dec 05 09:30:37 compute-1 ovn_controller[95809]: 2025-12-05T09:30:37Z|00136|binding|INFO|Setting lport 8ecf8178-afa3-4403-9157-3d31a281b7aa up in Southbound
Dec 05 09:30:37 compute-1 nova_compute[189066]: 2025-12-05 09:30:37.252 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:30:37 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:30:37.258 105809 DEBUG oslo.privsep.daemon [-] privsep: reply[f4d234cc-762b-4104-aee0-c5a460a742fa]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:30:37 compute-1 systemd-machined[154815]: New machine qemu-10-instance-00000013.
Dec 05 09:30:37 compute-1 systemd[1]: Started Virtual Machine qemu-10-instance-00000013.
Dec 05 09:30:37 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:30:37.275 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[a9240b6e-0542-4045-ba67-bc9a12e71370]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:30:37 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:30:37.315 220896 DEBUG oslo.privsep.daemon [-] privsep: reply[1f0cfac7-1ae8-43b1-aad1-a39696de4050]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:30:37 compute-1 NetworkManager[55704]: <info>  [1764927037.3251] manager: (tap4b2a905b-10): new Veth device (/org/freedesktop/NetworkManager/Devices/71)
Dec 05 09:30:37 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:30:37.323 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[42a797af-5254-4fd3-ae49-28e3a48dc132]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:30:37 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:30:37.362 220896 DEBUG oslo.privsep.daemon [-] privsep: reply[8cdbc459-33eb-4ae8-83ca-a2f3863768cd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:30:37 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:30:37.366 220896 DEBUG oslo.privsep.daemon [-] privsep: reply[d2d407cb-6e9b-4fc8-ad77-a4d160ded778]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:30:37 compute-1 NetworkManager[55704]: <info>  [1764927037.3962] device (tap4b2a905b-10): carrier: link connected
Dec 05 09:30:37 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:30:37.405 220896 DEBUG oslo.privsep.daemon [-] privsep: reply[8572db2d-d66d-4222-8988-2c4fd8660b8c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:30:37 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:30:37.423 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[bc2a32a4-11aa-4755-95eb-2af48863fa62]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap4b2a905b-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:fa:16:98'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 40], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 431040, 'reachable_time': 24618, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 225815, 'error': None, 'target': 'ovnmeta-4b2a905b-11a3-4881-aea2-258a30f07ac8', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:30:37 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:30:37.443 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[6a969fe4-32c1-4d24-9158-7d185053a942]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fefa:1698'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 431040, 'tstamp': 431040}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 225816, 'error': None, 'target': 'ovnmeta-4b2a905b-11a3-4881-aea2-258a30f07ac8', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:30:37 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:30:37.465 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[56e6b7ed-eb73-43d1-8586-bd6e7bfa50b0]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap4b2a905b-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:fa:16:98'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 40], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 431040, 'reachable_time': 24618, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 152, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 152, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 225817, 'error': None, 'target': 'ovnmeta-4b2a905b-11a3-4881-aea2-258a30f07ac8', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:30:37 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:30:37.504 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[b4783805-7b37-41c0-8199-3c7c4a30d062]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:30:37 compute-1 nova_compute[189066]: 2025-12-05 09:30:37.579 189070 DEBUG nova.virt.driver [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] Emitting event <LifecycleEvent: 1764927037.578442, d615ca65-b10a-4ff3-a274-7b300d2e1808 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 09:30:37 compute-1 nova_compute[189066]: 2025-12-05 09:30:37.580 189070 INFO nova.compute.manager [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] [instance: d615ca65-b10a-4ff3-a274-7b300d2e1808] VM Started (Lifecycle Event)
Dec 05 09:30:37 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:30:37.579 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[b8200562-0e55-4596-a543-3e54d41f84fa]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:30:37 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:30:37.582 105272 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap4b2a905b-10, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 09:30:37 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:30:37.582 105272 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 09:30:37 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:30:37.583 105272 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap4b2a905b-10, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 09:30:37 compute-1 kernel: tap4b2a905b-10: entered promiscuous mode
Dec 05 09:30:37 compute-1 nova_compute[189066]: 2025-12-05 09:30:37.629 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:30:37 compute-1 NetworkManager[55704]: <info>  [1764927037.6317] manager: (tap4b2a905b-10): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/72)
Dec 05 09:30:37 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:30:37.633 105272 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap4b2a905b-10, col_values=(('external_ids', {'iface-id': '2df964f1-ebe4-45a7-8282-c391a2faa310'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 09:30:37 compute-1 nova_compute[189066]: 2025-12-05 09:30:37.633 189070 DEBUG nova.compute.manager [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] [instance: d615ca65-b10a-4ff3-a274-7b300d2e1808] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 09:30:37 compute-1 nova_compute[189066]: 2025-12-05 09:30:37.635 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:30:37 compute-1 ovn_controller[95809]: 2025-12-05T09:30:37Z|00137|binding|INFO|Releasing lport 2df964f1-ebe4-45a7-8282-c391a2faa310 from this chassis (sb_readonly=0)
Dec 05 09:30:37 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:30:37.637 105272 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/4b2a905b-11a3-4881-aea2-258a30f07ac8.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/4b2a905b-11a3-4881-aea2-258a30f07ac8.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Dec 05 09:30:37 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:30:37.639 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[4f8fc2d6-5b45-44ac-8a77-7612ea8c7863]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:30:37 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:30:37.640 105272 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Dec 05 09:30:37 compute-1 ovn_metadata_agent[105267]: global
Dec 05 09:30:37 compute-1 ovn_metadata_agent[105267]:     log         /dev/log local0 debug
Dec 05 09:30:37 compute-1 ovn_metadata_agent[105267]:     log-tag     haproxy-metadata-proxy-4b2a905b-11a3-4881-aea2-258a30f07ac8
Dec 05 09:30:37 compute-1 ovn_metadata_agent[105267]:     user        root
Dec 05 09:30:37 compute-1 ovn_metadata_agent[105267]:     group       root
Dec 05 09:30:37 compute-1 ovn_metadata_agent[105267]:     maxconn     1024
Dec 05 09:30:37 compute-1 ovn_metadata_agent[105267]:     pidfile     /var/lib/neutron/external/pids/4b2a905b-11a3-4881-aea2-258a30f07ac8.pid.haproxy
Dec 05 09:30:37 compute-1 ovn_metadata_agent[105267]:     daemon
Dec 05 09:30:37 compute-1 ovn_metadata_agent[105267]: 
Dec 05 09:30:37 compute-1 ovn_metadata_agent[105267]: defaults
Dec 05 09:30:37 compute-1 ovn_metadata_agent[105267]:     log global
Dec 05 09:30:37 compute-1 ovn_metadata_agent[105267]:     mode http
Dec 05 09:30:37 compute-1 ovn_metadata_agent[105267]:     option httplog
Dec 05 09:30:37 compute-1 ovn_metadata_agent[105267]:     option dontlognull
Dec 05 09:30:37 compute-1 ovn_metadata_agent[105267]:     option http-server-close
Dec 05 09:30:37 compute-1 ovn_metadata_agent[105267]:     option forwardfor
Dec 05 09:30:37 compute-1 ovn_metadata_agent[105267]:     retries                 3
Dec 05 09:30:37 compute-1 ovn_metadata_agent[105267]:     timeout http-request    30s
Dec 05 09:30:37 compute-1 ovn_metadata_agent[105267]:     timeout connect         30s
Dec 05 09:30:37 compute-1 ovn_metadata_agent[105267]:     timeout client          32s
Dec 05 09:30:37 compute-1 ovn_metadata_agent[105267]:     timeout server          32s
Dec 05 09:30:37 compute-1 ovn_metadata_agent[105267]:     timeout http-keep-alive 30s
Dec 05 09:30:37 compute-1 ovn_metadata_agent[105267]: 
Dec 05 09:30:37 compute-1 ovn_metadata_agent[105267]: 
Dec 05 09:30:37 compute-1 ovn_metadata_agent[105267]: listen listener
Dec 05 09:30:37 compute-1 ovn_metadata_agent[105267]:     bind 169.254.169.254:80
Dec 05 09:30:37 compute-1 ovn_metadata_agent[105267]:     server metadata /var/lib/neutron/metadata_proxy
Dec 05 09:30:37 compute-1 ovn_metadata_agent[105267]:     http-request add-header X-OVN-Network-ID 4b2a905b-11a3-4881-aea2-258a30f07ac8
Dec 05 09:30:37 compute-1 ovn_metadata_agent[105267]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Dec 05 09:30:37 compute-1 nova_compute[189066]: 2025-12-05 09:30:37.640 189070 DEBUG nova.virt.driver [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] Emitting event <LifecycleEvent: 1764927037.578599, d615ca65-b10a-4ff3-a274-7b300d2e1808 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 09:30:37 compute-1 nova_compute[189066]: 2025-12-05 09:30:37.640 189070 INFO nova.compute.manager [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] [instance: d615ca65-b10a-4ff3-a274-7b300d2e1808] VM Paused (Lifecycle Event)
Dec 05 09:30:37 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:30:37.641 105272 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-4b2a905b-11a3-4881-aea2-258a30f07ac8', 'env', 'PROCESS_TAG=haproxy-4b2a905b-11a3-4881-aea2-258a30f07ac8', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/4b2a905b-11a3-4881-aea2-258a30f07ac8.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Dec 05 09:30:37 compute-1 nova_compute[189066]: 2025-12-05 09:30:37.646 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:30:37 compute-1 nova_compute[189066]: 2025-12-05 09:30:37.683 189070 DEBUG nova.compute.manager [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] [instance: d615ca65-b10a-4ff3-a274-7b300d2e1808] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 09:30:37 compute-1 nova_compute[189066]: 2025-12-05 09:30:37.689 189070 DEBUG nova.compute.manager [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] [instance: d615ca65-b10a-4ff3-a274-7b300d2e1808] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 09:30:37 compute-1 nova_compute[189066]: 2025-12-05 09:30:37.749 189070 INFO nova.compute.manager [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] [instance: d615ca65-b10a-4ff3-a274-7b300d2e1808] During sync_power_state the instance has a pending task (spawning). Skip.
Dec 05 09:30:37 compute-1 nova_compute[189066]: 2025-12-05 09:30:37.929 189070 DEBUG nova.compute.manager [req-dd4c5257-2d72-4522-b439-7ed5339ce5a9 req-c5359578-7240-4313-a1ed-0f257c5af0c0 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: d615ca65-b10a-4ff3-a274-7b300d2e1808] Received event network-vif-plugged-8ecf8178-afa3-4403-9157-3d31a281b7aa external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 09:30:37 compute-1 nova_compute[189066]: 2025-12-05 09:30:37.929 189070 DEBUG oslo_concurrency.lockutils [req-dd4c5257-2d72-4522-b439-7ed5339ce5a9 req-c5359578-7240-4313-a1ed-0f257c5af0c0 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Acquiring lock "d615ca65-b10a-4ff3-a274-7b300d2e1808-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:30:37 compute-1 nova_compute[189066]: 2025-12-05 09:30:37.929 189070 DEBUG oslo_concurrency.lockutils [req-dd4c5257-2d72-4522-b439-7ed5339ce5a9 req-c5359578-7240-4313-a1ed-0f257c5af0c0 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Lock "d615ca65-b10a-4ff3-a274-7b300d2e1808-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:30:37 compute-1 nova_compute[189066]: 2025-12-05 09:30:37.930 189070 DEBUG oslo_concurrency.lockutils [req-dd4c5257-2d72-4522-b439-7ed5339ce5a9 req-c5359578-7240-4313-a1ed-0f257c5af0c0 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Lock "d615ca65-b10a-4ff3-a274-7b300d2e1808-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:30:37 compute-1 nova_compute[189066]: 2025-12-05 09:30:37.930 189070 DEBUG nova.compute.manager [req-dd4c5257-2d72-4522-b439-7ed5339ce5a9 req-c5359578-7240-4313-a1ed-0f257c5af0c0 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: d615ca65-b10a-4ff3-a274-7b300d2e1808] Processing event network-vif-plugged-8ecf8178-afa3-4403-9157-3d31a281b7aa _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Dec 05 09:30:37 compute-1 nova_compute[189066]: 2025-12-05 09:30:37.931 189070 DEBUG nova.compute.manager [None req-ab315688-b56d-460e-9e9d-6b5533215e96 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: d615ca65-b10a-4ff3-a274-7b300d2e1808] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Dec 05 09:30:37 compute-1 nova_compute[189066]: 2025-12-05 09:30:37.934 189070 DEBUG nova.virt.driver [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] Emitting event <LifecycleEvent: 1764927037.9341476, d615ca65-b10a-4ff3-a274-7b300d2e1808 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 09:30:37 compute-1 nova_compute[189066]: 2025-12-05 09:30:37.934 189070 INFO nova.compute.manager [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] [instance: d615ca65-b10a-4ff3-a274-7b300d2e1808] VM Resumed (Lifecycle Event)
Dec 05 09:30:37 compute-1 nova_compute[189066]: 2025-12-05 09:30:37.936 189070 DEBUG nova.virt.libvirt.driver [None req-ab315688-b56d-460e-9e9d-6b5533215e96 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: d615ca65-b10a-4ff3-a274-7b300d2e1808] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Dec 05 09:30:37 compute-1 nova_compute[189066]: 2025-12-05 09:30:37.939 189070 INFO nova.virt.libvirt.driver [-] [instance: d615ca65-b10a-4ff3-a274-7b300d2e1808] Instance spawned successfully.
Dec 05 09:30:37 compute-1 nova_compute[189066]: 2025-12-05 09:30:37.940 189070 DEBUG nova.virt.libvirt.driver [None req-ab315688-b56d-460e-9e9d-6b5533215e96 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: d615ca65-b10a-4ff3-a274-7b300d2e1808] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Dec 05 09:30:37 compute-1 nova_compute[189066]: 2025-12-05 09:30:37.984 189070 DEBUG nova.compute.manager [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] [instance: d615ca65-b10a-4ff3-a274-7b300d2e1808] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 09:30:37 compute-1 nova_compute[189066]: 2025-12-05 09:30:37.995 189070 DEBUG nova.compute.manager [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] [instance: d615ca65-b10a-4ff3-a274-7b300d2e1808] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 09:30:38 compute-1 nova_compute[189066]: 2025-12-05 09:30:37.999 189070 DEBUG nova.virt.libvirt.driver [None req-ab315688-b56d-460e-9e9d-6b5533215e96 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: d615ca65-b10a-4ff3-a274-7b300d2e1808] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 09:30:38 compute-1 nova_compute[189066]: 2025-12-05 09:30:38.000 189070 DEBUG nova.virt.libvirt.driver [None req-ab315688-b56d-460e-9e9d-6b5533215e96 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: d615ca65-b10a-4ff3-a274-7b300d2e1808] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 09:30:38 compute-1 nova_compute[189066]: 2025-12-05 09:30:38.000 189070 DEBUG nova.virt.libvirt.driver [None req-ab315688-b56d-460e-9e9d-6b5533215e96 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: d615ca65-b10a-4ff3-a274-7b300d2e1808] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 09:30:38 compute-1 nova_compute[189066]: 2025-12-05 09:30:38.000 189070 DEBUG nova.virt.libvirt.driver [None req-ab315688-b56d-460e-9e9d-6b5533215e96 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: d615ca65-b10a-4ff3-a274-7b300d2e1808] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 09:30:38 compute-1 nova_compute[189066]: 2025-12-05 09:30:38.001 189070 DEBUG nova.virt.libvirt.driver [None req-ab315688-b56d-460e-9e9d-6b5533215e96 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: d615ca65-b10a-4ff3-a274-7b300d2e1808] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 09:30:38 compute-1 nova_compute[189066]: 2025-12-05 09:30:38.001 189070 DEBUG nova.virt.libvirt.driver [None req-ab315688-b56d-460e-9e9d-6b5533215e96 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: d615ca65-b10a-4ff3-a274-7b300d2e1808] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 09:30:38 compute-1 nova_compute[189066]: 2025-12-05 09:30:38.049 189070 INFO nova.compute.manager [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] [instance: d615ca65-b10a-4ff3-a274-7b300d2e1808] During sync_power_state the instance has a pending task (spawning). Skip.
Dec 05 09:30:38 compute-1 podman[225854]: 2025-12-05 09:30:38.106432573 +0000 UTC m=+0.052522904 container create 8fe18b98975dfc7fe1b635c6aeca8f02c0633546ba08e2d4e1ba60f288ac4132 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-4b2a905b-11a3-4881-aea2-258a30f07ac8, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Dec 05 09:30:38 compute-1 systemd[1]: Started libpod-conmon-8fe18b98975dfc7fe1b635c6aeca8f02c0633546ba08e2d4e1ba60f288ac4132.scope.
Dec 05 09:30:38 compute-1 podman[225854]: 2025-12-05 09:30:38.075941532 +0000 UTC m=+0.022031883 image pull 014dc726c85414b29f2dde7b5d875685d08784761c0f0ffa8630d1583a877bf9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Dec 05 09:30:38 compute-1 systemd[1]: Started libcrun container.
Dec 05 09:30:38 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/90878cbd4de19577d6b7d96a84221b51a26b5d43499b4e7f2d8c43d3443f1da7/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 05 09:30:38 compute-1 podman[225854]: 2025-12-05 09:30:38.197233457 +0000 UTC m=+0.143323808 container init 8fe18b98975dfc7fe1b635c6aeca8f02c0633546ba08e2d4e1ba60f288ac4132 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-4b2a905b-11a3-4881-aea2-258a30f07ac8, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true)
Dec 05 09:30:38 compute-1 podman[225854]: 2025-12-05 09:30:38.204910386 +0000 UTC m=+0.151000717 container start 8fe18b98975dfc7fe1b635c6aeca8f02c0633546ba08e2d4e1ba60f288ac4132 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-4b2a905b-11a3-4881-aea2-258a30f07ac8, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 05 09:30:38 compute-1 podman[225865]: 2025-12-05 09:30:38.221612136 +0000 UTC m=+0.067700146 container health_status d60d3dfd47255bc7c068dbedc03dbf86678863f0414a1b8f8536e4d53dd67a13 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a42d21893a42142b0b00e96296a13ac9d2c0fedf55d6438ad104fd0309c63376-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute)
Dec 05 09:30:38 compute-1 neutron-haproxy-ovnmeta-4b2a905b-11a3-4881-aea2-258a30f07ac8[225868]: [NOTICE]   (225890) : New worker (225893) forked
Dec 05 09:30:38 compute-1 neutron-haproxy-ovnmeta-4b2a905b-11a3-4881-aea2-258a30f07ac8[225868]: [NOTICE]   (225890) : Loading success.
Dec 05 09:30:38 compute-1 nova_compute[189066]: 2025-12-05 09:30:38.323 189070 INFO nova.compute.manager [None req-ab315688-b56d-460e-9e9d-6b5533215e96 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: d615ca65-b10a-4ff3-a274-7b300d2e1808] Took 12.73 seconds to spawn the instance on the hypervisor.
Dec 05 09:30:38 compute-1 nova_compute[189066]: 2025-12-05 09:30:38.323 189070 DEBUG nova.compute.manager [None req-ab315688-b56d-460e-9e9d-6b5533215e96 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: d615ca65-b10a-4ff3-a274-7b300d2e1808] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 09:30:38 compute-1 nova_compute[189066]: 2025-12-05 09:30:38.391 189070 INFO nova.compute.manager [None req-ab315688-b56d-460e-9e9d-6b5533215e96 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: d615ca65-b10a-4ff3-a274-7b300d2e1808] Took 13.30 seconds to build instance.
Dec 05 09:30:38 compute-1 nova_compute[189066]: 2025-12-05 09:30:38.414 189070 DEBUG oslo_concurrency.lockutils [None req-ab315688-b56d-460e-9e9d-6b5533215e96 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Lock "d615ca65-b10a-4ff3-a274-7b300d2e1808" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 13.443s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:30:39 compute-1 nova_compute[189066]: 2025-12-05 09:30:39.507 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:30:40 compute-1 nova_compute[189066]: 2025-12-05 09:30:40.071 189070 DEBUG nova.network.neutron [req-05f131e9-07c1-4945-8b55-1e5d989948b0 req-f2ea9a03-c76a-4d43-8819-a7683b2d5290 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: d615ca65-b10a-4ff3-a274-7b300d2e1808] Updated VIF entry in instance network info cache for port 8ecf8178-afa3-4403-9157-3d31a281b7aa. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Dec 05 09:30:40 compute-1 nova_compute[189066]: 2025-12-05 09:30:40.072 189070 DEBUG nova.network.neutron [req-05f131e9-07c1-4945-8b55-1e5d989948b0 req-f2ea9a03-c76a-4d43-8819-a7683b2d5290 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: d615ca65-b10a-4ff3-a274-7b300d2e1808] Updating instance_info_cache with network_info: [{"id": "8ecf8178-afa3-4403-9157-3d31a281b7aa", "address": "fa:16:3e:d0:ae:d2", "network": {"id": "4b2a905b-11a3-4881-aea2-258a30f07ac8", "bridge": "br-int", "label": "tempest-network-smoke--2084868532", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.24", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f918144b49634ed5a43d75f8f7d194d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8ecf8178-af", "ovs_interfaceid": "8ecf8178-afa3-4403-9157-3d31a281b7aa", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 09:30:40 compute-1 nova_compute[189066]: 2025-12-05 09:30:40.097 189070 DEBUG oslo_concurrency.lockutils [req-05f131e9-07c1-4945-8b55-1e5d989948b0 req-f2ea9a03-c76a-4d43-8819-a7683b2d5290 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Releasing lock "refresh_cache-d615ca65-b10a-4ff3-a274-7b300d2e1808" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 09:30:40 compute-1 nova_compute[189066]: 2025-12-05 09:30:40.133 189070 DEBUG nova.compute.manager [req-62be7825-2902-4a2a-9af6-2284e314df35 req-204faf8d-0855-48e5-bdc7-30a82a262875 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: d615ca65-b10a-4ff3-a274-7b300d2e1808] Received event network-vif-plugged-8ecf8178-afa3-4403-9157-3d31a281b7aa external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 09:30:40 compute-1 nova_compute[189066]: 2025-12-05 09:30:40.133 189070 DEBUG oslo_concurrency.lockutils [req-62be7825-2902-4a2a-9af6-2284e314df35 req-204faf8d-0855-48e5-bdc7-30a82a262875 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Acquiring lock "d615ca65-b10a-4ff3-a274-7b300d2e1808-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:30:40 compute-1 nova_compute[189066]: 2025-12-05 09:30:40.133 189070 DEBUG oslo_concurrency.lockutils [req-62be7825-2902-4a2a-9af6-2284e314df35 req-204faf8d-0855-48e5-bdc7-30a82a262875 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Lock "d615ca65-b10a-4ff3-a274-7b300d2e1808-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:30:40 compute-1 nova_compute[189066]: 2025-12-05 09:30:40.134 189070 DEBUG oslo_concurrency.lockutils [req-62be7825-2902-4a2a-9af6-2284e314df35 req-204faf8d-0855-48e5-bdc7-30a82a262875 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Lock "d615ca65-b10a-4ff3-a274-7b300d2e1808-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:30:40 compute-1 nova_compute[189066]: 2025-12-05 09:30:40.134 189070 DEBUG nova.compute.manager [req-62be7825-2902-4a2a-9af6-2284e314df35 req-204faf8d-0855-48e5-bdc7-30a82a262875 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: d615ca65-b10a-4ff3-a274-7b300d2e1808] No waiting events found dispatching network-vif-plugged-8ecf8178-afa3-4403-9157-3d31a281b7aa pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 09:30:40 compute-1 nova_compute[189066]: 2025-12-05 09:30:40.134 189070 WARNING nova.compute.manager [req-62be7825-2902-4a2a-9af6-2284e314df35 req-204faf8d-0855-48e5-bdc7-30a82a262875 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: d615ca65-b10a-4ff3-a274-7b300d2e1808] Received unexpected event network-vif-plugged-8ecf8178-afa3-4403-9157-3d31a281b7aa for instance with vm_state active and task_state None.
Dec 05 09:30:41 compute-1 nova_compute[189066]: 2025-12-05 09:30:41.352 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:30:42 compute-1 podman[225903]: 2025-12-05 09:30:42.6587052 +0000 UTC m=+0.098875543 container health_status 0cdc951f3b455a1459471b20e1f1626ca53f702831c125f835d1b670aeb17ccf (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a42d21893a42142b0b00e96296a13ac9d2c0fedf55d6438ad104fd0309c63376-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Dec 05 09:30:44 compute-1 nova_compute[189066]: 2025-12-05 09:30:44.514 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:30:44 compute-1 podman[225929]: 2025-12-05 09:30:44.610661609 +0000 UTC m=+0.049289534 container health_status e8cea6352187f28e672aa25c227f2705a90d58074b8cf42235a56df5d4026fd5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a42d21893a42142b0b00e96296a13ac9d2c0fedf55d6438ad104fd0309c63376-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true)
Dec 05 09:30:46 compute-1 nova_compute[189066]: 2025-12-05 09:30:46.355 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:30:47 compute-1 podman[225949]: 2025-12-05 09:30:47.628552243 +0000 UTC m=+0.068279711 container health_status 3f3288243aefe700d53cffce184353529fbaf6e44f1c19ec4792ed576af41176 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125)
Dec 05 09:30:48 compute-1 NetworkManager[55704]: <info>  [1764927048.7000] manager: (patch-provnet-540ef657-1e81-4b72-856b-0e1c84504735-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/73)
Dec 05 09:30:48 compute-1 NetworkManager[55704]: <info>  [1764927048.7009] manager: (patch-br-int-to-provnet-540ef657-1e81-4b72-856b-0e1c84504735): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/74)
Dec 05 09:30:48 compute-1 nova_compute[189066]: 2025-12-05 09:30:48.698 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:30:48 compute-1 nova_compute[189066]: 2025-12-05 09:30:48.895 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:30:48 compute-1 ovn_controller[95809]: 2025-12-05T09:30:48Z|00138|binding|INFO|Releasing lport 2df964f1-ebe4-45a7-8282-c391a2faa310 from this chassis (sb_readonly=0)
Dec 05 09:30:48 compute-1 nova_compute[189066]: 2025-12-05 09:30:48.998 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:30:49 compute-1 nova_compute[189066]: 2025-12-05 09:30:49.573 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:30:51 compute-1 nova_compute[189066]: 2025-12-05 09:30:51.359 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:30:52 compute-1 ovn_controller[95809]: 2025-12-05T09:30:52Z|00021|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:d0:ae:d2 10.100.0.24
Dec 05 09:30:52 compute-1 ovn_controller[95809]: 2025-12-05T09:30:52Z|00022|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:d0:ae:d2 10.100.0.24
Dec 05 09:30:52 compute-1 podman[225983]: 2025-12-05 09:30:52.330662138 +0000 UTC m=+0.091627975 container health_status b07f36300303a5c1729056a61e344d6c6fd9b2e404082d8e8ce7185d45652c2b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, config_id=openstack_network_exporter, io.buildah.version=1.33.7, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, managed_by=edpm_ansible, io.openshift.tags=minimal rhel9, architecture=x86_64, build-date=2025-08-20T13:12:41, maintainer=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350)
Dec 05 09:30:53 compute-1 nova_compute[189066]: 2025-12-05 09:30:53.300 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:30:53 compute-1 nova_compute[189066]: 2025-12-05 09:30:53.526 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:30:54 compute-1 nova_compute[189066]: 2025-12-05 09:30:54.575 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:30:54 compute-1 podman[226005]: 2025-12-05 09:30:54.629451019 +0000 UTC m=+0.065127584 container health_status b9f45cdcb567bb250381fa80d86c78c226d5638c7de1b6d79ee017275814ff68 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Dec 05 09:30:56 compute-1 nova_compute[189066]: 2025-12-05 09:30:56.361 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:30:59 compute-1 nova_compute[189066]: 2025-12-05 09:30:59.549 189070 DEBUG oslo_concurrency.lockutils [None req-9369d8b4-f1ed-4091-8726-2a59baae17ba 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Acquiring lock "d615ca65-b10a-4ff3-a274-7b300d2e1808" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:30:59 compute-1 nova_compute[189066]: 2025-12-05 09:30:59.550 189070 DEBUG oslo_concurrency.lockutils [None req-9369d8b4-f1ed-4091-8726-2a59baae17ba 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Lock "d615ca65-b10a-4ff3-a274-7b300d2e1808" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:30:59 compute-1 nova_compute[189066]: 2025-12-05 09:30:59.551 189070 DEBUG oslo_concurrency.lockutils [None req-9369d8b4-f1ed-4091-8726-2a59baae17ba 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Acquiring lock "d615ca65-b10a-4ff3-a274-7b300d2e1808-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:30:59 compute-1 nova_compute[189066]: 2025-12-05 09:30:59.551 189070 DEBUG oslo_concurrency.lockutils [None req-9369d8b4-f1ed-4091-8726-2a59baae17ba 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Lock "d615ca65-b10a-4ff3-a274-7b300d2e1808-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:30:59 compute-1 nova_compute[189066]: 2025-12-05 09:30:59.551 189070 DEBUG oslo_concurrency.lockutils [None req-9369d8b4-f1ed-4091-8726-2a59baae17ba 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Lock "d615ca65-b10a-4ff3-a274-7b300d2e1808-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:30:59 compute-1 nova_compute[189066]: 2025-12-05 09:30:59.555 189070 INFO nova.compute.manager [None req-9369d8b4-f1ed-4091-8726-2a59baae17ba 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: d615ca65-b10a-4ff3-a274-7b300d2e1808] Terminating instance
Dec 05 09:30:59 compute-1 nova_compute[189066]: 2025-12-05 09:30:59.557 189070 DEBUG nova.compute.manager [None req-9369d8b4-f1ed-4091-8726-2a59baae17ba 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: d615ca65-b10a-4ff3-a274-7b300d2e1808] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Dec 05 09:30:59 compute-1 nova_compute[189066]: 2025-12-05 09:30:59.577 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:30:59 compute-1 kernel: tap8ecf8178-af (unregistering): left promiscuous mode
Dec 05 09:30:59 compute-1 NetworkManager[55704]: <info>  [1764927059.5919] device (tap8ecf8178-af): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec 05 09:30:59 compute-1 ovn_controller[95809]: 2025-12-05T09:30:59Z|00139|binding|INFO|Releasing lport 8ecf8178-afa3-4403-9157-3d31a281b7aa from this chassis (sb_readonly=0)
Dec 05 09:30:59 compute-1 ovn_controller[95809]: 2025-12-05T09:30:59Z|00140|binding|INFO|Setting lport 8ecf8178-afa3-4403-9157-3d31a281b7aa down in Southbound
Dec 05 09:30:59 compute-1 ovn_controller[95809]: 2025-12-05T09:30:59Z|00141|binding|INFO|Removing iface tap8ecf8178-af ovn-installed in OVS
Dec 05 09:30:59 compute-1 nova_compute[189066]: 2025-12-05 09:30:59.600 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:30:59 compute-1 nova_compute[189066]: 2025-12-05 09:30:59.603 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:30:59 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:30:59.609 105272 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d0:ae:d2 10.100.0.24'], port_security=['fa:16:3e:d0:ae:d2 10.100.0.24'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.24/28', 'neutron:device_id': 'd615ca65-b10a-4ff3-a274-7b300d2e1808', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-4b2a905b-11a3-4881-aea2-258a30f07ac8', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'f918144b49634ed5a43d75f8f7d194d3', 'neutron:revision_number': '4', 'neutron:security_group_ids': '3e81dcd5-6a80-4841-b630-75615078ae6f', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=68649471-e728-44ca-85c4-bb6fd0fc7f6e, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fc06f54c970>], logical_port=8ecf8178-afa3-4403-9157-3d31a281b7aa) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fc06f54c970>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 09:30:59 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:30:59.611 105272 INFO neutron.agent.ovn.metadata.agent [-] Port 8ecf8178-afa3-4403-9157-3d31a281b7aa in datapath 4b2a905b-11a3-4881-aea2-258a30f07ac8 unbound from our chassis
Dec 05 09:30:59 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:30:59.614 105272 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 4b2a905b-11a3-4881-aea2-258a30f07ac8, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 05 09:30:59 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:30:59.618 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[8ec1c254-cc17-478e-95c9-ca4f6b3ce3c6]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:30:59 compute-1 nova_compute[189066]: 2025-12-05 09:30:59.619 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:30:59 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:30:59.619 105272 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-4b2a905b-11a3-4881-aea2-258a30f07ac8 namespace which is not needed anymore
Dec 05 09:30:59 compute-1 systemd[1]: machine-qemu\x2d10\x2dinstance\x2d00000013.scope: Deactivated successfully.
Dec 05 09:30:59 compute-1 systemd[1]: machine-qemu\x2d10\x2dinstance\x2d00000013.scope: Consumed 14.499s CPU time.
Dec 05 09:30:59 compute-1 systemd-machined[154815]: Machine qemu-10-instance-00000013 terminated.
Dec 05 09:30:59 compute-1 neutron-haproxy-ovnmeta-4b2a905b-11a3-4881-aea2-258a30f07ac8[225868]: [NOTICE]   (225890) : haproxy version is 2.8.14-c23fe91
Dec 05 09:30:59 compute-1 neutron-haproxy-ovnmeta-4b2a905b-11a3-4881-aea2-258a30f07ac8[225868]: [NOTICE]   (225890) : path to executable is /usr/sbin/haproxy
Dec 05 09:30:59 compute-1 neutron-haproxy-ovnmeta-4b2a905b-11a3-4881-aea2-258a30f07ac8[225868]: [WARNING]  (225890) : Exiting Master process...
Dec 05 09:30:59 compute-1 neutron-haproxy-ovnmeta-4b2a905b-11a3-4881-aea2-258a30f07ac8[225868]: [ALERT]    (225890) : Current worker (225893) exited with code 143 (Terminated)
Dec 05 09:30:59 compute-1 neutron-haproxy-ovnmeta-4b2a905b-11a3-4881-aea2-258a30f07ac8[225868]: [WARNING]  (225890) : All workers exited. Exiting... (0)
Dec 05 09:30:59 compute-1 systemd[1]: libpod-8fe18b98975dfc7fe1b635c6aeca8f02c0633546ba08e2d4e1ba60f288ac4132.scope: Deactivated successfully.
Dec 05 09:30:59 compute-1 podman[226054]: 2025-12-05 09:30:59.777145758 +0000 UTC m=+0.052203085 container died 8fe18b98975dfc7fe1b635c6aeca8f02c0633546ba08e2d4e1ba60f288ac4132 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-4b2a905b-11a3-4881-aea2-258a30f07ac8, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 05 09:30:59 compute-1 nova_compute[189066]: 2025-12-05 09:30:59.777 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:30:59 compute-1 nova_compute[189066]: 2025-12-05 09:30:59.781 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:30:59 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-8fe18b98975dfc7fe1b635c6aeca8f02c0633546ba08e2d4e1ba60f288ac4132-userdata-shm.mount: Deactivated successfully.
Dec 05 09:30:59 compute-1 systemd[1]: var-lib-containers-storage-overlay-90878cbd4de19577d6b7d96a84221b51a26b5d43499b4e7f2d8c43d3443f1da7-merged.mount: Deactivated successfully.
Dec 05 09:30:59 compute-1 podman[226054]: 2025-12-05 09:30:59.814470577 +0000 UTC m=+0.089527894 container cleanup 8fe18b98975dfc7fe1b635c6aeca8f02c0633546ba08e2d4e1ba60f288ac4132 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-4b2a905b-11a3-4881-aea2-258a30f07ac8, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 05 09:30:59 compute-1 nova_compute[189066]: 2025-12-05 09:30:59.826 189070 INFO nova.virt.libvirt.driver [-] [instance: d615ca65-b10a-4ff3-a274-7b300d2e1808] Instance destroyed successfully.
Dec 05 09:30:59 compute-1 nova_compute[189066]: 2025-12-05 09:30:59.827 189070 DEBUG nova.objects.instance [None req-9369d8b4-f1ed-4091-8726-2a59baae17ba 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Lazy-loading 'resources' on Instance uuid d615ca65-b10a-4ff3-a274-7b300d2e1808 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 09:30:59 compute-1 systemd[1]: libpod-conmon-8fe18b98975dfc7fe1b635c6aeca8f02c0633546ba08e2d4e1ba60f288ac4132.scope: Deactivated successfully.
Dec 05 09:30:59 compute-1 nova_compute[189066]: 2025-12-05 09:30:59.846 189070 DEBUG nova.virt.libvirt.vif [None req-9369d8b4-f1ed-4091-8726-2a59baae17ba 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-05T09:30:23Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1530331023',display_name='tempest-TestNetworkBasicOps-server-1530331023',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1530331023',id=19,image_ref='3ebffd97-b242-42d7-b245-ebdaf8e4377c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBNREUxZEXDGlM1cS2M56hWAaHBDxXPzppWgxjbb+JJgdE8aiFxjvmXx2SUTfh+sxoQ78m1bgaZQKtd++zJCNAQJVePjcvSgQzSxjDBnGLKPV8IYUkASQ+pSRHD7qN2MCBA==',key_name='tempest-TestNetworkBasicOps-357394776',keypairs=<?>,launch_index=0,launched_at=2025-12-05T09:30:38Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='f918144b49634ed5a43d75f8f7d194d3',ramdisk_id='',reservation_id='r-gvmh0cdu',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='3ebffd97-b242-42d7-b245-ebdaf8e4377c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1341993432',owner_user_name='tempest-TestNetworkBasicOps-1341993432-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-05T09:30:38Z,user_data=None,user_id='822a2d80e80e46d9a5a49e1e9560e0d9',uuid=d615ca65-b10a-4ff3-a274-7b300d2e1808,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "8ecf8178-afa3-4403-9157-3d31a281b7aa", "address": "fa:16:3e:d0:ae:d2", "network": {"id": "4b2a905b-11a3-4881-aea2-258a30f07ac8", "bridge": "br-int", "label": "tempest-network-smoke--2084868532", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.24", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f918144b49634ed5a43d75f8f7d194d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8ecf8178-af", "ovs_interfaceid": "8ecf8178-afa3-4403-9157-3d31a281b7aa", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Dec 05 09:30:59 compute-1 nova_compute[189066]: 2025-12-05 09:30:59.846 189070 DEBUG nova.network.os_vif_util [None req-9369d8b4-f1ed-4091-8726-2a59baae17ba 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Converting VIF {"id": "8ecf8178-afa3-4403-9157-3d31a281b7aa", "address": "fa:16:3e:d0:ae:d2", "network": {"id": "4b2a905b-11a3-4881-aea2-258a30f07ac8", "bridge": "br-int", "label": "tempest-network-smoke--2084868532", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.24", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f918144b49634ed5a43d75f8f7d194d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8ecf8178-af", "ovs_interfaceid": "8ecf8178-afa3-4403-9157-3d31a281b7aa", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 09:30:59 compute-1 nova_compute[189066]: 2025-12-05 09:30:59.847 189070 DEBUG nova.network.os_vif_util [None req-9369d8b4-f1ed-4091-8726-2a59baae17ba 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d0:ae:d2,bridge_name='br-int',has_traffic_filtering=True,id=8ecf8178-afa3-4403-9157-3d31a281b7aa,network=Network(4b2a905b-11a3-4881-aea2-258a30f07ac8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8ecf8178-af') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 09:30:59 compute-1 nova_compute[189066]: 2025-12-05 09:30:59.848 189070 DEBUG os_vif [None req-9369d8b4-f1ed-4091-8726-2a59baae17ba 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:d0:ae:d2,bridge_name='br-int',has_traffic_filtering=True,id=8ecf8178-afa3-4403-9157-3d31a281b7aa,network=Network(4b2a905b-11a3-4881-aea2-258a30f07ac8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8ecf8178-af') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Dec 05 09:30:59 compute-1 nova_compute[189066]: 2025-12-05 09:30:59.851 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:30:59 compute-1 nova_compute[189066]: 2025-12-05 09:30:59.851 189070 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap8ecf8178-af, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 09:30:59 compute-1 nova_compute[189066]: 2025-12-05 09:30:59.855 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 09:30:59 compute-1 nova_compute[189066]: 2025-12-05 09:30:59.861 189070 INFO os_vif [None req-9369d8b4-f1ed-4091-8726-2a59baae17ba 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:d0:ae:d2,bridge_name='br-int',has_traffic_filtering=True,id=8ecf8178-afa3-4403-9157-3d31a281b7aa,network=Network(4b2a905b-11a3-4881-aea2-258a30f07ac8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8ecf8178-af')
Dec 05 09:30:59 compute-1 nova_compute[189066]: 2025-12-05 09:30:59.861 189070 INFO nova.virt.libvirt.driver [None req-9369d8b4-f1ed-4091-8726-2a59baae17ba 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: d615ca65-b10a-4ff3-a274-7b300d2e1808] Deleting instance files /var/lib/nova/instances/d615ca65-b10a-4ff3-a274-7b300d2e1808_del
Dec 05 09:30:59 compute-1 nova_compute[189066]: 2025-12-05 09:30:59.862 189070 INFO nova.virt.libvirt.driver [None req-9369d8b4-f1ed-4091-8726-2a59baae17ba 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: d615ca65-b10a-4ff3-a274-7b300d2e1808] Deletion of /var/lib/nova/instances/d615ca65-b10a-4ff3-a274-7b300d2e1808_del complete
Dec 05 09:30:59 compute-1 podman[226097]: 2025-12-05 09:30:59.885115116 +0000 UTC m=+0.040685482 container remove 8fe18b98975dfc7fe1b635c6aeca8f02c0633546ba08e2d4e1ba60f288ac4132 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-4b2a905b-11a3-4881-aea2-258a30f07ac8, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Dec 05 09:30:59 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:30:59.891 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[67dc80d6-0076-4ea3-ba37-f26f01776b45]: (4, ('Fri Dec  5 09:30:59 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-4b2a905b-11a3-4881-aea2-258a30f07ac8 (8fe18b98975dfc7fe1b635c6aeca8f02c0633546ba08e2d4e1ba60f288ac4132)\n8fe18b98975dfc7fe1b635c6aeca8f02c0633546ba08e2d4e1ba60f288ac4132\nFri Dec  5 09:30:59 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-4b2a905b-11a3-4881-aea2-258a30f07ac8 (8fe18b98975dfc7fe1b635c6aeca8f02c0633546ba08e2d4e1ba60f288ac4132)\n8fe18b98975dfc7fe1b635c6aeca8f02c0633546ba08e2d4e1ba60f288ac4132\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:30:59 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:30:59.893 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[40c85f02-2a77-4924-a9cc-289711c1feb0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:30:59 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:30:59.894 105272 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap4b2a905b-10, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 09:30:59 compute-1 nova_compute[189066]: 2025-12-05 09:30:59.896 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:30:59 compute-1 kernel: tap4b2a905b-10: left promiscuous mode
Dec 05 09:30:59 compute-1 nova_compute[189066]: 2025-12-05 09:30:59.907 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:30:59 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:30:59.913 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[38109182-7efc-4bb8-ace6-017a4dd33c31]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:30:59 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:30:59.927 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[de42e6ef-1610-4cfa-89a5-ab272e6e3b5f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:30:59 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:30:59.928 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[ceecd1e9-1687-42c2-9a22-f5a75d820d4a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:30:59 compute-1 nova_compute[189066]: 2025-12-05 09:30:59.937 189070 INFO nova.compute.manager [None req-9369d8b4-f1ed-4091-8726-2a59baae17ba 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: d615ca65-b10a-4ff3-a274-7b300d2e1808] Took 0.38 seconds to destroy the instance on the hypervisor.
Dec 05 09:30:59 compute-1 nova_compute[189066]: 2025-12-05 09:30:59.937 189070 DEBUG oslo.service.loopingcall [None req-9369d8b4-f1ed-4091-8726-2a59baae17ba 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Dec 05 09:30:59 compute-1 nova_compute[189066]: 2025-12-05 09:30:59.938 189070 DEBUG nova.compute.manager [-] [instance: d615ca65-b10a-4ff3-a274-7b300d2e1808] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Dec 05 09:30:59 compute-1 nova_compute[189066]: 2025-12-05 09:30:59.938 189070 DEBUG nova.network.neutron [-] [instance: d615ca65-b10a-4ff3-a274-7b300d2e1808] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Dec 05 09:30:59 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:30:59.948 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[0474ad91-c4a0-4951-b7a7-3cdf4f7eb276]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 431031, 'reachable_time': 42088, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 226112, 'error': None, 'target': 'ovnmeta-4b2a905b-11a3-4881-aea2-258a30f07ac8', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:30:59 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:30:59.951 105809 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-4b2a905b-11a3-4881-aea2-258a30f07ac8 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Dec 05 09:30:59 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:30:59.951 105809 DEBUG oslo.privsep.daemon [-] privsep: reply[475f3468-c018-4743-a43e-8741a9910a0f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:30:59 compute-1 systemd[1]: run-netns-ovnmeta\x2d4b2a905b\x2d11a3\x2d4881\x2daea2\x2d258a30f07ac8.mount: Deactivated successfully.
Dec 05 09:31:01 compute-1 nova_compute[189066]: 2025-12-05 09:31:01.385 189070 DEBUG oslo_concurrency.lockutils [None req-1d07d364-88ff-45bc-b26a-7ddc3a3b818d bcc37d16c39547bba794fb1f43e889c1 6c5bb818cba543bbb1bcff8df31dd9cd - - default default] Acquiring lock "a0adc838-396e-45a2-8503-a2ab451cd778" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:31:01 compute-1 nova_compute[189066]: 2025-12-05 09:31:01.385 189070 DEBUG oslo_concurrency.lockutils [None req-1d07d364-88ff-45bc-b26a-7ddc3a3b818d bcc37d16c39547bba794fb1f43e889c1 6c5bb818cba543bbb1bcff8df31dd9cd - - default default] Lock "a0adc838-396e-45a2-8503-a2ab451cd778" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:31:01 compute-1 nova_compute[189066]: 2025-12-05 09:31:01.411 189070 DEBUG nova.compute.manager [None req-1d07d364-88ff-45bc-b26a-7ddc3a3b818d bcc37d16c39547bba794fb1f43e889c1 6c5bb818cba543bbb1bcff8df31dd9cd - - default default] [instance: a0adc838-396e-45a2-8503-a2ab451cd778] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Dec 05 09:31:01 compute-1 nova_compute[189066]: 2025-12-05 09:31:01.550 189070 DEBUG nova.network.neutron [-] [instance: d615ca65-b10a-4ff3-a274-7b300d2e1808] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 09:31:01 compute-1 nova_compute[189066]: 2025-12-05 09:31:01.571 189070 DEBUG oslo_concurrency.lockutils [None req-1d07d364-88ff-45bc-b26a-7ddc3a3b818d bcc37d16c39547bba794fb1f43e889c1 6c5bb818cba543bbb1bcff8df31dd9cd - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:31:01 compute-1 nova_compute[189066]: 2025-12-05 09:31:01.572 189070 DEBUG oslo_concurrency.lockutils [None req-1d07d364-88ff-45bc-b26a-7ddc3a3b818d bcc37d16c39547bba794fb1f43e889c1 6c5bb818cba543bbb1bcff8df31dd9cd - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:31:01 compute-1 nova_compute[189066]: 2025-12-05 09:31:01.582 189070 DEBUG nova.virt.hardware [None req-1d07d364-88ff-45bc-b26a-7ddc3a3b818d bcc37d16c39547bba794fb1f43e889c1 6c5bb818cba543bbb1bcff8df31dd9cd - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Dec 05 09:31:01 compute-1 nova_compute[189066]: 2025-12-05 09:31:01.583 189070 INFO nova.compute.claims [None req-1d07d364-88ff-45bc-b26a-7ddc3a3b818d bcc37d16c39547bba794fb1f43e889c1 6c5bb818cba543bbb1bcff8df31dd9cd - - default default] [instance: a0adc838-396e-45a2-8503-a2ab451cd778] Claim successful on node compute-1.ctlplane.example.com
Dec 05 09:31:01 compute-1 nova_compute[189066]: 2025-12-05 09:31:01.588 189070 INFO nova.compute.manager [-] [instance: d615ca65-b10a-4ff3-a274-7b300d2e1808] Took 1.65 seconds to deallocate network for instance.
Dec 05 09:31:01 compute-1 podman[226113]: 2025-12-05 09:31:01.641206824 +0000 UTC m=+0.075291124 container health_status 550b604b3b40c506829669f3af0f3e8dc4e7ac01464222ce7f26998708fe1a96 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Dec 05 09:31:01 compute-1 nova_compute[189066]: 2025-12-05 09:31:01.668 189070 DEBUG oslo_concurrency.lockutils [None req-9369d8b4-f1ed-4091-8726-2a59baae17ba 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:31:01 compute-1 nova_compute[189066]: 2025-12-05 09:31:01.766 189070 DEBUG nova.compute.provider_tree [None req-1d07d364-88ff-45bc-b26a-7ddc3a3b818d bcc37d16c39547bba794fb1f43e889c1 6c5bb818cba543bbb1bcff8df31dd9cd - - default default] Inventory has not changed in ProviderTree for provider: be68f9f1-7820-4bfa-8dbd-210e13729f64 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 09:31:01 compute-1 nova_compute[189066]: 2025-12-05 09:31:01.787 189070 DEBUG nova.scheduler.client.report [None req-1d07d364-88ff-45bc-b26a-7ddc3a3b818d bcc37d16c39547bba794fb1f43e889c1 6c5bb818cba543bbb1bcff8df31dd9cd - - default default] Inventory has not changed for provider be68f9f1-7820-4bfa-8dbd-210e13729f64 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 09:31:01 compute-1 nova_compute[189066]: 2025-12-05 09:31:01.816 189070 DEBUG oslo_concurrency.lockutils [None req-1d07d364-88ff-45bc-b26a-7ddc3a3b818d bcc37d16c39547bba794fb1f43e889c1 6c5bb818cba543bbb1bcff8df31dd9cd - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.244s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:31:01 compute-1 nova_compute[189066]: 2025-12-05 09:31:01.817 189070 DEBUG nova.compute.manager [None req-1d07d364-88ff-45bc-b26a-7ddc3a3b818d bcc37d16c39547bba794fb1f43e889c1 6c5bb818cba543bbb1bcff8df31dd9cd - - default default] [instance: a0adc838-396e-45a2-8503-a2ab451cd778] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Dec 05 09:31:01 compute-1 nova_compute[189066]: 2025-12-05 09:31:01.819 189070 DEBUG oslo_concurrency.lockutils [None req-9369d8b4-f1ed-4091-8726-2a59baae17ba 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.151s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:31:01 compute-1 nova_compute[189066]: 2025-12-05 09:31:01.880 189070 DEBUG nova.compute.manager [None req-1d07d364-88ff-45bc-b26a-7ddc3a3b818d bcc37d16c39547bba794fb1f43e889c1 6c5bb818cba543bbb1bcff8df31dd9cd - - default default] [instance: a0adc838-396e-45a2-8503-a2ab451cd778] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Dec 05 09:31:01 compute-1 nova_compute[189066]: 2025-12-05 09:31:01.881 189070 DEBUG nova.network.neutron [None req-1d07d364-88ff-45bc-b26a-7ddc3a3b818d bcc37d16c39547bba794fb1f43e889c1 6c5bb818cba543bbb1bcff8df31dd9cd - - default default] [instance: a0adc838-396e-45a2-8503-a2ab451cd778] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Dec 05 09:31:01 compute-1 nova_compute[189066]: 2025-12-05 09:31:01.916 189070 DEBUG nova.compute.provider_tree [None req-9369d8b4-f1ed-4091-8726-2a59baae17ba 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Inventory has not changed in ProviderTree for provider: be68f9f1-7820-4bfa-8dbd-210e13729f64 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 09:31:01 compute-1 nova_compute[189066]: 2025-12-05 09:31:01.919 189070 INFO nova.virt.libvirt.driver [None req-1d07d364-88ff-45bc-b26a-7ddc3a3b818d bcc37d16c39547bba794fb1f43e889c1 6c5bb818cba543bbb1bcff8df31dd9cd - - default default] [instance: a0adc838-396e-45a2-8503-a2ab451cd778] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Dec 05 09:31:01 compute-1 nova_compute[189066]: 2025-12-05 09:31:01.941 189070 DEBUG nova.scheduler.client.report [None req-9369d8b4-f1ed-4091-8726-2a59baae17ba 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Inventory has not changed for provider be68f9f1-7820-4bfa-8dbd-210e13729f64 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 09:31:01 compute-1 nova_compute[189066]: 2025-12-05 09:31:01.946 189070 DEBUG nova.compute.manager [None req-1d07d364-88ff-45bc-b26a-7ddc3a3b818d bcc37d16c39547bba794fb1f43e889c1 6c5bb818cba543bbb1bcff8df31dd9cd - - default default] [instance: a0adc838-396e-45a2-8503-a2ab451cd778] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Dec 05 09:31:01 compute-1 nova_compute[189066]: 2025-12-05 09:31:01.980 189070 DEBUG oslo_concurrency.lockutils [None req-9369d8b4-f1ed-4091-8726-2a59baae17ba 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.161s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:31:02 compute-1 nova_compute[189066]: 2025-12-05 09:31:02.042 189070 INFO nova.scheduler.client.report [None req-9369d8b4-f1ed-4091-8726-2a59baae17ba 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Deleted allocations for instance d615ca65-b10a-4ff3-a274-7b300d2e1808
Dec 05 09:31:02 compute-1 nova_compute[189066]: 2025-12-05 09:31:02.107 189070 DEBUG nova.compute.manager [None req-1d07d364-88ff-45bc-b26a-7ddc3a3b818d bcc37d16c39547bba794fb1f43e889c1 6c5bb818cba543bbb1bcff8df31dd9cd - - default default] [instance: a0adc838-396e-45a2-8503-a2ab451cd778] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Dec 05 09:31:02 compute-1 nova_compute[189066]: 2025-12-05 09:31:02.109 189070 DEBUG nova.virt.libvirt.driver [None req-1d07d364-88ff-45bc-b26a-7ddc3a3b818d bcc37d16c39547bba794fb1f43e889c1 6c5bb818cba543bbb1bcff8df31dd9cd - - default default] [instance: a0adc838-396e-45a2-8503-a2ab451cd778] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Dec 05 09:31:02 compute-1 nova_compute[189066]: 2025-12-05 09:31:02.110 189070 INFO nova.virt.libvirt.driver [None req-1d07d364-88ff-45bc-b26a-7ddc3a3b818d bcc37d16c39547bba794fb1f43e889c1 6c5bb818cba543bbb1bcff8df31dd9cd - - default default] [instance: a0adc838-396e-45a2-8503-a2ab451cd778] Creating image(s)
Dec 05 09:31:02 compute-1 nova_compute[189066]: 2025-12-05 09:31:02.110 189070 DEBUG oslo_concurrency.lockutils [None req-1d07d364-88ff-45bc-b26a-7ddc3a3b818d bcc37d16c39547bba794fb1f43e889c1 6c5bb818cba543bbb1bcff8df31dd9cd - - default default] Acquiring lock "/var/lib/nova/instances/a0adc838-396e-45a2-8503-a2ab451cd778/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:31:02 compute-1 nova_compute[189066]: 2025-12-05 09:31:02.111 189070 DEBUG oslo_concurrency.lockutils [None req-1d07d364-88ff-45bc-b26a-7ddc3a3b818d bcc37d16c39547bba794fb1f43e889c1 6c5bb818cba543bbb1bcff8df31dd9cd - - default default] Lock "/var/lib/nova/instances/a0adc838-396e-45a2-8503-a2ab451cd778/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:31:02 compute-1 nova_compute[189066]: 2025-12-05 09:31:02.111 189070 DEBUG oslo_concurrency.lockutils [None req-1d07d364-88ff-45bc-b26a-7ddc3a3b818d bcc37d16c39547bba794fb1f43e889c1 6c5bb818cba543bbb1bcff8df31dd9cd - - default default] Lock "/var/lib/nova/instances/a0adc838-396e-45a2-8503-a2ab451cd778/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:31:02 compute-1 nova_compute[189066]: 2025-12-05 09:31:02.129 189070 DEBUG oslo_concurrency.processutils [None req-1d07d364-88ff-45bc-b26a-7ddc3a3b818d bcc37d16c39547bba794fb1f43e889c1 6c5bb818cba543bbb1bcff8df31dd9cd - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5b1bdb5cc50b9bf43ebe3094e9279a12a18df0ce --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 09:31:02 compute-1 nova_compute[189066]: 2025-12-05 09:31:02.167 189070 DEBUG oslo_concurrency.lockutils [None req-9369d8b4-f1ed-4091-8726-2a59baae17ba 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Lock "d615ca65-b10a-4ff3-a274-7b300d2e1808" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.617s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:31:02 compute-1 nova_compute[189066]: 2025-12-05 09:31:02.194 189070 DEBUG oslo_concurrency.processutils [None req-1d07d364-88ff-45bc-b26a-7ddc3a3b818d bcc37d16c39547bba794fb1f43e889c1 6c5bb818cba543bbb1bcff8df31dd9cd - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5b1bdb5cc50b9bf43ebe3094e9279a12a18df0ce --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 09:31:02 compute-1 nova_compute[189066]: 2025-12-05 09:31:02.194 189070 DEBUG oslo_concurrency.lockutils [None req-1d07d364-88ff-45bc-b26a-7ddc3a3b818d bcc37d16c39547bba794fb1f43e889c1 6c5bb818cba543bbb1bcff8df31dd9cd - - default default] Acquiring lock "5b1bdb5cc50b9bf43ebe3094e9279a12a18df0ce" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:31:02 compute-1 nova_compute[189066]: 2025-12-05 09:31:02.195 189070 DEBUG oslo_concurrency.lockutils [None req-1d07d364-88ff-45bc-b26a-7ddc3a3b818d bcc37d16c39547bba794fb1f43e889c1 6c5bb818cba543bbb1bcff8df31dd9cd - - default default] Lock "5b1bdb5cc50b9bf43ebe3094e9279a12a18df0ce" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:31:02 compute-1 nova_compute[189066]: 2025-12-05 09:31:02.205 189070 DEBUG oslo_concurrency.processutils [None req-1d07d364-88ff-45bc-b26a-7ddc3a3b818d bcc37d16c39547bba794fb1f43e889c1 6c5bb818cba543bbb1bcff8df31dd9cd - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5b1bdb5cc50b9bf43ebe3094e9279a12a18df0ce --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 09:31:02 compute-1 nova_compute[189066]: 2025-12-05 09:31:02.234 189070 DEBUG nova.compute.manager [req-3ef8af1b-21b9-4658-9a01-66f4c985290b req-7d6d8772-d09d-4039-bb25-deceb4b09428 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: d615ca65-b10a-4ff3-a274-7b300d2e1808] Received event network-vif-deleted-8ecf8178-afa3-4403-9157-3d31a281b7aa external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 09:31:02 compute-1 nova_compute[189066]: 2025-12-05 09:31:02.267 189070 DEBUG oslo_concurrency.processutils [None req-1d07d364-88ff-45bc-b26a-7ddc3a3b818d bcc37d16c39547bba794fb1f43e889c1 6c5bb818cba543bbb1bcff8df31dd9cd - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5b1bdb5cc50b9bf43ebe3094e9279a12a18df0ce --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 09:31:02 compute-1 nova_compute[189066]: 2025-12-05 09:31:02.268 189070 DEBUG oslo_concurrency.processutils [None req-1d07d364-88ff-45bc-b26a-7ddc3a3b818d bcc37d16c39547bba794fb1f43e889c1 6c5bb818cba543bbb1bcff8df31dd9cd - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5b1bdb5cc50b9bf43ebe3094e9279a12a18df0ce,backing_fmt=raw /var/lib/nova/instances/a0adc838-396e-45a2-8503-a2ab451cd778/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 09:31:02 compute-1 nova_compute[189066]: 2025-12-05 09:31:02.305 189070 DEBUG oslo_concurrency.processutils [None req-1d07d364-88ff-45bc-b26a-7ddc3a3b818d bcc37d16c39547bba794fb1f43e889c1 6c5bb818cba543bbb1bcff8df31dd9cd - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5b1bdb5cc50b9bf43ebe3094e9279a12a18df0ce,backing_fmt=raw /var/lib/nova/instances/a0adc838-396e-45a2-8503-a2ab451cd778/disk 1073741824" returned: 0 in 0.037s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 09:31:02 compute-1 nova_compute[189066]: 2025-12-05 09:31:02.306 189070 DEBUG oslo_concurrency.lockutils [None req-1d07d364-88ff-45bc-b26a-7ddc3a3b818d bcc37d16c39547bba794fb1f43e889c1 6c5bb818cba543bbb1bcff8df31dd9cd - - default default] Lock "5b1bdb5cc50b9bf43ebe3094e9279a12a18df0ce" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.112s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:31:02 compute-1 nova_compute[189066]: 2025-12-05 09:31:02.307 189070 DEBUG oslo_concurrency.processutils [None req-1d07d364-88ff-45bc-b26a-7ddc3a3b818d bcc37d16c39547bba794fb1f43e889c1 6c5bb818cba543bbb1bcff8df31dd9cd - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5b1bdb5cc50b9bf43ebe3094e9279a12a18df0ce --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 09:31:02 compute-1 nova_compute[189066]: 2025-12-05 09:31:02.339 189070 DEBUG nova.policy [None req-1d07d364-88ff-45bc-b26a-7ddc3a3b818d bcc37d16c39547bba794fb1f43e889c1 6c5bb818cba543bbb1bcff8df31dd9cd - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'bcc37d16c39547bba794fb1f43e889c1', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '6c5bb818cba543bbb1bcff8df31dd9cd', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Dec 05 09:31:02 compute-1 nova_compute[189066]: 2025-12-05 09:31:02.365 189070 DEBUG oslo_concurrency.processutils [None req-1d07d364-88ff-45bc-b26a-7ddc3a3b818d bcc37d16c39547bba794fb1f43e889c1 6c5bb818cba543bbb1bcff8df31dd9cd - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5b1bdb5cc50b9bf43ebe3094e9279a12a18df0ce --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 09:31:02 compute-1 nova_compute[189066]: 2025-12-05 09:31:02.367 189070 DEBUG nova.virt.disk.api [None req-1d07d364-88ff-45bc-b26a-7ddc3a3b818d bcc37d16c39547bba794fb1f43e889c1 6c5bb818cba543bbb1bcff8df31dd9cd - - default default] Checking if we can resize image /var/lib/nova/instances/a0adc838-396e-45a2-8503-a2ab451cd778/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Dec 05 09:31:02 compute-1 nova_compute[189066]: 2025-12-05 09:31:02.367 189070 DEBUG oslo_concurrency.processutils [None req-1d07d364-88ff-45bc-b26a-7ddc3a3b818d bcc37d16c39547bba794fb1f43e889c1 6c5bb818cba543bbb1bcff8df31dd9cd - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a0adc838-396e-45a2-8503-a2ab451cd778/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 09:31:02 compute-1 nova_compute[189066]: 2025-12-05 09:31:02.427 189070 DEBUG oslo_concurrency.processutils [None req-1d07d364-88ff-45bc-b26a-7ddc3a3b818d bcc37d16c39547bba794fb1f43e889c1 6c5bb818cba543bbb1bcff8df31dd9cd - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a0adc838-396e-45a2-8503-a2ab451cd778/disk --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 09:31:02 compute-1 nova_compute[189066]: 2025-12-05 09:31:02.428 189070 DEBUG nova.virt.disk.api [None req-1d07d364-88ff-45bc-b26a-7ddc3a3b818d bcc37d16c39547bba794fb1f43e889c1 6c5bb818cba543bbb1bcff8df31dd9cd - - default default] Cannot resize image /var/lib/nova/instances/a0adc838-396e-45a2-8503-a2ab451cd778/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Dec 05 09:31:02 compute-1 nova_compute[189066]: 2025-12-05 09:31:02.429 189070 DEBUG nova.objects.instance [None req-1d07d364-88ff-45bc-b26a-7ddc3a3b818d bcc37d16c39547bba794fb1f43e889c1 6c5bb818cba543bbb1bcff8df31dd9cd - - default default] Lazy-loading 'migration_context' on Instance uuid a0adc838-396e-45a2-8503-a2ab451cd778 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 09:31:02 compute-1 nova_compute[189066]: 2025-12-05 09:31:02.448 189070 DEBUG nova.virt.libvirt.driver [None req-1d07d364-88ff-45bc-b26a-7ddc3a3b818d bcc37d16c39547bba794fb1f43e889c1 6c5bb818cba543bbb1bcff8df31dd9cd - - default default] [instance: a0adc838-396e-45a2-8503-a2ab451cd778] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Dec 05 09:31:02 compute-1 nova_compute[189066]: 2025-12-05 09:31:02.448 189070 DEBUG nova.virt.libvirt.driver [None req-1d07d364-88ff-45bc-b26a-7ddc3a3b818d bcc37d16c39547bba794fb1f43e889c1 6c5bb818cba543bbb1bcff8df31dd9cd - - default default] [instance: a0adc838-396e-45a2-8503-a2ab451cd778] Ensure instance console log exists: /var/lib/nova/instances/a0adc838-396e-45a2-8503-a2ab451cd778/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Dec 05 09:31:02 compute-1 nova_compute[189066]: 2025-12-05 09:31:02.449 189070 DEBUG oslo_concurrency.lockutils [None req-1d07d364-88ff-45bc-b26a-7ddc3a3b818d bcc37d16c39547bba794fb1f43e889c1 6c5bb818cba543bbb1bcff8df31dd9cd - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:31:02 compute-1 nova_compute[189066]: 2025-12-05 09:31:02.449 189070 DEBUG oslo_concurrency.lockutils [None req-1d07d364-88ff-45bc-b26a-7ddc3a3b818d bcc37d16c39547bba794fb1f43e889c1 6c5bb818cba543bbb1bcff8df31dd9cd - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:31:02 compute-1 nova_compute[189066]: 2025-12-05 09:31:02.449 189070 DEBUG oslo_concurrency.lockutils [None req-1d07d364-88ff-45bc-b26a-7ddc3a3b818d bcc37d16c39547bba794fb1f43e889c1 6c5bb818cba543bbb1bcff8df31dd9cd - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:31:04 compute-1 nova_compute[189066]: 2025-12-05 09:31:04.581 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:31:04 compute-1 nova_compute[189066]: 2025-12-05 09:31:04.854 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:31:04 compute-1 nova_compute[189066]: 2025-12-05 09:31:04.909 189070 DEBUG nova.network.neutron [None req-1d07d364-88ff-45bc-b26a-7ddc3a3b818d bcc37d16c39547bba794fb1f43e889c1 6c5bb818cba543bbb1bcff8df31dd9cd - - default default] [instance: a0adc838-396e-45a2-8503-a2ab451cd778] Successfully created port: c8f841b5-bac4-4aed-83b6-65ebccb9c49e _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Dec 05 09:31:05 compute-1 nova_compute[189066]: 2025-12-05 09:31:05.705 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:31:05 compute-1 nova_compute[189066]: 2025-12-05 09:31:05.926 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:31:06 compute-1 nova_compute[189066]: 2025-12-05 09:31:06.454 189070 DEBUG nova.network.neutron [None req-1d07d364-88ff-45bc-b26a-7ddc3a3b818d bcc37d16c39547bba794fb1f43e889c1 6c5bb818cba543bbb1bcff8df31dd9cd - - default default] [instance: a0adc838-396e-45a2-8503-a2ab451cd778] Successfully updated port: c8f841b5-bac4-4aed-83b6-65ebccb9c49e _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Dec 05 09:31:06 compute-1 nova_compute[189066]: 2025-12-05 09:31:06.495 189070 DEBUG oslo_concurrency.lockutils [None req-1d07d364-88ff-45bc-b26a-7ddc3a3b818d bcc37d16c39547bba794fb1f43e889c1 6c5bb818cba543bbb1bcff8df31dd9cd - - default default] Acquiring lock "refresh_cache-a0adc838-396e-45a2-8503-a2ab451cd778" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 09:31:06 compute-1 nova_compute[189066]: 2025-12-05 09:31:06.496 189070 DEBUG oslo_concurrency.lockutils [None req-1d07d364-88ff-45bc-b26a-7ddc3a3b818d bcc37d16c39547bba794fb1f43e889c1 6c5bb818cba543bbb1bcff8df31dd9cd - - default default] Acquired lock "refresh_cache-a0adc838-396e-45a2-8503-a2ab451cd778" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 09:31:06 compute-1 nova_compute[189066]: 2025-12-05 09:31:06.496 189070 DEBUG nova.network.neutron [None req-1d07d364-88ff-45bc-b26a-7ddc3a3b818d bcc37d16c39547bba794fb1f43e889c1 6c5bb818cba543bbb1bcff8df31dd9cd - - default default] [instance: a0adc838-396e-45a2-8503-a2ab451cd778] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Dec 05 09:31:06 compute-1 nova_compute[189066]: 2025-12-05 09:31:06.848 189070 DEBUG nova.network.neutron [None req-1d07d364-88ff-45bc-b26a-7ddc3a3b818d bcc37d16c39547bba794fb1f43e889c1 6c5bb818cba543bbb1bcff8df31dd9cd - - default default] [instance: a0adc838-396e-45a2-8503-a2ab451cd778] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Dec 05 09:31:06 compute-1 nova_compute[189066]: 2025-12-05 09:31:06.869 189070 DEBUG nova.compute.manager [req-3cf9f110-6013-46b7-9b5e-6cb8000a563a req-384e4f01-df7f-41a1-a74f-219f6ea36784 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: a0adc838-396e-45a2-8503-a2ab451cd778] Received event network-changed-c8f841b5-bac4-4aed-83b6-65ebccb9c49e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 09:31:06 compute-1 nova_compute[189066]: 2025-12-05 09:31:06.869 189070 DEBUG nova.compute.manager [req-3cf9f110-6013-46b7-9b5e-6cb8000a563a req-384e4f01-df7f-41a1-a74f-219f6ea36784 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: a0adc838-396e-45a2-8503-a2ab451cd778] Refreshing instance network info cache due to event network-changed-c8f841b5-bac4-4aed-83b6-65ebccb9c49e. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Dec 05 09:31:06 compute-1 nova_compute[189066]: 2025-12-05 09:31:06.870 189070 DEBUG oslo_concurrency.lockutils [req-3cf9f110-6013-46b7-9b5e-6cb8000a563a req-384e4f01-df7f-41a1-a74f-219f6ea36784 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Acquiring lock "refresh_cache-a0adc838-396e-45a2-8503-a2ab451cd778" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 09:31:08 compute-1 podman[226153]: 2025-12-05 09:31:08.648664051 +0000 UTC m=+0.088021196 container health_status d60d3dfd47255bc7c068dbedc03dbf86678863f0414a1b8f8536e4d53dd67a13 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a42d21893a42142b0b00e96296a13ac9d2c0fedf55d6438ad104fd0309c63376-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251125, config_id=ceilometer_agent_compute, org.label-schema.license=GPLv2)
Dec 05 09:31:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:31:08.875 105272 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:31:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:31:08.875 105272 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:31:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:31:08.876 105272 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:31:09 compute-1 nova_compute[189066]: 2025-12-05 09:31:09.107 189070 DEBUG nova.network.neutron [None req-1d07d364-88ff-45bc-b26a-7ddc3a3b818d bcc37d16c39547bba794fb1f43e889c1 6c5bb818cba543bbb1bcff8df31dd9cd - - default default] [instance: a0adc838-396e-45a2-8503-a2ab451cd778] Updating instance_info_cache with network_info: [{"id": "c8f841b5-bac4-4aed-83b6-65ebccb9c49e", "address": "fa:16:3e:a7:2b:28", "network": {"id": "dbc58ba4-8158-4179-bf5b-c94cb9a82196", "bridge": "br-int", "label": "tempest-network-smoke--906868098", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6c5bb818cba543bbb1bcff8df31dd9cd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc8f841b5-ba", "ovs_interfaceid": "c8f841b5-bac4-4aed-83b6-65ebccb9c49e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 09:31:09 compute-1 nova_compute[189066]: 2025-12-05 09:31:09.148 189070 DEBUG oslo_concurrency.lockutils [None req-1d07d364-88ff-45bc-b26a-7ddc3a3b818d bcc37d16c39547bba794fb1f43e889c1 6c5bb818cba543bbb1bcff8df31dd9cd - - default default] Releasing lock "refresh_cache-a0adc838-396e-45a2-8503-a2ab451cd778" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 09:31:09 compute-1 nova_compute[189066]: 2025-12-05 09:31:09.149 189070 DEBUG nova.compute.manager [None req-1d07d364-88ff-45bc-b26a-7ddc3a3b818d bcc37d16c39547bba794fb1f43e889c1 6c5bb818cba543bbb1bcff8df31dd9cd - - default default] [instance: a0adc838-396e-45a2-8503-a2ab451cd778] Instance network_info: |[{"id": "c8f841b5-bac4-4aed-83b6-65ebccb9c49e", "address": "fa:16:3e:a7:2b:28", "network": {"id": "dbc58ba4-8158-4179-bf5b-c94cb9a82196", "bridge": "br-int", "label": "tempest-network-smoke--906868098", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6c5bb818cba543bbb1bcff8df31dd9cd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc8f841b5-ba", "ovs_interfaceid": "c8f841b5-bac4-4aed-83b6-65ebccb9c49e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Dec 05 09:31:09 compute-1 nova_compute[189066]: 2025-12-05 09:31:09.149 189070 DEBUG oslo_concurrency.lockutils [req-3cf9f110-6013-46b7-9b5e-6cb8000a563a req-384e4f01-df7f-41a1-a74f-219f6ea36784 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Acquired lock "refresh_cache-a0adc838-396e-45a2-8503-a2ab451cd778" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 09:31:09 compute-1 nova_compute[189066]: 2025-12-05 09:31:09.149 189070 DEBUG nova.network.neutron [req-3cf9f110-6013-46b7-9b5e-6cb8000a563a req-384e4f01-df7f-41a1-a74f-219f6ea36784 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: a0adc838-396e-45a2-8503-a2ab451cd778] Refreshing network info cache for port c8f841b5-bac4-4aed-83b6-65ebccb9c49e _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Dec 05 09:31:09 compute-1 nova_compute[189066]: 2025-12-05 09:31:09.152 189070 DEBUG nova.virt.libvirt.driver [None req-1d07d364-88ff-45bc-b26a-7ddc3a3b818d bcc37d16c39547bba794fb1f43e889c1 6c5bb818cba543bbb1bcff8df31dd9cd - - default default] [instance: a0adc838-396e-45a2-8503-a2ab451cd778] Start _get_guest_xml network_info=[{"id": "c8f841b5-bac4-4aed-83b6-65ebccb9c49e", "address": "fa:16:3e:a7:2b:28", "network": {"id": "dbc58ba4-8158-4179-bf5b-c94cb9a82196", "bridge": "br-int", "label": "tempest-network-smoke--906868098", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6c5bb818cba543bbb1bcff8df31dd9cd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc8f841b5-ba", "ovs_interfaceid": "c8f841b5-bac4-4aed-83b6-65ebccb9c49e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T09:19:04Z,direct_url=<?>,disk_format='qcow2',id=3ebffd97-b242-42d7-b245-ebdaf8e4377c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='1cb29f3743454b7fb71af92993455437',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T09:19:06Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encrypted': False, 'encryption_secret_uuid': None, 'size': 0, 'boot_index': 0, 'device_name': '/dev/vda', 'disk_bus': 'virtio', 'guest_format': None, 'encryption_options': None, 'encryption_format': None, 'device_type': 'disk', 'image_id': '3ebffd97-b242-42d7-b245-ebdaf8e4377c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Dec 05 09:31:09 compute-1 nova_compute[189066]: 2025-12-05 09:31:09.158 189070 WARNING nova.virt.libvirt.driver [None req-1d07d364-88ff-45bc-b26a-7ddc3a3b818d bcc37d16c39547bba794fb1f43e889c1 6c5bb818cba543bbb1bcff8df31dd9cd - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 05 09:31:09 compute-1 nova_compute[189066]: 2025-12-05 09:31:09.164 189070 DEBUG nova.virt.libvirt.host [None req-1d07d364-88ff-45bc-b26a-7ddc3a3b818d bcc37d16c39547bba794fb1f43e889c1 6c5bb818cba543bbb1bcff8df31dd9cd - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Dec 05 09:31:09 compute-1 nova_compute[189066]: 2025-12-05 09:31:09.166 189070 DEBUG nova.virt.libvirt.host [None req-1d07d364-88ff-45bc-b26a-7ddc3a3b818d bcc37d16c39547bba794fb1f43e889c1 6c5bb818cba543bbb1bcff8df31dd9cd - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Dec 05 09:31:09 compute-1 nova_compute[189066]: 2025-12-05 09:31:09.170 189070 DEBUG nova.virt.libvirt.host [None req-1d07d364-88ff-45bc-b26a-7ddc3a3b818d bcc37d16c39547bba794fb1f43e889c1 6c5bb818cba543bbb1bcff8df31dd9cd - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Dec 05 09:31:09 compute-1 nova_compute[189066]: 2025-12-05 09:31:09.171 189070 DEBUG nova.virt.libvirt.host [None req-1d07d364-88ff-45bc-b26a-7ddc3a3b818d bcc37d16c39547bba794fb1f43e889c1 6c5bb818cba543bbb1bcff8df31dd9cd - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Dec 05 09:31:09 compute-1 nova_compute[189066]: 2025-12-05 09:31:09.173 189070 DEBUG nova.virt.libvirt.driver [None req-1d07d364-88ff-45bc-b26a-7ddc3a3b818d bcc37d16c39547bba794fb1f43e889c1 6c5bb818cba543bbb1bcff8df31dd9cd - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Dec 05 09:31:09 compute-1 nova_compute[189066]: 2025-12-05 09:31:09.173 189070 DEBUG nova.virt.hardware [None req-1d07d364-88ff-45bc-b26a-7ddc3a3b818d bcc37d16c39547bba794fb1f43e889c1 6c5bb818cba543bbb1bcff8df31dd9cd - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-05T09:19:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='fbadeab4-f24f-4100-963a-d228b2a6f7c4',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T09:19:04Z,direct_url=<?>,disk_format='qcow2',id=3ebffd97-b242-42d7-b245-ebdaf8e4377c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='1cb29f3743454b7fb71af92993455437',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T09:19:06Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Dec 05 09:31:09 compute-1 nova_compute[189066]: 2025-12-05 09:31:09.174 189070 DEBUG nova.virt.hardware [None req-1d07d364-88ff-45bc-b26a-7ddc3a3b818d bcc37d16c39547bba794fb1f43e889c1 6c5bb818cba543bbb1bcff8df31dd9cd - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Dec 05 09:31:09 compute-1 nova_compute[189066]: 2025-12-05 09:31:09.174 189070 DEBUG nova.virt.hardware [None req-1d07d364-88ff-45bc-b26a-7ddc3a3b818d bcc37d16c39547bba794fb1f43e889c1 6c5bb818cba543bbb1bcff8df31dd9cd - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Dec 05 09:31:09 compute-1 nova_compute[189066]: 2025-12-05 09:31:09.174 189070 DEBUG nova.virt.hardware [None req-1d07d364-88ff-45bc-b26a-7ddc3a3b818d bcc37d16c39547bba794fb1f43e889c1 6c5bb818cba543bbb1bcff8df31dd9cd - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Dec 05 09:31:09 compute-1 nova_compute[189066]: 2025-12-05 09:31:09.174 189070 DEBUG nova.virt.hardware [None req-1d07d364-88ff-45bc-b26a-7ddc3a3b818d bcc37d16c39547bba794fb1f43e889c1 6c5bb818cba543bbb1bcff8df31dd9cd - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Dec 05 09:31:09 compute-1 nova_compute[189066]: 2025-12-05 09:31:09.174 189070 DEBUG nova.virt.hardware [None req-1d07d364-88ff-45bc-b26a-7ddc3a3b818d bcc37d16c39547bba794fb1f43e889c1 6c5bb818cba543bbb1bcff8df31dd9cd - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Dec 05 09:31:09 compute-1 nova_compute[189066]: 2025-12-05 09:31:09.174 189070 DEBUG nova.virt.hardware [None req-1d07d364-88ff-45bc-b26a-7ddc3a3b818d bcc37d16c39547bba794fb1f43e889c1 6c5bb818cba543bbb1bcff8df31dd9cd - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Dec 05 09:31:09 compute-1 nova_compute[189066]: 2025-12-05 09:31:09.175 189070 DEBUG nova.virt.hardware [None req-1d07d364-88ff-45bc-b26a-7ddc3a3b818d bcc37d16c39547bba794fb1f43e889c1 6c5bb818cba543bbb1bcff8df31dd9cd - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Dec 05 09:31:09 compute-1 nova_compute[189066]: 2025-12-05 09:31:09.175 189070 DEBUG nova.virt.hardware [None req-1d07d364-88ff-45bc-b26a-7ddc3a3b818d bcc37d16c39547bba794fb1f43e889c1 6c5bb818cba543bbb1bcff8df31dd9cd - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Dec 05 09:31:09 compute-1 nova_compute[189066]: 2025-12-05 09:31:09.175 189070 DEBUG nova.virt.hardware [None req-1d07d364-88ff-45bc-b26a-7ddc3a3b818d bcc37d16c39547bba794fb1f43e889c1 6c5bb818cba543bbb1bcff8df31dd9cd - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Dec 05 09:31:09 compute-1 nova_compute[189066]: 2025-12-05 09:31:09.175 189070 DEBUG nova.virt.hardware [None req-1d07d364-88ff-45bc-b26a-7ddc3a3b818d bcc37d16c39547bba794fb1f43e889c1 6c5bb818cba543bbb1bcff8df31dd9cd - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Dec 05 09:31:09 compute-1 nova_compute[189066]: 2025-12-05 09:31:09.179 189070 DEBUG nova.virt.libvirt.vif [None req-1d07d364-88ff-45bc-b26a-7ddc3a3b818d bcc37d16c39547bba794fb1f43e889c1 6c5bb818cba543bbb1bcff8df31dd9cd - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T09:31:00Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-tempest-TestSecurityGroupsBasicOps-1223075532-access_point-49256169',display_name='tempest-server-tempest-TestSecurityGroupsBasicOps-1223075532-access_point-49256169',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-server-tempest-testsecuritygroupsbasicops-1223075532-ac',id=22,image_ref='3ebffd97-b242-42d7-b245-ebdaf8e4377c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBD+5Pg7shJpMimDThmBqstpAx0KkRt2i5yGiHWlpBuzwDEqniUE3UrKuB7/FlvnvA8AQmg1XL8HIIV8GqoiYNn1cRv3JE7sesclhpKZ5mzYbeAsim3ia0Zws6nHf4Otwlg==',key_name='tempest-TestSecurityGroupsBasicOps-2125780827',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='6c5bb818cba543bbb1bcff8df31dd9cd',ramdisk_id='',reservation_id='r-1l334wzq',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='3ebffd97-b242-42d7-b245-ebdaf8e4377c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestSecurityGroupsBasicOps-1223075532',owner_user_name='tempest-TestSecurityGroupsBasicOps-1223075532-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T09:31:01Z,user_data=None,user_id='bcc37d16c39547bba794fb1f43e889c1',uuid=a0adc838-396e-45a2-8503-a2ab451cd778,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "c8f841b5-bac4-4aed-83b6-65ebccb9c49e", "address": "fa:16:3e:a7:2b:28", "network": {"id": "dbc58ba4-8158-4179-bf5b-c94cb9a82196", "bridge": "br-int", "label": "tempest-network-smoke--906868098", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6c5bb818cba543bbb1bcff8df31dd9cd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc8f841b5-ba", "ovs_interfaceid": "c8f841b5-bac4-4aed-83b6-65ebccb9c49e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Dec 05 09:31:09 compute-1 nova_compute[189066]: 2025-12-05 09:31:09.180 189070 DEBUG nova.network.os_vif_util [None req-1d07d364-88ff-45bc-b26a-7ddc3a3b818d bcc37d16c39547bba794fb1f43e889c1 6c5bb818cba543bbb1bcff8df31dd9cd - - default default] Converting VIF {"id": "c8f841b5-bac4-4aed-83b6-65ebccb9c49e", "address": "fa:16:3e:a7:2b:28", "network": {"id": "dbc58ba4-8158-4179-bf5b-c94cb9a82196", "bridge": "br-int", "label": "tempest-network-smoke--906868098", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6c5bb818cba543bbb1bcff8df31dd9cd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc8f841b5-ba", "ovs_interfaceid": "c8f841b5-bac4-4aed-83b6-65ebccb9c49e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 09:31:09 compute-1 nova_compute[189066]: 2025-12-05 09:31:09.180 189070 DEBUG nova.network.os_vif_util [None req-1d07d364-88ff-45bc-b26a-7ddc3a3b818d bcc37d16c39547bba794fb1f43e889c1 6c5bb818cba543bbb1bcff8df31dd9cd - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:a7:2b:28,bridge_name='br-int',has_traffic_filtering=True,id=c8f841b5-bac4-4aed-83b6-65ebccb9c49e,network=Network(dbc58ba4-8158-4179-bf5b-c94cb9a82196),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc8f841b5-ba') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 09:31:09 compute-1 nova_compute[189066]: 2025-12-05 09:31:09.181 189070 DEBUG nova.objects.instance [None req-1d07d364-88ff-45bc-b26a-7ddc3a3b818d bcc37d16c39547bba794fb1f43e889c1 6c5bb818cba543bbb1bcff8df31dd9cd - - default default] Lazy-loading 'pci_devices' on Instance uuid a0adc838-396e-45a2-8503-a2ab451cd778 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 09:31:09 compute-1 nova_compute[189066]: 2025-12-05 09:31:09.205 189070 DEBUG nova.virt.libvirt.driver [None req-1d07d364-88ff-45bc-b26a-7ddc3a3b818d bcc37d16c39547bba794fb1f43e889c1 6c5bb818cba543bbb1bcff8df31dd9cd - - default default] [instance: a0adc838-396e-45a2-8503-a2ab451cd778] End _get_guest_xml xml=<domain type="kvm">
Dec 05 09:31:09 compute-1 nova_compute[189066]:   <uuid>a0adc838-396e-45a2-8503-a2ab451cd778</uuid>
Dec 05 09:31:09 compute-1 nova_compute[189066]:   <name>instance-00000016</name>
Dec 05 09:31:09 compute-1 nova_compute[189066]:   <memory>131072</memory>
Dec 05 09:31:09 compute-1 nova_compute[189066]:   <vcpu>1</vcpu>
Dec 05 09:31:09 compute-1 nova_compute[189066]:   <metadata>
Dec 05 09:31:09 compute-1 nova_compute[189066]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 05 09:31:09 compute-1 nova_compute[189066]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec 05 09:31:09 compute-1 nova_compute[189066]:       <nova:name>tempest-server-tempest-TestSecurityGroupsBasicOps-1223075532-access_point-49256169</nova:name>
Dec 05 09:31:09 compute-1 nova_compute[189066]:       <nova:creationTime>2025-12-05 09:31:09</nova:creationTime>
Dec 05 09:31:09 compute-1 nova_compute[189066]:       <nova:flavor name="m1.nano">
Dec 05 09:31:09 compute-1 nova_compute[189066]:         <nova:memory>128</nova:memory>
Dec 05 09:31:09 compute-1 nova_compute[189066]:         <nova:disk>1</nova:disk>
Dec 05 09:31:09 compute-1 nova_compute[189066]:         <nova:swap>0</nova:swap>
Dec 05 09:31:09 compute-1 nova_compute[189066]:         <nova:ephemeral>0</nova:ephemeral>
Dec 05 09:31:09 compute-1 nova_compute[189066]:         <nova:vcpus>1</nova:vcpus>
Dec 05 09:31:09 compute-1 nova_compute[189066]:       </nova:flavor>
Dec 05 09:31:09 compute-1 nova_compute[189066]:       <nova:owner>
Dec 05 09:31:09 compute-1 nova_compute[189066]:         <nova:user uuid="bcc37d16c39547bba794fb1f43e889c1">tempest-TestSecurityGroupsBasicOps-1223075532-project-member</nova:user>
Dec 05 09:31:09 compute-1 nova_compute[189066]:         <nova:project uuid="6c5bb818cba543bbb1bcff8df31dd9cd">tempest-TestSecurityGroupsBasicOps-1223075532</nova:project>
Dec 05 09:31:09 compute-1 nova_compute[189066]:       </nova:owner>
Dec 05 09:31:09 compute-1 nova_compute[189066]:       <nova:root type="image" uuid="3ebffd97-b242-42d7-b245-ebdaf8e4377c"/>
Dec 05 09:31:09 compute-1 nova_compute[189066]:       <nova:ports>
Dec 05 09:31:09 compute-1 nova_compute[189066]:         <nova:port uuid="c8f841b5-bac4-4aed-83b6-65ebccb9c49e">
Dec 05 09:31:09 compute-1 nova_compute[189066]:           <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Dec 05 09:31:09 compute-1 nova_compute[189066]:         </nova:port>
Dec 05 09:31:09 compute-1 nova_compute[189066]:       </nova:ports>
Dec 05 09:31:09 compute-1 nova_compute[189066]:     </nova:instance>
Dec 05 09:31:09 compute-1 nova_compute[189066]:   </metadata>
Dec 05 09:31:09 compute-1 nova_compute[189066]:   <sysinfo type="smbios">
Dec 05 09:31:09 compute-1 nova_compute[189066]:     <system>
Dec 05 09:31:09 compute-1 nova_compute[189066]:       <entry name="manufacturer">RDO</entry>
Dec 05 09:31:09 compute-1 nova_compute[189066]:       <entry name="product">OpenStack Compute</entry>
Dec 05 09:31:09 compute-1 nova_compute[189066]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec 05 09:31:09 compute-1 nova_compute[189066]:       <entry name="serial">a0adc838-396e-45a2-8503-a2ab451cd778</entry>
Dec 05 09:31:09 compute-1 nova_compute[189066]:       <entry name="uuid">a0adc838-396e-45a2-8503-a2ab451cd778</entry>
Dec 05 09:31:09 compute-1 nova_compute[189066]:       <entry name="family">Virtual Machine</entry>
Dec 05 09:31:09 compute-1 nova_compute[189066]:     </system>
Dec 05 09:31:09 compute-1 nova_compute[189066]:   </sysinfo>
Dec 05 09:31:09 compute-1 nova_compute[189066]:   <os>
Dec 05 09:31:09 compute-1 nova_compute[189066]:     <type arch="x86_64" machine="q35">hvm</type>
Dec 05 09:31:09 compute-1 nova_compute[189066]:     <boot dev="hd"/>
Dec 05 09:31:09 compute-1 nova_compute[189066]:     <smbios mode="sysinfo"/>
Dec 05 09:31:09 compute-1 nova_compute[189066]:   </os>
Dec 05 09:31:09 compute-1 nova_compute[189066]:   <features>
Dec 05 09:31:09 compute-1 nova_compute[189066]:     <acpi/>
Dec 05 09:31:09 compute-1 nova_compute[189066]:     <apic/>
Dec 05 09:31:09 compute-1 nova_compute[189066]:     <vmcoreinfo/>
Dec 05 09:31:09 compute-1 nova_compute[189066]:   </features>
Dec 05 09:31:09 compute-1 nova_compute[189066]:   <clock offset="utc">
Dec 05 09:31:09 compute-1 nova_compute[189066]:     <timer name="pit" tickpolicy="delay"/>
Dec 05 09:31:09 compute-1 nova_compute[189066]:     <timer name="rtc" tickpolicy="catchup"/>
Dec 05 09:31:09 compute-1 nova_compute[189066]:     <timer name="hpet" present="no"/>
Dec 05 09:31:09 compute-1 nova_compute[189066]:   </clock>
Dec 05 09:31:09 compute-1 nova_compute[189066]:   <cpu mode="custom" match="exact">
Dec 05 09:31:09 compute-1 nova_compute[189066]:     <model>Nehalem</model>
Dec 05 09:31:09 compute-1 nova_compute[189066]:     <topology sockets="1" cores="1" threads="1"/>
Dec 05 09:31:09 compute-1 nova_compute[189066]:   </cpu>
Dec 05 09:31:09 compute-1 nova_compute[189066]:   <devices>
Dec 05 09:31:09 compute-1 nova_compute[189066]:     <disk type="file" device="disk">
Dec 05 09:31:09 compute-1 nova_compute[189066]:       <driver name="qemu" type="qcow2" cache="none"/>
Dec 05 09:31:09 compute-1 nova_compute[189066]:       <source file="/var/lib/nova/instances/a0adc838-396e-45a2-8503-a2ab451cd778/disk"/>
Dec 05 09:31:09 compute-1 nova_compute[189066]:       <target dev="vda" bus="virtio"/>
Dec 05 09:31:09 compute-1 nova_compute[189066]:     </disk>
Dec 05 09:31:09 compute-1 nova_compute[189066]:     <disk type="file" device="cdrom">
Dec 05 09:31:09 compute-1 nova_compute[189066]:       <driver name="qemu" type="raw" cache="none"/>
Dec 05 09:31:09 compute-1 nova_compute[189066]:       <source file="/var/lib/nova/instances/a0adc838-396e-45a2-8503-a2ab451cd778/disk.config"/>
Dec 05 09:31:09 compute-1 nova_compute[189066]:       <target dev="sda" bus="sata"/>
Dec 05 09:31:09 compute-1 nova_compute[189066]:     </disk>
Dec 05 09:31:09 compute-1 nova_compute[189066]:     <interface type="ethernet">
Dec 05 09:31:09 compute-1 nova_compute[189066]:       <mac address="fa:16:3e:a7:2b:28"/>
Dec 05 09:31:09 compute-1 nova_compute[189066]:       <model type="virtio"/>
Dec 05 09:31:09 compute-1 nova_compute[189066]:       <driver name="vhost" rx_queue_size="512"/>
Dec 05 09:31:09 compute-1 nova_compute[189066]:       <mtu size="1442"/>
Dec 05 09:31:09 compute-1 nova_compute[189066]:       <target dev="tapc8f841b5-ba"/>
Dec 05 09:31:09 compute-1 nova_compute[189066]:     </interface>
Dec 05 09:31:09 compute-1 nova_compute[189066]:     <serial type="pty">
Dec 05 09:31:09 compute-1 nova_compute[189066]:       <log file="/var/lib/nova/instances/a0adc838-396e-45a2-8503-a2ab451cd778/console.log" append="off"/>
Dec 05 09:31:09 compute-1 nova_compute[189066]:     </serial>
Dec 05 09:31:09 compute-1 nova_compute[189066]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Dec 05 09:31:09 compute-1 nova_compute[189066]:     <video>
Dec 05 09:31:09 compute-1 nova_compute[189066]:       <model type="virtio"/>
Dec 05 09:31:09 compute-1 nova_compute[189066]:     </video>
Dec 05 09:31:09 compute-1 nova_compute[189066]:     <input type="tablet" bus="usb"/>
Dec 05 09:31:09 compute-1 nova_compute[189066]:     <rng model="virtio">
Dec 05 09:31:09 compute-1 nova_compute[189066]:       <backend model="random">/dev/urandom</backend>
Dec 05 09:31:09 compute-1 nova_compute[189066]:     </rng>
Dec 05 09:31:09 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root"/>
Dec 05 09:31:09 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:31:09 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:31:09 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:31:09 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:31:09 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:31:09 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:31:09 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:31:09 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:31:09 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:31:09 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:31:09 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:31:09 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:31:09 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:31:09 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:31:09 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:31:09 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:31:09 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:31:09 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:31:09 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:31:09 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:31:09 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:31:09 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:31:09 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:31:09 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:31:09 compute-1 nova_compute[189066]:     <controller type="usb" index="0"/>
Dec 05 09:31:09 compute-1 nova_compute[189066]:     <memballoon model="virtio">
Dec 05 09:31:09 compute-1 nova_compute[189066]:       <stats period="10"/>
Dec 05 09:31:09 compute-1 nova_compute[189066]:     </memballoon>
Dec 05 09:31:09 compute-1 nova_compute[189066]:   </devices>
Dec 05 09:31:09 compute-1 nova_compute[189066]: </domain>
Dec 05 09:31:09 compute-1 nova_compute[189066]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Dec 05 09:31:09 compute-1 nova_compute[189066]: 2025-12-05 09:31:09.205 189070 DEBUG nova.compute.manager [None req-1d07d364-88ff-45bc-b26a-7ddc3a3b818d bcc37d16c39547bba794fb1f43e889c1 6c5bb818cba543bbb1bcff8df31dd9cd - - default default] [instance: a0adc838-396e-45a2-8503-a2ab451cd778] Preparing to wait for external event network-vif-plugged-c8f841b5-bac4-4aed-83b6-65ebccb9c49e prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Dec 05 09:31:09 compute-1 nova_compute[189066]: 2025-12-05 09:31:09.205 189070 DEBUG oslo_concurrency.lockutils [None req-1d07d364-88ff-45bc-b26a-7ddc3a3b818d bcc37d16c39547bba794fb1f43e889c1 6c5bb818cba543bbb1bcff8df31dd9cd - - default default] Acquiring lock "a0adc838-396e-45a2-8503-a2ab451cd778-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:31:09 compute-1 nova_compute[189066]: 2025-12-05 09:31:09.206 189070 DEBUG oslo_concurrency.lockutils [None req-1d07d364-88ff-45bc-b26a-7ddc3a3b818d bcc37d16c39547bba794fb1f43e889c1 6c5bb818cba543bbb1bcff8df31dd9cd - - default default] Lock "a0adc838-396e-45a2-8503-a2ab451cd778-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:31:09 compute-1 nova_compute[189066]: 2025-12-05 09:31:09.206 189070 DEBUG oslo_concurrency.lockutils [None req-1d07d364-88ff-45bc-b26a-7ddc3a3b818d bcc37d16c39547bba794fb1f43e889c1 6c5bb818cba543bbb1bcff8df31dd9cd - - default default] Lock "a0adc838-396e-45a2-8503-a2ab451cd778-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:31:09 compute-1 nova_compute[189066]: 2025-12-05 09:31:09.207 189070 DEBUG nova.virt.libvirt.vif [None req-1d07d364-88ff-45bc-b26a-7ddc3a3b818d bcc37d16c39547bba794fb1f43e889c1 6c5bb818cba543bbb1bcff8df31dd9cd - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T09:31:00Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-tempest-TestSecurityGroupsBasicOps-1223075532-access_point-49256169',display_name='tempest-server-tempest-TestSecurityGroupsBasicOps-1223075532-access_point-49256169',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-server-tempest-testsecuritygroupsbasicops-1223075532-ac',id=22,image_ref='3ebffd97-b242-42d7-b245-ebdaf8e4377c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBD+5Pg7shJpMimDThmBqstpAx0KkRt2i5yGiHWlpBuzwDEqniUE3UrKuB7/FlvnvA8AQmg1XL8HIIV8GqoiYNn1cRv3JE7sesclhpKZ5mzYbeAsim3ia0Zws6nHf4Otwlg==',key_name='tempest-TestSecurityGroupsBasicOps-2125780827',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='6c5bb818cba543bbb1bcff8df31dd9cd',ramdisk_id='',reservation_id='r-1l334wzq',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='3ebffd97-b242-42d7-b245-ebdaf8e4377c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestSecurityGroupsBasicOps-1223075532',owner_user_name='tempest-TestSecurityGroupsBasicOps-1223075532-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T09:31:01Z,user_data=None,user_id='bcc37d16c39547bba794fb1f43e889c1',uuid=a0adc838-396e-45a2-8503-a2ab451cd778,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "c8f841b5-bac4-4aed-83b6-65ebccb9c49e", "address": "fa:16:3e:a7:2b:28", "network": {"id": "dbc58ba4-8158-4179-bf5b-c94cb9a82196", "bridge": "br-int", "label": "tempest-network-smoke--906868098", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6c5bb818cba543bbb1bcff8df31dd9cd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc8f841b5-ba", "ovs_interfaceid": "c8f841b5-bac4-4aed-83b6-65ebccb9c49e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Dec 05 09:31:09 compute-1 nova_compute[189066]: 2025-12-05 09:31:09.207 189070 DEBUG nova.network.os_vif_util [None req-1d07d364-88ff-45bc-b26a-7ddc3a3b818d bcc37d16c39547bba794fb1f43e889c1 6c5bb818cba543bbb1bcff8df31dd9cd - - default default] Converting VIF {"id": "c8f841b5-bac4-4aed-83b6-65ebccb9c49e", "address": "fa:16:3e:a7:2b:28", "network": {"id": "dbc58ba4-8158-4179-bf5b-c94cb9a82196", "bridge": "br-int", "label": "tempest-network-smoke--906868098", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6c5bb818cba543bbb1bcff8df31dd9cd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc8f841b5-ba", "ovs_interfaceid": "c8f841b5-bac4-4aed-83b6-65ebccb9c49e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 09:31:09 compute-1 nova_compute[189066]: 2025-12-05 09:31:09.207 189070 DEBUG nova.network.os_vif_util [None req-1d07d364-88ff-45bc-b26a-7ddc3a3b818d bcc37d16c39547bba794fb1f43e889c1 6c5bb818cba543bbb1bcff8df31dd9cd - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:a7:2b:28,bridge_name='br-int',has_traffic_filtering=True,id=c8f841b5-bac4-4aed-83b6-65ebccb9c49e,network=Network(dbc58ba4-8158-4179-bf5b-c94cb9a82196),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc8f841b5-ba') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 09:31:09 compute-1 nova_compute[189066]: 2025-12-05 09:31:09.208 189070 DEBUG os_vif [None req-1d07d364-88ff-45bc-b26a-7ddc3a3b818d bcc37d16c39547bba794fb1f43e889c1 6c5bb818cba543bbb1bcff8df31dd9cd - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:a7:2b:28,bridge_name='br-int',has_traffic_filtering=True,id=c8f841b5-bac4-4aed-83b6-65ebccb9c49e,network=Network(dbc58ba4-8158-4179-bf5b-c94cb9a82196),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc8f841b5-ba') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Dec 05 09:31:09 compute-1 nova_compute[189066]: 2025-12-05 09:31:09.208 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:31:09 compute-1 nova_compute[189066]: 2025-12-05 09:31:09.208 189070 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 09:31:09 compute-1 nova_compute[189066]: 2025-12-05 09:31:09.209 189070 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 09:31:09 compute-1 nova_compute[189066]: 2025-12-05 09:31:09.212 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:31:09 compute-1 nova_compute[189066]: 2025-12-05 09:31:09.213 189070 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapc8f841b5-ba, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 09:31:09 compute-1 nova_compute[189066]: 2025-12-05 09:31:09.213 189070 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapc8f841b5-ba, col_values=(('external_ids', {'iface-id': 'c8f841b5-bac4-4aed-83b6-65ebccb9c49e', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:a7:2b:28', 'vm-uuid': 'a0adc838-396e-45a2-8503-a2ab451cd778'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 09:31:09 compute-1 nova_compute[189066]: 2025-12-05 09:31:09.215 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:31:09 compute-1 NetworkManager[55704]: <info>  [1764927069.2170] manager: (tapc8f841b5-ba): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/75)
Dec 05 09:31:09 compute-1 nova_compute[189066]: 2025-12-05 09:31:09.218 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 09:31:09 compute-1 nova_compute[189066]: 2025-12-05 09:31:09.224 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:31:09 compute-1 nova_compute[189066]: 2025-12-05 09:31:09.225 189070 INFO os_vif [None req-1d07d364-88ff-45bc-b26a-7ddc3a3b818d bcc37d16c39547bba794fb1f43e889c1 6c5bb818cba543bbb1bcff8df31dd9cd - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:a7:2b:28,bridge_name='br-int',has_traffic_filtering=True,id=c8f841b5-bac4-4aed-83b6-65ebccb9c49e,network=Network(dbc58ba4-8158-4179-bf5b-c94cb9a82196),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc8f841b5-ba')
Dec 05 09:31:09 compute-1 nova_compute[189066]: 2025-12-05 09:31:09.320 189070 DEBUG nova.virt.libvirt.driver [None req-1d07d364-88ff-45bc-b26a-7ddc3a3b818d bcc37d16c39547bba794fb1f43e889c1 6c5bb818cba543bbb1bcff8df31dd9cd - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 05 09:31:09 compute-1 nova_compute[189066]: 2025-12-05 09:31:09.321 189070 DEBUG nova.virt.libvirt.driver [None req-1d07d364-88ff-45bc-b26a-7ddc3a3b818d bcc37d16c39547bba794fb1f43e889c1 6c5bb818cba543bbb1bcff8df31dd9cd - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 05 09:31:09 compute-1 nova_compute[189066]: 2025-12-05 09:31:09.321 189070 DEBUG nova.virt.libvirt.driver [None req-1d07d364-88ff-45bc-b26a-7ddc3a3b818d bcc37d16c39547bba794fb1f43e889c1 6c5bb818cba543bbb1bcff8df31dd9cd - - default default] No VIF found with MAC fa:16:3e:a7:2b:28, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Dec 05 09:31:09 compute-1 nova_compute[189066]: 2025-12-05 09:31:09.321 189070 INFO nova.virt.libvirt.driver [None req-1d07d364-88ff-45bc-b26a-7ddc3a3b818d bcc37d16c39547bba794fb1f43e889c1 6c5bb818cba543bbb1bcff8df31dd9cd - - default default] [instance: a0adc838-396e-45a2-8503-a2ab451cd778] Using config drive
Dec 05 09:31:09 compute-1 nova_compute[189066]: 2025-12-05 09:31:09.634 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:31:09 compute-1 nova_compute[189066]: 2025-12-05 09:31:09.972 189070 INFO nova.virt.libvirt.driver [None req-1d07d364-88ff-45bc-b26a-7ddc3a3b818d bcc37d16c39547bba794fb1f43e889c1 6c5bb818cba543bbb1bcff8df31dd9cd - - default default] [instance: a0adc838-396e-45a2-8503-a2ab451cd778] Creating config drive at /var/lib/nova/instances/a0adc838-396e-45a2-8503-a2ab451cd778/disk.config
Dec 05 09:31:09 compute-1 nova_compute[189066]: 2025-12-05 09:31:09.980 189070 DEBUG oslo_concurrency.processutils [None req-1d07d364-88ff-45bc-b26a-7ddc3a3b818d bcc37d16c39547bba794fb1f43e889c1 6c5bb818cba543bbb1bcff8df31dd9cd - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/a0adc838-396e-45a2-8503-a2ab451cd778/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpjidtkn3l execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 09:31:10 compute-1 nova_compute[189066]: 2025-12-05 09:31:10.112 189070 DEBUG oslo_concurrency.processutils [None req-1d07d364-88ff-45bc-b26a-7ddc3a3b818d bcc37d16c39547bba794fb1f43e889c1 6c5bb818cba543bbb1bcff8df31dd9cd - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/a0adc838-396e-45a2-8503-a2ab451cd778/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpjidtkn3l" returned: 0 in 0.132s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 09:31:10 compute-1 kernel: tapc8f841b5-ba: entered promiscuous mode
Dec 05 09:31:10 compute-1 NetworkManager[55704]: <info>  [1764927070.1827] manager: (tapc8f841b5-ba): new Tun device (/org/freedesktop/NetworkManager/Devices/76)
Dec 05 09:31:10 compute-1 ovn_controller[95809]: 2025-12-05T09:31:10Z|00142|binding|INFO|Claiming lport c8f841b5-bac4-4aed-83b6-65ebccb9c49e for this chassis.
Dec 05 09:31:10 compute-1 ovn_controller[95809]: 2025-12-05T09:31:10Z|00143|binding|INFO|c8f841b5-bac4-4aed-83b6-65ebccb9c49e: Claiming fa:16:3e:a7:2b:28 10.100.0.5
Dec 05 09:31:10 compute-1 nova_compute[189066]: 2025-12-05 09:31:10.188 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:31:10 compute-1 nova_compute[189066]: 2025-12-05 09:31:10.191 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:31:10 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:31:10.200 105272 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:a7:2b:28 10.100.0.5'], port_security=['fa:16:3e:a7:2b:28 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': 'a0adc838-396e-45a2-8503-a2ab451cd778', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-dbc58ba4-8158-4179-bf5b-c94cb9a82196', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6c5bb818cba543bbb1bcff8df31dd9cd', 'neutron:revision_number': '2', 'neutron:security_group_ids': '923cdeef-e7c2-4413-bfb2-fbe572ab445e b6dc48f6-1863-466c-8b58-dcc336cd88a6', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=eec9cf42-f628-43fa-9672-aa1cd3542052, chassis=[<ovs.db.idl.Row object at 0x7fc06f54c970>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fc06f54c970>], logical_port=c8f841b5-bac4-4aed-83b6-65ebccb9c49e) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 09:31:10 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:31:10.202 105272 INFO neutron.agent.ovn.metadata.agent [-] Port c8f841b5-bac4-4aed-83b6-65ebccb9c49e in datapath dbc58ba4-8158-4179-bf5b-c94cb9a82196 bound to our chassis
Dec 05 09:31:10 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:31:10.203 105272 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network dbc58ba4-8158-4179-bf5b-c94cb9a82196
Dec 05 09:31:10 compute-1 systemd-udevd[226193]: Network interface NamePolicy= disabled on kernel command line.
Dec 05 09:31:10 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:31:10.217 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[e23ac7af-8bad-430f-964b-b5c13ea47f07]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:31:10 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:31:10.218 105272 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapdbc58ba4-81 in ovnmeta-dbc58ba4-8158-4179-bf5b-c94cb9a82196 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Dec 05 09:31:10 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:31:10.220 220556 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapdbc58ba4-80 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Dec 05 09:31:10 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:31:10.221 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[3fbcc89a-f8cf-45e5-9f81-afdb0449b7bf]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:31:10 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:31:10.222 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[b6203ae4-6a9b-4b1c-9498-2a1e152cbfe5]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:31:10 compute-1 systemd-machined[154815]: New machine qemu-11-instance-00000016.
Dec 05 09:31:10 compute-1 NetworkManager[55704]: <info>  [1764927070.2269] device (tapc8f841b5-ba): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec 05 09:31:10 compute-1 NetworkManager[55704]: <info>  [1764927070.2280] device (tapc8f841b5-ba): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec 05 09:31:10 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:31:10.235 105809 DEBUG oslo.privsep.daemon [-] privsep: reply[0435f224-7e33-4070-8d9d-24b2db74426b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:31:10 compute-1 systemd[1]: Started Virtual Machine qemu-11-instance-00000016.
Dec 05 09:31:10 compute-1 nova_compute[189066]: 2025-12-05 09:31:10.243 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:31:10 compute-1 ovn_controller[95809]: 2025-12-05T09:31:10Z|00144|binding|INFO|Setting lport c8f841b5-bac4-4aed-83b6-65ebccb9c49e ovn-installed in OVS
Dec 05 09:31:10 compute-1 ovn_controller[95809]: 2025-12-05T09:31:10Z|00145|binding|INFO|Setting lport c8f841b5-bac4-4aed-83b6-65ebccb9c49e up in Southbound
Dec 05 09:31:10 compute-1 nova_compute[189066]: 2025-12-05 09:31:10.245 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:31:10 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:31:10.261 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[382ecb9d-1854-41ad-8b0f-0ec6dddc3da0]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:31:10 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:31:10.298 220896 DEBUG oslo.privsep.daemon [-] privsep: reply[8a49023e-52bb-4284-b069-f8bf03be8c50]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:31:10 compute-1 systemd-udevd[226199]: Network interface NamePolicy= disabled on kernel command line.
Dec 05 09:31:10 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:31:10.305 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[82897120-24bb-430b-9894-f9eac5c746c4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:31:10 compute-1 NetworkManager[55704]: <info>  [1764927070.3065] manager: (tapdbc58ba4-80): new Veth device (/org/freedesktop/NetworkManager/Devices/77)
Dec 05 09:31:10 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:31:10.340 220896 DEBUG oslo.privsep.daemon [-] privsep: reply[c03387ab-cf30-44a3-9c72-6cf82cd41a0a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:31:10 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:31:10.344 220896 DEBUG oslo.privsep.daemon [-] privsep: reply[a2cf4f10-72b5-4130-b3c4-8e3a847b42e3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:31:10 compute-1 NetworkManager[55704]: <info>  [1764927070.3662] device (tapdbc58ba4-80): carrier: link connected
Dec 05 09:31:10 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:31:10.373 220896 DEBUG oslo.privsep.daemon [-] privsep: reply[79254e88-9361-4132-8fa6-1cbeee0138b2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:31:10 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:31:10.393 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[c6afa10f-2e32-4928-aa6b-b32ae31c8c12]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapdbc58ba4-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:54:ba:52'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 43], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 434337, 'reachable_time': 44663, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 226233, 'error': None, 'target': 'ovnmeta-dbc58ba4-8158-4179-bf5b-c94cb9a82196', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:31:10 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:31:10.410 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[fed33915-b5d9-4fea-a05d-5d22152ae6c4]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe54:ba52'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 434337, 'tstamp': 434337}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 226235, 'error': None, 'target': 'ovnmeta-dbc58ba4-8158-4179-bf5b-c94cb9a82196', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:31:10 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:31:10.432 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[07b08be8-4ecf-41e8-b066-78ed4b4f7b18]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapdbc58ba4-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:54:ba:52'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 43], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 434337, 'reachable_time': 44663, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 148, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 148, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 226236, 'error': None, 'target': 'ovnmeta-dbc58ba4-8158-4179-bf5b-c94cb9a82196', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:31:10 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:31:10.472 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[f05a5da3-0d63-4326-bad4-9f8ae0831e5b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:31:10 compute-1 nova_compute[189066]: 2025-12-05 09:31:10.492 189070 DEBUG nova.virt.driver [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] Emitting event <LifecycleEvent: 1764927070.4914384, a0adc838-396e-45a2-8503-a2ab451cd778 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 09:31:10 compute-1 nova_compute[189066]: 2025-12-05 09:31:10.492 189070 INFO nova.compute.manager [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] [instance: a0adc838-396e-45a2-8503-a2ab451cd778] VM Started (Lifecycle Event)
Dec 05 09:31:10 compute-1 nova_compute[189066]: 2025-12-05 09:31:10.517 189070 DEBUG nova.compute.manager [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] [instance: a0adc838-396e-45a2-8503-a2ab451cd778] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 09:31:10 compute-1 nova_compute[189066]: 2025-12-05 09:31:10.521 189070 DEBUG nova.virt.driver [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] Emitting event <LifecycleEvent: 1764927070.494723, a0adc838-396e-45a2-8503-a2ab451cd778 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 09:31:10 compute-1 nova_compute[189066]: 2025-12-05 09:31:10.522 189070 INFO nova.compute.manager [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] [instance: a0adc838-396e-45a2-8503-a2ab451cd778] VM Paused (Lifecycle Event)
Dec 05 09:31:10 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:31:10.532 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[7662c702-31e6-4c13-9314-ec983c5dd21b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:31:10 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:31:10.534 105272 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapdbc58ba4-80, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 09:31:10 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:31:10.534 105272 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 09:31:10 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:31:10.534 105272 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapdbc58ba4-80, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 09:31:10 compute-1 nova_compute[189066]: 2025-12-05 09:31:10.536 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:31:10 compute-1 NetworkManager[55704]: <info>  [1764927070.5377] manager: (tapdbc58ba4-80): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/78)
Dec 05 09:31:10 compute-1 kernel: tapdbc58ba4-80: entered promiscuous mode
Dec 05 09:31:10 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:31:10.540 105272 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapdbc58ba4-80, col_values=(('external_ids', {'iface-id': 'dfe2c5ef-972a-4b59-b38d-e7bdc4437f37'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 09:31:10 compute-1 nova_compute[189066]: 2025-12-05 09:31:10.540 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:31:10 compute-1 ovn_controller[95809]: 2025-12-05T09:31:10Z|00146|binding|INFO|Releasing lport dfe2c5ef-972a-4b59-b38d-e7bdc4437f37 from this chassis (sb_readonly=0)
Dec 05 09:31:10 compute-1 nova_compute[189066]: 2025-12-05 09:31:10.541 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:31:10 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:31:10.542 105272 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/dbc58ba4-8158-4179-bf5b-c94cb9a82196.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/dbc58ba4-8158-4179-bf5b-c94cb9a82196.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Dec 05 09:31:10 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:31:10.543 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[6c4451f3-bcbd-4293-a376-721cc7b35b63]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:31:10 compute-1 nova_compute[189066]: 2025-12-05 09:31:10.543 189070 DEBUG nova.compute.manager [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] [instance: a0adc838-396e-45a2-8503-a2ab451cd778] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 09:31:10 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:31:10.543 105272 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Dec 05 09:31:10 compute-1 ovn_metadata_agent[105267]: global
Dec 05 09:31:10 compute-1 ovn_metadata_agent[105267]:     log         /dev/log local0 debug
Dec 05 09:31:10 compute-1 ovn_metadata_agent[105267]:     log-tag     haproxy-metadata-proxy-dbc58ba4-8158-4179-bf5b-c94cb9a82196
Dec 05 09:31:10 compute-1 ovn_metadata_agent[105267]:     user        root
Dec 05 09:31:10 compute-1 ovn_metadata_agent[105267]:     group       root
Dec 05 09:31:10 compute-1 ovn_metadata_agent[105267]:     maxconn     1024
Dec 05 09:31:10 compute-1 ovn_metadata_agent[105267]:     pidfile     /var/lib/neutron/external/pids/dbc58ba4-8158-4179-bf5b-c94cb9a82196.pid.haproxy
Dec 05 09:31:10 compute-1 ovn_metadata_agent[105267]:     daemon
Dec 05 09:31:10 compute-1 ovn_metadata_agent[105267]: 
Dec 05 09:31:10 compute-1 ovn_metadata_agent[105267]: defaults
Dec 05 09:31:10 compute-1 ovn_metadata_agent[105267]:     log global
Dec 05 09:31:10 compute-1 ovn_metadata_agent[105267]:     mode http
Dec 05 09:31:10 compute-1 ovn_metadata_agent[105267]:     option httplog
Dec 05 09:31:10 compute-1 ovn_metadata_agent[105267]:     option dontlognull
Dec 05 09:31:10 compute-1 ovn_metadata_agent[105267]:     option http-server-close
Dec 05 09:31:10 compute-1 ovn_metadata_agent[105267]:     option forwardfor
Dec 05 09:31:10 compute-1 ovn_metadata_agent[105267]:     retries                 3
Dec 05 09:31:10 compute-1 ovn_metadata_agent[105267]:     timeout http-request    30s
Dec 05 09:31:10 compute-1 ovn_metadata_agent[105267]:     timeout connect         30s
Dec 05 09:31:10 compute-1 ovn_metadata_agent[105267]:     timeout client          32s
Dec 05 09:31:10 compute-1 ovn_metadata_agent[105267]:     timeout server          32s
Dec 05 09:31:10 compute-1 ovn_metadata_agent[105267]:     timeout http-keep-alive 30s
Dec 05 09:31:10 compute-1 ovn_metadata_agent[105267]: 
Dec 05 09:31:10 compute-1 ovn_metadata_agent[105267]: 
Dec 05 09:31:10 compute-1 ovn_metadata_agent[105267]: listen listener
Dec 05 09:31:10 compute-1 ovn_metadata_agent[105267]:     bind 169.254.169.254:80
Dec 05 09:31:10 compute-1 ovn_metadata_agent[105267]:     server metadata /var/lib/neutron/metadata_proxy
Dec 05 09:31:10 compute-1 ovn_metadata_agent[105267]:     http-request add-header X-OVN-Network-ID dbc58ba4-8158-4179-bf5b-c94cb9a82196
Dec 05 09:31:10 compute-1 ovn_metadata_agent[105267]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Dec 05 09:31:10 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:31:10.544 105272 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-dbc58ba4-8158-4179-bf5b-c94cb9a82196', 'env', 'PROCESS_TAG=haproxy-dbc58ba4-8158-4179-bf5b-c94cb9a82196', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/dbc58ba4-8158-4179-bf5b-c94cb9a82196.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Dec 05 09:31:10 compute-1 nova_compute[189066]: 2025-12-05 09:31:10.548 189070 DEBUG nova.compute.manager [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] [instance: a0adc838-396e-45a2-8503-a2ab451cd778] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 09:31:10 compute-1 nova_compute[189066]: 2025-12-05 09:31:10.554 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:31:10 compute-1 nova_compute[189066]: 2025-12-05 09:31:10.574 189070 INFO nova.compute.manager [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] [instance: a0adc838-396e-45a2-8503-a2ab451cd778] During sync_power_state the instance has a pending task (spawning). Skip.
Dec 05 09:31:10 compute-1 nova_compute[189066]: 2025-12-05 09:31:10.600 189070 DEBUG nova.network.neutron [req-3cf9f110-6013-46b7-9b5e-6cb8000a563a req-384e4f01-df7f-41a1-a74f-219f6ea36784 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: a0adc838-396e-45a2-8503-a2ab451cd778] Updated VIF entry in instance network info cache for port c8f841b5-bac4-4aed-83b6-65ebccb9c49e. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Dec 05 09:31:10 compute-1 nova_compute[189066]: 2025-12-05 09:31:10.600 189070 DEBUG nova.network.neutron [req-3cf9f110-6013-46b7-9b5e-6cb8000a563a req-384e4f01-df7f-41a1-a74f-219f6ea36784 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: a0adc838-396e-45a2-8503-a2ab451cd778] Updating instance_info_cache with network_info: [{"id": "c8f841b5-bac4-4aed-83b6-65ebccb9c49e", "address": "fa:16:3e:a7:2b:28", "network": {"id": "dbc58ba4-8158-4179-bf5b-c94cb9a82196", "bridge": "br-int", "label": "tempest-network-smoke--906868098", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6c5bb818cba543bbb1bcff8df31dd9cd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc8f841b5-ba", "ovs_interfaceid": "c8f841b5-bac4-4aed-83b6-65ebccb9c49e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 09:31:10 compute-1 nova_compute[189066]: 2025-12-05 09:31:10.635 189070 DEBUG oslo_concurrency.lockutils [req-3cf9f110-6013-46b7-9b5e-6cb8000a563a req-384e4f01-df7f-41a1-a74f-219f6ea36784 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Releasing lock "refresh_cache-a0adc838-396e-45a2-8503-a2ab451cd778" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.750 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'a0adc838-396e-45a2-8503-a2ab451cd778', 'name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-1223075532-access_point-49256169', 'flavor': {'id': 'fbadeab4-f24f-4100-963a-d228b2a6f7c4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '3ebffd97-b242-42d7-b245-ebdaf8e4377c'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000016', 'OS-EXT-SRV-ATTR:host': 'compute-1.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'paused', 'tenant_id': '6c5bb818cba543bbb1bcff8df31dd9cd', 'user_id': 'bcc37d16c39547bba794fb1f43e889c1', 'hostId': 'f2577ca8636883f79a0aa37c042a4881acabab47495e37fd012025fb', 'status': 'paused', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.754 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.784 12 DEBUG ceilometer.compute.pollsters [-] a0adc838-396e-45a2-8503-a2ab451cd778/disk.device.read.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.786 12 DEBUG ceilometer.compute.pollsters [-] a0adc838-396e-45a2-8503-a2ab451cd778/disk.device.read.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.790 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '86a0146b-cfe8-45f6-8078-5d88acfe3f11', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'bcc37d16c39547bba794fb1f43e889c1', 'user_name': None, 'project_id': '6c5bb818cba543bbb1bcff8df31dd9cd', 'project_name': None, 'resource_id': 'a0adc838-396e-45a2-8503-a2ab451cd778-vda', 'timestamp': '2025-12-05T09:31:10.754749', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-1223075532-access_point-49256169', 'name': 'instance-00000016', 'instance_id': 'a0adc838-396e-45a2-8503-a2ab451cd778', 'instance_type': 'm1.nano', 'host': 'f2577ca8636883f79a0aa37c042a4881acabab47495e37fd012025fb', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'fbadeab4-f24f-4100-963a-d228b2a6f7c4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '3ebffd97-b242-42d7-b245-ebdaf8e4377c'}, 'image_ref': '3ebffd97-b242-42d7-b245-ebdaf8e4377c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '22976b72-d1bd-11f0-a2f3-fa163e9454b0', 'monotonic_time': 4343.812393455, 'message_signature': '1667c68b8f2d412d2064d51ad3ad5c57942525355ddd0f10424c14ff36360bde'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'bcc37d16c39547bba794fb1f43e889c1', 'user_name': None, 'project_id': '6c5bb818cba543bbb1bcff8df31dd9cd', 'project_name': None, 'resource_id': 'a0adc838-396e-45a2-8503-a2ab451cd778-sda', 'timestamp': '2025-12-05T09:31:10.754749', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-1223075532-access_point-49256169', 'name': 'instance-00000016', 'instance_id': 'a0adc838-396e-45a2-8503-a2ab451cd778', 'instance_type': 'm1.nano', 'host': 'f2577ca8636883f79a0aa37c042a4881acabab47495e37fd012025fb', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'fbadeab4-f24f-4100-963a-d228b2a6f7c4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '3ebffd97-b242-42d7-b245-ebdaf8e4377c'}, 'image_ref': '3ebffd97-b242-42d7-b245-ebdaf8e4377c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '22978814-d1bd-11f0-a2f3-fa163e9454b0', 'monotonic_time': 4343.812393455, 'message_signature': 'caceeaa0466106f46c448e37cdf400d6098dec74ab3d2b96baa234a36570145e'}]}, 'timestamp': '2025-12-05 09:31:10.786720', '_unique_id': 'cdac4871aaac43b2a7390cc8a83b8042'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.790 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.790 12 ERROR oslo_messaging.notify.messaging     yield
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.790 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.790 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.790 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.790 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.790 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.790 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.790 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.790 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.790 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.790 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.790 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.790 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.790 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.790 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.790 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.790 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.790 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.790 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.790 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.790 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.790 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.790 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.790 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.790 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.790 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.790 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.790 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.790 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.790 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.791 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.795 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for a0adc838-396e-45a2-8503-a2ab451cd778 / tapc8f841b5-ba inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.795 12 DEBUG ceilometer.compute.pollsters [-] a0adc838-396e-45a2-8503-a2ab451cd778/network.outgoing.packets volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.797 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '64a834bf-e048-4f01-b665-12666b560471', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'bcc37d16c39547bba794fb1f43e889c1', 'user_name': None, 'project_id': '6c5bb818cba543bbb1bcff8df31dd9cd', 'project_name': None, 'resource_id': 'instance-00000016-a0adc838-396e-45a2-8503-a2ab451cd778-tapc8f841b5-ba', 'timestamp': '2025-12-05T09:31:10.791824', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-1223075532-access_point-49256169', 'name': 'tapc8f841b5-ba', 'instance_id': 'a0adc838-396e-45a2-8503-a2ab451cd778', 'instance_type': 'm1.nano', 'host': 'f2577ca8636883f79a0aa37c042a4881acabab47495e37fd012025fb', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'fbadeab4-f24f-4100-963a-d228b2a6f7c4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '3ebffd97-b242-42d7-b245-ebdaf8e4377c'}, 'image_ref': '3ebffd97-b242-42d7-b245-ebdaf8e4377c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:a7:2b:28', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapc8f841b5-ba'}, 'message_id': '2298f712-d1bd-11f0-a2f3-fa163e9454b0', 'monotonic_time': 4343.849441956, 'message_signature': '937586c9eee5d02acdd70a28f8200e0a85f7ebaa07ebdc8ee1416502c9ddd273'}]}, 'timestamp': '2025-12-05 09:31:10.796221', '_unique_id': 'c6e7e6415313434c8f0ba3809c9f640a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.797 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.797 12 ERROR oslo_messaging.notify.messaging     yield
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.797 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.797 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.797 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.797 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.797 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.797 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.797 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.797 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.797 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.797 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.797 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.797 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.797 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.797 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.797 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.797 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.797 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.797 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.797 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.797 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.797 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.797 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.797 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.797 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.797 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.797 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.797 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.797 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.797 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.799 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.799 12 DEBUG ceilometer.compute.pollsters [-] a0adc838-396e-45a2-8503-a2ab451cd778/disk.device.read.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.800 12 DEBUG ceilometer.compute.pollsters [-] a0adc838-396e-45a2-8503-a2ab451cd778/disk.device.read.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.802 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9ad21045-bdbd-4eb2-8711-90857e3f0c72', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': 'bcc37d16c39547bba794fb1f43e889c1', 'user_name': None, 'project_id': '6c5bb818cba543bbb1bcff8df31dd9cd', 'project_name': None, 'resource_id': 'a0adc838-396e-45a2-8503-a2ab451cd778-vda', 'timestamp': '2025-12-05T09:31:10.799884', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-1223075532-access_point-49256169', 'name': 'instance-00000016', 'instance_id': 'a0adc838-396e-45a2-8503-a2ab451cd778', 'instance_type': 'm1.nano', 'host': 'f2577ca8636883f79a0aa37c042a4881acabab47495e37fd012025fb', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'fbadeab4-f24f-4100-963a-d228b2a6f7c4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '3ebffd97-b242-42d7-b245-ebdaf8e4377c'}, 'image_ref': '3ebffd97-b242-42d7-b245-ebdaf8e4377c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '22999ee2-d1bd-11f0-a2f3-fa163e9454b0', 'monotonic_time': 4343.812393455, 'message_signature': 'e43d9a359e3b5d6d7af1aa10b0d2e5b1be9d0f0eee8397c94acfa6641889b5e0'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': 'bcc37d16c39547bba794fb1f43e889c1', 'user_name': None, 'project_id': '6c5bb818cba543bbb1bcff8df31dd9cd', 'project_name': None, 'resource_id': 'a0adc838-396e-45a2-8503-a2ab451cd778-sda', 'timestamp': '2025-12-05T09:31:10.799884', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-1223075532-access_point-49256169', 'name': 'instance-00000016', 'instance_id': 'a0adc838-396e-45a2-8503-a2ab451cd778', 'instance_type': 'm1.nano', 'host': 'f2577ca8636883f79a0aa37c042a4881acabab47495e37fd012025fb', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'fbadeab4-f24f-4100-963a-d228b2a6f7c4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '3ebffd97-b242-42d7-b245-ebdaf8e4377c'}, 'image_ref': '3ebffd97-b242-42d7-b245-ebdaf8e4377c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '2299aedc-d1bd-11f0-a2f3-fa163e9454b0', 'monotonic_time': 4343.812393455, 'message_signature': 'c974dd00f99816fd7ee639a91ab2c91070bd2b7f845af6ca1ea954af6909315b'}]}, 'timestamp': '2025-12-05 09:31:10.800792', '_unique_id': 'ea92ed3cf29c430282256c0ddf7d65a1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.802 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.802 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.802 12 ERROR oslo_messaging.notify.messaging     yield
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.802 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.802 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.802 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.802 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.802 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.802 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.802 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.802 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.802 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.802 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.802 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.802 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.802 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.802 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.802 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.802 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.802 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.802 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.802 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.802 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.802 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.802 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.802 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.802 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.802 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.802 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.802 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.802 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.802 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.802 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.802 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.802 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.802 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.802 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.802 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.802 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.802 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.802 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.802 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.802 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.802 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.802 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.802 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.802 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.802 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.802 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.802 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.802 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.802 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.802 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.802 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.803 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.804 12 DEBUG ceilometer.compute.pollsters [-] a0adc838-396e-45a2-8503-a2ab451cd778/disk.device.read.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.804 12 DEBUG ceilometer.compute.pollsters [-] a0adc838-396e-45a2-8503-a2ab451cd778/disk.device.read.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.805 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1892bc35-9e91-4336-bdb9-3dc96f6314ef', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': 'bcc37d16c39547bba794fb1f43e889c1', 'user_name': None, 'project_id': '6c5bb818cba543bbb1bcff8df31dd9cd', 'project_name': None, 'resource_id': 'a0adc838-396e-45a2-8503-a2ab451cd778-vda', 'timestamp': '2025-12-05T09:31:10.804096', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-1223075532-access_point-49256169', 'name': 'instance-00000016', 'instance_id': 'a0adc838-396e-45a2-8503-a2ab451cd778', 'instance_type': 'm1.nano', 'host': 'f2577ca8636883f79a0aa37c042a4881acabab47495e37fd012025fb', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'fbadeab4-f24f-4100-963a-d228b2a6f7c4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '3ebffd97-b242-42d7-b245-ebdaf8e4377c'}, 'image_ref': '3ebffd97-b242-42d7-b245-ebdaf8e4377c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '229a434c-d1bd-11f0-a2f3-fa163e9454b0', 'monotonic_time': 4343.812393455, 'message_signature': 'da2f998d90919f456e6ecf6f92edcb0d77f3949a710aedc0731bc67f39b11549'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': 'bcc37d16c39547bba794fb1f43e889c1', 'user_name': None, 'project_id': '6c5bb818cba543bbb1bcff8df31dd9cd', 'project_name': None, 'resource_id': 'a0adc838-396e-45a2-8503-a2ab451cd778-sda', 'timestamp': '2025-12-05T09:31:10.804096', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-1223075532-access_point-49256169', 'name': 'instance-00000016', 'instance_id': 'a0adc838-396e-45a2-8503-a2ab451cd778', 'instance_type': 'm1.nano', 'host': 'f2577ca8636883f79a0aa37c042a4881acabab47495e37fd012025fb', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'fbadeab4-f24f-4100-963a-d228b2a6f7c4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '3ebffd97-b242-42d7-b245-ebdaf8e4377c'}, 'image_ref': '3ebffd97-b242-42d7-b245-ebdaf8e4377c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '229a526a-d1bd-11f0-a2f3-fa163e9454b0', 'monotonic_time': 4343.812393455, 'message_signature': 'c82996482752a263fe4d7b254e71d89d35c005c22ca0c692231df598fec723e5'}]}, 'timestamp': '2025-12-05 09:31:10.804955', '_unique_id': '7c244537aa2447efb76180a763ea16fe'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.805 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.805 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.805 12 ERROR oslo_messaging.notify.messaging     yield
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.805 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.805 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.805 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.805 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.805 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.805 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.805 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.805 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.805 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.805 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.805 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.805 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.805 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.805 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.805 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.805 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.805 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.805 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.805 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.805 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.805 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.805 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.805 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.805 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.805 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.805 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.805 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.805 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.805 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.805 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.805 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.805 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.805 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.805 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.805 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.805 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.805 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.805 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.805 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.805 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.805 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.805 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.805 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.805 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.805 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.805 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.805 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.805 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.805 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.805 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.805 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.806 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.823 12 DEBUG ceilometer.compute.pollsters [-] a0adc838-396e-45a2-8503-a2ab451cd778/memory.usage volume: Unavailable _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.823 12 WARNING ceilometer.compute.pollsters [-] memory.usage statistic in not available for instance a0adc838-396e-45a2-8503-a2ab451cd778: ceilometer.compute.pollsters.NoVolumeException
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.824 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.824 12 DEBUG ceilometer.compute.pollsters [-] a0adc838-396e-45a2-8503-a2ab451cd778/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.826 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c854995e-ac8f-4d71-a060-030fe901bcb8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'bcc37d16c39547bba794fb1f43e889c1', 'user_name': None, 'project_id': '6c5bb818cba543bbb1bcff8df31dd9cd', 'project_name': None, 'resource_id': 'instance-00000016-a0adc838-396e-45a2-8503-a2ab451cd778-tapc8f841b5-ba', 'timestamp': '2025-12-05T09:31:10.824338', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-1223075532-access_point-49256169', 'name': 'tapc8f841b5-ba', 'instance_id': 'a0adc838-396e-45a2-8503-a2ab451cd778', 'instance_type': 'm1.nano', 'host': 'f2577ca8636883f79a0aa37c042a4881acabab47495e37fd012025fb', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'fbadeab4-f24f-4100-963a-d228b2a6f7c4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '3ebffd97-b242-42d7-b245-ebdaf8e4377c'}, 'image_ref': '3ebffd97-b242-42d7-b245-ebdaf8e4377c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:a7:2b:28', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapc8f841b5-ba'}, 'message_id': '229d5a46-d1bd-11f0-a2f3-fa163e9454b0', 'monotonic_time': 4343.849441956, 'message_signature': '7845fa4968bd5c576d9a19f99b4e2bed53d00631b99231d2c146aa93eb054c5b'}]}, 'timestamp': '2025-12-05 09:31:10.824980', '_unique_id': 'bda9bc01e19e457e8d2812670b3440b1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.826 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.826 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.826 12 ERROR oslo_messaging.notify.messaging     yield
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.826 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.826 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.826 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.826 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.826 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.826 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.826 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.826 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.826 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.826 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.826 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.826 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.826 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.826 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.826 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.826 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.826 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.826 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.826 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.826 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.826 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.826 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.826 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.826 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.826 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.826 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.826 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.826 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.826 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.826 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.826 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.826 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.826 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.826 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.826 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.826 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.826 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.826 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.826 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.826 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.826 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.826 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.826 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.826 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.826 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.826 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.826 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.826 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.826 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.826 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.826 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.828 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.828 12 DEBUG ceilometer.compute.pollsters [-] a0adc838-396e-45a2-8503-a2ab451cd778/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.829 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6d0f240f-c814-4ad2-b73d-778eeafda4fd', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'bcc37d16c39547bba794fb1f43e889c1', 'user_name': None, 'project_id': '6c5bb818cba543bbb1bcff8df31dd9cd', 'project_name': None, 'resource_id': 'instance-00000016-a0adc838-396e-45a2-8503-a2ab451cd778-tapc8f841b5-ba', 'timestamp': '2025-12-05T09:31:10.828411', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-1223075532-access_point-49256169', 'name': 'tapc8f841b5-ba', 'instance_id': 'a0adc838-396e-45a2-8503-a2ab451cd778', 'instance_type': 'm1.nano', 'host': 'f2577ca8636883f79a0aa37c042a4881acabab47495e37fd012025fb', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'fbadeab4-f24f-4100-963a-d228b2a6f7c4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '3ebffd97-b242-42d7-b245-ebdaf8e4377c'}, 'image_ref': '3ebffd97-b242-42d7-b245-ebdaf8e4377c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:a7:2b:28', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapc8f841b5-ba'}, 'message_id': '229df578-d1bd-11f0-a2f3-fa163e9454b0', 'monotonic_time': 4343.849441956, 'message_signature': '9eb0e753893ecf356bb6d826d1ace3e2185caeae151deeaafe5879440251d0e9'}]}, 'timestamp': '2025-12-05 09:31:10.828914', '_unique_id': 'a4b47612a6ab43129ba558b22ca63f1d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.829 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.829 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.829 12 ERROR oslo_messaging.notify.messaging     yield
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.829 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.829 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.829 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.829 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.829 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.829 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.829 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.829 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.829 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.829 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.829 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.829 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.829 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.829 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.829 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.829 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.829 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.829 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.829 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.829 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.829 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.829 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.829 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.829 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.829 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.829 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.829 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.829 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.829 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.829 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.829 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.829 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.829 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.829 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.829 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.829 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.829 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.829 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.829 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.829 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.829 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.829 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.829 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.829 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.829 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.829 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.829 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.829 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.829 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.829 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.829 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.831 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.831 12 DEBUG ceilometer.compute.pollsters [-] a0adc838-396e-45a2-8503-a2ab451cd778/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.831 12 DEBUG ceilometer.compute.pollsters [-] a0adc838-396e-45a2-8503-a2ab451cd778/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.832 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '790c34d5-374b-4553-97be-04298e08eaf4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': 'bcc37d16c39547bba794fb1f43e889c1', 'user_name': None, 'project_id': '6c5bb818cba543bbb1bcff8df31dd9cd', 'project_name': None, 'resource_id': 'a0adc838-396e-45a2-8503-a2ab451cd778-vda', 'timestamp': '2025-12-05T09:31:10.831315', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-1223075532-access_point-49256169', 'name': 'instance-00000016', 'instance_id': 'a0adc838-396e-45a2-8503-a2ab451cd778', 'instance_type': 'm1.nano', 'host': 'f2577ca8636883f79a0aa37c042a4881acabab47495e37fd012025fb', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'fbadeab4-f24f-4100-963a-d228b2a6f7c4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '3ebffd97-b242-42d7-b245-ebdaf8e4377c'}, 'image_ref': '3ebffd97-b242-42d7-b245-ebdaf8e4377c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '229e6846-d1bd-11f0-a2f3-fa163e9454b0', 'monotonic_time': 4343.812393455, 'message_signature': '37da2f3446740a57e25516bf3288dc94b39274aecfa1ecf6ce065a9247913bf4'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': 'bcc37d16c39547bba794fb1f43e889c1', 'user_name': None, 'project_id': '6c5bb818cba543bbb1bcff8df31dd9cd', 'project_name': None, 'resource_id': 'a0adc838-396e-45a2-8503-a2ab451cd778-sda', 'timestamp': '2025-12-05T09:31:10.831315', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-1223075532-access_point-49256169', 'name': 'instance-00000016', 'instance_id': 'a0adc838-396e-45a2-8503-a2ab451cd778', 'instance_type': 'm1.nano', 'host': 'f2577ca8636883f79a0aa37c042a4881acabab47495e37fd012025fb', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'fbadeab4-f24f-4100-963a-d228b2a6f7c4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '3ebffd97-b242-42d7-b245-ebdaf8e4377c'}, 'image_ref': '3ebffd97-b242-42d7-b245-ebdaf8e4377c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '229e74b2-d1bd-11f0-a2f3-fa163e9454b0', 'monotonic_time': 4343.812393455, 'message_signature': '2f3bf62f5c86b6792cc0374f65dd9325fe0b3ad13a5066409480a9fb42acd645'}]}, 'timestamp': '2025-12-05 09:31:10.832066', '_unique_id': '64949d4f532849278870de5d62f7bc46'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.832 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.832 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.832 12 ERROR oslo_messaging.notify.messaging     yield
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.832 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.832 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.832 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.832 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.832 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.832 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.832 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.832 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.832 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.832 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.832 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.832 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.832 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.832 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.832 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.832 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.832 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.832 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.832 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.832 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.832 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.832 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.832 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.832 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.832 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.832 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.832 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.832 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.832 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.832 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.832 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.832 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.832 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.832 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.832 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.832 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.832 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.832 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.832 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.832 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.832 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.832 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.832 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.832 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.832 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.832 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.832 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.832 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.832 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.832 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.832 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.834 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.834 12 DEBUG ceilometer.compute.pollsters [-] a0adc838-396e-45a2-8503-a2ab451cd778/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.835 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b9ab2b7a-752f-4f90-8c61-21a23e021872', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'bcc37d16c39547bba794fb1f43e889c1', 'user_name': None, 'project_id': '6c5bb818cba543bbb1bcff8df31dd9cd', 'project_name': None, 'resource_id': 'instance-00000016-a0adc838-396e-45a2-8503-a2ab451cd778-tapc8f841b5-ba', 'timestamp': '2025-12-05T09:31:10.834251', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-1223075532-access_point-49256169', 'name': 'tapc8f841b5-ba', 'instance_id': 'a0adc838-396e-45a2-8503-a2ab451cd778', 'instance_type': 'm1.nano', 'host': 'f2577ca8636883f79a0aa37c042a4881acabab47495e37fd012025fb', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'fbadeab4-f24f-4100-963a-d228b2a6f7c4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '3ebffd97-b242-42d7-b245-ebdaf8e4377c'}, 'image_ref': '3ebffd97-b242-42d7-b245-ebdaf8e4377c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:a7:2b:28', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapc8f841b5-ba'}, 'message_id': '229ed74a-d1bd-11f0-a2f3-fa163e9454b0', 'monotonic_time': 4343.849441956, 'message_signature': '737feca3d123c2974256d145fed800d4675a0f6dbe79d45b73cd77713f111781'}]}, 'timestamp': '2025-12-05 09:31:10.834658', '_unique_id': '19aeb7541c0e4f24b51093ee957c00ed'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.835 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.835 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.835 12 ERROR oslo_messaging.notify.messaging     yield
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.835 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.835 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.835 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.835 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.835 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.835 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.835 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.835 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.835 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.835 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.835 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.835 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.835 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.835 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.835 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.835 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.835 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.835 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.835 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.835 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.835 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.835 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.835 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.835 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.835 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.835 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.835 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.835 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.835 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.835 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.835 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.835 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.835 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.835 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.835 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.835 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.835 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.835 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.835 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.835 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.835 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.835 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.835 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.835 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.835 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.835 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.835 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.835 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.835 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.835 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.835 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.836 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.836 12 DEBUG ceilometer.compute.pollsters [-] a0adc838-396e-45a2-8503-a2ab451cd778/cpu volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.838 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '79806d9f-5193-452b-be1e-3001d3f3effb', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': 'bcc37d16c39547bba794fb1f43e889c1', 'user_name': None, 'project_id': '6c5bb818cba543bbb1bcff8df31dd9cd', 'project_name': None, 'resource_id': 'a0adc838-396e-45a2-8503-a2ab451cd778', 'timestamp': '2025-12-05T09:31:10.836942', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-1223075532-access_point-49256169', 'name': 'instance-00000016', 'instance_id': 'a0adc838-396e-45a2-8503-a2ab451cd778', 'instance_type': 'm1.nano', 'host': 'f2577ca8636883f79a0aa37c042a4881acabab47495e37fd012025fb', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'fbadeab4-f24f-4100-963a-d228b2a6f7c4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '3ebffd97-b242-42d7-b245-ebdaf8e4377c'}, 'image_ref': '3ebffd97-b242-42d7-b245-ebdaf8e4377c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '229f42e8-d1bd-11f0-a2f3-fa163e9454b0', 'monotonic_time': 4343.880583873, 'message_signature': '63f0a5c296f776b8db2331798b6f53855c1292622761113d178a672341565f10'}]}, 'timestamp': '2025-12-05 09:31:10.837364', '_unique_id': 'd6fe4dc9c7b54e438836d91cae4747ec'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.838 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.838 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.838 12 ERROR oslo_messaging.notify.messaging     yield
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.838 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.838 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.838 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.838 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.838 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.838 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.838 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.838 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.838 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.838 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.838 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.838 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.838 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.838 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.838 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.838 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.838 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.838 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.838 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.838 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.838 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.838 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.838 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.838 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.838 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.838 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.838 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.838 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.838 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.838 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.838 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.838 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.838 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.838 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.838 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.838 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.838 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.838 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.838 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.838 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.838 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.838 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.838 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.838 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.838 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.838 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.838 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.838 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.838 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.838 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.838 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.839 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.839 12 DEBUG ceilometer.compute.pollsters [-] a0adc838-396e-45a2-8503-a2ab451cd778/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.840 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '13decc66-0419-4b30-a987-b43a72c656cc', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'bcc37d16c39547bba794fb1f43e889c1', 'user_name': None, 'project_id': '6c5bb818cba543bbb1bcff8df31dd9cd', 'project_name': None, 'resource_id': 'instance-00000016-a0adc838-396e-45a2-8503-a2ab451cd778-tapc8f841b5-ba', 'timestamp': '2025-12-05T09:31:10.839401', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-1223075532-access_point-49256169', 'name': 'tapc8f841b5-ba', 'instance_id': 'a0adc838-396e-45a2-8503-a2ab451cd778', 'instance_type': 'm1.nano', 'host': 'f2577ca8636883f79a0aa37c042a4881acabab47495e37fd012025fb', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'fbadeab4-f24f-4100-963a-d228b2a6f7c4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '3ebffd97-b242-42d7-b245-ebdaf8e4377c'}, 'image_ref': '3ebffd97-b242-42d7-b245-ebdaf8e4377c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:a7:2b:28', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapc8f841b5-ba'}, 'message_id': '229fa0d0-d1bd-11f0-a2f3-fa163e9454b0', 'monotonic_time': 4343.849441956, 'message_signature': '8eb0e30dd0a33ce842ac0f6e49b06acc99c5b353efd4ee745453e2373f58dda1'}]}, 'timestamp': '2025-12-05 09:31:10.839764', '_unique_id': '4c72d012a8f24cd3b94805b9b4c64b7d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.840 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.840 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.840 12 ERROR oslo_messaging.notify.messaging     yield
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.840 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.840 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.840 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.840 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.840 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.840 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.840 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 05 09:31:10 compute-1 rsyslogd[1007]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.840 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.840 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.840 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.840 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.840 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.840 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.840 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.840 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.840 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.840 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.840 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.840 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.840 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.840 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.840 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.840 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.840 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.840 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.840 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.840 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.840 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.840 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.840 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.840 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.840 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.840 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.840 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.840 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.840 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.840 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.840 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.840 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.840 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.840 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.840 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.840 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.840 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.840 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.840 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.840 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.840 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.840 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.840 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.840 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.841 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.842 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.842 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-server-tempest-TestSecurityGroupsBasicOps-1223075532-access_point-49256169>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-server-tempest-TestSecurityGroupsBasicOps-1223075532-access_point-49256169>]
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.842 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.887 12 DEBUG ceilometer.compute.pollsters [-] a0adc838-396e-45a2-8503-a2ab451cd778/disk.device.allocation volume: 204800 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.888 12 DEBUG ceilometer.compute.pollsters [-] a0adc838-396e-45a2-8503-a2ab451cd778/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.890 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd3cba7a6-31ee-4abf-b07e-4be107cb4aa2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 204800, 'user_id': 'bcc37d16c39547bba794fb1f43e889c1', 'user_name': None, 'project_id': '6c5bb818cba543bbb1bcff8df31dd9cd', 'project_name': None, 'resource_id': 'a0adc838-396e-45a2-8503-a2ab451cd778-vda', 'timestamp': '2025-12-05T09:31:10.842752', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-1223075532-access_point-49256169', 'name': 'instance-00000016', 'instance_id': 'a0adc838-396e-45a2-8503-a2ab451cd778', 'instance_type': 'm1.nano', 'host': 'f2577ca8636883f79a0aa37c042a4881acabab47495e37fd012025fb', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'fbadeab4-f24f-4100-963a-d228b2a6f7c4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '3ebffd97-b242-42d7-b245-ebdaf8e4377c'}, 'image_ref': '3ebffd97-b242-42d7-b245-ebdaf8e4377c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '22a70cda-d1bd-11f0-a2f3-fa163e9454b0', 'monotonic_time': 4343.900319689, 'message_signature': '4eb708cb8582dcc1a814827fc69a1c43ad2a81b9d6754d21d7f2668c9e93f505'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': 'bcc37d16c39547bba794fb1f43e889c1', 'user_name': None, 'project_id': '6c5bb818cba543bbb1bcff8df31dd9cd', 'project_name': None, 'resource_id': 'a0adc838-396e-45a2-8503-a2ab451cd778-sda', 'timestamp': '2025-12-05T09:31:10.842752', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-1223075532-access_point-49256169', 'name': 'instance-00000016', 'instance_id': 'a0adc838-396e-45a2-8503-a2ab451cd778', 'instance_type': 'm1.nano', 'host': 'f2577ca8636883f79a0aa37c042a4881acabab47495e37fd012025fb', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'fbadeab4-f24f-4100-963a-d228b2a6f7c4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '3ebffd97-b242-42d7-b245-ebdaf8e4377c'}, 'image_ref': '3ebffd97-b242-42d7-b245-ebdaf8e4377c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '22a71d06-d1bd-11f0-a2f3-fa163e9454b0', 'monotonic_time': 4343.900319689, 'message_signature': '9cf81153e94c1658a82eccdede1bbc8bb5cadd3e2f467612439044e01946b2c0'}]}, 'timestamp': '2025-12-05 09:31:10.888809', '_unique_id': '28104ca27ffb4f70b2f8dc17627bdf1f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.890 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.890 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.890 12 ERROR oslo_messaging.notify.messaging     yield
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.890 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.890 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.890 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.890 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.890 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.890 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.890 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.890 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.890 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.890 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.890 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.890 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.890 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.890 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.890 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.890 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.890 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.890 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.890 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.890 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.890 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.890 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.890 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.890 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.890 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.890 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.890 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.890 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.890 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.890 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.890 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.890 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.890 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.890 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.890 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.890 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.890 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.890 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.890 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.890 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.890 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.890 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.890 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.890 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.890 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.890 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.890 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.890 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.890 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.890 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.890 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.891 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.891 12 DEBUG ceilometer.compute.pollsters [-] a0adc838-396e-45a2-8503-a2ab451cd778/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.891 12 DEBUG ceilometer.compute.pollsters [-] a0adc838-396e-45a2-8503-a2ab451cd778/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.892 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2fe66dc2-0fd5-4924-95d4-24d1a95ff827', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'bcc37d16c39547bba794fb1f43e889c1', 'user_name': None, 'project_id': '6c5bb818cba543bbb1bcff8df31dd9cd', 'project_name': None, 'resource_id': 'a0adc838-396e-45a2-8503-a2ab451cd778-vda', 'timestamp': '2025-12-05T09:31:10.891439', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-1223075532-access_point-49256169', 'name': 'instance-00000016', 'instance_id': 'a0adc838-396e-45a2-8503-a2ab451cd778', 'instance_type': 'm1.nano', 'host': 'f2577ca8636883f79a0aa37c042a4881acabab47495e37fd012025fb', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'fbadeab4-f24f-4100-963a-d228b2a6f7c4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '3ebffd97-b242-42d7-b245-ebdaf8e4377c'}, 'image_ref': '3ebffd97-b242-42d7-b245-ebdaf8e4377c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '22a7934e-d1bd-11f0-a2f3-fa163e9454b0', 'monotonic_time': 4343.900319689, 'message_signature': '2804af3207d3c40dc7621ef03c7b00c0b74dcded7d0ff1017ad7b33bb2c252e3'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'bcc37d16c39547bba794fb1f43e889c1', 'user_name': None, 'project_id': '6c5bb818cba543bbb1bcff8df31dd9cd', 'project_name': None, 'resource_id': 'a0adc838-396e-45a2-8503-a2ab451cd778-sda', 'timestamp': '2025-12-05T09:31:10.891439', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-1223075532-access_point-49256169', 'name': 'instance-00000016', 'instance_id': 'a0adc838-396e-45a2-8503-a2ab451cd778', 'instance_type': 'm1.nano', 'host': 'f2577ca8636883f79a0aa37c042a4881acabab47495e37fd012025fb', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'fbadeab4-f24f-4100-963a-d228b2a6f7c4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '3ebffd97-b242-42d7-b245-ebdaf8e4377c'}, 'image_ref': '3ebffd97-b242-42d7-b245-ebdaf8e4377c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '22a79eb6-d1bd-11f0-a2f3-fa163e9454b0', 'monotonic_time': 4343.900319689, 'message_signature': '6f0aa092358d739255d14f244781cd6890c5a186803953a96fbb3b25732f1ebd'}]}, 'timestamp': '2025-12-05 09:31:10.892106', '_unique_id': 'bf5cae57f8514275a86c8244c4479539'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.892 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.892 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.892 12 ERROR oslo_messaging.notify.messaging     yield
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.892 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.892 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.892 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.892 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.892 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.892 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.892 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.892 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.892 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.892 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.892 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.892 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.892 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.892 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.892 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.892 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.892 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.892 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.892 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.892 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.892 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.892 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.892 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.892 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.892 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.892 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.892 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.892 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.892 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.892 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.892 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.892 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.892 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.892 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.892 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.892 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.892 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.892 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.892 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.892 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.892 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.892 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.892 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.892 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.892 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.892 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.892 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.892 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.892 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.892 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.892 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.893 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.893 12 DEBUG ceilometer.compute.pollsters [-] a0adc838-396e-45a2-8503-a2ab451cd778/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.894 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '51a3ebb9-c122-48bf-becc-903df18f6522', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'bcc37d16c39547bba794fb1f43e889c1', 'user_name': None, 'project_id': '6c5bb818cba543bbb1bcff8df31dd9cd', 'project_name': None, 'resource_id': 'instance-00000016-a0adc838-396e-45a2-8503-a2ab451cd778-tapc8f841b5-ba', 'timestamp': '2025-12-05T09:31:10.893786', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-1223075532-access_point-49256169', 'name': 'tapc8f841b5-ba', 'instance_id': 'a0adc838-396e-45a2-8503-a2ab451cd778', 'instance_type': 'm1.nano', 'host': 'f2577ca8636883f79a0aa37c042a4881acabab47495e37fd012025fb', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'fbadeab4-f24f-4100-963a-d228b2a6f7c4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '3ebffd97-b242-42d7-b245-ebdaf8e4377c'}, 'image_ref': '3ebffd97-b242-42d7-b245-ebdaf8e4377c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:a7:2b:28', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapc8f841b5-ba'}, 'message_id': '22a7ef2e-d1bd-11f0-a2f3-fa163e9454b0', 'monotonic_time': 4343.849441956, 'message_signature': '9d562ba7e1206def8d0715976aa3b0dc38005cfb91973501b25caf49cc0af60b'}]}, 'timestamp': '2025-12-05 09:31:10.894214', '_unique_id': 'c471662f33d14710a074ab1c1c841f76'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.894 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.894 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.894 12 ERROR oslo_messaging.notify.messaging     yield
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.894 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.894 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.894 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.894 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.894 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.894 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.894 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.894 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.894 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.894 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.894 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.894 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.894 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.894 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.894 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.894 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.894 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.894 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.894 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.894 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.894 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.894 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.894 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.894 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.894 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.894 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.894 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.894 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.894 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.894 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.894 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.894 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.894 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.894 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.894 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.894 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.894 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.894 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.894 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.894 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.894 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.894 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.894 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.894 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.894 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.894 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.894 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.894 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.894 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.894 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.894 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.895 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.895 12 DEBUG ceilometer.compute.pollsters [-] a0adc838-396e-45a2-8503-a2ab451cd778/network.outgoing.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.896 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3be8a3c0-0ab2-4543-8b9a-d2132b9174cd', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'bcc37d16c39547bba794fb1f43e889c1', 'user_name': None, 'project_id': '6c5bb818cba543bbb1bcff8df31dd9cd', 'project_name': None, 'resource_id': 'instance-00000016-a0adc838-396e-45a2-8503-a2ab451cd778-tapc8f841b5-ba', 'timestamp': '2025-12-05T09:31:10.895938', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-1223075532-access_point-49256169', 'name': 'tapc8f841b5-ba', 'instance_id': 'a0adc838-396e-45a2-8503-a2ab451cd778', 'instance_type': 'm1.nano', 'host': 'f2577ca8636883f79a0aa37c042a4881acabab47495e37fd012025fb', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'fbadeab4-f24f-4100-963a-d228b2a6f7c4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '3ebffd97-b242-42d7-b245-ebdaf8e4377c'}, 'image_ref': '3ebffd97-b242-42d7-b245-ebdaf8e4377c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:a7:2b:28', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapc8f841b5-ba'}, 'message_id': '22a83f56-d1bd-11f0-a2f3-fa163e9454b0', 'monotonic_time': 4343.849441956, 'message_signature': '30c1d3aaa1dc3f1046f69bd264174ec192696b8268f8754d90e7c0d20e417be7'}]}, 'timestamp': '2025-12-05 09:31:10.896267', '_unique_id': '1d25b560abb8458d9c81453b54578ebf'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.896 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.896 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.896 12 ERROR oslo_messaging.notify.messaging     yield
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.896 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.896 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.896 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.896 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.896 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.896 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.896 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.896 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.896 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.896 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.896 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.896 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.896 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.896 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.896 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.896 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.896 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.896 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.896 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.896 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.896 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.896 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.896 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.896 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.896 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.896 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.896 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.896 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.896 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.896 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.896 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.896 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.896 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.896 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.896 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.896 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.896 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.896 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.896 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.896 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.896 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.896 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.896 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.896 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.896 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.896 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.896 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.896 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.896 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.896 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.896 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.897 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.897 12 DEBUG ceilometer.compute.pollsters [-] a0adc838-396e-45a2-8503-a2ab451cd778/network.incoming.bytes volume: 90 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.898 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2c63757a-ff0f-4d6a-bb93-0564783feb36', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 90, 'user_id': 'bcc37d16c39547bba794fb1f43e889c1', 'user_name': None, 'project_id': '6c5bb818cba543bbb1bcff8df31dd9cd', 'project_name': None, 'resource_id': 'instance-00000016-a0adc838-396e-45a2-8503-a2ab451cd778-tapc8f841b5-ba', 'timestamp': '2025-12-05T09:31:10.897865', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-1223075532-access_point-49256169', 'name': 'tapc8f841b5-ba', 'instance_id': 'a0adc838-396e-45a2-8503-a2ab451cd778', 'instance_type': 'm1.nano', 'host': 'f2577ca8636883f79a0aa37c042a4881acabab47495e37fd012025fb', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'fbadeab4-f24f-4100-963a-d228b2a6f7c4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '3ebffd97-b242-42d7-b245-ebdaf8e4377c'}, 'image_ref': '3ebffd97-b242-42d7-b245-ebdaf8e4377c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:a7:2b:28', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapc8f841b5-ba'}, 'message_id': '22a88aa6-d1bd-11f0-a2f3-fa163e9454b0', 'monotonic_time': 4343.849441956, 'message_signature': '4c6be61ae75f6d23b2a2048a13d4aaed63ffb5394a166e8534b22a3863af984e'}]}, 'timestamp': '2025-12-05 09:31:10.898155', '_unique_id': '5cc9a47062d941ccb1f8032712fff621'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.898 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.898 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.898 12 ERROR oslo_messaging.notify.messaging     yield
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.898 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.898 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.898 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.898 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.898 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.898 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.898 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.898 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.898 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.898 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.898 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.898 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.898 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.898 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.898 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.898 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.898 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.898 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.898 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.898 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.898 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.898 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.898 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.898 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.898 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.898 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.898 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.898 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.898 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.898 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.898 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.898 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.898 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.898 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.898 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.898 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.898 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.898 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.898 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.898 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.898 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.898 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.898 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.898 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.898 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.898 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.898 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.898 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.898 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.898 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.898 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.899 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.899 12 DEBUG ceilometer.compute.pollsters [-] a0adc838-396e-45a2-8503-a2ab451cd778/network.incoming.packets volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.900 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ee43c977-a26d-4614-801d-9957809aa7a4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 1, 'user_id': 'bcc37d16c39547bba794fb1f43e889c1', 'user_name': None, 'project_id': '6c5bb818cba543bbb1bcff8df31dd9cd', 'project_name': None, 'resource_id': 'instance-00000016-a0adc838-396e-45a2-8503-a2ab451cd778-tapc8f841b5-ba', 'timestamp': '2025-12-05T09:31:10.899788', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-1223075532-access_point-49256169', 'name': 'tapc8f841b5-ba', 'instance_id': 'a0adc838-396e-45a2-8503-a2ab451cd778', 'instance_type': 'm1.nano', 'host': 'f2577ca8636883f79a0aa37c042a4881acabab47495e37fd012025fb', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'fbadeab4-f24f-4100-963a-d228b2a6f7c4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '3ebffd97-b242-42d7-b245-ebdaf8e4377c'}, 'image_ref': '3ebffd97-b242-42d7-b245-ebdaf8e4377c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:a7:2b:28', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapc8f841b5-ba'}, 'message_id': '22a8d600-d1bd-11f0-a2f3-fa163e9454b0', 'monotonic_time': 4343.849441956, 'message_signature': 'f5e66cef967dca82b1c083605976a93d10d6b86c93f3789fcc7a412629e9e7cc'}]}, 'timestamp': '2025-12-05 09:31:10.900114', '_unique_id': '2c3d271f614d4dcfa2a11f1592c249be'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.900 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.900 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.900 12 ERROR oslo_messaging.notify.messaging     yield
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.900 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.900 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.900 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.900 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.900 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.900 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.900 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.900 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.900 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.900 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.900 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.900 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.900 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.900 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.900 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.900 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.900 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.900 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.900 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.900 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.900 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.900 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.900 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.900 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.900 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.900 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.900 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.900 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.900 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.900 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.900 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.900 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.900 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.900 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.900 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.900 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.900 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.900 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.900 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.900 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.900 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.900 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.900 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.900 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.900 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.900 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.900 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.900 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.900 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.900 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.900 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.901 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.902 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.902 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-server-tempest-TestSecurityGroupsBasicOps-1223075532-access_point-49256169>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-server-tempest-TestSecurityGroupsBasicOps-1223075532-access_point-49256169>]
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.902 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.902 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.902 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-server-tempest-TestSecurityGroupsBasicOps-1223075532-access_point-49256169>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-server-tempest-TestSecurityGroupsBasicOps-1223075532-access_point-49256169>]
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.902 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.902 12 DEBUG ceilometer.compute.pollsters [-] a0adc838-396e-45a2-8503-a2ab451cd778/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.903 12 DEBUG ceilometer.compute.pollsters [-] a0adc838-396e-45a2-8503-a2ab451cd778/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.904 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c05d1031-3874-4403-8c99-a6adb5a950fc', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': 'bcc37d16c39547bba794fb1f43e889c1', 'user_name': None, 'project_id': '6c5bb818cba543bbb1bcff8df31dd9cd', 'project_name': None, 'resource_id': 'a0adc838-396e-45a2-8503-a2ab451cd778-vda', 'timestamp': '2025-12-05T09:31:10.902934', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-1223075532-access_point-49256169', 'name': 'instance-00000016', 'instance_id': 'a0adc838-396e-45a2-8503-a2ab451cd778', 'instance_type': 'm1.nano', 'host': 'f2577ca8636883f79a0aa37c042a4881acabab47495e37fd012025fb', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'fbadeab4-f24f-4100-963a-d228b2a6f7c4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '3ebffd97-b242-42d7-b245-ebdaf8e4377c'}, 'image_ref': '3ebffd97-b242-42d7-b245-ebdaf8e4377c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '22a950c6-d1bd-11f0-a2f3-fa163e9454b0', 'monotonic_time': 4343.812393455, 'message_signature': 'e0aed7219cfee1f52ee0d51cf327684152e442995a9fcf9cd2d37481dd02ac61'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': 'bcc37d16c39547bba794fb1f43e889c1', 'user_name': None, 'project_id': '6c5bb818cba543bbb1bcff8df31dd9cd', 'project_name': None, 'resource_id': 'a0adc838-396e-45a2-8503-a2ab451cd778-sda', 'timestamp': '2025-12-05T09:31:10.902934', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-1223075532-access_point-49256169', 'name': 'instance-00000016', 'instance_id': 'a0adc838-396e-45a2-8503-a2ab451cd778', 'instance_type': 'm1.nano', 'host': 'f2577ca8636883f79a0aa37c042a4881acabab47495e37fd012025fb', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'fbadeab4-f24f-4100-963a-d228b2a6f7c4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '3ebffd97-b242-42d7-b245-ebdaf8e4377c'}, 'image_ref': '3ebffd97-b242-42d7-b245-ebdaf8e4377c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '22a95b20-d1bd-11f0-a2f3-fa163e9454b0', 'monotonic_time': 4343.812393455, 'message_signature': 'a6d4d303c772d00fa2834dd3b8ba6af2282b0e192c492fd29a263d88e86fce2a'}]}, 'timestamp': '2025-12-05 09:31:10.903480', '_unique_id': 'ea700894d4a54036b43df2c74e070a32'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.904 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.904 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.904 12 ERROR oslo_messaging.notify.messaging     yield
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.904 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.904 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.904 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.904 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.904 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.904 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.904 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.904 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.904 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.904 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.904 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.904 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.904 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.904 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.904 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.904 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.904 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.904 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.904 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.904 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.904 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.904 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.904 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.904 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.904 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.904 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.904 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.904 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.904 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.904 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.904 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.904 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.904 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.904 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.904 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.904 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.904 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.904 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.904 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.904 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.904 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.904 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.904 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.904 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.904 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.904 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.904 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.904 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.904 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.904 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.904 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.905 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.905 12 DEBUG ceilometer.compute.pollsters [-] a0adc838-396e-45a2-8503-a2ab451cd778/disk.device.usage volume: 196624 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.905 12 DEBUG ceilometer.compute.pollsters [-] a0adc838-396e-45a2-8503-a2ab451cd778/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.906 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '13b1bbd9-56a7-4e9c-8afc-1d827986f8e3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 196624, 'user_id': 'bcc37d16c39547bba794fb1f43e889c1', 'user_name': None, 'project_id': '6c5bb818cba543bbb1bcff8df31dd9cd', 'project_name': None, 'resource_id': 'a0adc838-396e-45a2-8503-a2ab451cd778-vda', 'timestamp': '2025-12-05T09:31:10.905188', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-1223075532-access_point-49256169', 'name': 'instance-00000016', 'instance_id': 'a0adc838-396e-45a2-8503-a2ab451cd778', 'instance_type': 'm1.nano', 'host': 'f2577ca8636883f79a0aa37c042a4881acabab47495e37fd012025fb', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'fbadeab4-f24f-4100-963a-d228b2a6f7c4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '3ebffd97-b242-42d7-b245-ebdaf8e4377c'}, 'image_ref': '3ebffd97-b242-42d7-b245-ebdaf8e4377c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '22a9a88c-d1bd-11f0-a2f3-fa163e9454b0', 'monotonic_time': 4343.900319689, 'message_signature': '39fc66ecf0505c7b75370896c05b0c6b7c2613e56da78bd26b4165510c5bf745'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'bcc37d16c39547bba794fb1f43e889c1', 'user_name': None, 'project_id': '6c5bb818cba543bbb1bcff8df31dd9cd', 'project_name': None, 'resource_id': 'a0adc838-396e-45a2-8503-a2ab451cd778-sda', 'timestamp': '2025-12-05T09:31:10.905188', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-1223075532-access_point-49256169', 'name': 'instance-00000016', 'instance_id': 'a0adc838-396e-45a2-8503-a2ab451cd778', 'instance_type': 'm1.nano', 'host': 'f2577ca8636883f79a0aa37c042a4881acabab47495e37fd012025fb', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'fbadeab4-f24f-4100-963a-d228b2a6f7c4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '3ebffd97-b242-42d7-b245-ebdaf8e4377c'}, 'image_ref': '3ebffd97-b242-42d7-b245-ebdaf8e4377c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '22a9b5c0-d1bd-11f0-a2f3-fa163e9454b0', 'monotonic_time': 4343.900319689, 'message_signature': 'eda0fbbca261df2f51afbcc1120c47e27d8c5d83ed0f827b9df44482e4153dc4'}]}, 'timestamp': '2025-12-05 09:31:10.905799', '_unique_id': '5a4bcbed86f8453b9e426c2f5cf15bae'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.906 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.906 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.906 12 ERROR oslo_messaging.notify.messaging     yield
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.906 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.906 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.906 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.906 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.906 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.906 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.906 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.906 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.906 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.906 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.906 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.906 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.906 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.906 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.906 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.906 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.906 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.906 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.906 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.906 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.906 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.906 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.906 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.906 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.906 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.906 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.906 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.906 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.906 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.906 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.906 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.906 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.906 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.906 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.906 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.906 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.906 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.906 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.906 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.906 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.906 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.906 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.906 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.906 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.906 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.906 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.906 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.906 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.906 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.906 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.906 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.907 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.907 12 DEBUG ceilometer.compute.pollsters [-] a0adc838-396e-45a2-8503-a2ab451cd778/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.908 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ac28a3df-6ab3-456e-9010-08b70de38441', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'bcc37d16c39547bba794fb1f43e889c1', 'user_name': None, 'project_id': '6c5bb818cba543bbb1bcff8df31dd9cd', 'project_name': None, 'resource_id': 'instance-00000016-a0adc838-396e-45a2-8503-a2ab451cd778-tapc8f841b5-ba', 'timestamp': '2025-12-05T09:31:10.907448', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-1223075532-access_point-49256169', 'name': 'tapc8f841b5-ba', 'instance_id': 'a0adc838-396e-45a2-8503-a2ab451cd778', 'instance_type': 'm1.nano', 'host': 'f2577ca8636883f79a0aa37c042a4881acabab47495e37fd012025fb', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'fbadeab4-f24f-4100-963a-d228b2a6f7c4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '3ebffd97-b242-42d7-b245-ebdaf8e4377c'}, 'image_ref': '3ebffd97-b242-42d7-b245-ebdaf8e4377c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:a7:2b:28', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapc8f841b5-ba'}, 'message_id': '22aa020a-d1bd-11f0-a2f3-fa163e9454b0', 'monotonic_time': 4343.849441956, 'message_signature': '2c0acc0be44604d8f80efb242d4b550f31e1c5c46ee57e2f66110cc6f3a6611a'}]}, 'timestamp': '2025-12-05 09:31:10.907765', '_unique_id': '794a89f1042447c2a963054f4ccac400'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.908 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.908 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.908 12 ERROR oslo_messaging.notify.messaging     yield
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.908 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.908 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.908 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.908 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.908 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.908 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.908 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.908 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.908 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.908 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.908 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.908 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.908 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.908 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.908 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.908 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.908 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.908 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.908 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.908 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.908 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.908 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.908 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.908 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.908 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.908 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.908 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 05 09:31:10 compute-1 rsyslogd[1007]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.908 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.908 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.908 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.908 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.908 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.908 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.908 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.908 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.908 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.908 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.908 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.908 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.908 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.908 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.908 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.908 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.908 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.908 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.908 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.908 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.908 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.908 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.908 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.908 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.909 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.909 12 DEBUG ceilometer.compute.pollsters [-] a0adc838-396e-45a2-8503-a2ab451cd778/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.909 12 DEBUG ceilometer.compute.pollsters [-] a0adc838-396e-45a2-8503-a2ab451cd778/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.910 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '977e0384-5ae4-48b3-9efa-110c90a41787', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'bcc37d16c39547bba794fb1f43e889c1', 'user_name': None, 'project_id': '6c5bb818cba543bbb1bcff8df31dd9cd', 'project_name': None, 'resource_id': 'a0adc838-396e-45a2-8503-a2ab451cd778-vda', 'timestamp': '2025-12-05T09:31:10.909337', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-1223075532-access_point-49256169', 'name': 'instance-00000016', 'instance_id': 'a0adc838-396e-45a2-8503-a2ab451cd778', 'instance_type': 'm1.nano', 'host': 'f2577ca8636883f79a0aa37c042a4881acabab47495e37fd012025fb', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'fbadeab4-f24f-4100-963a-d228b2a6f7c4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '3ebffd97-b242-42d7-b245-ebdaf8e4377c'}, 'image_ref': '3ebffd97-b242-42d7-b245-ebdaf8e4377c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '22aa4a76-d1bd-11f0-a2f3-fa163e9454b0', 'monotonic_time': 4343.812393455, 'message_signature': '6a76ab35c6b83bab01786d6c64b8a0b51259a57ea3c64e0a27b16ccde28f74ec'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'bcc37d16c39547bba794fb1f43e889c1', 'user_name': None, 'project_id': '6c5bb818cba543bbb1bcff8df31dd9cd', 'project_name': None, 'resource_id': 'a0adc838-396e-45a2-8503-a2ab451cd778-sda', 'timestamp': '2025-12-05T09:31:10.909337', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-1223075532-access_point-49256169', 'name': 'instance-00000016', 'instance_id': 'a0adc838-396e-45a2-8503-a2ab451cd778', 'instance_type': 'm1.nano', 'host': 'f2577ca8636883f79a0aa37c042a4881acabab47495e37fd012025fb', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'fbadeab4-f24f-4100-963a-d228b2a6f7c4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '3ebffd97-b242-42d7-b245-ebdaf8e4377c'}, 'image_ref': '3ebffd97-b242-42d7-b245-ebdaf8e4377c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '22aa56b0-d1bd-11f0-a2f3-fa163e9454b0', 'monotonic_time': 4343.812393455, 'message_signature': '876ee16b22865ac042ddf27a9ad8d859bb332e178591487b5b7c23d6b4b12bcb'}]}, 'timestamp': '2025-12-05 09:31:10.909940', '_unique_id': 'c7ff98a4b03a4b0f9bffe6e9848a369f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.910 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.910 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.910 12 ERROR oslo_messaging.notify.messaging     yield
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.910 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.910 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.910 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.910 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.910 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.910 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.910 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.910 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.910 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.910 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.910 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.910 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.910 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.910 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.910 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.910 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.910 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.910 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.910 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.910 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.910 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.910 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.910 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.910 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.910 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.910 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.910 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.910 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.910 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.910 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.910 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.910 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.910 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.910 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.910 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.910 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.910 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.910 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.910 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.910 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.910 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.910 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.910 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.910 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.910 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.910 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.910 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.910 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.910 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.910 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.910 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.911 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.911 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Dec 05 09:31:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:31:10.912 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-server-tempest-TestSecurityGroupsBasicOps-1223075532-access_point-49256169>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-server-tempest-TestSecurityGroupsBasicOps-1223075532-access_point-49256169>]
Dec 05 09:31:10 compute-1 podman[226268]: 2025-12-05 09:31:10.961856247 +0000 UTC m=+0.053160009 container create 2c2206b261b3bab6122ba5007d2777e2483f9fa703006f4bc735f638caff06db (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-dbc58ba4-8158-4179-bf5b-c94cb9a82196, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Dec 05 09:31:10 compute-1 systemd[1]: Started libpod-conmon-2c2206b261b3bab6122ba5007d2777e2483f9fa703006f4bc735f638caff06db.scope.
Dec 05 09:31:11 compute-1 systemd[1]: Started libcrun container.
Dec 05 09:31:11 compute-1 nova_compute[189066]: 2025-12-05 09:31:11.020 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:31:11 compute-1 nova_compute[189066]: 2025-12-05 09:31:11.022 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:31:11 compute-1 nova_compute[189066]: 2025-12-05 09:31:11.022 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:31:11 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c8a08c4d73e653d495ce7a5964b3e3119615b7065cd702612681aed36b236bf6/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 05 09:31:11 compute-1 podman[226268]: 2025-12-05 09:31:10.936111693 +0000 UTC m=+0.027415485 image pull 014dc726c85414b29f2dde7b5d875685d08784761c0f0ffa8630d1583a877bf9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Dec 05 09:31:11 compute-1 podman[226268]: 2025-12-05 09:31:11.039776014 +0000 UTC m=+0.131079796 container init 2c2206b261b3bab6122ba5007d2777e2483f9fa703006f4bc735f638caff06db (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-dbc58ba4-8158-4179-bf5b-c94cb9a82196, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 05 09:31:11 compute-1 podman[226268]: 2025-12-05 09:31:11.045607048 +0000 UTC m=+0.136910810 container start 2c2206b261b3bab6122ba5007d2777e2483f9fa703006f4bc735f638caff06db (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-dbc58ba4-8158-4179-bf5b-c94cb9a82196, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true)
Dec 05 09:31:11 compute-1 neutron-haproxy-ovnmeta-dbc58ba4-8158-4179-bf5b-c94cb9a82196[226284]: [NOTICE]   (226288) : New worker (226290) forked
Dec 05 09:31:11 compute-1 neutron-haproxy-ovnmeta-dbc58ba4-8158-4179-bf5b-c94cb9a82196[226284]: [NOTICE]   (226288) : Loading success.
Dec 05 09:31:12 compute-1 nova_compute[189066]: 2025-12-05 09:31:12.302 189070 DEBUG nova.compute.manager [req-b916caa2-5e0f-4ad7-81d8-58d0f973a954 req-4e682371-d1d9-4a8e-a14d-a5ec5f26e848 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: a0adc838-396e-45a2-8503-a2ab451cd778] Received event network-vif-plugged-c8f841b5-bac4-4aed-83b6-65ebccb9c49e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 09:31:12 compute-1 nova_compute[189066]: 2025-12-05 09:31:12.302 189070 DEBUG oslo_concurrency.lockutils [req-b916caa2-5e0f-4ad7-81d8-58d0f973a954 req-4e682371-d1d9-4a8e-a14d-a5ec5f26e848 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Acquiring lock "a0adc838-396e-45a2-8503-a2ab451cd778-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:31:12 compute-1 nova_compute[189066]: 2025-12-05 09:31:12.303 189070 DEBUG oslo_concurrency.lockutils [req-b916caa2-5e0f-4ad7-81d8-58d0f973a954 req-4e682371-d1d9-4a8e-a14d-a5ec5f26e848 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Lock "a0adc838-396e-45a2-8503-a2ab451cd778-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:31:12 compute-1 nova_compute[189066]: 2025-12-05 09:31:12.303 189070 DEBUG oslo_concurrency.lockutils [req-b916caa2-5e0f-4ad7-81d8-58d0f973a954 req-4e682371-d1d9-4a8e-a14d-a5ec5f26e848 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Lock "a0adc838-396e-45a2-8503-a2ab451cd778-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:31:12 compute-1 nova_compute[189066]: 2025-12-05 09:31:12.303 189070 DEBUG nova.compute.manager [req-b916caa2-5e0f-4ad7-81d8-58d0f973a954 req-4e682371-d1d9-4a8e-a14d-a5ec5f26e848 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: a0adc838-396e-45a2-8503-a2ab451cd778] Processing event network-vif-plugged-c8f841b5-bac4-4aed-83b6-65ebccb9c49e _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Dec 05 09:31:12 compute-1 nova_compute[189066]: 2025-12-05 09:31:12.304 189070 DEBUG nova.compute.manager [req-b916caa2-5e0f-4ad7-81d8-58d0f973a954 req-4e682371-d1d9-4a8e-a14d-a5ec5f26e848 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: a0adc838-396e-45a2-8503-a2ab451cd778] Received event network-vif-plugged-c8f841b5-bac4-4aed-83b6-65ebccb9c49e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 09:31:12 compute-1 nova_compute[189066]: 2025-12-05 09:31:12.304 189070 DEBUG oslo_concurrency.lockutils [req-b916caa2-5e0f-4ad7-81d8-58d0f973a954 req-4e682371-d1d9-4a8e-a14d-a5ec5f26e848 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Acquiring lock "a0adc838-396e-45a2-8503-a2ab451cd778-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:31:12 compute-1 nova_compute[189066]: 2025-12-05 09:31:12.304 189070 DEBUG oslo_concurrency.lockutils [req-b916caa2-5e0f-4ad7-81d8-58d0f973a954 req-4e682371-d1d9-4a8e-a14d-a5ec5f26e848 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Lock "a0adc838-396e-45a2-8503-a2ab451cd778-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:31:12 compute-1 nova_compute[189066]: 2025-12-05 09:31:12.304 189070 DEBUG oslo_concurrency.lockutils [req-b916caa2-5e0f-4ad7-81d8-58d0f973a954 req-4e682371-d1d9-4a8e-a14d-a5ec5f26e848 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Lock "a0adc838-396e-45a2-8503-a2ab451cd778-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:31:12 compute-1 nova_compute[189066]: 2025-12-05 09:31:12.305 189070 DEBUG nova.compute.manager [req-b916caa2-5e0f-4ad7-81d8-58d0f973a954 req-4e682371-d1d9-4a8e-a14d-a5ec5f26e848 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: a0adc838-396e-45a2-8503-a2ab451cd778] No waiting events found dispatching network-vif-plugged-c8f841b5-bac4-4aed-83b6-65ebccb9c49e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 09:31:12 compute-1 nova_compute[189066]: 2025-12-05 09:31:12.305 189070 WARNING nova.compute.manager [req-b916caa2-5e0f-4ad7-81d8-58d0f973a954 req-4e682371-d1d9-4a8e-a14d-a5ec5f26e848 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: a0adc838-396e-45a2-8503-a2ab451cd778] Received unexpected event network-vif-plugged-c8f841b5-bac4-4aed-83b6-65ebccb9c49e for instance with vm_state building and task_state spawning.
Dec 05 09:31:12 compute-1 nova_compute[189066]: 2025-12-05 09:31:12.306 189070 DEBUG nova.compute.manager [None req-1d07d364-88ff-45bc-b26a-7ddc3a3b818d bcc37d16c39547bba794fb1f43e889c1 6c5bb818cba543bbb1bcff8df31dd9cd - - default default] [instance: a0adc838-396e-45a2-8503-a2ab451cd778] Instance event wait completed in 1 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Dec 05 09:31:12 compute-1 nova_compute[189066]: 2025-12-05 09:31:12.311 189070 DEBUG nova.virt.driver [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] Emitting event <LifecycleEvent: 1764927072.3109522, a0adc838-396e-45a2-8503-a2ab451cd778 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 09:31:12 compute-1 nova_compute[189066]: 2025-12-05 09:31:12.312 189070 INFO nova.compute.manager [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] [instance: a0adc838-396e-45a2-8503-a2ab451cd778] VM Resumed (Lifecycle Event)
Dec 05 09:31:12 compute-1 nova_compute[189066]: 2025-12-05 09:31:12.315 189070 DEBUG nova.virt.libvirt.driver [None req-1d07d364-88ff-45bc-b26a-7ddc3a3b818d bcc37d16c39547bba794fb1f43e889c1 6c5bb818cba543bbb1bcff8df31dd9cd - - default default] [instance: a0adc838-396e-45a2-8503-a2ab451cd778] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Dec 05 09:31:12 compute-1 nova_compute[189066]: 2025-12-05 09:31:12.318 189070 INFO nova.virt.libvirt.driver [-] [instance: a0adc838-396e-45a2-8503-a2ab451cd778] Instance spawned successfully.
Dec 05 09:31:12 compute-1 nova_compute[189066]: 2025-12-05 09:31:12.319 189070 DEBUG nova.virt.libvirt.driver [None req-1d07d364-88ff-45bc-b26a-7ddc3a3b818d bcc37d16c39547bba794fb1f43e889c1 6c5bb818cba543bbb1bcff8df31dd9cd - - default default] [instance: a0adc838-396e-45a2-8503-a2ab451cd778] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Dec 05 09:31:12 compute-1 nova_compute[189066]: 2025-12-05 09:31:12.340 189070 DEBUG nova.compute.manager [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] [instance: a0adc838-396e-45a2-8503-a2ab451cd778] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 09:31:12 compute-1 nova_compute[189066]: 2025-12-05 09:31:12.345 189070 DEBUG nova.compute.manager [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] [instance: a0adc838-396e-45a2-8503-a2ab451cd778] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 09:31:12 compute-1 nova_compute[189066]: 2025-12-05 09:31:12.348 189070 DEBUG nova.virt.libvirt.driver [None req-1d07d364-88ff-45bc-b26a-7ddc3a3b818d bcc37d16c39547bba794fb1f43e889c1 6c5bb818cba543bbb1bcff8df31dd9cd - - default default] [instance: a0adc838-396e-45a2-8503-a2ab451cd778] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 09:31:12 compute-1 nova_compute[189066]: 2025-12-05 09:31:12.349 189070 DEBUG nova.virt.libvirt.driver [None req-1d07d364-88ff-45bc-b26a-7ddc3a3b818d bcc37d16c39547bba794fb1f43e889c1 6c5bb818cba543bbb1bcff8df31dd9cd - - default default] [instance: a0adc838-396e-45a2-8503-a2ab451cd778] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 09:31:12 compute-1 nova_compute[189066]: 2025-12-05 09:31:12.349 189070 DEBUG nova.virt.libvirt.driver [None req-1d07d364-88ff-45bc-b26a-7ddc3a3b818d bcc37d16c39547bba794fb1f43e889c1 6c5bb818cba543bbb1bcff8df31dd9cd - - default default] [instance: a0adc838-396e-45a2-8503-a2ab451cd778] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 09:31:12 compute-1 nova_compute[189066]: 2025-12-05 09:31:12.350 189070 DEBUG nova.virt.libvirt.driver [None req-1d07d364-88ff-45bc-b26a-7ddc3a3b818d bcc37d16c39547bba794fb1f43e889c1 6c5bb818cba543bbb1bcff8df31dd9cd - - default default] [instance: a0adc838-396e-45a2-8503-a2ab451cd778] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 09:31:12 compute-1 nova_compute[189066]: 2025-12-05 09:31:12.350 189070 DEBUG nova.virt.libvirt.driver [None req-1d07d364-88ff-45bc-b26a-7ddc3a3b818d bcc37d16c39547bba794fb1f43e889c1 6c5bb818cba543bbb1bcff8df31dd9cd - - default default] [instance: a0adc838-396e-45a2-8503-a2ab451cd778] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 09:31:12 compute-1 nova_compute[189066]: 2025-12-05 09:31:12.350 189070 DEBUG nova.virt.libvirt.driver [None req-1d07d364-88ff-45bc-b26a-7ddc3a3b818d bcc37d16c39547bba794fb1f43e889c1 6c5bb818cba543bbb1bcff8df31dd9cd - - default default] [instance: a0adc838-396e-45a2-8503-a2ab451cd778] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 09:31:12 compute-1 nova_compute[189066]: 2025-12-05 09:31:12.380 189070 INFO nova.compute.manager [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] [instance: a0adc838-396e-45a2-8503-a2ab451cd778] During sync_power_state the instance has a pending task (spawning). Skip.
Dec 05 09:31:12 compute-1 nova_compute[189066]: 2025-12-05 09:31:12.739 189070 INFO nova.compute.manager [None req-1d07d364-88ff-45bc-b26a-7ddc3a3b818d bcc37d16c39547bba794fb1f43e889c1 6c5bb818cba543bbb1bcff8df31dd9cd - - default default] [instance: a0adc838-396e-45a2-8503-a2ab451cd778] Took 10.63 seconds to spawn the instance on the hypervisor.
Dec 05 09:31:12 compute-1 nova_compute[189066]: 2025-12-05 09:31:12.739 189070 DEBUG nova.compute.manager [None req-1d07d364-88ff-45bc-b26a-7ddc3a3b818d bcc37d16c39547bba794fb1f43e889c1 6c5bb818cba543bbb1bcff8df31dd9cd - - default default] [instance: a0adc838-396e-45a2-8503-a2ab451cd778] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 09:31:13 compute-1 nova_compute[189066]: 2025-12-05 09:31:13.161 189070 INFO nova.compute.manager [None req-1d07d364-88ff-45bc-b26a-7ddc3a3b818d bcc37d16c39547bba794fb1f43e889c1 6c5bb818cba543bbb1bcff8df31dd9cd - - default default] [instance: a0adc838-396e-45a2-8503-a2ab451cd778] Took 11.64 seconds to build instance.
Dec 05 09:31:13 compute-1 nova_compute[189066]: 2025-12-05 09:31:13.201 189070 DEBUG oslo_concurrency.lockutils [None req-1d07d364-88ff-45bc-b26a-7ddc3a3b818d bcc37d16c39547bba794fb1f43e889c1 6c5bb818cba543bbb1bcff8df31dd9cd - - default default] Lock "a0adc838-396e-45a2-8503-a2ab451cd778" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 11.816s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:31:13 compute-1 podman[226299]: 2025-12-05 09:31:13.675199478 +0000 UTC m=+0.107631789 container health_status 0cdc951f3b455a1459471b20e1f1626ca53f702831c125f835d1b670aeb17ccf (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a42d21893a42142b0b00e96296a13ac9d2c0fedf55d6438ad104fd0309c63376-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0)
Dec 05 09:31:14 compute-1 nova_compute[189066]: 2025-12-05 09:31:14.020 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:31:14 compute-1 nova_compute[189066]: 2025-12-05 09:31:14.060 189070 DEBUG oslo_concurrency.lockutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:31:14 compute-1 nova_compute[189066]: 2025-12-05 09:31:14.060 189070 DEBUG oslo_concurrency.lockutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:31:14 compute-1 nova_compute[189066]: 2025-12-05 09:31:14.061 189070 DEBUG oslo_concurrency.lockutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:31:14 compute-1 nova_compute[189066]: 2025-12-05 09:31:14.061 189070 DEBUG nova.compute.resource_tracker [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 05 09:31:14 compute-1 nova_compute[189066]: 2025-12-05 09:31:14.146 189070 DEBUG oslo_concurrency.processutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a0adc838-396e-45a2-8503-a2ab451cd778/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 09:31:14 compute-1 nova_compute[189066]: 2025-12-05 09:31:14.211 189070 DEBUG oslo_concurrency.processutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a0adc838-396e-45a2-8503-a2ab451cd778/disk --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 09:31:14 compute-1 nova_compute[189066]: 2025-12-05 09:31:14.212 189070 DEBUG oslo_concurrency.processutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a0adc838-396e-45a2-8503-a2ab451cd778/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 09:31:14 compute-1 nova_compute[189066]: 2025-12-05 09:31:14.258 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:31:14 compute-1 nova_compute[189066]: 2025-12-05 09:31:14.275 189070 DEBUG oslo_concurrency.processutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a0adc838-396e-45a2-8503-a2ab451cd778/disk --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 09:31:14 compute-1 nova_compute[189066]: 2025-12-05 09:31:14.429 189070 WARNING nova.virt.libvirt.driver [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 05 09:31:14 compute-1 nova_compute[189066]: 2025-12-05 09:31:14.431 189070 DEBUG nova.compute.resource_tracker [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5611MB free_disk=73.33284378051758GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 05 09:31:14 compute-1 nova_compute[189066]: 2025-12-05 09:31:14.431 189070 DEBUG oslo_concurrency.lockutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:31:14 compute-1 nova_compute[189066]: 2025-12-05 09:31:14.432 189070 DEBUG oslo_concurrency.lockutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:31:14 compute-1 nova_compute[189066]: 2025-12-05 09:31:14.537 189070 DEBUG nova.compute.resource_tracker [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Instance a0adc838-396e-45a2-8503-a2ab451cd778 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Dec 05 09:31:14 compute-1 nova_compute[189066]: 2025-12-05 09:31:14.537 189070 DEBUG nova.compute.resource_tracker [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 05 09:31:14 compute-1 nova_compute[189066]: 2025-12-05 09:31:14.538 189070 DEBUG nova.compute.resource_tracker [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 05 09:31:14 compute-1 nova_compute[189066]: 2025-12-05 09:31:14.582 189070 DEBUG nova.compute.provider_tree [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Inventory has not changed in ProviderTree for provider: be68f9f1-7820-4bfa-8dbd-210e13729f64 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 09:31:14 compute-1 nova_compute[189066]: 2025-12-05 09:31:14.598 189070 DEBUG nova.scheduler.client.report [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Inventory has not changed for provider be68f9f1-7820-4bfa-8dbd-210e13729f64 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 09:31:14 compute-1 nova_compute[189066]: 2025-12-05 09:31:14.618 189070 DEBUG nova.compute.resource_tracker [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 05 09:31:14 compute-1 nova_compute[189066]: 2025-12-05 09:31:14.619 189070 DEBUG oslo_concurrency.lockutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.187s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:31:14 compute-1 nova_compute[189066]: 2025-12-05 09:31:14.637 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:31:14 compute-1 nova_compute[189066]: 2025-12-05 09:31:14.822 189070 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764927059.8208625, d615ca65-b10a-4ff3-a274-7b300d2e1808 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 09:31:14 compute-1 nova_compute[189066]: 2025-12-05 09:31:14.823 189070 INFO nova.compute.manager [-] [instance: d615ca65-b10a-4ff3-a274-7b300d2e1808] VM Stopped (Lifecycle Event)
Dec 05 09:31:14 compute-1 nova_compute[189066]: 2025-12-05 09:31:14.850 189070 DEBUG nova.compute.manager [None req-07d619ca-759f-43b2-9813-354461663d87 - - - - - -] [instance: d615ca65-b10a-4ff3-a274-7b300d2e1808] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 09:31:15 compute-1 nova_compute[189066]: 2025-12-05 09:31:15.619 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:31:15 compute-1 podman[226332]: 2025-12-05 09:31:15.65419202 +0000 UTC m=+0.077158109 container health_status e8cea6352187f28e672aa25c227f2705a90d58074b8cf42235a56df5d4026fd5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, config_id=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a42d21893a42142b0b00e96296a13ac9d2c0fedf55d6438ad104fd0309c63376-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0)
Dec 05 09:31:16 compute-1 nova_compute[189066]: 2025-12-05 09:31:16.020 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:31:16 compute-1 nova_compute[189066]: 2025-12-05 09:31:16.021 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:31:16 compute-1 nova_compute[189066]: 2025-12-05 09:31:16.021 189070 DEBUG nova.compute.manager [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 05 09:31:17 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:31:17.022 105272 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=14, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f2:d3:89', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '8e:43:2c:7d:20:dd'}, ipsec=False) old=SB_Global(nb_cfg=13) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 09:31:17 compute-1 nova_compute[189066]: 2025-12-05 09:31:17.023 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:31:17 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:31:17.024 105272 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Dec 05 09:31:17 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:31:17.025 105272 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=2deaed7a-68f6-453c-b7f8-10ef033f3762, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '14'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 09:31:18 compute-1 nova_compute[189066]: 2025-12-05 09:31:18.021 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:31:18 compute-1 nova_compute[189066]: 2025-12-05 09:31:18.022 189070 DEBUG nova.compute.manager [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 05 09:31:18 compute-1 nova_compute[189066]: 2025-12-05 09:31:18.022 189070 DEBUG nova.compute.manager [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 05 09:31:18 compute-1 podman[226352]: 2025-12-05 09:31:18.644669381 +0000 UTC m=+0.078632316 container health_status 3f3288243aefe700d53cffce184353529fbaf6e44f1c19ec4792ed576af41176 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=multipathd, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 05 09:31:18 compute-1 nova_compute[189066]: 2025-12-05 09:31:18.926 189070 DEBUG oslo_concurrency.lockutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Acquiring lock "refresh_cache-a0adc838-396e-45a2-8503-a2ab451cd778" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 09:31:18 compute-1 nova_compute[189066]: 2025-12-05 09:31:18.926 189070 DEBUG oslo_concurrency.lockutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Acquired lock "refresh_cache-a0adc838-396e-45a2-8503-a2ab451cd778" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 09:31:18 compute-1 nova_compute[189066]: 2025-12-05 09:31:18.927 189070 DEBUG nova.network.neutron [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] [instance: a0adc838-396e-45a2-8503-a2ab451cd778] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Dec 05 09:31:18 compute-1 nova_compute[189066]: 2025-12-05 09:31:18.927 189070 DEBUG nova.objects.instance [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Lazy-loading 'info_cache' on Instance uuid a0adc838-396e-45a2-8503-a2ab451cd778 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 09:31:18 compute-1 nova_compute[189066]: 2025-12-05 09:31:18.958 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:31:18 compute-1 NetworkManager[55704]: <info>  [1764927078.9587] manager: (patch-br-int-to-provnet-540ef657-1e81-4b72-856b-0e1c84504735): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/79)
Dec 05 09:31:18 compute-1 NetworkManager[55704]: <info>  [1764927078.9593] manager: (patch-provnet-540ef657-1e81-4b72-856b-0e1c84504735-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/80)
Dec 05 09:31:19 compute-1 nova_compute[189066]: 2025-12-05 09:31:19.122 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:31:19 compute-1 ovn_controller[95809]: 2025-12-05T09:31:19Z|00147|binding|INFO|Releasing lport dfe2c5ef-972a-4b59-b38d-e7bdc4437f37 from this chassis (sb_readonly=0)
Dec 05 09:31:19 compute-1 nova_compute[189066]: 2025-12-05 09:31:19.147 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:31:19 compute-1 nova_compute[189066]: 2025-12-05 09:31:19.261 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:31:19 compute-1 nova_compute[189066]: 2025-12-05 09:31:19.614 189070 DEBUG nova.compute.manager [req-6e00dc7d-9f57-4363-a49c-d72e611c444c req-65aff028-97c7-46d1-9529-00b992dc1a34 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: a0adc838-396e-45a2-8503-a2ab451cd778] Received event network-changed-c8f841b5-bac4-4aed-83b6-65ebccb9c49e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 09:31:19 compute-1 nova_compute[189066]: 2025-12-05 09:31:19.615 189070 DEBUG nova.compute.manager [req-6e00dc7d-9f57-4363-a49c-d72e611c444c req-65aff028-97c7-46d1-9529-00b992dc1a34 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: a0adc838-396e-45a2-8503-a2ab451cd778] Refreshing instance network info cache due to event network-changed-c8f841b5-bac4-4aed-83b6-65ebccb9c49e. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Dec 05 09:31:19 compute-1 nova_compute[189066]: 2025-12-05 09:31:19.616 189070 DEBUG oslo_concurrency.lockutils [req-6e00dc7d-9f57-4363-a49c-d72e611c444c req-65aff028-97c7-46d1-9529-00b992dc1a34 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Acquiring lock "refresh_cache-a0adc838-396e-45a2-8503-a2ab451cd778" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 09:31:19 compute-1 nova_compute[189066]: 2025-12-05 09:31:19.641 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:31:22 compute-1 podman[226373]: 2025-12-05 09:31:22.638044176 +0000 UTC m=+0.065863941 container health_status b07f36300303a5c1729056a61e344d6c6fd9b2e404082d8e8ce7185d45652c2b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-type=git, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=minimal rhel9, name=ubi9-minimal, release=1755695350, io.openshift.expose-services=, version=9.6, architecture=x86_64, com.redhat.component=ubi9-minimal-container, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, managed_by=edpm_ansible, config_id=openstack_network_exporter)
Dec 05 09:31:23 compute-1 nova_compute[189066]: 2025-12-05 09:31:23.286 189070 DEBUG nova.network.neutron [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] [instance: a0adc838-396e-45a2-8503-a2ab451cd778] Updating instance_info_cache with network_info: [{"id": "c8f841b5-bac4-4aed-83b6-65ebccb9c49e", "address": "fa:16:3e:a7:2b:28", "network": {"id": "dbc58ba4-8158-4179-bf5b-c94cb9a82196", "bridge": "br-int", "label": "tempest-network-smoke--906868098", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.183", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6c5bb818cba543bbb1bcff8df31dd9cd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc8f841b5-ba", "ovs_interfaceid": "c8f841b5-bac4-4aed-83b6-65ebccb9c49e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 09:31:23 compute-1 nova_compute[189066]: 2025-12-05 09:31:23.319 189070 DEBUG oslo_concurrency.lockutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Releasing lock "refresh_cache-a0adc838-396e-45a2-8503-a2ab451cd778" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 09:31:23 compute-1 nova_compute[189066]: 2025-12-05 09:31:23.320 189070 DEBUG nova.compute.manager [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] [instance: a0adc838-396e-45a2-8503-a2ab451cd778] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Dec 05 09:31:23 compute-1 nova_compute[189066]: 2025-12-05 09:31:23.320 189070 DEBUG oslo_concurrency.lockutils [req-6e00dc7d-9f57-4363-a49c-d72e611c444c req-65aff028-97c7-46d1-9529-00b992dc1a34 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Acquired lock "refresh_cache-a0adc838-396e-45a2-8503-a2ab451cd778" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 09:31:23 compute-1 nova_compute[189066]: 2025-12-05 09:31:23.320 189070 DEBUG nova.network.neutron [req-6e00dc7d-9f57-4363-a49c-d72e611c444c req-65aff028-97c7-46d1-9529-00b992dc1a34 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: a0adc838-396e-45a2-8503-a2ab451cd778] Refreshing network info cache for port c8f841b5-bac4-4aed-83b6-65ebccb9c49e _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Dec 05 09:31:23 compute-1 nova_compute[189066]: 2025-12-05 09:31:23.322 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:31:23 compute-1 nova_compute[189066]: 2025-12-05 09:31:23.987 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:31:24 compute-1 nova_compute[189066]: 2025-12-05 09:31:24.305 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:31:24 compute-1 nova_compute[189066]: 2025-12-05 09:31:24.643 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:31:25 compute-1 podman[226407]: 2025-12-05 09:31:25.632867594 +0000 UTC m=+0.067057602 container health_status b9f45cdcb567bb250381fa80d86c78c226d5638c7de1b6d79ee017275814ff68 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Dec 05 09:31:26 compute-1 nova_compute[189066]: 2025-12-05 09:31:26.212 189070 DEBUG nova.network.neutron [req-6e00dc7d-9f57-4363-a49c-d72e611c444c req-65aff028-97c7-46d1-9529-00b992dc1a34 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: a0adc838-396e-45a2-8503-a2ab451cd778] Updated VIF entry in instance network info cache for port c8f841b5-bac4-4aed-83b6-65ebccb9c49e. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Dec 05 09:31:26 compute-1 nova_compute[189066]: 2025-12-05 09:31:26.213 189070 DEBUG nova.network.neutron [req-6e00dc7d-9f57-4363-a49c-d72e611c444c req-65aff028-97c7-46d1-9529-00b992dc1a34 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: a0adc838-396e-45a2-8503-a2ab451cd778] Updating instance_info_cache with network_info: [{"id": "c8f841b5-bac4-4aed-83b6-65ebccb9c49e", "address": "fa:16:3e:a7:2b:28", "network": {"id": "dbc58ba4-8158-4179-bf5b-c94cb9a82196", "bridge": "br-int", "label": "tempest-network-smoke--906868098", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.183", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6c5bb818cba543bbb1bcff8df31dd9cd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc8f841b5-ba", "ovs_interfaceid": "c8f841b5-bac4-4aed-83b6-65ebccb9c49e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 09:31:26 compute-1 nova_compute[189066]: 2025-12-05 09:31:26.238 189070 DEBUG oslo_concurrency.lockutils [req-6e00dc7d-9f57-4363-a49c-d72e611c444c req-65aff028-97c7-46d1-9529-00b992dc1a34 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Releasing lock "refresh_cache-a0adc838-396e-45a2-8503-a2ab451cd778" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 09:31:26 compute-1 ovn_controller[95809]: 2025-12-05T09:31:26Z|00023|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:a7:2b:28 10.100.0.5
Dec 05 09:31:26 compute-1 ovn_controller[95809]: 2025-12-05T09:31:26Z|00024|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:a7:2b:28 10.100.0.5
Dec 05 09:31:29 compute-1 nova_compute[189066]: 2025-12-05 09:31:29.103 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:31:29 compute-1 nova_compute[189066]: 2025-12-05 09:31:29.307 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:31:29 compute-1 nova_compute[189066]: 2025-12-05 09:31:29.645 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:31:32 compute-1 podman[226432]: 2025-12-05 09:31:32.620941354 +0000 UTC m=+0.064739524 container health_status 550b604b3b40c506829669f3af0f3e8dc4e7ac01464222ce7f26998708fe1a96 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Dec 05 09:31:34 compute-1 nova_compute[189066]: 2025-12-05 09:31:34.309 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:31:34 compute-1 nova_compute[189066]: 2025-12-05 09:31:34.648 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:31:39 compute-1 nova_compute[189066]: 2025-12-05 09:31:39.312 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:31:39 compute-1 nova_compute[189066]: 2025-12-05 09:31:39.313 189070 DEBUG oslo_concurrency.lockutils [None req-ea5da156-5170-4219-8df4-dd2287031ffd 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Acquiring lock "b843e130-e156-47b6-8a2a-d4811973b93a" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:31:39 compute-1 nova_compute[189066]: 2025-12-05 09:31:39.314 189070 DEBUG oslo_concurrency.lockutils [None req-ea5da156-5170-4219-8df4-dd2287031ffd 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Lock "b843e130-e156-47b6-8a2a-d4811973b93a" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:31:39 compute-1 nova_compute[189066]: 2025-12-05 09:31:39.350 189070 DEBUG nova.compute.manager [None req-ea5da156-5170-4219-8df4-dd2287031ffd 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: b843e130-e156-47b6-8a2a-d4811973b93a] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Dec 05 09:31:39 compute-1 nova_compute[189066]: 2025-12-05 09:31:39.545 189070 DEBUG oslo_concurrency.lockutils [None req-ea5da156-5170-4219-8df4-dd2287031ffd 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:31:39 compute-1 nova_compute[189066]: 2025-12-05 09:31:39.546 189070 DEBUG oslo_concurrency.lockutils [None req-ea5da156-5170-4219-8df4-dd2287031ffd 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:31:39 compute-1 nova_compute[189066]: 2025-12-05 09:31:39.554 189070 DEBUG nova.virt.hardware [None req-ea5da156-5170-4219-8df4-dd2287031ffd 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Dec 05 09:31:39 compute-1 nova_compute[189066]: 2025-12-05 09:31:39.555 189070 INFO nova.compute.claims [None req-ea5da156-5170-4219-8df4-dd2287031ffd 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: b843e130-e156-47b6-8a2a-d4811973b93a] Claim successful on node compute-1.ctlplane.example.com
Dec 05 09:31:39 compute-1 podman[226455]: 2025-12-05 09:31:39.631548597 +0000 UTC m=+0.069224463 container health_status d60d3dfd47255bc7c068dbedc03dbf86678863f0414a1b8f8536e4d53dd67a13 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a42d21893a42142b0b00e96296a13ac9d2c0fedf55d6438ad104fd0309c63376-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ceilometer_agent_compute)
Dec 05 09:31:39 compute-1 nova_compute[189066]: 2025-12-05 09:31:39.650 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:31:39 compute-1 nova_compute[189066]: 2025-12-05 09:31:39.728 189070 DEBUG nova.compute.provider_tree [None req-ea5da156-5170-4219-8df4-dd2287031ffd 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Inventory has not changed in ProviderTree for provider: be68f9f1-7820-4bfa-8dbd-210e13729f64 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 09:31:39 compute-1 nova_compute[189066]: 2025-12-05 09:31:39.748 189070 DEBUG nova.scheduler.client.report [None req-ea5da156-5170-4219-8df4-dd2287031ffd 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Inventory has not changed for provider be68f9f1-7820-4bfa-8dbd-210e13729f64 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 09:31:39 compute-1 nova_compute[189066]: 2025-12-05 09:31:39.778 189070 DEBUG oslo_concurrency.lockutils [None req-ea5da156-5170-4219-8df4-dd2287031ffd 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.232s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:31:39 compute-1 nova_compute[189066]: 2025-12-05 09:31:39.779 189070 DEBUG nova.compute.manager [None req-ea5da156-5170-4219-8df4-dd2287031ffd 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: b843e130-e156-47b6-8a2a-d4811973b93a] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Dec 05 09:31:39 compute-1 nova_compute[189066]: 2025-12-05 09:31:39.858 189070 DEBUG nova.compute.manager [None req-ea5da156-5170-4219-8df4-dd2287031ffd 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: b843e130-e156-47b6-8a2a-d4811973b93a] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Dec 05 09:31:39 compute-1 nova_compute[189066]: 2025-12-05 09:31:39.858 189070 DEBUG nova.network.neutron [None req-ea5da156-5170-4219-8df4-dd2287031ffd 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: b843e130-e156-47b6-8a2a-d4811973b93a] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Dec 05 09:31:39 compute-1 nova_compute[189066]: 2025-12-05 09:31:39.885 189070 INFO nova.virt.libvirt.driver [None req-ea5da156-5170-4219-8df4-dd2287031ffd 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: b843e130-e156-47b6-8a2a-d4811973b93a] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Dec 05 09:31:39 compute-1 nova_compute[189066]: 2025-12-05 09:31:39.908 189070 DEBUG nova.compute.manager [None req-ea5da156-5170-4219-8df4-dd2287031ffd 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: b843e130-e156-47b6-8a2a-d4811973b93a] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Dec 05 09:31:40 compute-1 nova_compute[189066]: 2025-12-05 09:31:40.012 189070 DEBUG nova.compute.manager [None req-ea5da156-5170-4219-8df4-dd2287031ffd 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: b843e130-e156-47b6-8a2a-d4811973b93a] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Dec 05 09:31:40 compute-1 nova_compute[189066]: 2025-12-05 09:31:40.013 189070 DEBUG nova.virt.libvirt.driver [None req-ea5da156-5170-4219-8df4-dd2287031ffd 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: b843e130-e156-47b6-8a2a-d4811973b93a] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Dec 05 09:31:40 compute-1 nova_compute[189066]: 2025-12-05 09:31:40.013 189070 INFO nova.virt.libvirt.driver [None req-ea5da156-5170-4219-8df4-dd2287031ffd 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: b843e130-e156-47b6-8a2a-d4811973b93a] Creating image(s)
Dec 05 09:31:40 compute-1 nova_compute[189066]: 2025-12-05 09:31:40.014 189070 DEBUG oslo_concurrency.lockutils [None req-ea5da156-5170-4219-8df4-dd2287031ffd 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Acquiring lock "/var/lib/nova/instances/b843e130-e156-47b6-8a2a-d4811973b93a/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:31:40 compute-1 nova_compute[189066]: 2025-12-05 09:31:40.014 189070 DEBUG oslo_concurrency.lockutils [None req-ea5da156-5170-4219-8df4-dd2287031ffd 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Lock "/var/lib/nova/instances/b843e130-e156-47b6-8a2a-d4811973b93a/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:31:40 compute-1 nova_compute[189066]: 2025-12-05 09:31:40.015 189070 DEBUG oslo_concurrency.lockutils [None req-ea5da156-5170-4219-8df4-dd2287031ffd 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Lock "/var/lib/nova/instances/b843e130-e156-47b6-8a2a-d4811973b93a/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:31:40 compute-1 nova_compute[189066]: 2025-12-05 09:31:40.028 189070 DEBUG oslo_concurrency.processutils [None req-ea5da156-5170-4219-8df4-dd2287031ffd 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5b1bdb5cc50b9bf43ebe3094e9279a12a18df0ce --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 09:31:40 compute-1 nova_compute[189066]: 2025-12-05 09:31:40.106 189070 DEBUG oslo_concurrency.processutils [None req-ea5da156-5170-4219-8df4-dd2287031ffd 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5b1bdb5cc50b9bf43ebe3094e9279a12a18df0ce --force-share --output=json" returned: 0 in 0.078s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 09:31:40 compute-1 nova_compute[189066]: 2025-12-05 09:31:40.108 189070 DEBUG oslo_concurrency.lockutils [None req-ea5da156-5170-4219-8df4-dd2287031ffd 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Acquiring lock "5b1bdb5cc50b9bf43ebe3094e9279a12a18df0ce" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:31:40 compute-1 nova_compute[189066]: 2025-12-05 09:31:40.108 189070 DEBUG oslo_concurrency.lockutils [None req-ea5da156-5170-4219-8df4-dd2287031ffd 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Lock "5b1bdb5cc50b9bf43ebe3094e9279a12a18df0ce" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:31:40 compute-1 nova_compute[189066]: 2025-12-05 09:31:40.119 189070 DEBUG oslo_concurrency.processutils [None req-ea5da156-5170-4219-8df4-dd2287031ffd 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5b1bdb5cc50b9bf43ebe3094e9279a12a18df0ce --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 09:31:40 compute-1 nova_compute[189066]: 2025-12-05 09:31:40.185 189070 DEBUG oslo_concurrency.processutils [None req-ea5da156-5170-4219-8df4-dd2287031ffd 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5b1bdb5cc50b9bf43ebe3094e9279a12a18df0ce --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 09:31:40 compute-1 nova_compute[189066]: 2025-12-05 09:31:40.186 189070 DEBUG oslo_concurrency.processutils [None req-ea5da156-5170-4219-8df4-dd2287031ffd 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5b1bdb5cc50b9bf43ebe3094e9279a12a18df0ce,backing_fmt=raw /var/lib/nova/instances/b843e130-e156-47b6-8a2a-d4811973b93a/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 09:31:40 compute-1 nova_compute[189066]: 2025-12-05 09:31:40.231 189070 DEBUG oslo_concurrency.processutils [None req-ea5da156-5170-4219-8df4-dd2287031ffd 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5b1bdb5cc50b9bf43ebe3094e9279a12a18df0ce,backing_fmt=raw /var/lib/nova/instances/b843e130-e156-47b6-8a2a-d4811973b93a/disk 1073741824" returned: 0 in 0.045s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 09:31:40 compute-1 nova_compute[189066]: 2025-12-05 09:31:40.232 189070 DEBUG oslo_concurrency.lockutils [None req-ea5da156-5170-4219-8df4-dd2287031ffd 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Lock "5b1bdb5cc50b9bf43ebe3094e9279a12a18df0ce" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.124s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:31:40 compute-1 nova_compute[189066]: 2025-12-05 09:31:40.233 189070 DEBUG oslo_concurrency.processutils [None req-ea5da156-5170-4219-8df4-dd2287031ffd 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5b1bdb5cc50b9bf43ebe3094e9279a12a18df0ce --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 09:31:40 compute-1 nova_compute[189066]: 2025-12-05 09:31:40.295 189070 DEBUG oslo_concurrency.processutils [None req-ea5da156-5170-4219-8df4-dd2287031ffd 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5b1bdb5cc50b9bf43ebe3094e9279a12a18df0ce --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 09:31:40 compute-1 nova_compute[189066]: 2025-12-05 09:31:40.296 189070 DEBUG nova.virt.disk.api [None req-ea5da156-5170-4219-8df4-dd2287031ffd 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Checking if we can resize image /var/lib/nova/instances/b843e130-e156-47b6-8a2a-d4811973b93a/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Dec 05 09:31:40 compute-1 nova_compute[189066]: 2025-12-05 09:31:40.297 189070 DEBUG oslo_concurrency.processutils [None req-ea5da156-5170-4219-8df4-dd2287031ffd 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b843e130-e156-47b6-8a2a-d4811973b93a/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 09:31:40 compute-1 nova_compute[189066]: 2025-12-05 09:31:40.363 189070 DEBUG oslo_concurrency.processutils [None req-ea5da156-5170-4219-8df4-dd2287031ffd 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b843e130-e156-47b6-8a2a-d4811973b93a/disk --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 09:31:40 compute-1 nova_compute[189066]: 2025-12-05 09:31:40.365 189070 DEBUG nova.virt.disk.api [None req-ea5da156-5170-4219-8df4-dd2287031ffd 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Cannot resize image /var/lib/nova/instances/b843e130-e156-47b6-8a2a-d4811973b93a/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Dec 05 09:31:40 compute-1 nova_compute[189066]: 2025-12-05 09:31:40.365 189070 DEBUG nova.objects.instance [None req-ea5da156-5170-4219-8df4-dd2287031ffd 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Lazy-loading 'migration_context' on Instance uuid b843e130-e156-47b6-8a2a-d4811973b93a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 09:31:40 compute-1 nova_compute[189066]: 2025-12-05 09:31:40.391 189070 DEBUG nova.virt.libvirt.driver [None req-ea5da156-5170-4219-8df4-dd2287031ffd 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: b843e130-e156-47b6-8a2a-d4811973b93a] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Dec 05 09:31:40 compute-1 nova_compute[189066]: 2025-12-05 09:31:40.391 189070 DEBUG nova.virt.libvirt.driver [None req-ea5da156-5170-4219-8df4-dd2287031ffd 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: b843e130-e156-47b6-8a2a-d4811973b93a] Ensure instance console log exists: /var/lib/nova/instances/b843e130-e156-47b6-8a2a-d4811973b93a/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Dec 05 09:31:40 compute-1 nova_compute[189066]: 2025-12-05 09:31:40.392 189070 DEBUG oslo_concurrency.lockutils [None req-ea5da156-5170-4219-8df4-dd2287031ffd 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:31:40 compute-1 nova_compute[189066]: 2025-12-05 09:31:40.392 189070 DEBUG oslo_concurrency.lockutils [None req-ea5da156-5170-4219-8df4-dd2287031ffd 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:31:40 compute-1 nova_compute[189066]: 2025-12-05 09:31:40.392 189070 DEBUG oslo_concurrency.lockutils [None req-ea5da156-5170-4219-8df4-dd2287031ffd 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:31:40 compute-1 nova_compute[189066]: 2025-12-05 09:31:40.681 189070 DEBUG nova.policy [None req-ea5da156-5170-4219-8df4-dd2287031ffd 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '822a2d80e80e46d9a5a49e1e9560e0d9', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'f918144b49634ed5a43d75f8f7d194d3', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Dec 05 09:31:43 compute-1 nova_compute[189066]: 2025-12-05 09:31:43.220 189070 DEBUG nova.network.neutron [None req-ea5da156-5170-4219-8df4-dd2287031ffd 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: b843e130-e156-47b6-8a2a-d4811973b93a] Successfully created port: d7d3d638-32a8-4d3e-abce-d0e8942a6c22 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Dec 05 09:31:44 compute-1 nova_compute[189066]: 2025-12-05 09:31:44.314 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:31:44 compute-1 nova_compute[189066]: 2025-12-05 09:31:44.652 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:31:44 compute-1 podman[226489]: 2025-12-05 09:31:44.662701677 +0000 UTC m=+0.102663717 container health_status 0cdc951f3b455a1459471b20e1f1626ca53f702831c125f835d1b670aeb17ccf (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a42d21893a42142b0b00e96296a13ac9d2c0fedf55d6438ad104fd0309c63376-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, managed_by=edpm_ansible)
Dec 05 09:31:45 compute-1 nova_compute[189066]: 2025-12-05 09:31:45.486 189070 DEBUG nova.network.neutron [None req-ea5da156-5170-4219-8df4-dd2287031ffd 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: b843e130-e156-47b6-8a2a-d4811973b93a] Successfully updated port: d7d3d638-32a8-4d3e-abce-d0e8942a6c22 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Dec 05 09:31:45 compute-1 nova_compute[189066]: 2025-12-05 09:31:45.655 189070 DEBUG nova.compute.manager [req-8ad80b61-4922-4346-a7f2-e57b77d86bdc req-c3675a4b-4fc4-47a4-8a3c-cc0ffbea19ee 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: b843e130-e156-47b6-8a2a-d4811973b93a] Received event network-changed-d7d3d638-32a8-4d3e-abce-d0e8942a6c22 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 09:31:45 compute-1 nova_compute[189066]: 2025-12-05 09:31:45.656 189070 DEBUG nova.compute.manager [req-8ad80b61-4922-4346-a7f2-e57b77d86bdc req-c3675a4b-4fc4-47a4-8a3c-cc0ffbea19ee 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: b843e130-e156-47b6-8a2a-d4811973b93a] Refreshing instance network info cache due to event network-changed-d7d3d638-32a8-4d3e-abce-d0e8942a6c22. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Dec 05 09:31:45 compute-1 nova_compute[189066]: 2025-12-05 09:31:45.656 189070 DEBUG oslo_concurrency.lockutils [req-8ad80b61-4922-4346-a7f2-e57b77d86bdc req-c3675a4b-4fc4-47a4-8a3c-cc0ffbea19ee 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Acquiring lock "refresh_cache-b843e130-e156-47b6-8a2a-d4811973b93a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 09:31:45 compute-1 nova_compute[189066]: 2025-12-05 09:31:45.656 189070 DEBUG oslo_concurrency.lockutils [req-8ad80b61-4922-4346-a7f2-e57b77d86bdc req-c3675a4b-4fc4-47a4-8a3c-cc0ffbea19ee 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Acquired lock "refresh_cache-b843e130-e156-47b6-8a2a-d4811973b93a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 09:31:45 compute-1 nova_compute[189066]: 2025-12-05 09:31:45.656 189070 DEBUG nova.network.neutron [req-8ad80b61-4922-4346-a7f2-e57b77d86bdc req-c3675a4b-4fc4-47a4-8a3c-cc0ffbea19ee 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: b843e130-e156-47b6-8a2a-d4811973b93a] Refreshing network info cache for port d7d3d638-32a8-4d3e-abce-d0e8942a6c22 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Dec 05 09:31:45 compute-1 nova_compute[189066]: 2025-12-05 09:31:45.671 189070 DEBUG oslo_concurrency.lockutils [None req-ea5da156-5170-4219-8df4-dd2287031ffd 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Acquiring lock "refresh_cache-b843e130-e156-47b6-8a2a-d4811973b93a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 09:31:45 compute-1 nova_compute[189066]: 2025-12-05 09:31:45.896 189070 DEBUG nova.network.neutron [req-8ad80b61-4922-4346-a7f2-e57b77d86bdc req-c3675a4b-4fc4-47a4-8a3c-cc0ffbea19ee 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: b843e130-e156-47b6-8a2a-d4811973b93a] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Dec 05 09:31:46 compute-1 podman[226515]: 2025-12-05 09:31:46.6185618 +0000 UTC m=+0.057589938 container health_status e8cea6352187f28e672aa25c227f2705a90d58074b8cf42235a56df5d4026fd5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a42d21893a42142b0b00e96296a13ac9d2c0fedf55d6438ad104fd0309c63376-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Dec 05 09:31:46 compute-1 nova_compute[189066]: 2025-12-05 09:31:46.981 189070 DEBUG nova.network.neutron [req-8ad80b61-4922-4346-a7f2-e57b77d86bdc req-c3675a4b-4fc4-47a4-8a3c-cc0ffbea19ee 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: b843e130-e156-47b6-8a2a-d4811973b93a] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 09:31:47 compute-1 nova_compute[189066]: 2025-12-05 09:31:47.022 189070 DEBUG oslo_concurrency.lockutils [req-8ad80b61-4922-4346-a7f2-e57b77d86bdc req-c3675a4b-4fc4-47a4-8a3c-cc0ffbea19ee 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Releasing lock "refresh_cache-b843e130-e156-47b6-8a2a-d4811973b93a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 09:31:47 compute-1 nova_compute[189066]: 2025-12-05 09:31:47.022 189070 DEBUG oslo_concurrency.lockutils [None req-ea5da156-5170-4219-8df4-dd2287031ffd 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Acquired lock "refresh_cache-b843e130-e156-47b6-8a2a-d4811973b93a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 09:31:47 compute-1 nova_compute[189066]: 2025-12-05 09:31:47.023 189070 DEBUG nova.network.neutron [None req-ea5da156-5170-4219-8df4-dd2287031ffd 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: b843e130-e156-47b6-8a2a-d4811973b93a] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Dec 05 09:31:47 compute-1 nova_compute[189066]: 2025-12-05 09:31:47.906 189070 DEBUG nova.network.neutron [None req-ea5da156-5170-4219-8df4-dd2287031ffd 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: b843e130-e156-47b6-8a2a-d4811973b93a] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Dec 05 09:31:49 compute-1 nova_compute[189066]: 2025-12-05 09:31:49.317 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:31:49 compute-1 podman[226534]: 2025-12-05 09:31:49.618106285 +0000 UTC m=+0.055182109 container health_status 3f3288243aefe700d53cffce184353529fbaf6e44f1c19ec4792ed576af41176 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true, container_name=multipathd, managed_by=edpm_ansible, config_id=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 05 09:31:49 compute-1 nova_compute[189066]: 2025-12-05 09:31:49.654 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:31:50 compute-1 nova_compute[189066]: 2025-12-05 09:31:50.055 189070 DEBUG nova.network.neutron [None req-ea5da156-5170-4219-8df4-dd2287031ffd 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: b843e130-e156-47b6-8a2a-d4811973b93a] Updating instance_info_cache with network_info: [{"id": "d7d3d638-32a8-4d3e-abce-d0e8942a6c22", "address": "fa:16:3e:7f:68:29", "network": {"id": "7e493efd-28a8-4124-9b83-acc24853936f", "bridge": "br-int", "label": "tempest-network-smoke--1424299295", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f918144b49634ed5a43d75f8f7d194d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd7d3d638-32", "ovs_interfaceid": "d7d3d638-32a8-4d3e-abce-d0e8942a6c22", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 09:31:50 compute-1 nova_compute[189066]: 2025-12-05 09:31:50.113 189070 DEBUG oslo_concurrency.lockutils [None req-ea5da156-5170-4219-8df4-dd2287031ffd 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Releasing lock "refresh_cache-b843e130-e156-47b6-8a2a-d4811973b93a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 09:31:50 compute-1 nova_compute[189066]: 2025-12-05 09:31:50.114 189070 DEBUG nova.compute.manager [None req-ea5da156-5170-4219-8df4-dd2287031ffd 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: b843e130-e156-47b6-8a2a-d4811973b93a] Instance network_info: |[{"id": "d7d3d638-32a8-4d3e-abce-d0e8942a6c22", "address": "fa:16:3e:7f:68:29", "network": {"id": "7e493efd-28a8-4124-9b83-acc24853936f", "bridge": "br-int", "label": "tempest-network-smoke--1424299295", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f918144b49634ed5a43d75f8f7d194d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd7d3d638-32", "ovs_interfaceid": "d7d3d638-32a8-4d3e-abce-d0e8942a6c22", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Dec 05 09:31:50 compute-1 nova_compute[189066]: 2025-12-05 09:31:50.116 189070 DEBUG nova.virt.libvirt.driver [None req-ea5da156-5170-4219-8df4-dd2287031ffd 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: b843e130-e156-47b6-8a2a-d4811973b93a] Start _get_guest_xml network_info=[{"id": "d7d3d638-32a8-4d3e-abce-d0e8942a6c22", "address": "fa:16:3e:7f:68:29", "network": {"id": "7e493efd-28a8-4124-9b83-acc24853936f", "bridge": "br-int", "label": "tempest-network-smoke--1424299295", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f918144b49634ed5a43d75f8f7d194d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd7d3d638-32", "ovs_interfaceid": "d7d3d638-32a8-4d3e-abce-d0e8942a6c22", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T09:19:04Z,direct_url=<?>,disk_format='qcow2',id=3ebffd97-b242-42d7-b245-ebdaf8e4377c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='1cb29f3743454b7fb71af92993455437',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T09:19:06Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encrypted': False, 'encryption_secret_uuid': None, 'size': 0, 'boot_index': 0, 'device_name': '/dev/vda', 'disk_bus': 'virtio', 'guest_format': None, 'encryption_options': None, 'encryption_format': None, 'device_type': 'disk', 'image_id': '3ebffd97-b242-42d7-b245-ebdaf8e4377c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Dec 05 09:31:50 compute-1 nova_compute[189066]: 2025-12-05 09:31:50.123 189070 WARNING nova.virt.libvirt.driver [None req-ea5da156-5170-4219-8df4-dd2287031ffd 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 05 09:31:50 compute-1 nova_compute[189066]: 2025-12-05 09:31:50.132 189070 DEBUG nova.virt.libvirt.host [None req-ea5da156-5170-4219-8df4-dd2287031ffd 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Dec 05 09:31:50 compute-1 nova_compute[189066]: 2025-12-05 09:31:50.132 189070 DEBUG nova.virt.libvirt.host [None req-ea5da156-5170-4219-8df4-dd2287031ffd 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Dec 05 09:31:50 compute-1 nova_compute[189066]: 2025-12-05 09:31:50.140 189070 DEBUG nova.virt.libvirt.host [None req-ea5da156-5170-4219-8df4-dd2287031ffd 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Dec 05 09:31:50 compute-1 nova_compute[189066]: 2025-12-05 09:31:50.140 189070 DEBUG nova.virt.libvirt.host [None req-ea5da156-5170-4219-8df4-dd2287031ffd 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Dec 05 09:31:50 compute-1 nova_compute[189066]: 2025-12-05 09:31:50.142 189070 DEBUG nova.virt.libvirt.driver [None req-ea5da156-5170-4219-8df4-dd2287031ffd 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Dec 05 09:31:50 compute-1 nova_compute[189066]: 2025-12-05 09:31:50.142 189070 DEBUG nova.virt.hardware [None req-ea5da156-5170-4219-8df4-dd2287031ffd 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-05T09:19:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='fbadeab4-f24f-4100-963a-d228b2a6f7c4',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T09:19:04Z,direct_url=<?>,disk_format='qcow2',id=3ebffd97-b242-42d7-b245-ebdaf8e4377c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='1cb29f3743454b7fb71af92993455437',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T09:19:06Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Dec 05 09:31:50 compute-1 nova_compute[189066]: 2025-12-05 09:31:50.143 189070 DEBUG nova.virt.hardware [None req-ea5da156-5170-4219-8df4-dd2287031ffd 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Dec 05 09:31:50 compute-1 nova_compute[189066]: 2025-12-05 09:31:50.143 189070 DEBUG nova.virt.hardware [None req-ea5da156-5170-4219-8df4-dd2287031ffd 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Dec 05 09:31:50 compute-1 nova_compute[189066]: 2025-12-05 09:31:50.143 189070 DEBUG nova.virt.hardware [None req-ea5da156-5170-4219-8df4-dd2287031ffd 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Dec 05 09:31:50 compute-1 nova_compute[189066]: 2025-12-05 09:31:50.144 189070 DEBUG nova.virt.hardware [None req-ea5da156-5170-4219-8df4-dd2287031ffd 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Dec 05 09:31:50 compute-1 nova_compute[189066]: 2025-12-05 09:31:50.144 189070 DEBUG nova.virt.hardware [None req-ea5da156-5170-4219-8df4-dd2287031ffd 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Dec 05 09:31:50 compute-1 nova_compute[189066]: 2025-12-05 09:31:50.144 189070 DEBUG nova.virt.hardware [None req-ea5da156-5170-4219-8df4-dd2287031ffd 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Dec 05 09:31:50 compute-1 nova_compute[189066]: 2025-12-05 09:31:50.144 189070 DEBUG nova.virt.hardware [None req-ea5da156-5170-4219-8df4-dd2287031ffd 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Dec 05 09:31:50 compute-1 nova_compute[189066]: 2025-12-05 09:31:50.145 189070 DEBUG nova.virt.hardware [None req-ea5da156-5170-4219-8df4-dd2287031ffd 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Dec 05 09:31:50 compute-1 nova_compute[189066]: 2025-12-05 09:31:50.145 189070 DEBUG nova.virt.hardware [None req-ea5da156-5170-4219-8df4-dd2287031ffd 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Dec 05 09:31:50 compute-1 nova_compute[189066]: 2025-12-05 09:31:50.145 189070 DEBUG nova.virt.hardware [None req-ea5da156-5170-4219-8df4-dd2287031ffd 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Dec 05 09:31:50 compute-1 nova_compute[189066]: 2025-12-05 09:31:50.149 189070 DEBUG nova.virt.libvirt.vif [None req-ea5da156-5170-4219-8df4-dd2287031ffd 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T09:31:38Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1503446723',display_name='tempest-TestNetworkBasicOps-server-1503446723',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1503446723',id=23,image_ref='3ebffd97-b242-42d7-b245-ebdaf8e4377c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBAxL8qwYiwFQ90XKpcC6fYNHvnM/WUgqEtm6wBf8f4M0HY8C3R6RbtlwIDxS0SDgqUTn2GT88UA7fyLkCPv6NpqTCLHrrPHnPjNeXNwF6PVPzq8XOcGedORn9QBGLd5VXw==',key_name='tempest-TestNetworkBasicOps-1643628689',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='f918144b49634ed5a43d75f8f7d194d3',ramdisk_id='',reservation_id='r-0tfmxwfc',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='3ebffd97-b242-42d7-b245-ebdaf8e4377c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1341993432',owner_user_name='tempest-TestNetworkBasicOps-1341993432-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T09:31:39Z,user_data=None,user_id='822a2d80e80e46d9a5a49e1e9560e0d9',uuid=b843e130-e156-47b6-8a2a-d4811973b93a,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "d7d3d638-32a8-4d3e-abce-d0e8942a6c22", "address": "fa:16:3e:7f:68:29", "network": {"id": "7e493efd-28a8-4124-9b83-acc24853936f", "bridge": "br-int", "label": "tempest-network-smoke--1424299295", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f918144b49634ed5a43d75f8f7d194d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd7d3d638-32", "ovs_interfaceid": "d7d3d638-32a8-4d3e-abce-d0e8942a6c22", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Dec 05 09:31:50 compute-1 nova_compute[189066]: 2025-12-05 09:31:50.150 189070 DEBUG nova.network.os_vif_util [None req-ea5da156-5170-4219-8df4-dd2287031ffd 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Converting VIF {"id": "d7d3d638-32a8-4d3e-abce-d0e8942a6c22", "address": "fa:16:3e:7f:68:29", "network": {"id": "7e493efd-28a8-4124-9b83-acc24853936f", "bridge": "br-int", "label": "tempest-network-smoke--1424299295", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f918144b49634ed5a43d75f8f7d194d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd7d3d638-32", "ovs_interfaceid": "d7d3d638-32a8-4d3e-abce-d0e8942a6c22", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 09:31:50 compute-1 nova_compute[189066]: 2025-12-05 09:31:50.150 189070 DEBUG nova.network.os_vif_util [None req-ea5da156-5170-4219-8df4-dd2287031ffd 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:7f:68:29,bridge_name='br-int',has_traffic_filtering=True,id=d7d3d638-32a8-4d3e-abce-d0e8942a6c22,network=Network(7e493efd-28a8-4124-9b83-acc24853936f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd7d3d638-32') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 09:31:50 compute-1 nova_compute[189066]: 2025-12-05 09:31:50.152 189070 DEBUG nova.objects.instance [None req-ea5da156-5170-4219-8df4-dd2287031ffd 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Lazy-loading 'pci_devices' on Instance uuid b843e130-e156-47b6-8a2a-d4811973b93a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 09:31:50 compute-1 nova_compute[189066]: 2025-12-05 09:31:50.267 189070 DEBUG nova.virt.libvirt.driver [None req-ea5da156-5170-4219-8df4-dd2287031ffd 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: b843e130-e156-47b6-8a2a-d4811973b93a] End _get_guest_xml xml=<domain type="kvm">
Dec 05 09:31:50 compute-1 nova_compute[189066]:   <uuid>b843e130-e156-47b6-8a2a-d4811973b93a</uuid>
Dec 05 09:31:50 compute-1 nova_compute[189066]:   <name>instance-00000017</name>
Dec 05 09:31:50 compute-1 nova_compute[189066]:   <memory>131072</memory>
Dec 05 09:31:50 compute-1 nova_compute[189066]:   <vcpu>1</vcpu>
Dec 05 09:31:50 compute-1 nova_compute[189066]:   <metadata>
Dec 05 09:31:50 compute-1 nova_compute[189066]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 05 09:31:50 compute-1 nova_compute[189066]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec 05 09:31:50 compute-1 nova_compute[189066]:       <nova:name>tempest-TestNetworkBasicOps-server-1503446723</nova:name>
Dec 05 09:31:50 compute-1 nova_compute[189066]:       <nova:creationTime>2025-12-05 09:31:50</nova:creationTime>
Dec 05 09:31:50 compute-1 nova_compute[189066]:       <nova:flavor name="m1.nano">
Dec 05 09:31:50 compute-1 nova_compute[189066]:         <nova:memory>128</nova:memory>
Dec 05 09:31:50 compute-1 nova_compute[189066]:         <nova:disk>1</nova:disk>
Dec 05 09:31:50 compute-1 nova_compute[189066]:         <nova:swap>0</nova:swap>
Dec 05 09:31:50 compute-1 nova_compute[189066]:         <nova:ephemeral>0</nova:ephemeral>
Dec 05 09:31:50 compute-1 nova_compute[189066]:         <nova:vcpus>1</nova:vcpus>
Dec 05 09:31:50 compute-1 nova_compute[189066]:       </nova:flavor>
Dec 05 09:31:50 compute-1 nova_compute[189066]:       <nova:owner>
Dec 05 09:31:50 compute-1 nova_compute[189066]:         <nova:user uuid="822a2d80e80e46d9a5a49e1e9560e0d9">tempest-TestNetworkBasicOps-1341993432-project-member</nova:user>
Dec 05 09:31:50 compute-1 nova_compute[189066]:         <nova:project uuid="f918144b49634ed5a43d75f8f7d194d3">tempest-TestNetworkBasicOps-1341993432</nova:project>
Dec 05 09:31:50 compute-1 nova_compute[189066]:       </nova:owner>
Dec 05 09:31:50 compute-1 nova_compute[189066]:       <nova:root type="image" uuid="3ebffd97-b242-42d7-b245-ebdaf8e4377c"/>
Dec 05 09:31:50 compute-1 nova_compute[189066]:       <nova:ports>
Dec 05 09:31:50 compute-1 nova_compute[189066]:         <nova:port uuid="d7d3d638-32a8-4d3e-abce-d0e8942a6c22">
Dec 05 09:31:50 compute-1 nova_compute[189066]:           <nova:ip type="fixed" address="10.100.0.10" ipVersion="4"/>
Dec 05 09:31:50 compute-1 nova_compute[189066]:         </nova:port>
Dec 05 09:31:50 compute-1 nova_compute[189066]:       </nova:ports>
Dec 05 09:31:50 compute-1 nova_compute[189066]:     </nova:instance>
Dec 05 09:31:50 compute-1 nova_compute[189066]:   </metadata>
Dec 05 09:31:50 compute-1 nova_compute[189066]:   <sysinfo type="smbios">
Dec 05 09:31:50 compute-1 nova_compute[189066]:     <system>
Dec 05 09:31:50 compute-1 nova_compute[189066]:       <entry name="manufacturer">RDO</entry>
Dec 05 09:31:50 compute-1 nova_compute[189066]:       <entry name="product">OpenStack Compute</entry>
Dec 05 09:31:50 compute-1 nova_compute[189066]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec 05 09:31:50 compute-1 nova_compute[189066]:       <entry name="serial">b843e130-e156-47b6-8a2a-d4811973b93a</entry>
Dec 05 09:31:50 compute-1 nova_compute[189066]:       <entry name="uuid">b843e130-e156-47b6-8a2a-d4811973b93a</entry>
Dec 05 09:31:50 compute-1 nova_compute[189066]:       <entry name="family">Virtual Machine</entry>
Dec 05 09:31:50 compute-1 nova_compute[189066]:     </system>
Dec 05 09:31:50 compute-1 nova_compute[189066]:   </sysinfo>
Dec 05 09:31:50 compute-1 nova_compute[189066]:   <os>
Dec 05 09:31:50 compute-1 nova_compute[189066]:     <type arch="x86_64" machine="q35">hvm</type>
Dec 05 09:31:50 compute-1 nova_compute[189066]:     <boot dev="hd"/>
Dec 05 09:31:50 compute-1 nova_compute[189066]:     <smbios mode="sysinfo"/>
Dec 05 09:31:50 compute-1 nova_compute[189066]:   </os>
Dec 05 09:31:50 compute-1 nova_compute[189066]:   <features>
Dec 05 09:31:50 compute-1 nova_compute[189066]:     <acpi/>
Dec 05 09:31:50 compute-1 nova_compute[189066]:     <apic/>
Dec 05 09:31:50 compute-1 nova_compute[189066]:     <vmcoreinfo/>
Dec 05 09:31:50 compute-1 nova_compute[189066]:   </features>
Dec 05 09:31:50 compute-1 nova_compute[189066]:   <clock offset="utc">
Dec 05 09:31:50 compute-1 nova_compute[189066]:     <timer name="pit" tickpolicy="delay"/>
Dec 05 09:31:50 compute-1 nova_compute[189066]:     <timer name="rtc" tickpolicy="catchup"/>
Dec 05 09:31:50 compute-1 nova_compute[189066]:     <timer name="hpet" present="no"/>
Dec 05 09:31:50 compute-1 nova_compute[189066]:   </clock>
Dec 05 09:31:50 compute-1 nova_compute[189066]:   <cpu mode="custom" match="exact">
Dec 05 09:31:50 compute-1 nova_compute[189066]:     <model>Nehalem</model>
Dec 05 09:31:50 compute-1 nova_compute[189066]:     <topology sockets="1" cores="1" threads="1"/>
Dec 05 09:31:50 compute-1 nova_compute[189066]:   </cpu>
Dec 05 09:31:50 compute-1 nova_compute[189066]:   <devices>
Dec 05 09:31:50 compute-1 nova_compute[189066]:     <disk type="file" device="disk">
Dec 05 09:31:50 compute-1 nova_compute[189066]:       <driver name="qemu" type="qcow2" cache="none"/>
Dec 05 09:31:50 compute-1 nova_compute[189066]:       <source file="/var/lib/nova/instances/b843e130-e156-47b6-8a2a-d4811973b93a/disk"/>
Dec 05 09:31:50 compute-1 nova_compute[189066]:       <target dev="vda" bus="virtio"/>
Dec 05 09:31:50 compute-1 nova_compute[189066]:     </disk>
Dec 05 09:31:50 compute-1 nova_compute[189066]:     <disk type="file" device="cdrom">
Dec 05 09:31:50 compute-1 nova_compute[189066]:       <driver name="qemu" type="raw" cache="none"/>
Dec 05 09:31:50 compute-1 nova_compute[189066]:       <source file="/var/lib/nova/instances/b843e130-e156-47b6-8a2a-d4811973b93a/disk.config"/>
Dec 05 09:31:50 compute-1 nova_compute[189066]:       <target dev="sda" bus="sata"/>
Dec 05 09:31:50 compute-1 nova_compute[189066]:     </disk>
Dec 05 09:31:50 compute-1 nova_compute[189066]:     <interface type="ethernet">
Dec 05 09:31:50 compute-1 nova_compute[189066]:       <mac address="fa:16:3e:7f:68:29"/>
Dec 05 09:31:50 compute-1 nova_compute[189066]:       <model type="virtio"/>
Dec 05 09:31:50 compute-1 nova_compute[189066]:       <driver name="vhost" rx_queue_size="512"/>
Dec 05 09:31:50 compute-1 nova_compute[189066]:       <mtu size="1442"/>
Dec 05 09:31:50 compute-1 nova_compute[189066]:       <target dev="tapd7d3d638-32"/>
Dec 05 09:31:50 compute-1 nova_compute[189066]:     </interface>
Dec 05 09:31:50 compute-1 nova_compute[189066]:     <serial type="pty">
Dec 05 09:31:50 compute-1 nova_compute[189066]:       <log file="/var/lib/nova/instances/b843e130-e156-47b6-8a2a-d4811973b93a/console.log" append="off"/>
Dec 05 09:31:50 compute-1 nova_compute[189066]:     </serial>
Dec 05 09:31:50 compute-1 nova_compute[189066]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Dec 05 09:31:50 compute-1 nova_compute[189066]:     <video>
Dec 05 09:31:50 compute-1 nova_compute[189066]:       <model type="virtio"/>
Dec 05 09:31:50 compute-1 nova_compute[189066]:     </video>
Dec 05 09:31:50 compute-1 nova_compute[189066]:     <input type="tablet" bus="usb"/>
Dec 05 09:31:50 compute-1 nova_compute[189066]:     <rng model="virtio">
Dec 05 09:31:50 compute-1 nova_compute[189066]:       <backend model="random">/dev/urandom</backend>
Dec 05 09:31:50 compute-1 nova_compute[189066]:     </rng>
Dec 05 09:31:50 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root"/>
Dec 05 09:31:50 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:31:50 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:31:50 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:31:50 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:31:50 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:31:50 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:31:50 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:31:50 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:31:50 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:31:50 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:31:50 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:31:50 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:31:50 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:31:50 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:31:50 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:31:50 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:31:50 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:31:50 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:31:50 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:31:50 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:31:50 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:31:50 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:31:50 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:31:50 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:31:50 compute-1 nova_compute[189066]:     <controller type="usb" index="0"/>
Dec 05 09:31:50 compute-1 nova_compute[189066]:     <memballoon model="virtio">
Dec 05 09:31:50 compute-1 nova_compute[189066]:       <stats period="10"/>
Dec 05 09:31:50 compute-1 nova_compute[189066]:     </memballoon>
Dec 05 09:31:50 compute-1 nova_compute[189066]:   </devices>
Dec 05 09:31:50 compute-1 nova_compute[189066]: </domain>
Dec 05 09:31:50 compute-1 nova_compute[189066]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Dec 05 09:31:50 compute-1 nova_compute[189066]: 2025-12-05 09:31:50.269 189070 DEBUG nova.compute.manager [None req-ea5da156-5170-4219-8df4-dd2287031ffd 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: b843e130-e156-47b6-8a2a-d4811973b93a] Preparing to wait for external event network-vif-plugged-d7d3d638-32a8-4d3e-abce-d0e8942a6c22 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Dec 05 09:31:50 compute-1 nova_compute[189066]: 2025-12-05 09:31:50.270 189070 DEBUG oslo_concurrency.lockutils [None req-ea5da156-5170-4219-8df4-dd2287031ffd 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Acquiring lock "b843e130-e156-47b6-8a2a-d4811973b93a-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:31:50 compute-1 nova_compute[189066]: 2025-12-05 09:31:50.270 189070 DEBUG oslo_concurrency.lockutils [None req-ea5da156-5170-4219-8df4-dd2287031ffd 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Lock "b843e130-e156-47b6-8a2a-d4811973b93a-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:31:50 compute-1 nova_compute[189066]: 2025-12-05 09:31:50.271 189070 DEBUG oslo_concurrency.lockutils [None req-ea5da156-5170-4219-8df4-dd2287031ffd 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Lock "b843e130-e156-47b6-8a2a-d4811973b93a-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:31:50 compute-1 nova_compute[189066]: 2025-12-05 09:31:50.272 189070 DEBUG nova.virt.libvirt.vif [None req-ea5da156-5170-4219-8df4-dd2287031ffd 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T09:31:38Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1503446723',display_name='tempest-TestNetworkBasicOps-server-1503446723',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1503446723',id=23,image_ref='3ebffd97-b242-42d7-b245-ebdaf8e4377c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBAxL8qwYiwFQ90XKpcC6fYNHvnM/WUgqEtm6wBf8f4M0HY8C3R6RbtlwIDxS0SDgqUTn2GT88UA7fyLkCPv6NpqTCLHrrPHnPjNeXNwF6PVPzq8XOcGedORn9QBGLd5VXw==',key_name='tempest-TestNetworkBasicOps-1643628689',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='f918144b49634ed5a43d75f8f7d194d3',ramdisk_id='',reservation_id='r-0tfmxwfc',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='3ebffd97-b242-42d7-b245-ebdaf8e4377c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1341993432',owner_user_name='tempest-TestNetworkBasicOps-1341993432-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T09:31:39Z,user_data=None,user_id='822a2d80e80e46d9a5a49e1e9560e0d9',uuid=b843e130-e156-47b6-8a2a-d4811973b93a,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "d7d3d638-32a8-4d3e-abce-d0e8942a6c22", "address": "fa:16:3e:7f:68:29", "network": {"id": "7e493efd-28a8-4124-9b83-acc24853936f", "bridge": "br-int", "label": "tempest-network-smoke--1424299295", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f918144b49634ed5a43d75f8f7d194d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd7d3d638-32", "ovs_interfaceid": "d7d3d638-32a8-4d3e-abce-d0e8942a6c22", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Dec 05 09:31:50 compute-1 nova_compute[189066]: 2025-12-05 09:31:50.272 189070 DEBUG nova.network.os_vif_util [None req-ea5da156-5170-4219-8df4-dd2287031ffd 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Converting VIF {"id": "d7d3d638-32a8-4d3e-abce-d0e8942a6c22", "address": "fa:16:3e:7f:68:29", "network": {"id": "7e493efd-28a8-4124-9b83-acc24853936f", "bridge": "br-int", "label": "tempest-network-smoke--1424299295", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f918144b49634ed5a43d75f8f7d194d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd7d3d638-32", "ovs_interfaceid": "d7d3d638-32a8-4d3e-abce-d0e8942a6c22", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 09:31:50 compute-1 nova_compute[189066]: 2025-12-05 09:31:50.273 189070 DEBUG nova.network.os_vif_util [None req-ea5da156-5170-4219-8df4-dd2287031ffd 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:7f:68:29,bridge_name='br-int',has_traffic_filtering=True,id=d7d3d638-32a8-4d3e-abce-d0e8942a6c22,network=Network(7e493efd-28a8-4124-9b83-acc24853936f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd7d3d638-32') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 09:31:50 compute-1 nova_compute[189066]: 2025-12-05 09:31:50.273 189070 DEBUG os_vif [None req-ea5da156-5170-4219-8df4-dd2287031ffd 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:7f:68:29,bridge_name='br-int',has_traffic_filtering=True,id=d7d3d638-32a8-4d3e-abce-d0e8942a6c22,network=Network(7e493efd-28a8-4124-9b83-acc24853936f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd7d3d638-32') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Dec 05 09:31:50 compute-1 nova_compute[189066]: 2025-12-05 09:31:50.274 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:31:50 compute-1 nova_compute[189066]: 2025-12-05 09:31:50.275 189070 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 09:31:50 compute-1 nova_compute[189066]: 2025-12-05 09:31:50.275 189070 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 09:31:50 compute-1 nova_compute[189066]: 2025-12-05 09:31:50.281 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:31:50 compute-1 nova_compute[189066]: 2025-12-05 09:31:50.281 189070 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd7d3d638-32, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 09:31:50 compute-1 nova_compute[189066]: 2025-12-05 09:31:50.282 189070 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapd7d3d638-32, col_values=(('external_ids', {'iface-id': 'd7d3d638-32a8-4d3e-abce-d0e8942a6c22', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:7f:68:29', 'vm-uuid': 'b843e130-e156-47b6-8a2a-d4811973b93a'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 09:31:50 compute-1 nova_compute[189066]: 2025-12-05 09:31:50.284 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:31:50 compute-1 NetworkManager[55704]: <info>  [1764927110.2858] manager: (tapd7d3d638-32): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/81)
Dec 05 09:31:50 compute-1 nova_compute[189066]: 2025-12-05 09:31:50.288 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 09:31:50 compute-1 nova_compute[189066]: 2025-12-05 09:31:50.294 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:31:50 compute-1 nova_compute[189066]: 2025-12-05 09:31:50.296 189070 INFO os_vif [None req-ea5da156-5170-4219-8df4-dd2287031ffd 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:7f:68:29,bridge_name='br-int',has_traffic_filtering=True,id=d7d3d638-32a8-4d3e-abce-d0e8942a6c22,network=Network(7e493efd-28a8-4124-9b83-acc24853936f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd7d3d638-32')
Dec 05 09:31:50 compute-1 nova_compute[189066]: 2025-12-05 09:31:50.494 189070 DEBUG nova.virt.libvirt.driver [None req-ea5da156-5170-4219-8df4-dd2287031ffd 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 05 09:31:50 compute-1 nova_compute[189066]: 2025-12-05 09:31:50.495 189070 DEBUG nova.virt.libvirt.driver [None req-ea5da156-5170-4219-8df4-dd2287031ffd 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 05 09:31:50 compute-1 nova_compute[189066]: 2025-12-05 09:31:50.495 189070 DEBUG nova.virt.libvirt.driver [None req-ea5da156-5170-4219-8df4-dd2287031ffd 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] No VIF found with MAC fa:16:3e:7f:68:29, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Dec 05 09:31:50 compute-1 nova_compute[189066]: 2025-12-05 09:31:50.496 189070 INFO nova.virt.libvirt.driver [None req-ea5da156-5170-4219-8df4-dd2287031ffd 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: b843e130-e156-47b6-8a2a-d4811973b93a] Using config drive
Dec 05 09:31:50 compute-1 nova_compute[189066]: 2025-12-05 09:31:50.982 189070 INFO nova.virt.libvirt.driver [None req-ea5da156-5170-4219-8df4-dd2287031ffd 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: b843e130-e156-47b6-8a2a-d4811973b93a] Creating config drive at /var/lib/nova/instances/b843e130-e156-47b6-8a2a-d4811973b93a/disk.config
Dec 05 09:31:50 compute-1 nova_compute[189066]: 2025-12-05 09:31:50.988 189070 DEBUG oslo_concurrency.processutils [None req-ea5da156-5170-4219-8df4-dd2287031ffd 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/b843e130-e156-47b6-8a2a-d4811973b93a/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp3f7832vy execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 09:31:51 compute-1 nova_compute[189066]: 2025-12-05 09:31:51.132 189070 DEBUG oslo_concurrency.processutils [None req-ea5da156-5170-4219-8df4-dd2287031ffd 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/b843e130-e156-47b6-8a2a-d4811973b93a/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp3f7832vy" returned: 0 in 0.144s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 09:31:51 compute-1 kernel: tapd7d3d638-32: entered promiscuous mode
Dec 05 09:31:51 compute-1 NetworkManager[55704]: <info>  [1764927111.1962] manager: (tapd7d3d638-32): new Tun device (/org/freedesktop/NetworkManager/Devices/82)
Dec 05 09:31:51 compute-1 nova_compute[189066]: 2025-12-05 09:31:51.198 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:31:51 compute-1 ovn_controller[95809]: 2025-12-05T09:31:51Z|00148|binding|INFO|Claiming lport d7d3d638-32a8-4d3e-abce-d0e8942a6c22 for this chassis.
Dec 05 09:31:51 compute-1 ovn_controller[95809]: 2025-12-05T09:31:51Z|00149|binding|INFO|d7d3d638-32a8-4d3e-abce-d0e8942a6c22: Claiming fa:16:3e:7f:68:29 10.100.0.10
Dec 05 09:31:51 compute-1 nova_compute[189066]: 2025-12-05 09:31:51.201 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:31:51 compute-1 ovn_controller[95809]: 2025-12-05T09:31:51Z|00150|binding|INFO|Setting lport d7d3d638-32a8-4d3e-abce-d0e8942a6c22 ovn-installed in OVS
Dec 05 09:31:51 compute-1 nova_compute[189066]: 2025-12-05 09:31:51.218 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:31:51 compute-1 systemd-machined[154815]: New machine qemu-12-instance-00000017.
Dec 05 09:31:51 compute-1 systemd[1]: Started Virtual Machine qemu-12-instance-00000017.
Dec 05 09:31:51 compute-1 systemd-udevd[226573]: Network interface NamePolicy= disabled on kernel command line.
Dec 05 09:31:51 compute-1 NetworkManager[55704]: <info>  [1764927111.2743] device (tapd7d3d638-32): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec 05 09:31:51 compute-1 NetworkManager[55704]: <info>  [1764927111.2751] device (tapd7d3d638-32): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec 05 09:31:51 compute-1 nova_compute[189066]: 2025-12-05 09:31:51.663 189070 DEBUG nova.virt.driver [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] Emitting event <LifecycleEvent: 1764927111.662628, b843e130-e156-47b6-8a2a-d4811973b93a => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 09:31:51 compute-1 nova_compute[189066]: 2025-12-05 09:31:51.665 189070 INFO nova.compute.manager [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] [instance: b843e130-e156-47b6-8a2a-d4811973b93a] VM Started (Lifecycle Event)
Dec 05 09:31:51 compute-1 ovn_controller[95809]: 2025-12-05T09:31:51Z|00151|binding|INFO|Setting lport d7d3d638-32a8-4d3e-abce-d0e8942a6c22 up in Southbound
Dec 05 09:31:51 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:31:51.690 105272 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:7f:68:29 10.100.0.10'], port_security=['fa:16:3e:7f:68:29 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': 'b843e130-e156-47b6-8a2a-d4811973b93a', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7e493efd-28a8-4124-9b83-acc24853936f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'f918144b49634ed5a43d75f8f7d194d3', 'neutron:revision_number': '2', 'neutron:security_group_ids': '2a619c6b-41fa-43e5-b2d4-5d1e0fb94c32', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=1e94af5e-95ab-48c4-b791-d5e364f548eb, chassis=[<ovs.db.idl.Row object at 0x7fc06f54c970>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fc06f54c970>], logical_port=d7d3d638-32a8-4d3e-abce-d0e8942a6c22) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 09:31:51 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:31:51.692 105272 INFO neutron.agent.ovn.metadata.agent [-] Port d7d3d638-32a8-4d3e-abce-d0e8942a6c22 in datapath 7e493efd-28a8-4124-9b83-acc24853936f bound to our chassis
Dec 05 09:31:51 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:31:51.696 105272 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 7e493efd-28a8-4124-9b83-acc24853936f
Dec 05 09:31:51 compute-1 nova_compute[189066]: 2025-12-05 09:31:51.710 189070 DEBUG nova.compute.manager [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] [instance: b843e130-e156-47b6-8a2a-d4811973b93a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 09:31:51 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:31:51.712 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[09e6498b-352f-49a3-bea8-8e575e634b28]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:31:51 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:31:51.714 105272 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap7e493efd-21 in ovnmeta-7e493efd-28a8-4124-9b83-acc24853936f namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Dec 05 09:31:51 compute-1 nova_compute[189066]: 2025-12-05 09:31:51.717 189070 DEBUG nova.virt.driver [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] Emitting event <LifecycleEvent: 1764927111.6627307, b843e130-e156-47b6-8a2a-d4811973b93a => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 09:31:51 compute-1 nova_compute[189066]: 2025-12-05 09:31:51.717 189070 INFO nova.compute.manager [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] [instance: b843e130-e156-47b6-8a2a-d4811973b93a] VM Paused (Lifecycle Event)
Dec 05 09:31:51 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:31:51.717 220556 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap7e493efd-20 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Dec 05 09:31:51 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:31:51.718 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[0d2b67ce-9a3b-42d7-afac-6d05078c9a70]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:31:51 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:31:51.719 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[4fbfb9a0-7935-47e5-a9ab-aa50aa34cc47]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:31:51 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:31:51.735 105809 DEBUG oslo.privsep.daemon [-] privsep: reply[8380bf36-5ded-4ba9-8295-bfd7983e4c7a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:31:51 compute-1 nova_compute[189066]: 2025-12-05 09:31:51.741 189070 DEBUG nova.compute.manager [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] [instance: b843e130-e156-47b6-8a2a-d4811973b93a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 09:31:51 compute-1 nova_compute[189066]: 2025-12-05 09:31:51.746 189070 DEBUG nova.compute.manager [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] [instance: b843e130-e156-47b6-8a2a-d4811973b93a] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 09:31:51 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:31:51.768 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[8d83fe24-c3cc-4c11-a46f-c9b0f6aa1493]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:31:51 compute-1 nova_compute[189066]: 2025-12-05 09:31:51.783 189070 INFO nova.compute.manager [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] [instance: b843e130-e156-47b6-8a2a-d4811973b93a] During sync_power_state the instance has a pending task (spawning). Skip.
Dec 05 09:31:51 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:31:51.805 220896 DEBUG oslo.privsep.daemon [-] privsep: reply[5a81f415-3457-496e-abd5-d237e66a0735]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:31:51 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:31:51.813 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[294df50a-a49c-4b58-8f64-ce1736520494]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:31:51 compute-1 NetworkManager[55704]: <info>  [1764927111.8151] manager: (tap7e493efd-20): new Veth device (/org/freedesktop/NetworkManager/Devices/83)
Dec 05 09:31:51 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:31:51.851 220896 DEBUG oslo.privsep.daemon [-] privsep: reply[d79c5772-95c5-483b-8398-3ddbcfe7fa51]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:31:51 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:31:51.854 220896 DEBUG oslo.privsep.daemon [-] privsep: reply[23b89093-0881-496f-af0f-eae721951412]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:31:51 compute-1 NetworkManager[55704]: <info>  [1764927111.8791] device (tap7e493efd-20): carrier: link connected
Dec 05 09:31:51 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:31:51.884 220896 DEBUG oslo.privsep.daemon [-] privsep: reply[6af8804d-ae23-479b-be01-b3d2ebda0567]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:31:51 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:31:51.910 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[42aaa298-f727-4151-bc16-bc82a037d7cc]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap7e493efd-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:65:0c:1b'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 45], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 438488, 'reachable_time': 28524, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 226613, 'error': None, 'target': 'ovnmeta-7e493efd-28a8-4124-9b83-acc24853936f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:31:51 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:31:51.931 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[9bd86654-5293-4efd-ad36-1e141f14e80d]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe65:c1b'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 438488, 'tstamp': 438488}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 226614, 'error': None, 'target': 'ovnmeta-7e493efd-28a8-4124-9b83-acc24853936f', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:31:51 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:31:51.955 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[d5163536-4356-4b0a-83ec-9d9fcdec6f4e]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap7e493efd-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:65:0c:1b'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 45], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 438488, 'reachable_time': 28524, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 226615, 'error': None, 'target': 'ovnmeta-7e493efd-28a8-4124-9b83-acc24853936f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:31:51 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:31:51.991 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[c0c8e712-78b4-42a0-a2ce-3fd6a00ba06b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:31:52 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:31:52.085 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[7ee411ee-42a7-4176-8106-63d44a3e3b8f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:31:52 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:31:52.088 105272 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7e493efd-20, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 09:31:52 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:31:52.088 105272 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 09:31:52 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:31:52.089 105272 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap7e493efd-20, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 09:31:52 compute-1 nova_compute[189066]: 2025-12-05 09:31:52.091 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:31:52 compute-1 NetworkManager[55704]: <info>  [1764927112.0919] manager: (tap7e493efd-20): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/84)
Dec 05 09:31:52 compute-1 kernel: tap7e493efd-20: entered promiscuous mode
Dec 05 09:31:52 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:31:52.096 105272 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap7e493efd-20, col_values=(('external_ids', {'iface-id': '99e3597b-6e93-4623-ba40-68c3147cea55'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 09:31:52 compute-1 ovn_controller[95809]: 2025-12-05T09:31:52Z|00152|binding|INFO|Releasing lport 99e3597b-6e93-4623-ba40-68c3147cea55 from this chassis (sb_readonly=0)
Dec 05 09:31:52 compute-1 nova_compute[189066]: 2025-12-05 09:31:52.097 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:31:52 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:31:52.100 105272 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/7e493efd-28a8-4124-9b83-acc24853936f.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/7e493efd-28a8-4124-9b83-acc24853936f.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Dec 05 09:31:52 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:31:52.102 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[fff27b2a-11fc-4829-b6e5-6b50fce6c757]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:31:52 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:31:52.103 105272 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Dec 05 09:31:52 compute-1 ovn_metadata_agent[105267]: global
Dec 05 09:31:52 compute-1 ovn_metadata_agent[105267]:     log         /dev/log local0 debug
Dec 05 09:31:52 compute-1 ovn_metadata_agent[105267]:     log-tag     haproxy-metadata-proxy-7e493efd-28a8-4124-9b83-acc24853936f
Dec 05 09:31:52 compute-1 ovn_metadata_agent[105267]:     user        root
Dec 05 09:31:52 compute-1 ovn_metadata_agent[105267]:     group       root
Dec 05 09:31:52 compute-1 ovn_metadata_agent[105267]:     maxconn     1024
Dec 05 09:31:52 compute-1 ovn_metadata_agent[105267]:     pidfile     /var/lib/neutron/external/pids/7e493efd-28a8-4124-9b83-acc24853936f.pid.haproxy
Dec 05 09:31:52 compute-1 ovn_metadata_agent[105267]:     daemon
Dec 05 09:31:52 compute-1 ovn_metadata_agent[105267]: 
Dec 05 09:31:52 compute-1 ovn_metadata_agent[105267]: defaults
Dec 05 09:31:52 compute-1 ovn_metadata_agent[105267]:     log global
Dec 05 09:31:52 compute-1 ovn_metadata_agent[105267]:     mode http
Dec 05 09:31:52 compute-1 ovn_metadata_agent[105267]:     option httplog
Dec 05 09:31:52 compute-1 ovn_metadata_agent[105267]:     option dontlognull
Dec 05 09:31:52 compute-1 ovn_metadata_agent[105267]:     option http-server-close
Dec 05 09:31:52 compute-1 ovn_metadata_agent[105267]:     option forwardfor
Dec 05 09:31:52 compute-1 ovn_metadata_agent[105267]:     retries                 3
Dec 05 09:31:52 compute-1 ovn_metadata_agent[105267]:     timeout http-request    30s
Dec 05 09:31:52 compute-1 ovn_metadata_agent[105267]:     timeout connect         30s
Dec 05 09:31:52 compute-1 ovn_metadata_agent[105267]:     timeout client          32s
Dec 05 09:31:52 compute-1 ovn_metadata_agent[105267]:     timeout server          32s
Dec 05 09:31:52 compute-1 ovn_metadata_agent[105267]:     timeout http-keep-alive 30s
Dec 05 09:31:52 compute-1 ovn_metadata_agent[105267]: 
Dec 05 09:31:52 compute-1 ovn_metadata_agent[105267]: 
Dec 05 09:31:52 compute-1 ovn_metadata_agent[105267]: listen listener
Dec 05 09:31:52 compute-1 ovn_metadata_agent[105267]:     bind 169.254.169.254:80
Dec 05 09:31:52 compute-1 ovn_metadata_agent[105267]:     server metadata /var/lib/neutron/metadata_proxy
Dec 05 09:31:52 compute-1 ovn_metadata_agent[105267]:     http-request add-header X-OVN-Network-ID 7e493efd-28a8-4124-9b83-acc24853936f
Dec 05 09:31:52 compute-1 ovn_metadata_agent[105267]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Dec 05 09:31:52 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:31:52.104 105272 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-7e493efd-28a8-4124-9b83-acc24853936f', 'env', 'PROCESS_TAG=haproxy-7e493efd-28a8-4124-9b83-acc24853936f', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/7e493efd-28a8-4124-9b83-acc24853936f.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Dec 05 09:31:52 compute-1 nova_compute[189066]: 2025-12-05 09:31:52.110 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:31:52 compute-1 podman[226647]: 2025-12-05 09:31:52.571616835 +0000 UTC m=+0.058410699 container create e4cc6f7e95e20852e8ccab93d85b207a1077a1bf7f58c822bd1b72ac71502d32 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7e493efd-28a8-4124-9b83-acc24853936f, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0)
Dec 05 09:31:52 compute-1 systemd[1]: Started libpod-conmon-e4cc6f7e95e20852e8ccab93d85b207a1077a1bf7f58c822bd1b72ac71502d32.scope.
Dec 05 09:31:52 compute-1 podman[226647]: 2025-12-05 09:31:52.542019497 +0000 UTC m=+0.028813381 image pull 014dc726c85414b29f2dde7b5d875685d08784761c0f0ffa8630d1583a877bf9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Dec 05 09:31:52 compute-1 systemd[1]: Started libcrun container.
Dec 05 09:31:52 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/761f53ea117b0cb1315f5f911b085bd8d2ffeb6bb05961299e6af01dda09830a/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 05 09:31:52 compute-1 podman[226647]: 2025-12-05 09:31:52.678392922 +0000 UTC m=+0.165186876 container init e4cc6f7e95e20852e8ccab93d85b207a1077a1bf7f58c822bd1b72ac71502d32 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7e493efd-28a8-4124-9b83-acc24853936f, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Dec 05 09:31:52 compute-1 podman[226647]: 2025-12-05 09:31:52.688264475 +0000 UTC m=+0.175058349 container start e4cc6f7e95e20852e8ccab93d85b207a1077a1bf7f58c822bd1b72ac71502d32 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7e493efd-28a8-4124-9b83-acc24853936f, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true)
Dec 05 09:31:52 compute-1 neutron-haproxy-ovnmeta-7e493efd-28a8-4124-9b83-acc24853936f[226663]: [NOTICE]   (226674) : New worker (226682) forked
Dec 05 09:31:52 compute-1 neutron-haproxy-ovnmeta-7e493efd-28a8-4124-9b83-acc24853936f[226663]: [NOTICE]   (226674) : Loading success.
Dec 05 09:31:52 compute-1 podman[226666]: 2025-12-05 09:31:52.752553847 +0000 UTC m=+0.077467297 container health_status b07f36300303a5c1729056a61e344d6c6fd9b2e404082d8e8ce7185d45652c2b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=openstack_network_exporter, name=ubi9-minimal, container_name=openstack_network_exporter, managed_by=edpm_ansible, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.buildah.version=1.33.7, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., architecture=x86_64, build-date=2025-08-20T13:12:41, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, vcs-type=git)
Dec 05 09:31:53 compute-1 nova_compute[189066]: 2025-12-05 09:31:53.386 189070 DEBUG nova.compute.manager [req-49aaf21f-4f8a-4538-9c02-b7ce7dec72f1 req-6b7df616-65c9-4040-91f3-6cfae4933533 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: b843e130-e156-47b6-8a2a-d4811973b93a] Received event network-vif-plugged-d7d3d638-32a8-4d3e-abce-d0e8942a6c22 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 09:31:53 compute-1 nova_compute[189066]: 2025-12-05 09:31:53.386 189070 DEBUG oslo_concurrency.lockutils [req-49aaf21f-4f8a-4538-9c02-b7ce7dec72f1 req-6b7df616-65c9-4040-91f3-6cfae4933533 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Acquiring lock "b843e130-e156-47b6-8a2a-d4811973b93a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:31:53 compute-1 nova_compute[189066]: 2025-12-05 09:31:53.387 189070 DEBUG oslo_concurrency.lockutils [req-49aaf21f-4f8a-4538-9c02-b7ce7dec72f1 req-6b7df616-65c9-4040-91f3-6cfae4933533 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Lock "b843e130-e156-47b6-8a2a-d4811973b93a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:31:53 compute-1 nova_compute[189066]: 2025-12-05 09:31:53.387 189070 DEBUG oslo_concurrency.lockutils [req-49aaf21f-4f8a-4538-9c02-b7ce7dec72f1 req-6b7df616-65c9-4040-91f3-6cfae4933533 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Lock "b843e130-e156-47b6-8a2a-d4811973b93a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:31:53 compute-1 nova_compute[189066]: 2025-12-05 09:31:53.387 189070 DEBUG nova.compute.manager [req-49aaf21f-4f8a-4538-9c02-b7ce7dec72f1 req-6b7df616-65c9-4040-91f3-6cfae4933533 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: b843e130-e156-47b6-8a2a-d4811973b93a] Processing event network-vif-plugged-d7d3d638-32a8-4d3e-abce-d0e8942a6c22 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Dec 05 09:31:53 compute-1 nova_compute[189066]: 2025-12-05 09:31:53.388 189070 DEBUG nova.compute.manager [None req-ea5da156-5170-4219-8df4-dd2287031ffd 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: b843e130-e156-47b6-8a2a-d4811973b93a] Instance event wait completed in 1 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Dec 05 09:31:53 compute-1 nova_compute[189066]: 2025-12-05 09:31:53.394 189070 DEBUG nova.virt.driver [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] Emitting event <LifecycleEvent: 1764927113.3943043, b843e130-e156-47b6-8a2a-d4811973b93a => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 09:31:53 compute-1 nova_compute[189066]: 2025-12-05 09:31:53.395 189070 INFO nova.compute.manager [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] [instance: b843e130-e156-47b6-8a2a-d4811973b93a] VM Resumed (Lifecycle Event)
Dec 05 09:31:53 compute-1 nova_compute[189066]: 2025-12-05 09:31:53.397 189070 DEBUG nova.virt.libvirt.driver [None req-ea5da156-5170-4219-8df4-dd2287031ffd 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: b843e130-e156-47b6-8a2a-d4811973b93a] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Dec 05 09:31:53 compute-1 nova_compute[189066]: 2025-12-05 09:31:53.401 189070 INFO nova.virt.libvirt.driver [-] [instance: b843e130-e156-47b6-8a2a-d4811973b93a] Instance spawned successfully.
Dec 05 09:31:53 compute-1 nova_compute[189066]: 2025-12-05 09:31:53.402 189070 DEBUG nova.virt.libvirt.driver [None req-ea5da156-5170-4219-8df4-dd2287031ffd 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: b843e130-e156-47b6-8a2a-d4811973b93a] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Dec 05 09:31:53 compute-1 nova_compute[189066]: 2025-12-05 09:31:53.419 189070 DEBUG nova.compute.manager [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] [instance: b843e130-e156-47b6-8a2a-d4811973b93a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 09:31:53 compute-1 nova_compute[189066]: 2025-12-05 09:31:53.427 189070 DEBUG nova.compute.manager [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] [instance: b843e130-e156-47b6-8a2a-d4811973b93a] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 09:31:53 compute-1 nova_compute[189066]: 2025-12-05 09:31:53.430 189070 DEBUG nova.virt.libvirt.driver [None req-ea5da156-5170-4219-8df4-dd2287031ffd 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: b843e130-e156-47b6-8a2a-d4811973b93a] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 09:31:53 compute-1 nova_compute[189066]: 2025-12-05 09:31:53.430 189070 DEBUG nova.virt.libvirt.driver [None req-ea5da156-5170-4219-8df4-dd2287031ffd 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: b843e130-e156-47b6-8a2a-d4811973b93a] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 09:31:53 compute-1 nova_compute[189066]: 2025-12-05 09:31:53.431 189070 DEBUG nova.virt.libvirt.driver [None req-ea5da156-5170-4219-8df4-dd2287031ffd 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: b843e130-e156-47b6-8a2a-d4811973b93a] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 09:31:53 compute-1 nova_compute[189066]: 2025-12-05 09:31:53.431 189070 DEBUG nova.virt.libvirt.driver [None req-ea5da156-5170-4219-8df4-dd2287031ffd 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: b843e130-e156-47b6-8a2a-d4811973b93a] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 09:31:53 compute-1 nova_compute[189066]: 2025-12-05 09:31:53.431 189070 DEBUG nova.virt.libvirt.driver [None req-ea5da156-5170-4219-8df4-dd2287031ffd 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: b843e130-e156-47b6-8a2a-d4811973b93a] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 09:31:53 compute-1 nova_compute[189066]: 2025-12-05 09:31:53.432 189070 DEBUG nova.virt.libvirt.driver [None req-ea5da156-5170-4219-8df4-dd2287031ffd 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: b843e130-e156-47b6-8a2a-d4811973b93a] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 09:31:53 compute-1 nova_compute[189066]: 2025-12-05 09:31:53.470 189070 INFO nova.compute.manager [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] [instance: b843e130-e156-47b6-8a2a-d4811973b93a] During sync_power_state the instance has a pending task (spawning). Skip.
Dec 05 09:31:53 compute-1 nova_compute[189066]: 2025-12-05 09:31:53.511 189070 INFO nova.compute.manager [None req-ea5da156-5170-4219-8df4-dd2287031ffd 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: b843e130-e156-47b6-8a2a-d4811973b93a] Took 13.50 seconds to spawn the instance on the hypervisor.
Dec 05 09:31:53 compute-1 nova_compute[189066]: 2025-12-05 09:31:53.512 189070 DEBUG nova.compute.manager [None req-ea5da156-5170-4219-8df4-dd2287031ffd 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: b843e130-e156-47b6-8a2a-d4811973b93a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 09:31:53 compute-1 nova_compute[189066]: 2025-12-05 09:31:53.678 189070 INFO nova.compute.manager [None req-ea5da156-5170-4219-8df4-dd2287031ffd 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: b843e130-e156-47b6-8a2a-d4811973b93a] Took 14.18 seconds to build instance.
Dec 05 09:31:53 compute-1 nova_compute[189066]: 2025-12-05 09:31:53.951 189070 DEBUG oslo_concurrency.lockutils [None req-ea5da156-5170-4219-8df4-dd2287031ffd 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Lock "b843e130-e156-47b6-8a2a-d4811973b93a" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 14.637s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:31:54 compute-1 nova_compute[189066]: 2025-12-05 09:31:54.657 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:31:55 compute-1 nova_compute[189066]: 2025-12-05 09:31:55.285 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:31:55 compute-1 nova_compute[189066]: 2025-12-05 09:31:55.495 189070 DEBUG nova.compute.manager [req-de2043a8-6c0e-4e70-9b02-cb3c97868901 req-12033b02-1c9c-4ee2-992a-7c58728f1f60 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: b843e130-e156-47b6-8a2a-d4811973b93a] Received event network-vif-plugged-d7d3d638-32a8-4d3e-abce-d0e8942a6c22 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 09:31:55 compute-1 nova_compute[189066]: 2025-12-05 09:31:55.496 189070 DEBUG oslo_concurrency.lockutils [req-de2043a8-6c0e-4e70-9b02-cb3c97868901 req-12033b02-1c9c-4ee2-992a-7c58728f1f60 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Acquiring lock "b843e130-e156-47b6-8a2a-d4811973b93a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:31:55 compute-1 nova_compute[189066]: 2025-12-05 09:31:55.496 189070 DEBUG oslo_concurrency.lockutils [req-de2043a8-6c0e-4e70-9b02-cb3c97868901 req-12033b02-1c9c-4ee2-992a-7c58728f1f60 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Lock "b843e130-e156-47b6-8a2a-d4811973b93a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:31:55 compute-1 nova_compute[189066]: 2025-12-05 09:31:55.496 189070 DEBUG oslo_concurrency.lockutils [req-de2043a8-6c0e-4e70-9b02-cb3c97868901 req-12033b02-1c9c-4ee2-992a-7c58728f1f60 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Lock "b843e130-e156-47b6-8a2a-d4811973b93a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:31:55 compute-1 nova_compute[189066]: 2025-12-05 09:31:55.497 189070 DEBUG nova.compute.manager [req-de2043a8-6c0e-4e70-9b02-cb3c97868901 req-12033b02-1c9c-4ee2-992a-7c58728f1f60 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: b843e130-e156-47b6-8a2a-d4811973b93a] No waiting events found dispatching network-vif-plugged-d7d3d638-32a8-4d3e-abce-d0e8942a6c22 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 09:31:55 compute-1 nova_compute[189066]: 2025-12-05 09:31:55.497 189070 WARNING nova.compute.manager [req-de2043a8-6c0e-4e70-9b02-cb3c97868901 req-12033b02-1c9c-4ee2-992a-7c58728f1f60 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: b843e130-e156-47b6-8a2a-d4811973b93a] Received unexpected event network-vif-plugged-d7d3d638-32a8-4d3e-abce-d0e8942a6c22 for instance with vm_state active and task_state None.
Dec 05 09:31:56 compute-1 podman[226700]: 2025-12-05 09:31:56.628457982 +0000 UTC m=+0.060169841 container health_status b9f45cdcb567bb250381fa80d86c78c226d5638c7de1b6d79ee017275814ff68 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Dec 05 09:31:59 compute-1 nova_compute[189066]: 2025-12-05 09:31:59.660 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:32:00 compute-1 nova_compute[189066]: 2025-12-05 09:32:00.287 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:32:01 compute-1 anacron[77209]: Job `cron.weekly' started
Dec 05 09:32:01 compute-1 anacron[77209]: Job `cron.weekly' terminated
Dec 05 09:32:03 compute-1 nova_compute[189066]: 2025-12-05 09:32:03.538 189070 DEBUG nova.compute.manager [req-a4048530-be57-4523-bd8e-55af4c0643ed req-a7a0c429-fe10-422f-b4f5-157549b994f8 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: b843e130-e156-47b6-8a2a-d4811973b93a] Received event network-changed-d7d3d638-32a8-4d3e-abce-d0e8942a6c22 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 09:32:03 compute-1 nova_compute[189066]: 2025-12-05 09:32:03.539 189070 DEBUG nova.compute.manager [req-a4048530-be57-4523-bd8e-55af4c0643ed req-a7a0c429-fe10-422f-b4f5-157549b994f8 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: b843e130-e156-47b6-8a2a-d4811973b93a] Refreshing instance network info cache due to event network-changed-d7d3d638-32a8-4d3e-abce-d0e8942a6c22. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Dec 05 09:32:03 compute-1 nova_compute[189066]: 2025-12-05 09:32:03.539 189070 DEBUG oslo_concurrency.lockutils [req-a4048530-be57-4523-bd8e-55af4c0643ed req-a7a0c429-fe10-422f-b4f5-157549b994f8 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Acquiring lock "refresh_cache-b843e130-e156-47b6-8a2a-d4811973b93a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 09:32:03 compute-1 nova_compute[189066]: 2025-12-05 09:32:03.539 189070 DEBUG oslo_concurrency.lockutils [req-a4048530-be57-4523-bd8e-55af4c0643ed req-a7a0c429-fe10-422f-b4f5-157549b994f8 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Acquired lock "refresh_cache-b843e130-e156-47b6-8a2a-d4811973b93a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 09:32:03 compute-1 nova_compute[189066]: 2025-12-05 09:32:03.540 189070 DEBUG nova.network.neutron [req-a4048530-be57-4523-bd8e-55af4c0643ed req-a7a0c429-fe10-422f-b4f5-157549b994f8 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: b843e130-e156-47b6-8a2a-d4811973b93a] Refreshing network info cache for port d7d3d638-32a8-4d3e-abce-d0e8942a6c22 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Dec 05 09:32:03 compute-1 podman[226726]: 2025-12-05 09:32:03.644418498 +0000 UTC m=+0.064141008 container health_status 550b604b3b40c506829669f3af0f3e8dc4e7ac01464222ce7f26998708fe1a96 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Dec 05 09:32:04 compute-1 ovn_controller[95809]: 2025-12-05T09:32:04Z|00153|binding|INFO|Releasing lport 99e3597b-6e93-4623-ba40-68c3147cea55 from this chassis (sb_readonly=0)
Dec 05 09:32:04 compute-1 ovn_controller[95809]: 2025-12-05T09:32:04Z|00154|binding|INFO|Releasing lport dfe2c5ef-972a-4b59-b38d-e7bdc4437f37 from this chassis (sb_readonly=0)
Dec 05 09:32:04 compute-1 nova_compute[189066]: 2025-12-05 09:32:04.283 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:32:04 compute-1 nova_compute[189066]: 2025-12-05 09:32:04.663 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:32:05 compute-1 nova_compute[189066]: 2025-12-05 09:32:05.290 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:32:05 compute-1 nova_compute[189066]: 2025-12-05 09:32:05.875 189070 DEBUG nova.network.neutron [req-a4048530-be57-4523-bd8e-55af4c0643ed req-a7a0c429-fe10-422f-b4f5-157549b994f8 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: b843e130-e156-47b6-8a2a-d4811973b93a] Updated VIF entry in instance network info cache for port d7d3d638-32a8-4d3e-abce-d0e8942a6c22. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Dec 05 09:32:05 compute-1 nova_compute[189066]: 2025-12-05 09:32:05.877 189070 DEBUG nova.network.neutron [req-a4048530-be57-4523-bd8e-55af4c0643ed req-a7a0c429-fe10-422f-b4f5-157549b994f8 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: b843e130-e156-47b6-8a2a-d4811973b93a] Updating instance_info_cache with network_info: [{"id": "d7d3d638-32a8-4d3e-abce-d0e8942a6c22", "address": "fa:16:3e:7f:68:29", "network": {"id": "7e493efd-28a8-4124-9b83-acc24853936f", "bridge": "br-int", "label": "tempest-network-smoke--1424299295", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.205", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f918144b49634ed5a43d75f8f7d194d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd7d3d638-32", "ovs_interfaceid": "d7d3d638-32a8-4d3e-abce-d0e8942a6c22", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 09:32:05 compute-1 nova_compute[189066]: 2025-12-05 09:32:05.911 189070 DEBUG oslo_concurrency.lockutils [req-a4048530-be57-4523-bd8e-55af4c0643ed req-a7a0c429-fe10-422f-b4f5-157549b994f8 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Releasing lock "refresh_cache-b843e130-e156-47b6-8a2a-d4811973b93a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 09:32:07 compute-1 ovn_controller[95809]: 2025-12-05T09:32:07Z|00025|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:7f:68:29 10.100.0.10
Dec 05 09:32:07 compute-1 ovn_controller[95809]: 2025-12-05T09:32:07Z|00026|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:7f:68:29 10.100.0.10
Dec 05 09:32:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:32:08.877 105272 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:32:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:32:08.878 105272 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:32:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:32:08.879 105272 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:32:09 compute-1 nova_compute[189066]: 2025-12-05 09:32:09.666 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:32:10 compute-1 nova_compute[189066]: 2025-12-05 09:32:10.295 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:32:10 compute-1 podman[226761]: 2025-12-05 09:32:10.649725143 +0000 UTC m=+0.077348454 container health_status d60d3dfd47255bc7c068dbedc03dbf86678863f0414a1b8f8536e4d53dd67a13 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a42d21893a42142b0b00e96296a13ac9d2c0fedf55d6438ad104fd0309c63376-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.vendor=CentOS)
Dec 05 09:32:11 compute-1 nova_compute[189066]: 2025-12-05 09:32:11.021 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:32:11 compute-1 nova_compute[189066]: 2025-12-05 09:32:11.130 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:32:13 compute-1 nova_compute[189066]: 2025-12-05 09:32:13.021 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:32:13 compute-1 nova_compute[189066]: 2025-12-05 09:32:13.022 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:32:13 compute-1 nova_compute[189066]: 2025-12-05 09:32:13.173 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:32:14 compute-1 nova_compute[189066]: 2025-12-05 09:32:14.459 189070 INFO nova.compute.manager [None req-fb8526bb-1603-4b5c-bce9-3f52fbe80083 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: b843e130-e156-47b6-8a2a-d4811973b93a] Get console output
Dec 05 09:32:14 compute-1 nova_compute[189066]: 2025-12-05 09:32:14.471 220618 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Dec 05 09:32:14 compute-1 nova_compute[189066]: 2025-12-05 09:32:14.669 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:32:15 compute-1 nova_compute[189066]: 2025-12-05 09:32:15.021 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:32:15 compute-1 nova_compute[189066]: 2025-12-05 09:32:15.050 189070 DEBUG oslo_concurrency.lockutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:32:15 compute-1 nova_compute[189066]: 2025-12-05 09:32:15.051 189070 DEBUG oslo_concurrency.lockutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:32:15 compute-1 nova_compute[189066]: 2025-12-05 09:32:15.051 189070 DEBUG oslo_concurrency.lockutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:32:15 compute-1 nova_compute[189066]: 2025-12-05 09:32:15.051 189070 DEBUG nova.compute.resource_tracker [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 05 09:32:15 compute-1 nova_compute[189066]: 2025-12-05 09:32:15.148 189070 DEBUG oslo_concurrency.processutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a0adc838-396e-45a2-8503-a2ab451cd778/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 09:32:15 compute-1 podman[226791]: 2025-12-05 09:32:15.214823528 +0000 UTC m=+0.111584536 container health_status 0cdc951f3b455a1459471b20e1f1626ca53f702831c125f835d1b670aeb17ccf (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a42d21893a42142b0b00e96296a13ac9d2c0fedf55d6438ad104fd0309c63376-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 05 09:32:15 compute-1 nova_compute[189066]: 2025-12-05 09:32:15.237 189070 DEBUG oslo_concurrency.processutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a0adc838-396e-45a2-8503-a2ab451cd778/disk --force-share --output=json" returned: 0 in 0.089s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 09:32:15 compute-1 nova_compute[189066]: 2025-12-05 09:32:15.238 189070 DEBUG oslo_concurrency.processutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a0adc838-396e-45a2-8503-a2ab451cd778/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 09:32:15 compute-1 nova_compute[189066]: 2025-12-05 09:32:15.302 189070 DEBUG oslo_concurrency.processutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a0adc838-396e-45a2-8503-a2ab451cd778/disk --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 09:32:15 compute-1 nova_compute[189066]: 2025-12-05 09:32:15.330 189070 DEBUG oslo_concurrency.processutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b843e130-e156-47b6-8a2a-d4811973b93a/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 09:32:15 compute-1 nova_compute[189066]: 2025-12-05 09:32:15.349 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:32:15 compute-1 nova_compute[189066]: 2025-12-05 09:32:15.392 189070 DEBUG oslo_concurrency.processutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b843e130-e156-47b6-8a2a-d4811973b93a/disk --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 09:32:15 compute-1 nova_compute[189066]: 2025-12-05 09:32:15.393 189070 DEBUG oslo_concurrency.processutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b843e130-e156-47b6-8a2a-d4811973b93a/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 09:32:15 compute-1 nova_compute[189066]: 2025-12-05 09:32:15.453 189070 DEBUG oslo_concurrency.processutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b843e130-e156-47b6-8a2a-d4811973b93a/disk --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 09:32:15 compute-1 nova_compute[189066]: 2025-12-05 09:32:15.626 189070 WARNING nova.virt.libvirt.driver [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 05 09:32:15 compute-1 nova_compute[189066]: 2025-12-05 09:32:15.629 189070 DEBUG nova.compute.resource_tracker [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5431MB free_disk=73.2756118774414GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 05 09:32:15 compute-1 nova_compute[189066]: 2025-12-05 09:32:15.629 189070 DEBUG oslo_concurrency.lockutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:32:15 compute-1 nova_compute[189066]: 2025-12-05 09:32:15.629 189070 DEBUG oslo_concurrency.lockutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:32:15 compute-1 nova_compute[189066]: 2025-12-05 09:32:15.791 189070 DEBUG nova.compute.resource_tracker [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Instance a0adc838-396e-45a2-8503-a2ab451cd778 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Dec 05 09:32:15 compute-1 nova_compute[189066]: 2025-12-05 09:32:15.792 189070 DEBUG nova.compute.resource_tracker [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Instance b843e130-e156-47b6-8a2a-d4811973b93a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Dec 05 09:32:15 compute-1 nova_compute[189066]: 2025-12-05 09:32:15.792 189070 DEBUG nova.compute.resource_tracker [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 05 09:32:15 compute-1 nova_compute[189066]: 2025-12-05 09:32:15.792 189070 DEBUG nova.compute.resource_tracker [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 05 09:32:15 compute-1 nova_compute[189066]: 2025-12-05 09:32:15.866 189070 DEBUG nova.scheduler.client.report [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Refreshing inventories for resource provider be68f9f1-7820-4bfa-8dbd-210e13729f64 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Dec 05 09:32:15 compute-1 nova_compute[189066]: 2025-12-05 09:32:15.890 189070 DEBUG nova.scheduler.client.report [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Updating ProviderTree inventory for provider be68f9f1-7820-4bfa-8dbd-210e13729f64 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Dec 05 09:32:15 compute-1 nova_compute[189066]: 2025-12-05 09:32:15.891 189070 DEBUG nova.compute.provider_tree [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Updating inventory in ProviderTree for provider be68f9f1-7820-4bfa-8dbd-210e13729f64 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Dec 05 09:32:15 compute-1 nova_compute[189066]: 2025-12-05 09:32:15.917 189070 DEBUG nova.scheduler.client.report [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Refreshing aggregate associations for resource provider be68f9f1-7820-4bfa-8dbd-210e13729f64, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Dec 05 09:32:15 compute-1 nova_compute[189066]: 2025-12-05 09:32:15.945 189070 DEBUG nova.scheduler.client.report [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Refreshing trait associations for resource provider be68f9f1-7820-4bfa-8dbd-210e13729f64, traits: COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_DEVICE_TAGGING,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_STORAGE_BUS_USB,COMPUTE_NODE,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_VOLUME_EXTEND,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_GRAPHICS_MODEL_NONE,HW_CPU_X86_MMX,COMPUTE_TRUSTED_CERTS,COMPUTE_SECURITY_UEFI_SECURE_BOOT,HW_CPU_X86_SSE,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_RESCUE_BFV,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_NET_VIF_MODEL_E1000E,HW_CPU_X86_SSE41,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_STORAGE_BUS_FDC,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_STORAGE_BUS_IDE,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_SECURITY_TPM_1_2,HW_CPU_X86_SSE42,COMPUTE_SECURITY_TPM_2_0,COMPUTE_NET_VIF_MODEL_NE2K_PCI,HW_CPU_X86_SSSE3,COMPUTE_STORAGE_BUS_SATA,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_ACCELERATORS,HW_CPU_X86_SSE2,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_NET_VIF_MODEL_RTL8139 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Dec 05 09:32:16 compute-1 nova_compute[189066]: 2025-12-05 09:32:16.010 189070 DEBUG nova.compute.provider_tree [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Inventory has not changed in ProviderTree for provider: be68f9f1-7820-4bfa-8dbd-210e13729f64 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 09:32:16 compute-1 nova_compute[189066]: 2025-12-05 09:32:16.033 189070 DEBUG nova.scheduler.client.report [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Inventory has not changed for provider be68f9f1-7820-4bfa-8dbd-210e13729f64 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 09:32:16 compute-1 nova_compute[189066]: 2025-12-05 09:32:16.064 189070 DEBUG nova.compute.resource_tracker [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 05 09:32:16 compute-1 nova_compute[189066]: 2025-12-05 09:32:16.064 189070 DEBUG oslo_concurrency.lockutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.435s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:32:16 compute-1 nova_compute[189066]: 2025-12-05 09:32:16.065 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:32:16 compute-1 nova_compute[189066]: 2025-12-05 09:32:16.065 189070 DEBUG nova.compute.manager [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Dec 05 09:32:16 compute-1 nova_compute[189066]: 2025-12-05 09:32:16.078 189070 DEBUG nova.compute.manager [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Dec 05 09:32:17 compute-1 nova_compute[189066]: 2025-12-05 09:32:17.079 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:32:17 compute-1 nova_compute[189066]: 2025-12-05 09:32:17.080 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:32:17 compute-1 nova_compute[189066]: 2025-12-05 09:32:17.081 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:32:17 compute-1 nova_compute[189066]: 2025-12-05 09:32:17.081 189070 DEBUG nova.compute.manager [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 05 09:32:17 compute-1 podman[226830]: 2025-12-05 09:32:17.640286235 +0000 UTC m=+0.060655343 container health_status e8cea6352187f28e672aa25c227f2705a90d58074b8cf42235a56df5d4026fd5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a42d21893a42142b0b00e96296a13ac9d2c0fedf55d6438ad104fd0309c63376-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Dec 05 09:32:17 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:32:17.790 105272 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=15, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f2:d3:89', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '8e:43:2c:7d:20:dd'}, ipsec=False) old=SB_Global(nb_cfg=14) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 09:32:17 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:32:17.791 105272 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Dec 05 09:32:17 compute-1 nova_compute[189066]: 2025-12-05 09:32:17.791 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:32:18 compute-1 nova_compute[189066]: 2025-12-05 09:32:18.020 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:32:18 compute-1 nova_compute[189066]: 2025-12-05 09:32:18.021 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:32:18 compute-1 nova_compute[189066]: 2025-12-05 09:32:18.866 189070 DEBUG nova.compute.manager [req-1e1161cf-ee58-4f38-a061-22eeec23c706 req-669e1263-5aac-44fa-b06b-e1ab0f830cf8 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: a0adc838-396e-45a2-8503-a2ab451cd778] Received event network-changed-c8f841b5-bac4-4aed-83b6-65ebccb9c49e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 09:32:18 compute-1 nova_compute[189066]: 2025-12-05 09:32:18.867 189070 DEBUG nova.compute.manager [req-1e1161cf-ee58-4f38-a061-22eeec23c706 req-669e1263-5aac-44fa-b06b-e1ab0f830cf8 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: a0adc838-396e-45a2-8503-a2ab451cd778] Refreshing instance network info cache due to event network-changed-c8f841b5-bac4-4aed-83b6-65ebccb9c49e. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Dec 05 09:32:18 compute-1 nova_compute[189066]: 2025-12-05 09:32:18.867 189070 DEBUG oslo_concurrency.lockutils [req-1e1161cf-ee58-4f38-a061-22eeec23c706 req-669e1263-5aac-44fa-b06b-e1ab0f830cf8 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Acquiring lock "refresh_cache-a0adc838-396e-45a2-8503-a2ab451cd778" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 09:32:18 compute-1 nova_compute[189066]: 2025-12-05 09:32:18.867 189070 DEBUG oslo_concurrency.lockutils [req-1e1161cf-ee58-4f38-a061-22eeec23c706 req-669e1263-5aac-44fa-b06b-e1ab0f830cf8 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Acquired lock "refresh_cache-a0adc838-396e-45a2-8503-a2ab451cd778" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 09:32:18 compute-1 nova_compute[189066]: 2025-12-05 09:32:18.867 189070 DEBUG nova.network.neutron [req-1e1161cf-ee58-4f38-a061-22eeec23c706 req-669e1263-5aac-44fa-b06b-e1ab0f830cf8 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: a0adc838-396e-45a2-8503-a2ab451cd778] Refreshing network info cache for port c8f841b5-bac4-4aed-83b6-65ebccb9c49e _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Dec 05 09:32:18 compute-1 nova_compute[189066]: 2025-12-05 09:32:18.923 189070 DEBUG oslo_concurrency.lockutils [None req-d1d49803-3eee-42c9-bd91-eab22a9b203b bcc37d16c39547bba794fb1f43e889c1 6c5bb818cba543bbb1bcff8df31dd9cd - - default default] Acquiring lock "a0adc838-396e-45a2-8503-a2ab451cd778" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:32:18 compute-1 nova_compute[189066]: 2025-12-05 09:32:18.923 189070 DEBUG oslo_concurrency.lockutils [None req-d1d49803-3eee-42c9-bd91-eab22a9b203b bcc37d16c39547bba794fb1f43e889c1 6c5bb818cba543bbb1bcff8df31dd9cd - - default default] Lock "a0adc838-396e-45a2-8503-a2ab451cd778" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:32:18 compute-1 nova_compute[189066]: 2025-12-05 09:32:18.923 189070 DEBUG oslo_concurrency.lockutils [None req-d1d49803-3eee-42c9-bd91-eab22a9b203b bcc37d16c39547bba794fb1f43e889c1 6c5bb818cba543bbb1bcff8df31dd9cd - - default default] Acquiring lock "a0adc838-396e-45a2-8503-a2ab451cd778-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:32:18 compute-1 nova_compute[189066]: 2025-12-05 09:32:18.924 189070 DEBUG oslo_concurrency.lockutils [None req-d1d49803-3eee-42c9-bd91-eab22a9b203b bcc37d16c39547bba794fb1f43e889c1 6c5bb818cba543bbb1bcff8df31dd9cd - - default default] Lock "a0adc838-396e-45a2-8503-a2ab451cd778-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:32:18 compute-1 nova_compute[189066]: 2025-12-05 09:32:18.924 189070 DEBUG oslo_concurrency.lockutils [None req-d1d49803-3eee-42c9-bd91-eab22a9b203b bcc37d16c39547bba794fb1f43e889c1 6c5bb818cba543bbb1bcff8df31dd9cd - - default default] Lock "a0adc838-396e-45a2-8503-a2ab451cd778-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:32:18 compute-1 nova_compute[189066]: 2025-12-05 09:32:18.926 189070 INFO nova.compute.manager [None req-d1d49803-3eee-42c9-bd91-eab22a9b203b bcc37d16c39547bba794fb1f43e889c1 6c5bb818cba543bbb1bcff8df31dd9cd - - default default] [instance: a0adc838-396e-45a2-8503-a2ab451cd778] Terminating instance
Dec 05 09:32:18 compute-1 nova_compute[189066]: 2025-12-05 09:32:18.927 189070 DEBUG nova.compute.manager [None req-d1d49803-3eee-42c9-bd91-eab22a9b203b bcc37d16c39547bba794fb1f43e889c1 6c5bb818cba543bbb1bcff8df31dd9cd - - default default] [instance: a0adc838-396e-45a2-8503-a2ab451cd778] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Dec 05 09:32:18 compute-1 kernel: tapc8f841b5-ba (unregistering): left promiscuous mode
Dec 05 09:32:18 compute-1 NetworkManager[55704]: <info>  [1764927138.9572] device (tapc8f841b5-ba): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec 05 09:32:18 compute-1 nova_compute[189066]: 2025-12-05 09:32:18.963 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:32:18 compute-1 ovn_controller[95809]: 2025-12-05T09:32:18Z|00155|binding|INFO|Releasing lport c8f841b5-bac4-4aed-83b6-65ebccb9c49e from this chassis (sb_readonly=0)
Dec 05 09:32:18 compute-1 ovn_controller[95809]: 2025-12-05T09:32:18Z|00156|binding|INFO|Setting lport c8f841b5-bac4-4aed-83b6-65ebccb9c49e down in Southbound
Dec 05 09:32:18 compute-1 ovn_controller[95809]: 2025-12-05T09:32:18Z|00157|binding|INFO|Removing iface tapc8f841b5-ba ovn-installed in OVS
Dec 05 09:32:18 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:32:18.971 105272 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:a7:2b:28 10.100.0.5'], port_security=['fa:16:3e:a7:2b:28 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': 'a0adc838-396e-45a2-8503-a2ab451cd778', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-dbc58ba4-8158-4179-bf5b-c94cb9a82196', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6c5bb818cba543bbb1bcff8df31dd9cd', 'neutron:revision_number': '4', 'neutron:security_group_ids': '923cdeef-e7c2-4413-bfb2-fbe572ab445e b6dc48f6-1863-466c-8b58-dcc336cd88a6', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=eec9cf42-f628-43fa-9672-aa1cd3542052, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fc06f54c970>], logical_port=c8f841b5-bac4-4aed-83b6-65ebccb9c49e) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fc06f54c970>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 09:32:18 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:32:18.972 105272 INFO neutron.agent.ovn.metadata.agent [-] Port c8f841b5-bac4-4aed-83b6-65ebccb9c49e in datapath dbc58ba4-8158-4179-bf5b-c94cb9a82196 unbound from our chassis
Dec 05 09:32:18 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:32:18.974 105272 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network dbc58ba4-8158-4179-bf5b-c94cb9a82196, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 05 09:32:18 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:32:18.977 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[75cde987-45d1-4b1f-95b6-7304e2c4ccb8]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:32:18 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:32:18.978 105272 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-dbc58ba4-8158-4179-bf5b-c94cb9a82196 namespace which is not needed anymore
Dec 05 09:32:18 compute-1 nova_compute[189066]: 2025-12-05 09:32:18.981 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:32:19 compute-1 systemd[1]: machine-qemu\x2d11\x2dinstance\x2d00000016.scope: Deactivated successfully.
Dec 05 09:32:19 compute-1 systemd[1]: machine-qemu\x2d11\x2dinstance\x2d00000016.scope: Consumed 16.185s CPU time.
Dec 05 09:32:19 compute-1 systemd-machined[154815]: Machine qemu-11-instance-00000016 terminated.
Dec 05 09:32:19 compute-1 neutron-haproxy-ovnmeta-dbc58ba4-8158-4179-bf5b-c94cb9a82196[226284]: [NOTICE]   (226288) : haproxy version is 2.8.14-c23fe91
Dec 05 09:32:19 compute-1 neutron-haproxy-ovnmeta-dbc58ba4-8158-4179-bf5b-c94cb9a82196[226284]: [NOTICE]   (226288) : path to executable is /usr/sbin/haproxy
Dec 05 09:32:19 compute-1 neutron-haproxy-ovnmeta-dbc58ba4-8158-4179-bf5b-c94cb9a82196[226284]: [WARNING]  (226288) : Exiting Master process...
Dec 05 09:32:19 compute-1 neutron-haproxy-ovnmeta-dbc58ba4-8158-4179-bf5b-c94cb9a82196[226284]: [WARNING]  (226288) : Exiting Master process...
Dec 05 09:32:19 compute-1 neutron-haproxy-ovnmeta-dbc58ba4-8158-4179-bf5b-c94cb9a82196[226284]: [ALERT]    (226288) : Current worker (226290) exited with code 143 (Terminated)
Dec 05 09:32:19 compute-1 neutron-haproxy-ovnmeta-dbc58ba4-8158-4179-bf5b-c94cb9a82196[226284]: [WARNING]  (226288) : All workers exited. Exiting... (0)
Dec 05 09:32:19 compute-1 systemd[1]: libpod-2c2206b261b3bab6122ba5007d2777e2483f9fa703006f4bc735f638caff06db.scope: Deactivated successfully.
Dec 05 09:32:19 compute-1 podman[226873]: 2025-12-05 09:32:19.153082538 +0000 UTC m=+0.053157109 container died 2c2206b261b3bab6122ba5007d2777e2483f9fa703006f4bc735f638caff06db (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-dbc58ba4-8158-4179-bf5b-c94cb9a82196, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2)
Dec 05 09:32:19 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-2c2206b261b3bab6122ba5007d2777e2483f9fa703006f4bc735f638caff06db-userdata-shm.mount: Deactivated successfully.
Dec 05 09:32:19 compute-1 nova_compute[189066]: 2025-12-05 09:32:19.190 189070 DEBUG oslo_concurrency.lockutils [None req-f7d4fc56-c840-44e4-a51b-b1f6a2d762ce 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Acquiring lock "interface-b843e130-e156-47b6-8a2a-d4811973b93a-None" by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:32:19 compute-1 nova_compute[189066]: 2025-12-05 09:32:19.190 189070 DEBUG oslo_concurrency.lockutils [None req-f7d4fc56-c840-44e4-a51b-b1f6a2d762ce 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Lock "interface-b843e130-e156-47b6-8a2a-d4811973b93a-None" acquired by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:32:19 compute-1 nova_compute[189066]: 2025-12-05 09:32:19.191 189070 DEBUG nova.objects.instance [None req-f7d4fc56-c840-44e4-a51b-b1f6a2d762ce 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Lazy-loading 'flavor' on Instance uuid b843e130-e156-47b6-8a2a-d4811973b93a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 09:32:19 compute-1 systemd[1]: var-lib-containers-storage-overlay-c8a08c4d73e653d495ce7a5964b3e3119615b7065cd702612681aed36b236bf6-merged.mount: Deactivated successfully.
Dec 05 09:32:19 compute-1 podman[226873]: 2025-12-05 09:32:19.203580981 +0000 UTC m=+0.103655542 container cleanup 2c2206b261b3bab6122ba5007d2777e2483f9fa703006f4bc735f638caff06db (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-dbc58ba4-8158-4179-bf5b-c94cb9a82196, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true)
Dec 05 09:32:19 compute-1 nova_compute[189066]: 2025-12-05 09:32:19.207 189070 INFO nova.virt.libvirt.driver [-] [instance: a0adc838-396e-45a2-8503-a2ab451cd778] Instance destroyed successfully.
Dec 05 09:32:19 compute-1 nova_compute[189066]: 2025-12-05 09:32:19.208 189070 DEBUG nova.objects.instance [None req-d1d49803-3eee-42c9-bd91-eab22a9b203b bcc37d16c39547bba794fb1f43e889c1 6c5bb818cba543bbb1bcff8df31dd9cd - - default default] Lazy-loading 'resources' on Instance uuid a0adc838-396e-45a2-8503-a2ab451cd778 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 09:32:19 compute-1 systemd[1]: libpod-conmon-2c2206b261b3bab6122ba5007d2777e2483f9fa703006f4bc735f638caff06db.scope: Deactivated successfully.
Dec 05 09:32:19 compute-1 nova_compute[189066]: 2025-12-05 09:32:19.249 189070 DEBUG nova.virt.libvirt.vif [None req-d1d49803-3eee-42c9-bd91-eab22a9b203b bcc37d16c39547bba794fb1f43e889c1 6c5bb818cba543bbb1bcff8df31dd9cd - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-05T09:31:00Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-server-tempest-TestSecurityGroupsBasicOps-1223075532-access_point-49256169',display_name='tempest-server-tempest-TestSecurityGroupsBasicOps-1223075532-access_point-49256169',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-server-tempest-testsecuritygroupsbasicops-1223075532-ac',id=22,image_ref='3ebffd97-b242-42d7-b245-ebdaf8e4377c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBD+5Pg7shJpMimDThmBqstpAx0KkRt2i5yGiHWlpBuzwDEqniUE3UrKuB7/FlvnvA8AQmg1XL8HIIV8GqoiYNn1cRv3JE7sesclhpKZ5mzYbeAsim3ia0Zws6nHf4Otwlg==',key_name='tempest-TestSecurityGroupsBasicOps-2125780827',keypairs=<?>,launch_index=0,launched_at=2025-12-05T09:31:12Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='6c5bb818cba543bbb1bcff8df31dd9cd',ramdisk_id='',reservation_id='r-1l334wzq',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='3ebffd97-b242-42d7-b245-ebdaf8e4377c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestSecurityGroupsBasicOps-1223075532',owner_user_name='tempest-TestSecurityGroupsBasicOps-1223075532-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-05T09:31:13Z,user_data=None,user_id='bcc37d16c39547bba794fb1f43e889c1',uuid=a0adc838-396e-45a2-8503-a2ab451cd778,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "c8f841b5-bac4-4aed-83b6-65ebccb9c49e", "address": "fa:16:3e:a7:2b:28", "network": {"id": "dbc58ba4-8158-4179-bf5b-c94cb9a82196", "bridge": "br-int", "label": "tempest-network-smoke--906868098", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.183", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6c5bb818cba543bbb1bcff8df31dd9cd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc8f841b5-ba", "ovs_interfaceid": "c8f841b5-bac4-4aed-83b6-65ebccb9c49e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Dec 05 09:32:19 compute-1 nova_compute[189066]: 2025-12-05 09:32:19.250 189070 DEBUG nova.network.os_vif_util [None req-d1d49803-3eee-42c9-bd91-eab22a9b203b bcc37d16c39547bba794fb1f43e889c1 6c5bb818cba543bbb1bcff8df31dd9cd - - default default] Converting VIF {"id": "c8f841b5-bac4-4aed-83b6-65ebccb9c49e", "address": "fa:16:3e:a7:2b:28", "network": {"id": "dbc58ba4-8158-4179-bf5b-c94cb9a82196", "bridge": "br-int", "label": "tempest-network-smoke--906868098", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.183", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6c5bb818cba543bbb1bcff8df31dd9cd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc8f841b5-ba", "ovs_interfaceid": "c8f841b5-bac4-4aed-83b6-65ebccb9c49e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 09:32:19 compute-1 nova_compute[189066]: 2025-12-05 09:32:19.252 189070 DEBUG nova.network.os_vif_util [None req-d1d49803-3eee-42c9-bd91-eab22a9b203b bcc37d16c39547bba794fb1f43e889c1 6c5bb818cba543bbb1bcff8df31dd9cd - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:a7:2b:28,bridge_name='br-int',has_traffic_filtering=True,id=c8f841b5-bac4-4aed-83b6-65ebccb9c49e,network=Network(dbc58ba4-8158-4179-bf5b-c94cb9a82196),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc8f841b5-ba') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 09:32:19 compute-1 nova_compute[189066]: 2025-12-05 09:32:19.252 189070 DEBUG os_vif [None req-d1d49803-3eee-42c9-bd91-eab22a9b203b bcc37d16c39547bba794fb1f43e889c1 6c5bb818cba543bbb1bcff8df31dd9cd - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:a7:2b:28,bridge_name='br-int',has_traffic_filtering=True,id=c8f841b5-bac4-4aed-83b6-65ebccb9c49e,network=Network(dbc58ba4-8158-4179-bf5b-c94cb9a82196),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc8f841b5-ba') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Dec 05 09:32:19 compute-1 nova_compute[189066]: 2025-12-05 09:32:19.260 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:32:19 compute-1 nova_compute[189066]: 2025-12-05 09:32:19.261 189070 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc8f841b5-ba, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 09:32:19 compute-1 nova_compute[189066]: 2025-12-05 09:32:19.265 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:32:19 compute-1 nova_compute[189066]: 2025-12-05 09:32:19.268 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 09:32:19 compute-1 nova_compute[189066]: 2025-12-05 09:32:19.274 189070 INFO os_vif [None req-d1d49803-3eee-42c9-bd91-eab22a9b203b bcc37d16c39547bba794fb1f43e889c1 6c5bb818cba543bbb1bcff8df31dd9cd - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:a7:2b:28,bridge_name='br-int',has_traffic_filtering=True,id=c8f841b5-bac4-4aed-83b6-65ebccb9c49e,network=Network(dbc58ba4-8158-4179-bf5b-c94cb9a82196),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc8f841b5-ba')
Dec 05 09:32:19 compute-1 nova_compute[189066]: 2025-12-05 09:32:19.275 189070 INFO nova.virt.libvirt.driver [None req-d1d49803-3eee-42c9-bd91-eab22a9b203b bcc37d16c39547bba794fb1f43e889c1 6c5bb818cba543bbb1bcff8df31dd9cd - - default default] [instance: a0adc838-396e-45a2-8503-a2ab451cd778] Deleting instance files /var/lib/nova/instances/a0adc838-396e-45a2-8503-a2ab451cd778_del
Dec 05 09:32:19 compute-1 nova_compute[189066]: 2025-12-05 09:32:19.275 189070 INFO nova.virt.libvirt.driver [None req-d1d49803-3eee-42c9-bd91-eab22a9b203b bcc37d16c39547bba794fb1f43e889c1 6c5bb818cba543bbb1bcff8df31dd9cd - - default default] [instance: a0adc838-396e-45a2-8503-a2ab451cd778] Deletion of /var/lib/nova/instances/a0adc838-396e-45a2-8503-a2ab451cd778_del complete
Dec 05 09:32:19 compute-1 podman[226918]: 2025-12-05 09:32:19.282752088 +0000 UTC m=+0.051807095 container remove 2c2206b261b3bab6122ba5007d2777e2483f9fa703006f4bc735f638caff06db (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-dbc58ba4-8158-4179-bf5b-c94cb9a82196, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Dec 05 09:32:19 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:32:19.289 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[a59fef8f-155f-4fa4-97b5-139f750debba]: (4, ('Fri Dec  5 09:32:19 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-dbc58ba4-8158-4179-bf5b-c94cb9a82196 (2c2206b261b3bab6122ba5007d2777e2483f9fa703006f4bc735f638caff06db)\n2c2206b261b3bab6122ba5007d2777e2483f9fa703006f4bc735f638caff06db\nFri Dec  5 09:32:19 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-dbc58ba4-8158-4179-bf5b-c94cb9a82196 (2c2206b261b3bab6122ba5007d2777e2483f9fa703006f4bc735f638caff06db)\n2c2206b261b3bab6122ba5007d2777e2483f9fa703006f4bc735f638caff06db\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:32:19 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:32:19.292 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[553cd0fc-b2d7-4d1e-b59e-22fa523335b0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:32:19 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:32:19.293 105272 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapdbc58ba4-80, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 09:32:19 compute-1 nova_compute[189066]: 2025-12-05 09:32:19.295 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:32:19 compute-1 kernel: tapdbc58ba4-80: left promiscuous mode
Dec 05 09:32:19 compute-1 nova_compute[189066]: 2025-12-05 09:32:19.307 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:32:19 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:32:19.311 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[8d545fb1-3f84-4a81-87bb-26059b245222]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:32:19 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:32:19.333 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[e98e73b9-e1f5-4ae9-b045-6cffd6077e3f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:32:19 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:32:19.335 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[b390d9fd-c168-46e7-9c03-0d2730ed5c29]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:32:19 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:32:19.356 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[c7d55ccc-ca6e-4cb3-a4c9-7a7482df8e94]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 434329, 'reachable_time': 35802, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 226932, 'error': None, 'target': 'ovnmeta-dbc58ba4-8158-4179-bf5b-c94cb9a82196', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:32:19 compute-1 systemd[1]: run-netns-ovnmeta\x2ddbc58ba4\x2d8158\x2d4179\x2dbf5b\x2dc94cb9a82196.mount: Deactivated successfully.
Dec 05 09:32:19 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:32:19.362 105809 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-dbc58ba4-8158-4179-bf5b-c94cb9a82196 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Dec 05 09:32:19 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:32:19.362 105809 DEBUG oslo.privsep.daemon [-] privsep: reply[59eb4769-333f-4b13-a403-e2bfea8518a8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:32:19 compute-1 nova_compute[189066]: 2025-12-05 09:32:19.378 189070 INFO nova.compute.manager [None req-d1d49803-3eee-42c9-bd91-eab22a9b203b bcc37d16c39547bba794fb1f43e889c1 6c5bb818cba543bbb1bcff8df31dd9cd - - default default] [instance: a0adc838-396e-45a2-8503-a2ab451cd778] Took 0.45 seconds to destroy the instance on the hypervisor.
Dec 05 09:32:19 compute-1 nova_compute[189066]: 2025-12-05 09:32:19.379 189070 DEBUG oslo.service.loopingcall [None req-d1d49803-3eee-42c9-bd91-eab22a9b203b bcc37d16c39547bba794fb1f43e889c1 6c5bb818cba543bbb1bcff8df31dd9cd - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Dec 05 09:32:19 compute-1 nova_compute[189066]: 2025-12-05 09:32:19.379 189070 DEBUG nova.compute.manager [-] [instance: a0adc838-396e-45a2-8503-a2ab451cd778] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Dec 05 09:32:19 compute-1 nova_compute[189066]: 2025-12-05 09:32:19.380 189070 DEBUG nova.network.neutron [-] [instance: a0adc838-396e-45a2-8503-a2ab451cd778] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Dec 05 09:32:19 compute-1 nova_compute[189066]: 2025-12-05 09:32:19.673 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:32:20 compute-1 nova_compute[189066]: 2025-12-05 09:32:20.038 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:32:20 compute-1 nova_compute[189066]: 2025-12-05 09:32:20.039 189070 DEBUG nova.compute.manager [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 05 09:32:20 compute-1 nova_compute[189066]: 2025-12-05 09:32:20.040 189070 DEBUG nova.compute.manager [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 05 09:32:20 compute-1 nova_compute[189066]: 2025-12-05 09:32:20.070 189070 DEBUG nova.compute.manager [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] [instance: a0adc838-396e-45a2-8503-a2ab451cd778] Skipping network cache update for instance because it is being deleted. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9875
Dec 05 09:32:20 compute-1 nova_compute[189066]: 2025-12-05 09:32:20.185 189070 DEBUG nova.compute.manager [req-83539462-66fd-4a57-81e2-5514a27b6987 req-7e19b27b-841f-4bd4-9ccd-08102302d15f 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: a0adc838-396e-45a2-8503-a2ab451cd778] Received event network-vif-unplugged-c8f841b5-bac4-4aed-83b6-65ebccb9c49e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 09:32:20 compute-1 nova_compute[189066]: 2025-12-05 09:32:20.185 189070 DEBUG oslo_concurrency.lockutils [req-83539462-66fd-4a57-81e2-5514a27b6987 req-7e19b27b-841f-4bd4-9ccd-08102302d15f 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Acquiring lock "a0adc838-396e-45a2-8503-a2ab451cd778-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:32:20 compute-1 nova_compute[189066]: 2025-12-05 09:32:20.186 189070 DEBUG oslo_concurrency.lockutils [req-83539462-66fd-4a57-81e2-5514a27b6987 req-7e19b27b-841f-4bd4-9ccd-08102302d15f 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Lock "a0adc838-396e-45a2-8503-a2ab451cd778-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:32:20 compute-1 nova_compute[189066]: 2025-12-05 09:32:20.186 189070 DEBUG oslo_concurrency.lockutils [req-83539462-66fd-4a57-81e2-5514a27b6987 req-7e19b27b-841f-4bd4-9ccd-08102302d15f 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Lock "a0adc838-396e-45a2-8503-a2ab451cd778-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:32:20 compute-1 nova_compute[189066]: 2025-12-05 09:32:20.186 189070 DEBUG nova.compute.manager [req-83539462-66fd-4a57-81e2-5514a27b6987 req-7e19b27b-841f-4bd4-9ccd-08102302d15f 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: a0adc838-396e-45a2-8503-a2ab451cd778] No waiting events found dispatching network-vif-unplugged-c8f841b5-bac4-4aed-83b6-65ebccb9c49e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 09:32:20 compute-1 nova_compute[189066]: 2025-12-05 09:32:20.186 189070 DEBUG nova.compute.manager [req-83539462-66fd-4a57-81e2-5514a27b6987 req-7e19b27b-841f-4bd4-9ccd-08102302d15f 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: a0adc838-396e-45a2-8503-a2ab451cd778] Received event network-vif-unplugged-c8f841b5-bac4-4aed-83b6-65ebccb9c49e for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Dec 05 09:32:20 compute-1 nova_compute[189066]: 2025-12-05 09:32:20.187 189070 DEBUG nova.compute.manager [req-83539462-66fd-4a57-81e2-5514a27b6987 req-7e19b27b-841f-4bd4-9ccd-08102302d15f 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: a0adc838-396e-45a2-8503-a2ab451cd778] Received event network-vif-plugged-c8f841b5-bac4-4aed-83b6-65ebccb9c49e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 09:32:20 compute-1 nova_compute[189066]: 2025-12-05 09:32:20.187 189070 DEBUG oslo_concurrency.lockutils [req-83539462-66fd-4a57-81e2-5514a27b6987 req-7e19b27b-841f-4bd4-9ccd-08102302d15f 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Acquiring lock "a0adc838-396e-45a2-8503-a2ab451cd778-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:32:20 compute-1 nova_compute[189066]: 2025-12-05 09:32:20.187 189070 DEBUG oslo_concurrency.lockutils [req-83539462-66fd-4a57-81e2-5514a27b6987 req-7e19b27b-841f-4bd4-9ccd-08102302d15f 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Lock "a0adc838-396e-45a2-8503-a2ab451cd778-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:32:20 compute-1 nova_compute[189066]: 2025-12-05 09:32:20.187 189070 DEBUG oslo_concurrency.lockutils [req-83539462-66fd-4a57-81e2-5514a27b6987 req-7e19b27b-841f-4bd4-9ccd-08102302d15f 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Lock "a0adc838-396e-45a2-8503-a2ab451cd778-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:32:20 compute-1 nova_compute[189066]: 2025-12-05 09:32:20.187 189070 DEBUG nova.compute.manager [req-83539462-66fd-4a57-81e2-5514a27b6987 req-7e19b27b-841f-4bd4-9ccd-08102302d15f 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: a0adc838-396e-45a2-8503-a2ab451cd778] No waiting events found dispatching network-vif-plugged-c8f841b5-bac4-4aed-83b6-65ebccb9c49e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 09:32:20 compute-1 nova_compute[189066]: 2025-12-05 09:32:20.188 189070 WARNING nova.compute.manager [req-83539462-66fd-4a57-81e2-5514a27b6987 req-7e19b27b-841f-4bd4-9ccd-08102302d15f 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: a0adc838-396e-45a2-8503-a2ab451cd778] Received unexpected event network-vif-plugged-c8f841b5-bac4-4aed-83b6-65ebccb9c49e for instance with vm_state active and task_state deleting.
Dec 05 09:32:20 compute-1 nova_compute[189066]: 2025-12-05 09:32:20.260 189070 DEBUG oslo_concurrency.lockutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Acquiring lock "refresh_cache-b843e130-e156-47b6-8a2a-d4811973b93a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 09:32:20 compute-1 nova_compute[189066]: 2025-12-05 09:32:20.260 189070 DEBUG oslo_concurrency.lockutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Acquired lock "refresh_cache-b843e130-e156-47b6-8a2a-d4811973b93a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 09:32:20 compute-1 nova_compute[189066]: 2025-12-05 09:32:20.261 189070 DEBUG nova.network.neutron [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] [instance: b843e130-e156-47b6-8a2a-d4811973b93a] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Dec 05 09:32:20 compute-1 nova_compute[189066]: 2025-12-05 09:32:20.261 189070 DEBUG nova.objects.instance [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Lazy-loading 'info_cache' on Instance uuid b843e130-e156-47b6-8a2a-d4811973b93a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 09:32:20 compute-1 nova_compute[189066]: 2025-12-05 09:32:20.273 189070 DEBUG nova.objects.instance [None req-f7d4fc56-c840-44e4-a51b-b1f6a2d762ce 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Lazy-loading 'pci_requests' on Instance uuid b843e130-e156-47b6-8a2a-d4811973b93a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 09:32:20 compute-1 nova_compute[189066]: 2025-12-05 09:32:20.284 189070 DEBUG nova.network.neutron [None req-f7d4fc56-c840-44e4-a51b-b1f6a2d762ce 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: b843e130-e156-47b6-8a2a-d4811973b93a] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Dec 05 09:32:20 compute-1 nova_compute[189066]: 2025-12-05 09:32:20.414 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:32:20 compute-1 podman[226933]: 2025-12-05 09:32:20.634512869 +0000 UTC m=+0.067789288 container health_status 3f3288243aefe700d53cffce184353529fbaf6e44f1c19ec4792ed576af41176 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, config_id=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251125)
Dec 05 09:32:20 compute-1 nova_compute[189066]: 2025-12-05 09:32:20.637 189070 DEBUG nova.policy [None req-f7d4fc56-c840-44e4-a51b-b1f6a2d762ce 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '822a2d80e80e46d9a5a49e1e9560e0d9', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'f918144b49634ed5a43d75f8f7d194d3', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Dec 05 09:32:20 compute-1 nova_compute[189066]: 2025-12-05 09:32:20.667 189070 DEBUG nova.network.neutron [-] [instance: a0adc838-396e-45a2-8503-a2ab451cd778] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 09:32:20 compute-1 nova_compute[189066]: 2025-12-05 09:32:20.730 189070 INFO nova.compute.manager [-] [instance: a0adc838-396e-45a2-8503-a2ab451cd778] Took 1.35 seconds to deallocate network for instance.
Dec 05 09:32:20 compute-1 nova_compute[189066]: 2025-12-05 09:32:20.817 189070 DEBUG oslo_concurrency.lockutils [None req-d1d49803-3eee-42c9-bd91-eab22a9b203b bcc37d16c39547bba794fb1f43e889c1 6c5bb818cba543bbb1bcff8df31dd9cd - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:32:20 compute-1 nova_compute[189066]: 2025-12-05 09:32:20.818 189070 DEBUG oslo_concurrency.lockutils [None req-d1d49803-3eee-42c9-bd91-eab22a9b203b bcc37d16c39547bba794fb1f43e889c1 6c5bb818cba543bbb1bcff8df31dd9cd - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:32:20 compute-1 nova_compute[189066]: 2025-12-05 09:32:20.908 189070 DEBUG nova.compute.provider_tree [None req-d1d49803-3eee-42c9-bd91-eab22a9b203b bcc37d16c39547bba794fb1f43e889c1 6c5bb818cba543bbb1bcff8df31dd9cd - - default default] Inventory has not changed in ProviderTree for provider: be68f9f1-7820-4bfa-8dbd-210e13729f64 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 09:32:20 compute-1 nova_compute[189066]: 2025-12-05 09:32:20.925 189070 DEBUG nova.scheduler.client.report [None req-d1d49803-3eee-42c9-bd91-eab22a9b203b bcc37d16c39547bba794fb1f43e889c1 6c5bb818cba543bbb1bcff8df31dd9cd - - default default] Inventory has not changed for provider be68f9f1-7820-4bfa-8dbd-210e13729f64 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 09:32:20 compute-1 nova_compute[189066]: 2025-12-05 09:32:20.951 189070 DEBUG oslo_concurrency.lockutils [None req-d1d49803-3eee-42c9-bd91-eab22a9b203b bcc37d16c39547bba794fb1f43e889c1 6c5bb818cba543bbb1bcff8df31dd9cd - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.134s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:32:20 compute-1 nova_compute[189066]: 2025-12-05 09:32:20.992 189070 INFO nova.scheduler.client.report [None req-d1d49803-3eee-42c9-bd91-eab22a9b203b bcc37d16c39547bba794fb1f43e889c1 6c5bb818cba543bbb1bcff8df31dd9cd - - default default] Deleted allocations for instance a0adc838-396e-45a2-8503-a2ab451cd778
Dec 05 09:32:21 compute-1 nova_compute[189066]: 2025-12-05 09:32:21.013 189070 DEBUG nova.network.neutron [req-1e1161cf-ee58-4f38-a061-22eeec23c706 req-669e1263-5aac-44fa-b06b-e1ab0f830cf8 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: a0adc838-396e-45a2-8503-a2ab451cd778] Updated VIF entry in instance network info cache for port c8f841b5-bac4-4aed-83b6-65ebccb9c49e. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Dec 05 09:32:21 compute-1 nova_compute[189066]: 2025-12-05 09:32:21.014 189070 DEBUG nova.network.neutron [req-1e1161cf-ee58-4f38-a061-22eeec23c706 req-669e1263-5aac-44fa-b06b-e1ab0f830cf8 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: a0adc838-396e-45a2-8503-a2ab451cd778] Updating instance_info_cache with network_info: [{"id": "c8f841b5-bac4-4aed-83b6-65ebccb9c49e", "address": "fa:16:3e:a7:2b:28", "network": {"id": "dbc58ba4-8158-4179-bf5b-c94cb9a82196", "bridge": "br-int", "label": "tempest-network-smoke--906868098", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6c5bb818cba543bbb1bcff8df31dd9cd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc8f841b5-ba", "ovs_interfaceid": "c8f841b5-bac4-4aed-83b6-65ebccb9c49e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 09:32:21 compute-1 nova_compute[189066]: 2025-12-05 09:32:21.059 189070 DEBUG oslo_concurrency.lockutils [req-1e1161cf-ee58-4f38-a061-22eeec23c706 req-669e1263-5aac-44fa-b06b-e1ab0f830cf8 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Releasing lock "refresh_cache-a0adc838-396e-45a2-8503-a2ab451cd778" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 09:32:21 compute-1 nova_compute[189066]: 2025-12-05 09:32:21.077 189070 DEBUG oslo_concurrency.lockutils [None req-d1d49803-3eee-42c9-bd91-eab22a9b203b bcc37d16c39547bba794fb1f43e889c1 6c5bb818cba543bbb1bcff8df31dd9cd - - default default] Lock "a0adc838-396e-45a2-8503-a2ab451cd778" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.153s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:32:21 compute-1 nova_compute[189066]: 2025-12-05 09:32:21.431 189070 DEBUG nova.network.neutron [None req-f7d4fc56-c840-44e4-a51b-b1f6a2d762ce 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: b843e130-e156-47b6-8a2a-d4811973b93a] Successfully created port: 68d7a18a-f1fa-4b3c-bad0-34623cfb7eb5 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Dec 05 09:32:21 compute-1 nova_compute[189066]: 2025-12-05 09:32:21.719 189070 DEBUG nova.network.neutron [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] [instance: b843e130-e156-47b6-8a2a-d4811973b93a] Updating instance_info_cache with network_info: [{"id": "d7d3d638-32a8-4d3e-abce-d0e8942a6c22", "address": "fa:16:3e:7f:68:29", "network": {"id": "7e493efd-28a8-4124-9b83-acc24853936f", "bridge": "br-int", "label": "tempest-network-smoke--1424299295", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.205", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f918144b49634ed5a43d75f8f7d194d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd7d3d638-32", "ovs_interfaceid": "d7d3d638-32a8-4d3e-abce-d0e8942a6c22", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 09:32:21 compute-1 nova_compute[189066]: 2025-12-05 09:32:21.741 189070 DEBUG oslo_concurrency.lockutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Releasing lock "refresh_cache-b843e130-e156-47b6-8a2a-d4811973b93a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 09:32:21 compute-1 nova_compute[189066]: 2025-12-05 09:32:21.742 189070 DEBUG nova.compute.manager [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] [instance: b843e130-e156-47b6-8a2a-d4811973b93a] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Dec 05 09:32:22 compute-1 nova_compute[189066]: 2025-12-05 09:32:22.021 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:32:22 compute-1 nova_compute[189066]: 2025-12-05 09:32:22.021 189070 DEBUG nova.compute.manager [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Dec 05 09:32:22 compute-1 nova_compute[189066]: 2025-12-05 09:32:22.455 189070 DEBUG nova.network.neutron [None req-f7d4fc56-c840-44e4-a51b-b1f6a2d762ce 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: b843e130-e156-47b6-8a2a-d4811973b93a] Successfully updated port: 68d7a18a-f1fa-4b3c-bad0-34623cfb7eb5 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Dec 05 09:32:22 compute-1 nova_compute[189066]: 2025-12-05 09:32:22.586 189070 DEBUG oslo_concurrency.lockutils [None req-f7d4fc56-c840-44e4-a51b-b1f6a2d762ce 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Acquiring lock "refresh_cache-b843e130-e156-47b6-8a2a-d4811973b93a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 09:32:22 compute-1 nova_compute[189066]: 2025-12-05 09:32:22.586 189070 DEBUG oslo_concurrency.lockutils [None req-f7d4fc56-c840-44e4-a51b-b1f6a2d762ce 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Acquired lock "refresh_cache-b843e130-e156-47b6-8a2a-d4811973b93a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 09:32:22 compute-1 nova_compute[189066]: 2025-12-05 09:32:22.587 189070 DEBUG nova.network.neutron [None req-f7d4fc56-c840-44e4-a51b-b1f6a2d762ce 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: b843e130-e156-47b6-8a2a-d4811973b93a] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Dec 05 09:32:23 compute-1 podman[226953]: 2025-12-05 09:32:23.631849029 +0000 UTC m=+0.060100900 container health_status b07f36300303a5c1729056a61e344d6c6fd9b2e404082d8e8ce7185d45652c2b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, distribution-scope=public, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, container_name=openstack_network_exporter, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., managed_by=edpm_ansible, config_id=openstack_network_exporter, name=ubi9-minimal)
Dec 05 09:32:23 compute-1 nova_compute[189066]: 2025-12-05 09:32:23.636 189070 DEBUG nova.compute.manager [req-c97b8df9-c730-4eaa-9efd-21f406422007 req-7f0b53c4-20e9-4b0e-8e4c-02850f0f276e 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: a0adc838-396e-45a2-8503-a2ab451cd778] Received event network-vif-deleted-c8f841b5-bac4-4aed-83b6-65ebccb9c49e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 09:32:23 compute-1 nova_compute[189066]: 2025-12-05 09:32:23.636 189070 INFO nova.compute.manager [req-c97b8df9-c730-4eaa-9efd-21f406422007 req-7f0b53c4-20e9-4b0e-8e4c-02850f0f276e 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: a0adc838-396e-45a2-8503-a2ab451cd778] Neutron deleted interface c8f841b5-bac4-4aed-83b6-65ebccb9c49e; detaching it from the instance and deleting it from the info cache
Dec 05 09:32:23 compute-1 nova_compute[189066]: 2025-12-05 09:32:23.636 189070 DEBUG nova.network.neutron [req-c97b8df9-c730-4eaa-9efd-21f406422007 req-7f0b53c4-20e9-4b0e-8e4c-02850f0f276e 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: a0adc838-396e-45a2-8503-a2ab451cd778] Instance is deleted, no further info cache update update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:106
Dec 05 09:32:23 compute-1 nova_compute[189066]: 2025-12-05 09:32:23.639 189070 DEBUG nova.compute.manager [req-c97b8df9-c730-4eaa-9efd-21f406422007 req-7f0b53c4-20e9-4b0e-8e4c-02850f0f276e 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: a0adc838-396e-45a2-8503-a2ab451cd778] Detach interface failed, port_id=c8f841b5-bac4-4aed-83b6-65ebccb9c49e, reason: Instance a0adc838-396e-45a2-8503-a2ab451cd778 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10882
Dec 05 09:32:24 compute-1 nova_compute[189066]: 2025-12-05 09:32:24.264 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:32:24 compute-1 nova_compute[189066]: 2025-12-05 09:32:24.431 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:32:24 compute-1 nova_compute[189066]: 2025-12-05 09:32:24.498 189070 DEBUG nova.compute.manager [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Triggering sync for uuid b843e130-e156-47b6-8a2a-d4811973b93a _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268
Dec 05 09:32:24 compute-1 nova_compute[189066]: 2025-12-05 09:32:24.499 189070 DEBUG oslo_concurrency.lockutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Acquiring lock "b843e130-e156-47b6-8a2a-d4811973b93a" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:32:24 compute-1 nova_compute[189066]: 2025-12-05 09:32:24.499 189070 DEBUG oslo_concurrency.lockutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Lock "b843e130-e156-47b6-8a2a-d4811973b93a" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:32:24 compute-1 nova_compute[189066]: 2025-12-05 09:32:24.540 189070 DEBUG oslo_concurrency.lockutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Lock "b843e130-e156-47b6-8a2a-d4811973b93a" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.041s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:32:24 compute-1 nova_compute[189066]: 2025-12-05 09:32:24.675 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:32:26 compute-1 nova_compute[189066]: 2025-12-05 09:32:26.522 189070 DEBUG nova.compute.manager [req-70bcf2f9-a121-4629-a4b5-7b8b86660907 req-ae86c0e1-fc31-4779-aab4-00446d51243f 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: b843e130-e156-47b6-8a2a-d4811973b93a] Received event network-changed-68d7a18a-f1fa-4b3c-bad0-34623cfb7eb5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 09:32:26 compute-1 nova_compute[189066]: 2025-12-05 09:32:26.523 189070 DEBUG nova.compute.manager [req-70bcf2f9-a121-4629-a4b5-7b8b86660907 req-ae86c0e1-fc31-4779-aab4-00446d51243f 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: b843e130-e156-47b6-8a2a-d4811973b93a] Refreshing instance network info cache due to event network-changed-68d7a18a-f1fa-4b3c-bad0-34623cfb7eb5. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Dec 05 09:32:26 compute-1 nova_compute[189066]: 2025-12-05 09:32:26.523 189070 DEBUG oslo_concurrency.lockutils [req-70bcf2f9-a121-4629-a4b5-7b8b86660907 req-ae86c0e1-fc31-4779-aab4-00446d51243f 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Acquiring lock "refresh_cache-b843e130-e156-47b6-8a2a-d4811973b93a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 09:32:26 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:32:26.794 105272 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=2deaed7a-68f6-453c-b7f8-10ef033f3762, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '15'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 09:32:27 compute-1 podman[226975]: 2025-12-05 09:32:27.624059357 +0000 UTC m=+0.063374540 container health_status b9f45cdcb567bb250381fa80d86c78c226d5638c7de1b6d79ee017275814ff68 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Dec 05 09:32:27 compute-1 nova_compute[189066]: 2025-12-05 09:32:27.635 189070 DEBUG nova.network.neutron [None req-f7d4fc56-c840-44e4-a51b-b1f6a2d762ce 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: b843e130-e156-47b6-8a2a-d4811973b93a] Updating instance_info_cache with network_info: [{"id": "d7d3d638-32a8-4d3e-abce-d0e8942a6c22", "address": "fa:16:3e:7f:68:29", "network": {"id": "7e493efd-28a8-4124-9b83-acc24853936f", "bridge": "br-int", "label": "tempest-network-smoke--1424299295", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.205", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f918144b49634ed5a43d75f8f7d194d3", "mtu": null, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd7d3d638-32", "ovs_interfaceid": "d7d3d638-32a8-4d3e-abce-d0e8942a6c22", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "68d7a18a-f1fa-4b3c-bad0-34623cfb7eb5", "address": "fa:16:3e:af:c1:b0", "network": {"id": "644a5de2-2e9c-48ea-b27d-a8992c881d84", "bridge": "br-int", "label": "tempest-network-smoke--1022632946", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.18", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f918144b49634ed5a43d75f8f7d194d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap68d7a18a-f1", "ovs_interfaceid": "68d7a18a-f1fa-4b3c-bad0-34623cfb7eb5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 09:32:29 compute-1 nova_compute[189066]: 2025-12-05 09:32:29.267 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:32:29 compute-1 nova_compute[189066]: 2025-12-05 09:32:29.695 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:32:29 compute-1 nova_compute[189066]: 2025-12-05 09:32:29.930 189070 DEBUG oslo_concurrency.lockutils [None req-f7d4fc56-c840-44e4-a51b-b1f6a2d762ce 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Releasing lock "refresh_cache-b843e130-e156-47b6-8a2a-d4811973b93a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 09:32:29 compute-1 nova_compute[189066]: 2025-12-05 09:32:29.932 189070 DEBUG oslo_concurrency.lockutils [req-70bcf2f9-a121-4629-a4b5-7b8b86660907 req-ae86c0e1-fc31-4779-aab4-00446d51243f 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Acquired lock "refresh_cache-b843e130-e156-47b6-8a2a-d4811973b93a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 09:32:29 compute-1 nova_compute[189066]: 2025-12-05 09:32:29.932 189070 DEBUG nova.network.neutron [req-70bcf2f9-a121-4629-a4b5-7b8b86660907 req-ae86c0e1-fc31-4779-aab4-00446d51243f 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: b843e130-e156-47b6-8a2a-d4811973b93a] Refreshing network info cache for port 68d7a18a-f1fa-4b3c-bad0-34623cfb7eb5 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Dec 05 09:32:29 compute-1 nova_compute[189066]: 2025-12-05 09:32:29.935 189070 DEBUG nova.virt.libvirt.vif [None req-f7d4fc56-c840-44e4-a51b-b1f6a2d762ce 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-05T09:31:38Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1503446723',display_name='tempest-TestNetworkBasicOps-server-1503446723',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1503446723',id=23,image_ref='3ebffd97-b242-42d7-b245-ebdaf8e4377c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBAxL8qwYiwFQ90XKpcC6fYNHvnM/WUgqEtm6wBf8f4M0HY8C3R6RbtlwIDxS0SDgqUTn2GT88UA7fyLkCPv6NpqTCLHrrPHnPjNeXNwF6PVPzq8XOcGedORn9QBGLd5VXw==',key_name='tempest-TestNetworkBasicOps-1643628689',keypairs=<?>,launch_index=0,launched_at=2025-12-05T09:31:53Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='f918144b49634ed5a43d75f8f7d194d3',ramdisk_id='',reservation_id='r-0tfmxwfc',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='3ebffd97-b242-42d7-b245-ebdaf8e4377c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1341993432',owner_user_name='tempest-TestNetworkBasicOps-1341993432-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-12-05T09:31:53Z,user_data=None,user_id='822a2d80e80e46d9a5a49e1e9560e0d9',uuid=b843e130-e156-47b6-8a2a-d4811973b93a,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "68d7a18a-f1fa-4b3c-bad0-34623cfb7eb5", "address": "fa:16:3e:af:c1:b0", "network": {"id": "644a5de2-2e9c-48ea-b27d-a8992c881d84", "bridge": "br-int", "label": "tempest-network-smoke--1022632946", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.18", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f918144b49634ed5a43d75f8f7d194d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap68d7a18a-f1", "ovs_interfaceid": "68d7a18a-f1fa-4b3c-bad0-34623cfb7eb5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Dec 05 09:32:29 compute-1 nova_compute[189066]: 2025-12-05 09:32:29.936 189070 DEBUG nova.network.os_vif_util [None req-f7d4fc56-c840-44e4-a51b-b1f6a2d762ce 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Converting VIF {"id": "68d7a18a-f1fa-4b3c-bad0-34623cfb7eb5", "address": "fa:16:3e:af:c1:b0", "network": {"id": "644a5de2-2e9c-48ea-b27d-a8992c881d84", "bridge": "br-int", "label": "tempest-network-smoke--1022632946", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.18", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f918144b49634ed5a43d75f8f7d194d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap68d7a18a-f1", "ovs_interfaceid": "68d7a18a-f1fa-4b3c-bad0-34623cfb7eb5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 09:32:29 compute-1 nova_compute[189066]: 2025-12-05 09:32:29.936 189070 DEBUG nova.network.os_vif_util [None req-f7d4fc56-c840-44e4-a51b-b1f6a2d762ce 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:af:c1:b0,bridge_name='br-int',has_traffic_filtering=True,id=68d7a18a-f1fa-4b3c-bad0-34623cfb7eb5,network=Network(644a5de2-2e9c-48ea-b27d-a8992c881d84),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap68d7a18a-f1') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 09:32:29 compute-1 nova_compute[189066]: 2025-12-05 09:32:29.937 189070 DEBUG os_vif [None req-f7d4fc56-c840-44e4-a51b-b1f6a2d762ce 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:af:c1:b0,bridge_name='br-int',has_traffic_filtering=True,id=68d7a18a-f1fa-4b3c-bad0-34623cfb7eb5,network=Network(644a5de2-2e9c-48ea-b27d-a8992c881d84),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap68d7a18a-f1') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Dec 05 09:32:29 compute-1 nova_compute[189066]: 2025-12-05 09:32:29.937 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:32:29 compute-1 nova_compute[189066]: 2025-12-05 09:32:29.938 189070 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 09:32:29 compute-1 nova_compute[189066]: 2025-12-05 09:32:29.938 189070 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 09:32:29 compute-1 nova_compute[189066]: 2025-12-05 09:32:29.941 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:32:29 compute-1 nova_compute[189066]: 2025-12-05 09:32:29.942 189070 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap68d7a18a-f1, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 09:32:29 compute-1 nova_compute[189066]: 2025-12-05 09:32:29.943 189070 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap68d7a18a-f1, col_values=(('external_ids', {'iface-id': '68d7a18a-f1fa-4b3c-bad0-34623cfb7eb5', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:af:c1:b0', 'vm-uuid': 'b843e130-e156-47b6-8a2a-d4811973b93a'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 09:32:29 compute-1 nova_compute[189066]: 2025-12-05 09:32:29.944 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:32:29 compute-1 NetworkManager[55704]: <info>  [1764927149.9462] manager: (tap68d7a18a-f1): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/85)
Dec 05 09:32:29 compute-1 nova_compute[189066]: 2025-12-05 09:32:29.946 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 09:32:29 compute-1 nova_compute[189066]: 2025-12-05 09:32:29.951 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:32:29 compute-1 nova_compute[189066]: 2025-12-05 09:32:29.952 189070 INFO os_vif [None req-f7d4fc56-c840-44e4-a51b-b1f6a2d762ce 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:af:c1:b0,bridge_name='br-int',has_traffic_filtering=True,id=68d7a18a-f1fa-4b3c-bad0-34623cfb7eb5,network=Network(644a5de2-2e9c-48ea-b27d-a8992c881d84),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap68d7a18a-f1')
Dec 05 09:32:29 compute-1 nova_compute[189066]: 2025-12-05 09:32:29.953 189070 DEBUG nova.virt.libvirt.vif [None req-f7d4fc56-c840-44e4-a51b-b1f6a2d762ce 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-05T09:31:38Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1503446723',display_name='tempest-TestNetworkBasicOps-server-1503446723',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1503446723',id=23,image_ref='3ebffd97-b242-42d7-b245-ebdaf8e4377c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBAxL8qwYiwFQ90XKpcC6fYNHvnM/WUgqEtm6wBf8f4M0HY8C3R6RbtlwIDxS0SDgqUTn2GT88UA7fyLkCPv6NpqTCLHrrPHnPjNeXNwF6PVPzq8XOcGedORn9QBGLd5VXw==',key_name='tempest-TestNetworkBasicOps-1643628689',keypairs=<?>,launch_index=0,launched_at=2025-12-05T09:31:53Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='f918144b49634ed5a43d75f8f7d194d3',ramdisk_id='',reservation_id='r-0tfmxwfc',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='3ebffd97-b242-42d7-b245-ebdaf8e4377c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1341993432',owner_user_name='tempest-TestNetworkBasicOps-1341993432-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-12-05T09:31:53Z,user_data=None,user_id='822a2d80e80e46d9a5a49e1e9560e0d9',uuid=b843e130-e156-47b6-8a2a-d4811973b93a,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "68d7a18a-f1fa-4b3c-bad0-34623cfb7eb5", "address": "fa:16:3e:af:c1:b0", "network": {"id": "644a5de2-2e9c-48ea-b27d-a8992c881d84", "bridge": "br-int", "label": "tempest-network-smoke--1022632946", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.18", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f918144b49634ed5a43d75f8f7d194d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap68d7a18a-f1", "ovs_interfaceid": "68d7a18a-f1fa-4b3c-bad0-34623cfb7eb5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Dec 05 09:32:29 compute-1 nova_compute[189066]: 2025-12-05 09:32:29.953 189070 DEBUG nova.network.os_vif_util [None req-f7d4fc56-c840-44e4-a51b-b1f6a2d762ce 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Converting VIF {"id": "68d7a18a-f1fa-4b3c-bad0-34623cfb7eb5", "address": "fa:16:3e:af:c1:b0", "network": {"id": "644a5de2-2e9c-48ea-b27d-a8992c881d84", "bridge": "br-int", "label": "tempest-network-smoke--1022632946", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.18", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f918144b49634ed5a43d75f8f7d194d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap68d7a18a-f1", "ovs_interfaceid": "68d7a18a-f1fa-4b3c-bad0-34623cfb7eb5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 09:32:29 compute-1 nova_compute[189066]: 2025-12-05 09:32:29.954 189070 DEBUG nova.network.os_vif_util [None req-f7d4fc56-c840-44e4-a51b-b1f6a2d762ce 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:af:c1:b0,bridge_name='br-int',has_traffic_filtering=True,id=68d7a18a-f1fa-4b3c-bad0-34623cfb7eb5,network=Network(644a5de2-2e9c-48ea-b27d-a8992c881d84),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap68d7a18a-f1') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 09:32:29 compute-1 nova_compute[189066]: 2025-12-05 09:32:29.957 189070 DEBUG nova.virt.libvirt.guest [None req-f7d4fc56-c840-44e4-a51b-b1f6a2d762ce 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] attach device xml: <interface type="ethernet">
Dec 05 09:32:29 compute-1 nova_compute[189066]:   <mac address="fa:16:3e:af:c1:b0"/>
Dec 05 09:32:29 compute-1 nova_compute[189066]:   <model type="virtio"/>
Dec 05 09:32:29 compute-1 nova_compute[189066]:   <driver name="vhost" rx_queue_size="512"/>
Dec 05 09:32:29 compute-1 nova_compute[189066]:   <mtu size="1442"/>
Dec 05 09:32:29 compute-1 nova_compute[189066]:   <target dev="tap68d7a18a-f1"/>
Dec 05 09:32:29 compute-1 nova_compute[189066]: </interface>
Dec 05 09:32:29 compute-1 nova_compute[189066]:  attach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:339
Dec 05 09:32:29 compute-1 kernel: tap68d7a18a-f1: entered promiscuous mode
Dec 05 09:32:29 compute-1 NetworkManager[55704]: <info>  [1764927149.9717] manager: (tap68d7a18a-f1): new Tun device (/org/freedesktop/NetworkManager/Devices/86)
Dec 05 09:32:29 compute-1 ovn_controller[95809]: 2025-12-05T09:32:29Z|00158|binding|INFO|Claiming lport 68d7a18a-f1fa-4b3c-bad0-34623cfb7eb5 for this chassis.
Dec 05 09:32:29 compute-1 ovn_controller[95809]: 2025-12-05T09:32:29Z|00159|binding|INFO|68d7a18a-f1fa-4b3c-bad0-34623cfb7eb5: Claiming fa:16:3e:af:c1:b0 10.100.0.18
Dec 05 09:32:29 compute-1 nova_compute[189066]: 2025-12-05 09:32:29.973 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:32:29 compute-1 nova_compute[189066]: 2025-12-05 09:32:29.975 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:32:29 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:32:29.988 105272 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:af:c1:b0 10.100.0.18'], port_security=['fa:16:3e:af:c1:b0 10.100.0.18'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.18/28', 'neutron:device_id': 'b843e130-e156-47b6-8a2a-d4811973b93a', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-644a5de2-2e9c-48ea-b27d-a8992c881d84', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'f918144b49634ed5a43d75f8f7d194d3', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'b999a1ea-2a47-43bb-b6d8-1df5b14a091a', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=4ebe793e-a06d-45b8-9629-f9aa35f26666, chassis=[<ovs.db.idl.Row object at 0x7fc06f54c970>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fc06f54c970>], logical_port=68d7a18a-f1fa-4b3c-bad0-34623cfb7eb5) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 09:32:29 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:32:29.989 105272 INFO neutron.agent.ovn.metadata.agent [-] Port 68d7a18a-f1fa-4b3c-bad0-34623cfb7eb5 in datapath 644a5de2-2e9c-48ea-b27d-a8992c881d84 bound to our chassis
Dec 05 09:32:29 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:32:29.991 105272 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 644a5de2-2e9c-48ea-b27d-a8992c881d84
Dec 05 09:32:30 compute-1 systemd-udevd[227007]: Network interface NamePolicy= disabled on kernel command line.
Dec 05 09:32:30 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:32:30.004 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[1f974bb3-aeba-4df6-81e5-3a35926fd047]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:32:30 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:32:30.006 105272 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap644a5de2-21 in ovnmeta-644a5de2-2e9c-48ea-b27d-a8992c881d84 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Dec 05 09:32:30 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:32:30.008 220556 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap644a5de2-20 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Dec 05 09:32:30 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:32:30.009 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[991a6992-454c-471e-99af-b87f64b6c3bf]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:32:30 compute-1 nova_compute[189066]: 2025-12-05 09:32:30.009 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:32:30 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:32:30.010 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[ae0ba256-bc30-49be-826d-008744cc58e9]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:32:30 compute-1 ovn_controller[95809]: 2025-12-05T09:32:30Z|00160|binding|INFO|Setting lport 68d7a18a-f1fa-4b3c-bad0-34623cfb7eb5 ovn-installed in OVS
Dec 05 09:32:30 compute-1 ovn_controller[95809]: 2025-12-05T09:32:30Z|00161|binding|INFO|Setting lport 68d7a18a-f1fa-4b3c-bad0-34623cfb7eb5 up in Southbound
Dec 05 09:32:30 compute-1 nova_compute[189066]: 2025-12-05 09:32:30.011 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:32:30 compute-1 NetworkManager[55704]: <info>  [1764927150.0176] device (tap68d7a18a-f1): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec 05 09:32:30 compute-1 NetworkManager[55704]: <info>  [1764927150.0189] device (tap68d7a18a-f1): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec 05 09:32:30 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:32:30.026 105809 DEBUG oslo.privsep.daemon [-] privsep: reply[52b3270f-2815-493f-913b-46f6592ea6f5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:32:30 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:32:30.045 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[ea97332d-dc92-423e-9850-7aa51cf1e5d4]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:32:30 compute-1 nova_compute[189066]: 2025-12-05 09:32:30.082 189070 DEBUG nova.virt.libvirt.driver [None req-f7d4fc56-c840-44e4-a51b-b1f6a2d762ce 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 05 09:32:30 compute-1 nova_compute[189066]: 2025-12-05 09:32:30.082 189070 DEBUG nova.virt.libvirt.driver [None req-f7d4fc56-c840-44e4-a51b-b1f6a2d762ce 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 05 09:32:30 compute-1 nova_compute[189066]: 2025-12-05 09:32:30.082 189070 DEBUG nova.virt.libvirt.driver [None req-f7d4fc56-c840-44e4-a51b-b1f6a2d762ce 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] No VIF found with MAC fa:16:3e:7f:68:29, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Dec 05 09:32:30 compute-1 nova_compute[189066]: 2025-12-05 09:32:30.083 189070 DEBUG nova.virt.libvirt.driver [None req-f7d4fc56-c840-44e4-a51b-b1f6a2d762ce 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] No VIF found with MAC fa:16:3e:af:c1:b0, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Dec 05 09:32:30 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:32:30.088 220896 DEBUG oslo.privsep.daemon [-] privsep: reply[0e7f8d38-7e46-4f98-bf7e-cb6cfdc69cd4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:32:30 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:32:30.093 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[03ef0133-a721-4e25-baa9-b242c75f9646]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:32:30 compute-1 NetworkManager[55704]: <info>  [1764927150.0946] manager: (tap644a5de2-20): new Veth device (/org/freedesktop/NetworkManager/Devices/87)
Dec 05 09:32:30 compute-1 nova_compute[189066]: 2025-12-05 09:32:30.115 189070 DEBUG nova.virt.libvirt.guest [None req-f7d4fc56-c840-44e4-a51b-b1f6a2d762ce 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] set metadata xml: <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 05 09:32:30 compute-1 nova_compute[189066]:   <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec 05 09:32:30 compute-1 nova_compute[189066]:   <nova:name>tempest-TestNetworkBasicOps-server-1503446723</nova:name>
Dec 05 09:32:30 compute-1 nova_compute[189066]:   <nova:creationTime>2025-12-05 09:32:30</nova:creationTime>
Dec 05 09:32:30 compute-1 nova_compute[189066]:   <nova:flavor name="m1.nano">
Dec 05 09:32:30 compute-1 nova_compute[189066]:     <nova:memory>128</nova:memory>
Dec 05 09:32:30 compute-1 nova_compute[189066]:     <nova:disk>1</nova:disk>
Dec 05 09:32:30 compute-1 nova_compute[189066]:     <nova:swap>0</nova:swap>
Dec 05 09:32:30 compute-1 nova_compute[189066]:     <nova:ephemeral>0</nova:ephemeral>
Dec 05 09:32:30 compute-1 nova_compute[189066]:     <nova:vcpus>1</nova:vcpus>
Dec 05 09:32:30 compute-1 nova_compute[189066]:   </nova:flavor>
Dec 05 09:32:30 compute-1 nova_compute[189066]:   <nova:owner>
Dec 05 09:32:30 compute-1 nova_compute[189066]:     <nova:user uuid="822a2d80e80e46d9a5a49e1e9560e0d9">tempest-TestNetworkBasicOps-1341993432-project-member</nova:user>
Dec 05 09:32:30 compute-1 nova_compute[189066]:     <nova:project uuid="f918144b49634ed5a43d75f8f7d194d3">tempest-TestNetworkBasicOps-1341993432</nova:project>
Dec 05 09:32:30 compute-1 nova_compute[189066]:   </nova:owner>
Dec 05 09:32:30 compute-1 nova_compute[189066]:   <nova:root type="image" uuid="3ebffd97-b242-42d7-b245-ebdaf8e4377c"/>
Dec 05 09:32:30 compute-1 nova_compute[189066]:   <nova:ports>
Dec 05 09:32:30 compute-1 nova_compute[189066]:     <nova:port uuid="d7d3d638-32a8-4d3e-abce-d0e8942a6c22">
Dec 05 09:32:30 compute-1 nova_compute[189066]:       <nova:ip type="fixed" address="10.100.0.10" ipVersion="4"/>
Dec 05 09:32:30 compute-1 nova_compute[189066]:     </nova:port>
Dec 05 09:32:30 compute-1 nova_compute[189066]:     <nova:port uuid="68d7a18a-f1fa-4b3c-bad0-34623cfb7eb5">
Dec 05 09:32:30 compute-1 nova_compute[189066]:       <nova:ip type="fixed" address="10.100.0.18" ipVersion="4"/>
Dec 05 09:32:30 compute-1 nova_compute[189066]:     </nova:port>
Dec 05 09:32:30 compute-1 nova_compute[189066]:   </nova:ports>
Dec 05 09:32:30 compute-1 nova_compute[189066]: </nova:instance>
Dec 05 09:32:30 compute-1 nova_compute[189066]:  set_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:359
Dec 05 09:32:30 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:32:30.127 220896 DEBUG oslo.privsep.daemon [-] privsep: reply[9d21aa83-740e-4391-afe6-fcbce87116d8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:32:30 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:32:30.131 220896 DEBUG oslo.privsep.daemon [-] privsep: reply[7f96f55b-1a96-488e-9742-e0e553c91875]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:32:30 compute-1 nova_compute[189066]: 2025-12-05 09:32:30.148 189070 DEBUG oslo_concurrency.lockutils [None req-f7d4fc56-c840-44e4-a51b-b1f6a2d762ce 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Lock "interface-b843e130-e156-47b6-8a2a-d4811973b93a-None" "released" by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" :: held 10.958s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:32:30 compute-1 NetworkManager[55704]: <info>  [1764927150.1573] device (tap644a5de2-20): carrier: link connected
Dec 05 09:32:30 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:32:30.162 220896 DEBUG oslo.privsep.daemon [-] privsep: reply[b44fb103-06e3-4029-aa09-9aedc766371c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:32:30 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:32:30.182 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[585ec0d2-89c9-4240-ad65-ffddb0a1a317]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap644a5de2-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:40:af:3f'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 48], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 442316, 'reachable_time': 29733, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 227033, 'error': None, 'target': 'ovnmeta-644a5de2-2e9c-48ea-b27d-a8992c881d84', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:32:30 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:32:30.207 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[82bb868c-86b0-4c91-920d-0b2806fbb4dc]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe40:af3f'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 442316, 'tstamp': 442316}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 227034, 'error': None, 'target': 'ovnmeta-644a5de2-2e9c-48ea-b27d-a8992c881d84', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:32:30 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:32:30.227 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[0078219c-b1e2-4b36-87ea-492279d56bf2]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap644a5de2-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:40:af:3f'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 48], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 442316, 'reachable_time': 29733, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 148, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 148, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 227035, 'error': None, 'target': 'ovnmeta-644a5de2-2e9c-48ea-b27d-a8992c881d84', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:32:30 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:32:30.263 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[117d0023-aa4e-46db-bd22-f4c4f503e12b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:32:30 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:32:30.333 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[2797fb3f-23a8-404e-9e21-323eb3074e67]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:32:30 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:32:30.335 105272 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap644a5de2-20, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 09:32:30 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:32:30.335 105272 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 09:32:30 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:32:30.336 105272 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap644a5de2-20, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 09:32:30 compute-1 nova_compute[189066]: 2025-12-05 09:32:30.338 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:32:30 compute-1 kernel: tap644a5de2-20: entered promiscuous mode
Dec 05 09:32:30 compute-1 NetworkManager[55704]: <info>  [1764927150.3388] manager: (tap644a5de2-20): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/88)
Dec 05 09:32:30 compute-1 nova_compute[189066]: 2025-12-05 09:32:30.340 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:32:30 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:32:30.342 105272 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap644a5de2-20, col_values=(('external_ids', {'iface-id': 'a9118915-361e-4e57-be0f-2dadc6e274fb'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 09:32:30 compute-1 ovn_controller[95809]: 2025-12-05T09:32:30Z|00162|binding|INFO|Releasing lport a9118915-361e-4e57-be0f-2dadc6e274fb from this chassis (sb_readonly=0)
Dec 05 09:32:30 compute-1 nova_compute[189066]: 2025-12-05 09:32:30.344 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:32:30 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:32:30.346 105272 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/644a5de2-2e9c-48ea-b27d-a8992c881d84.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/644a5de2-2e9c-48ea-b27d-a8992c881d84.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Dec 05 09:32:30 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:32:30.347 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[de5b7dcf-92dd-4609-bae8-ca1bee0489d9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:32:30 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:32:30.348 105272 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Dec 05 09:32:30 compute-1 ovn_metadata_agent[105267]: global
Dec 05 09:32:30 compute-1 ovn_metadata_agent[105267]:     log         /dev/log local0 debug
Dec 05 09:32:30 compute-1 ovn_metadata_agent[105267]:     log-tag     haproxy-metadata-proxy-644a5de2-2e9c-48ea-b27d-a8992c881d84
Dec 05 09:32:30 compute-1 ovn_metadata_agent[105267]:     user        root
Dec 05 09:32:30 compute-1 ovn_metadata_agent[105267]:     group       root
Dec 05 09:32:30 compute-1 ovn_metadata_agent[105267]:     maxconn     1024
Dec 05 09:32:30 compute-1 ovn_metadata_agent[105267]:     pidfile     /var/lib/neutron/external/pids/644a5de2-2e9c-48ea-b27d-a8992c881d84.pid.haproxy
Dec 05 09:32:30 compute-1 ovn_metadata_agent[105267]:     daemon
Dec 05 09:32:30 compute-1 ovn_metadata_agent[105267]: 
Dec 05 09:32:30 compute-1 ovn_metadata_agent[105267]: defaults
Dec 05 09:32:30 compute-1 ovn_metadata_agent[105267]:     log global
Dec 05 09:32:30 compute-1 ovn_metadata_agent[105267]:     mode http
Dec 05 09:32:30 compute-1 ovn_metadata_agent[105267]:     option httplog
Dec 05 09:32:30 compute-1 ovn_metadata_agent[105267]:     option dontlognull
Dec 05 09:32:30 compute-1 ovn_metadata_agent[105267]:     option http-server-close
Dec 05 09:32:30 compute-1 ovn_metadata_agent[105267]:     option forwardfor
Dec 05 09:32:30 compute-1 ovn_metadata_agent[105267]:     retries                 3
Dec 05 09:32:30 compute-1 ovn_metadata_agent[105267]:     timeout http-request    30s
Dec 05 09:32:30 compute-1 ovn_metadata_agent[105267]:     timeout connect         30s
Dec 05 09:32:30 compute-1 ovn_metadata_agent[105267]:     timeout client          32s
Dec 05 09:32:30 compute-1 ovn_metadata_agent[105267]:     timeout server          32s
Dec 05 09:32:30 compute-1 ovn_metadata_agent[105267]:     timeout http-keep-alive 30s
Dec 05 09:32:30 compute-1 ovn_metadata_agent[105267]: 
Dec 05 09:32:30 compute-1 ovn_metadata_agent[105267]: 
Dec 05 09:32:30 compute-1 ovn_metadata_agent[105267]: listen listener
Dec 05 09:32:30 compute-1 ovn_metadata_agent[105267]:     bind 169.254.169.254:80
Dec 05 09:32:30 compute-1 ovn_metadata_agent[105267]:     server metadata /var/lib/neutron/metadata_proxy
Dec 05 09:32:30 compute-1 ovn_metadata_agent[105267]:     http-request add-header X-OVN-Network-ID 644a5de2-2e9c-48ea-b27d-a8992c881d84
Dec 05 09:32:30 compute-1 ovn_metadata_agent[105267]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Dec 05 09:32:30 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:32:30.348 105272 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-644a5de2-2e9c-48ea-b27d-a8992c881d84', 'env', 'PROCESS_TAG=haproxy-644a5de2-2e9c-48ea-b27d-a8992c881d84', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/644a5de2-2e9c-48ea-b27d-a8992c881d84.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Dec 05 09:32:30 compute-1 nova_compute[189066]: 2025-12-05 09:32:30.356 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:32:30 compute-1 podman[227067]: 2025-12-05 09:32:30.804438219 +0000 UTC m=+0.072490454 container create 3729f9bed34cfd51910d138bbc18f75eafb4805b9d82140fca8e0fc9b6dc9817 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-644a5de2-2e9c-48ea-b27d-a8992c881d84, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, tcib_managed=true)
Dec 05 09:32:30 compute-1 systemd[1]: Started libpod-conmon-3729f9bed34cfd51910d138bbc18f75eafb4805b9d82140fca8e0fc9b6dc9817.scope.
Dec 05 09:32:30 compute-1 podman[227067]: 2025-12-05 09:32:30.772510364 +0000 UTC m=+0.040562619 image pull 014dc726c85414b29f2dde7b5d875685d08784761c0f0ffa8630d1583a877bf9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Dec 05 09:32:30 compute-1 systemd[1]: Started libcrun container.
Dec 05 09:32:30 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c42a599b9659148e1cd1e6021407d3d1d37f18d5111029bce9a9234222756d4c/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 05 09:32:30 compute-1 podman[227067]: 2025-12-05 09:32:30.915315938 +0000 UTC m=+0.183368193 container init 3729f9bed34cfd51910d138bbc18f75eafb4805b9d82140fca8e0fc9b6dc9817 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-644a5de2-2e9c-48ea-b27d-a8992c881d84, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 05 09:32:30 compute-1 podman[227067]: 2025-12-05 09:32:30.922771521 +0000 UTC m=+0.190823756 container start 3729f9bed34cfd51910d138bbc18f75eafb4805b9d82140fca8e0fc9b6dc9817 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-644a5de2-2e9c-48ea-b27d-a8992c881d84, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Dec 05 09:32:30 compute-1 neutron-haproxy-ovnmeta-644a5de2-2e9c-48ea-b27d-a8992c881d84[227083]: [NOTICE]   (227087) : New worker (227089) forked
Dec 05 09:32:30 compute-1 neutron-haproxy-ovnmeta-644a5de2-2e9c-48ea-b27d-a8992c881d84[227083]: [NOTICE]   (227087) : Loading success.
Dec 05 09:32:32 compute-1 ovn_controller[95809]: 2025-12-05T09:32:32Z|00027|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:af:c1:b0 10.100.0.18
Dec 05 09:32:32 compute-1 ovn_controller[95809]: 2025-12-05T09:32:32Z|00028|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:af:c1:b0 10.100.0.18
Dec 05 09:32:32 compute-1 nova_compute[189066]: 2025-12-05 09:32:32.781 189070 DEBUG oslo_concurrency.lockutils [None req-df92c33a-9cf2-435b-9d00-f524add3a333 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Acquiring lock "interface-b843e130-e156-47b6-8a2a-d4811973b93a-68d7a18a-f1fa-4b3c-bad0-34623cfb7eb5" by "nova.compute.manager.ComputeManager.detach_interface.<locals>.do_detach_interface" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:32:32 compute-1 nova_compute[189066]: 2025-12-05 09:32:32.781 189070 DEBUG oslo_concurrency.lockutils [None req-df92c33a-9cf2-435b-9d00-f524add3a333 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Lock "interface-b843e130-e156-47b6-8a2a-d4811973b93a-68d7a18a-f1fa-4b3c-bad0-34623cfb7eb5" acquired by "nova.compute.manager.ComputeManager.detach_interface.<locals>.do_detach_interface" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:32:32 compute-1 nova_compute[189066]: 2025-12-05 09:32:32.815 189070 DEBUG nova.objects.instance [None req-df92c33a-9cf2-435b-9d00-f524add3a333 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Lazy-loading 'flavor' on Instance uuid b843e130-e156-47b6-8a2a-d4811973b93a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 09:32:32 compute-1 nova_compute[189066]: 2025-12-05 09:32:32.867 189070 DEBUG nova.virt.libvirt.vif [None req-df92c33a-9cf2-435b-9d00-f524add3a333 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-05T09:31:38Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1503446723',display_name='tempest-TestNetworkBasicOps-server-1503446723',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1503446723',id=23,image_ref='3ebffd97-b242-42d7-b245-ebdaf8e4377c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBAxL8qwYiwFQ90XKpcC6fYNHvnM/WUgqEtm6wBf8f4M0HY8C3R6RbtlwIDxS0SDgqUTn2GT88UA7fyLkCPv6NpqTCLHrrPHnPjNeXNwF6PVPzq8XOcGedORn9QBGLd5VXw==',key_name='tempest-TestNetworkBasicOps-1643628689',keypairs=<?>,launch_index=0,launched_at=2025-12-05T09:31:53Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='f918144b49634ed5a43d75f8f7d194d3',ramdisk_id='',reservation_id='r-0tfmxwfc',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='3ebffd97-b242-42d7-b245-ebdaf8e4377c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1341993432',owner_user_name='tempest-TestNetworkBasicOps-1341993432-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-12-05T09:31:53Z,user_data=None,user_id='822a2d80e80e46d9a5a49e1e9560e0d9',uuid=b843e130-e156-47b6-8a2a-d4811973b93a,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "68d7a18a-f1fa-4b3c-bad0-34623cfb7eb5", "address": "fa:16:3e:af:c1:b0", "network": {"id": "644a5de2-2e9c-48ea-b27d-a8992c881d84", "bridge": "br-int", "label": "tempest-network-smoke--1022632946", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.18", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f918144b49634ed5a43d75f8f7d194d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap68d7a18a-f1", "ovs_interfaceid": "68d7a18a-f1fa-4b3c-bad0-34623cfb7eb5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Dec 05 09:32:32 compute-1 nova_compute[189066]: 2025-12-05 09:32:32.868 189070 DEBUG nova.network.os_vif_util [None req-df92c33a-9cf2-435b-9d00-f524add3a333 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Converting VIF {"id": "68d7a18a-f1fa-4b3c-bad0-34623cfb7eb5", "address": "fa:16:3e:af:c1:b0", "network": {"id": "644a5de2-2e9c-48ea-b27d-a8992c881d84", "bridge": "br-int", "label": "tempest-network-smoke--1022632946", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.18", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f918144b49634ed5a43d75f8f7d194d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap68d7a18a-f1", "ovs_interfaceid": "68d7a18a-f1fa-4b3c-bad0-34623cfb7eb5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 09:32:32 compute-1 nova_compute[189066]: 2025-12-05 09:32:32.868 189070 DEBUG nova.network.os_vif_util [None req-df92c33a-9cf2-435b-9d00-f524add3a333 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:af:c1:b0,bridge_name='br-int',has_traffic_filtering=True,id=68d7a18a-f1fa-4b3c-bad0-34623cfb7eb5,network=Network(644a5de2-2e9c-48ea-b27d-a8992c881d84),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap68d7a18a-f1') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 09:32:32 compute-1 nova_compute[189066]: 2025-12-05 09:32:32.873 189070 DEBUG nova.virt.libvirt.guest [None req-df92c33a-9cf2-435b-9d00-f524add3a333 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:af:c1:b0"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap68d7a18a-f1"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257
Dec 05 09:32:32 compute-1 nova_compute[189066]: 2025-12-05 09:32:32.875 189070 DEBUG nova.virt.libvirt.guest [None req-df92c33a-9cf2-435b-9d00-f524add3a333 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:af:c1:b0"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap68d7a18a-f1"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257
Dec 05 09:32:32 compute-1 nova_compute[189066]: 2025-12-05 09:32:32.877 189070 DEBUG nova.virt.libvirt.driver [None req-df92c33a-9cf2-435b-9d00-f524add3a333 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Attempting to detach device tap68d7a18a-f1 from instance b843e130-e156-47b6-8a2a-d4811973b93a from the persistent domain config. _detach_from_persistent /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2487
Dec 05 09:32:32 compute-1 nova_compute[189066]: 2025-12-05 09:32:32.878 189070 DEBUG nova.virt.libvirt.guest [None req-df92c33a-9cf2-435b-9d00-f524add3a333 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] detach device xml: <interface type="ethernet">
Dec 05 09:32:32 compute-1 nova_compute[189066]:   <mac address="fa:16:3e:af:c1:b0"/>
Dec 05 09:32:32 compute-1 nova_compute[189066]:   <model type="virtio"/>
Dec 05 09:32:32 compute-1 nova_compute[189066]:   <driver name="vhost" rx_queue_size="512"/>
Dec 05 09:32:32 compute-1 nova_compute[189066]:   <mtu size="1442"/>
Dec 05 09:32:32 compute-1 nova_compute[189066]:   <target dev="tap68d7a18a-f1"/>
Dec 05 09:32:32 compute-1 nova_compute[189066]: </interface>
Dec 05 09:32:32 compute-1 nova_compute[189066]:  detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465
Dec 05 09:32:32 compute-1 nova_compute[189066]: 2025-12-05 09:32:32.910 189070 DEBUG nova.compute.manager [req-0948f4d1-84eb-4f80-b3cb-b186db72d9a1 req-b11e82db-fcf1-497e-a3c3-6d52d8b97f98 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: b843e130-e156-47b6-8a2a-d4811973b93a] Received event network-vif-plugged-68d7a18a-f1fa-4b3c-bad0-34623cfb7eb5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 09:32:32 compute-1 nova_compute[189066]: 2025-12-05 09:32:32.910 189070 DEBUG oslo_concurrency.lockutils [req-0948f4d1-84eb-4f80-b3cb-b186db72d9a1 req-b11e82db-fcf1-497e-a3c3-6d52d8b97f98 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Acquiring lock "b843e130-e156-47b6-8a2a-d4811973b93a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:32:32 compute-1 nova_compute[189066]: 2025-12-05 09:32:32.910 189070 DEBUG oslo_concurrency.lockutils [req-0948f4d1-84eb-4f80-b3cb-b186db72d9a1 req-b11e82db-fcf1-497e-a3c3-6d52d8b97f98 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Lock "b843e130-e156-47b6-8a2a-d4811973b93a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:32:32 compute-1 nova_compute[189066]: 2025-12-05 09:32:32.911 189070 DEBUG oslo_concurrency.lockutils [req-0948f4d1-84eb-4f80-b3cb-b186db72d9a1 req-b11e82db-fcf1-497e-a3c3-6d52d8b97f98 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Lock "b843e130-e156-47b6-8a2a-d4811973b93a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:32:32 compute-1 nova_compute[189066]: 2025-12-05 09:32:32.911 189070 DEBUG nova.compute.manager [req-0948f4d1-84eb-4f80-b3cb-b186db72d9a1 req-b11e82db-fcf1-497e-a3c3-6d52d8b97f98 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: b843e130-e156-47b6-8a2a-d4811973b93a] No waiting events found dispatching network-vif-plugged-68d7a18a-f1fa-4b3c-bad0-34623cfb7eb5 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 09:32:32 compute-1 nova_compute[189066]: 2025-12-05 09:32:32.911 189070 WARNING nova.compute.manager [req-0948f4d1-84eb-4f80-b3cb-b186db72d9a1 req-b11e82db-fcf1-497e-a3c3-6d52d8b97f98 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: b843e130-e156-47b6-8a2a-d4811973b93a] Received unexpected event network-vif-plugged-68d7a18a-f1fa-4b3c-bad0-34623cfb7eb5 for instance with vm_state active and task_state None.
Dec 05 09:32:32 compute-1 nova_compute[189066]: 2025-12-05 09:32:32.911 189070 DEBUG nova.compute.manager [req-0948f4d1-84eb-4f80-b3cb-b186db72d9a1 req-b11e82db-fcf1-497e-a3c3-6d52d8b97f98 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: b843e130-e156-47b6-8a2a-d4811973b93a] Received event network-vif-plugged-68d7a18a-f1fa-4b3c-bad0-34623cfb7eb5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 09:32:32 compute-1 nova_compute[189066]: 2025-12-05 09:32:32.911 189070 DEBUG oslo_concurrency.lockutils [req-0948f4d1-84eb-4f80-b3cb-b186db72d9a1 req-b11e82db-fcf1-497e-a3c3-6d52d8b97f98 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Acquiring lock "b843e130-e156-47b6-8a2a-d4811973b93a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:32:32 compute-1 nova_compute[189066]: 2025-12-05 09:32:32.912 189070 DEBUG oslo_concurrency.lockutils [req-0948f4d1-84eb-4f80-b3cb-b186db72d9a1 req-b11e82db-fcf1-497e-a3c3-6d52d8b97f98 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Lock "b843e130-e156-47b6-8a2a-d4811973b93a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:32:32 compute-1 nova_compute[189066]: 2025-12-05 09:32:32.912 189070 DEBUG oslo_concurrency.lockutils [req-0948f4d1-84eb-4f80-b3cb-b186db72d9a1 req-b11e82db-fcf1-497e-a3c3-6d52d8b97f98 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Lock "b843e130-e156-47b6-8a2a-d4811973b93a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:32:32 compute-1 nova_compute[189066]: 2025-12-05 09:32:32.912 189070 DEBUG nova.compute.manager [req-0948f4d1-84eb-4f80-b3cb-b186db72d9a1 req-b11e82db-fcf1-497e-a3c3-6d52d8b97f98 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: b843e130-e156-47b6-8a2a-d4811973b93a] No waiting events found dispatching network-vif-plugged-68d7a18a-f1fa-4b3c-bad0-34623cfb7eb5 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 09:32:32 compute-1 nova_compute[189066]: 2025-12-05 09:32:32.912 189070 WARNING nova.compute.manager [req-0948f4d1-84eb-4f80-b3cb-b186db72d9a1 req-b11e82db-fcf1-497e-a3c3-6d52d8b97f98 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: b843e130-e156-47b6-8a2a-d4811973b93a] Received unexpected event network-vif-plugged-68d7a18a-f1fa-4b3c-bad0-34623cfb7eb5 for instance with vm_state active and task_state None.
Dec 05 09:32:33 compute-1 nova_compute[189066]: 2025-12-05 09:32:33.012 189070 DEBUG nova.virt.libvirt.guest [None req-df92c33a-9cf2-435b-9d00-f524add3a333 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:af:c1:b0"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap68d7a18a-f1"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257
Dec 05 09:32:33 compute-1 nova_compute[189066]: 2025-12-05 09:32:33.018 189070 DEBUG nova.virt.libvirt.guest [None req-df92c33a-9cf2-435b-9d00-f524add3a333 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] interface for config: <interface type="ethernet"><mac address="fa:16:3e:af:c1:b0"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap68d7a18a-f1"/></interface>not found in domain: <domain type='kvm' id='12'>
Dec 05 09:32:33 compute-1 nova_compute[189066]:   <name>instance-00000017</name>
Dec 05 09:32:33 compute-1 nova_compute[189066]:   <uuid>b843e130-e156-47b6-8a2a-d4811973b93a</uuid>
Dec 05 09:32:33 compute-1 nova_compute[189066]:   <metadata>
Dec 05 09:32:33 compute-1 nova_compute[189066]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1" xmlns:instance="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 05 09:32:33 compute-1 nova_compute[189066]:   <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:   <nova:name>tempest-TestNetworkBasicOps-server-1503446723</nova:name>
Dec 05 09:32:33 compute-1 nova_compute[189066]:   <nova:creationTime>2025-12-05 09:32:30</nova:creationTime>
Dec 05 09:32:33 compute-1 nova_compute[189066]:   <nova:flavor name="m1.nano">
Dec 05 09:32:33 compute-1 nova_compute[189066]:     <nova:memory>128</nova:memory>
Dec 05 09:32:33 compute-1 nova_compute[189066]:     <nova:disk>1</nova:disk>
Dec 05 09:32:33 compute-1 nova_compute[189066]:     <nova:swap>0</nova:swap>
Dec 05 09:32:33 compute-1 nova_compute[189066]:     <nova:ephemeral>0</nova:ephemeral>
Dec 05 09:32:33 compute-1 nova_compute[189066]:     <nova:vcpus>1</nova:vcpus>
Dec 05 09:32:33 compute-1 nova_compute[189066]:   </nova:flavor>
Dec 05 09:32:33 compute-1 nova_compute[189066]:   <nova:owner>
Dec 05 09:32:33 compute-1 nova_compute[189066]:     <nova:user uuid="822a2d80e80e46d9a5a49e1e9560e0d9">tempest-TestNetworkBasicOps-1341993432-project-member</nova:user>
Dec 05 09:32:33 compute-1 nova_compute[189066]:     <nova:project uuid="f918144b49634ed5a43d75f8f7d194d3">tempest-TestNetworkBasicOps-1341993432</nova:project>
Dec 05 09:32:33 compute-1 nova_compute[189066]:   </nova:owner>
Dec 05 09:32:33 compute-1 nova_compute[189066]:   <nova:root type="image" uuid="3ebffd97-b242-42d7-b245-ebdaf8e4377c"/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:   <nova:ports>
Dec 05 09:32:33 compute-1 nova_compute[189066]:     <nova:port uuid="d7d3d638-32a8-4d3e-abce-d0e8942a6c22">
Dec 05 09:32:33 compute-1 nova_compute[189066]:       <nova:ip type="fixed" address="10.100.0.10" ipVersion="4"/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:     </nova:port>
Dec 05 09:32:33 compute-1 nova_compute[189066]:     <nova:port uuid="68d7a18a-f1fa-4b3c-bad0-34623cfb7eb5">
Dec 05 09:32:33 compute-1 nova_compute[189066]:       <nova:ip type="fixed" address="10.100.0.18" ipVersion="4"/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:     </nova:port>
Dec 05 09:32:33 compute-1 nova_compute[189066]:   </nova:ports>
Dec 05 09:32:33 compute-1 nova_compute[189066]: </nova:instance>
Dec 05 09:32:33 compute-1 nova_compute[189066]:   </metadata>
Dec 05 09:32:33 compute-1 nova_compute[189066]:   <memory unit='KiB'>131072</memory>
Dec 05 09:32:33 compute-1 nova_compute[189066]:   <currentMemory unit='KiB'>131072</currentMemory>
Dec 05 09:32:33 compute-1 nova_compute[189066]:   <vcpu placement='static'>1</vcpu>
Dec 05 09:32:33 compute-1 nova_compute[189066]:   <resource>
Dec 05 09:32:33 compute-1 nova_compute[189066]:     <partition>/machine</partition>
Dec 05 09:32:33 compute-1 nova_compute[189066]:   </resource>
Dec 05 09:32:33 compute-1 nova_compute[189066]:   <sysinfo type='smbios'>
Dec 05 09:32:33 compute-1 nova_compute[189066]:     <system>
Dec 05 09:32:33 compute-1 nova_compute[189066]:       <entry name='manufacturer'>RDO</entry>
Dec 05 09:32:33 compute-1 nova_compute[189066]:       <entry name='product'>OpenStack Compute</entry>
Dec 05 09:32:33 compute-1 nova_compute[189066]:       <entry name='version'>27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec 05 09:32:33 compute-1 nova_compute[189066]:       <entry name='serial'>b843e130-e156-47b6-8a2a-d4811973b93a</entry>
Dec 05 09:32:33 compute-1 nova_compute[189066]:       <entry name='uuid'>b843e130-e156-47b6-8a2a-d4811973b93a</entry>
Dec 05 09:32:33 compute-1 nova_compute[189066]:       <entry name='family'>Virtual Machine</entry>
Dec 05 09:32:33 compute-1 nova_compute[189066]:     </system>
Dec 05 09:32:33 compute-1 nova_compute[189066]:   </sysinfo>
Dec 05 09:32:33 compute-1 nova_compute[189066]:   <os>
Dec 05 09:32:33 compute-1 nova_compute[189066]:     <type arch='x86_64' machine='pc-q35-rhel9.8.0'>hvm</type>
Dec 05 09:32:33 compute-1 nova_compute[189066]:     <boot dev='hd'/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:     <smbios mode='sysinfo'/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:   </os>
Dec 05 09:32:33 compute-1 nova_compute[189066]:   <features>
Dec 05 09:32:33 compute-1 nova_compute[189066]:     <acpi/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:     <apic/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:     <vmcoreinfo state='on'/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:   </features>
Dec 05 09:32:33 compute-1 nova_compute[189066]:   <cpu mode='custom' match='exact' check='full'>
Dec 05 09:32:33 compute-1 nova_compute[189066]:     <model fallback='forbid'>Nehalem</model>
Dec 05 09:32:33 compute-1 nova_compute[189066]:     <topology sockets='1' dies='1' clusters='1' cores='1' threads='1'/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:     <feature policy='require' name='x2apic'/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:     <feature policy='require' name='hypervisor'/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:     <feature policy='require' name='vme'/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:   </cpu>
Dec 05 09:32:33 compute-1 nova_compute[189066]:   <clock offset='utc'>
Dec 05 09:32:33 compute-1 nova_compute[189066]:     <timer name='pit' tickpolicy='delay'/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:     <timer name='rtc' tickpolicy='catchup'/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:     <timer name='hpet' present='no'/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:   </clock>
Dec 05 09:32:33 compute-1 nova_compute[189066]:   <on_poweroff>destroy</on_poweroff>
Dec 05 09:32:33 compute-1 nova_compute[189066]:   <on_reboot>restart</on_reboot>
Dec 05 09:32:33 compute-1 nova_compute[189066]:   <on_crash>destroy</on_crash>
Dec 05 09:32:33 compute-1 nova_compute[189066]:   <devices>
Dec 05 09:32:33 compute-1 nova_compute[189066]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Dec 05 09:32:33 compute-1 nova_compute[189066]:     <disk type='file' device='disk'>
Dec 05 09:32:33 compute-1 nova_compute[189066]:       <driver name='qemu' type='qcow2' cache='none'/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:       <source file='/var/lib/nova/instances/b843e130-e156-47b6-8a2a-d4811973b93a/disk' index='2'/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:       <backingStore type='file' index='3'>
Dec 05 09:32:33 compute-1 nova_compute[189066]:         <format type='raw'/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:         <source file='/var/lib/nova/instances/_base/5b1bdb5cc50b9bf43ebe3094e9279a12a18df0ce'/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:         <backingStore/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:       </backingStore>
Dec 05 09:32:33 compute-1 nova_compute[189066]:       <target dev='vda' bus='virtio'/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:       <alias name='virtio-disk0'/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:       <address type='pci' domain='0x0000' bus='0x03' slot='0x00' function='0x0'/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:     </disk>
Dec 05 09:32:33 compute-1 nova_compute[189066]:     <disk type='file' device='cdrom'>
Dec 05 09:32:33 compute-1 nova_compute[189066]:       <driver name='qemu' type='raw' cache='none'/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:       <source file='/var/lib/nova/instances/b843e130-e156-47b6-8a2a-d4811973b93a/disk.config' index='1'/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:       <backingStore/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:       <target dev='sda' bus='sata'/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:       <readonly/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:       <alias name='sata0-0-0'/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:       <address type='drive' controller='0' bus='0' target='0' unit='0'/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:     </disk>
Dec 05 09:32:33 compute-1 nova_compute[189066]:     <controller type='pci' index='0' model='pcie-root'>
Dec 05 09:32:33 compute-1 nova_compute[189066]:       <alias name='pcie.0'/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:     </controller>
Dec 05 09:32:33 compute-1 nova_compute[189066]:     <controller type='pci' index='1' model='pcie-root-port'>
Dec 05 09:32:33 compute-1 nova_compute[189066]:       <model name='pcie-root-port'/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:       <target chassis='1' port='0x10'/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:       <alias name='pci.1'/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x0' multifunction='on'/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:     </controller>
Dec 05 09:32:33 compute-1 nova_compute[189066]:     <controller type='pci' index='2' model='pcie-root-port'>
Dec 05 09:32:33 compute-1 nova_compute[189066]:       <model name='pcie-root-port'/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:       <target chassis='2' port='0x11'/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:       <alias name='pci.2'/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x1'/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:     </controller>
Dec 05 09:32:33 compute-1 nova_compute[189066]:     <controller type='pci' index='3' model='pcie-root-port'>
Dec 05 09:32:33 compute-1 nova_compute[189066]:       <model name='pcie-root-port'/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:       <target chassis='3' port='0x12'/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:       <alias name='pci.3'/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x2'/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:     </controller>
Dec 05 09:32:33 compute-1 nova_compute[189066]:     <controller type='pci' index='4' model='pcie-root-port'>
Dec 05 09:32:33 compute-1 nova_compute[189066]:       <model name='pcie-root-port'/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:       <target chassis='4' port='0x13'/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:       <alias name='pci.4'/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x3'/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:     </controller>
Dec 05 09:32:33 compute-1 nova_compute[189066]:     <controller type='pci' index='5' model='pcie-root-port'>
Dec 05 09:32:33 compute-1 nova_compute[189066]:       <model name='pcie-root-port'/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:       <target chassis='5' port='0x14'/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:       <alias name='pci.5'/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x4'/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:     </controller>
Dec 05 09:32:33 compute-1 nova_compute[189066]:     <controller type='pci' index='6' model='pcie-root-port'>
Dec 05 09:32:33 compute-1 nova_compute[189066]:       <model name='pcie-root-port'/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:       <target chassis='6' port='0x15'/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:       <alias name='pci.6'/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x5'/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:     </controller>
Dec 05 09:32:33 compute-1 nova_compute[189066]:     <controller type='pci' index='7' model='pcie-root-port'>
Dec 05 09:32:33 compute-1 nova_compute[189066]:       <model name='pcie-root-port'/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:       <target chassis='7' port='0x16'/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:       <alias name='pci.7'/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x6'/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:     </controller>
Dec 05 09:32:33 compute-1 nova_compute[189066]:     <controller type='pci' index='8' model='pcie-root-port'>
Dec 05 09:32:33 compute-1 nova_compute[189066]:       <model name='pcie-root-port'/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:       <target chassis='8' port='0x17'/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:       <alias name='pci.8'/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x7'/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:     </controller>
Dec 05 09:32:33 compute-1 nova_compute[189066]:     <controller type='pci' index='9' model='pcie-root-port'>
Dec 05 09:32:33 compute-1 nova_compute[189066]:       <model name='pcie-root-port'/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:       <target chassis='9' port='0x18'/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:       <alias name='pci.9'/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0' multifunction='on'/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:     </controller>
Dec 05 09:32:33 compute-1 nova_compute[189066]:     <controller type='pci' index='10' model='pcie-root-port'>
Dec 05 09:32:33 compute-1 nova_compute[189066]:       <model name='pcie-root-port'/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:       <target chassis='10' port='0x19'/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:       <alias name='pci.10'/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x1'/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:     </controller>
Dec 05 09:32:33 compute-1 nova_compute[189066]:     <controller type='pci' index='11' model='pcie-root-port'>
Dec 05 09:32:33 compute-1 nova_compute[189066]:       <model name='pcie-root-port'/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:       <target chassis='11' port='0x1a'/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:       <alias name='pci.11'/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x2'/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:     </controller>
Dec 05 09:32:33 compute-1 nova_compute[189066]:     <controller type='pci' index='12' model='pcie-root-port'>
Dec 05 09:32:33 compute-1 nova_compute[189066]:       <model name='pcie-root-port'/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:       <target chassis='12' port='0x1b'/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:       <alias name='pci.12'/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x3'/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:     </controller>
Dec 05 09:32:33 compute-1 nova_compute[189066]:     <controller type='pci' index='13' model='pcie-root-port'>
Dec 05 09:32:33 compute-1 nova_compute[189066]:       <model name='pcie-root-port'/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:       <target chassis='13' port='0x1c'/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:       <alias name='pci.13'/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x4'/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:     </controller>
Dec 05 09:32:33 compute-1 nova_compute[189066]:     <controller type='pci' index='14' model='pcie-root-port'>
Dec 05 09:32:33 compute-1 nova_compute[189066]:       <model name='pcie-root-port'/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:       <target chassis='14' port='0x1d'/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:       <alias name='pci.14'/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x5'/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:     </controller>
Dec 05 09:32:33 compute-1 nova_compute[189066]:     <controller type='pci' index='15' model='pcie-root-port'>
Dec 05 09:32:33 compute-1 nova_compute[189066]:       <model name='pcie-root-port'/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:       <target chassis='15' port='0x1e'/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:       <alias name='pci.15'/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x6'/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:     </controller>
Dec 05 09:32:33 compute-1 nova_compute[189066]:     <controller type='pci' index='16' model='pcie-root-port'>
Dec 05 09:32:33 compute-1 nova_compute[189066]:       <model name='pcie-root-port'/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:       <target chassis='16' port='0x1f'/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:       <alias name='pci.16'/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x7'/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:     </controller>
Dec 05 09:32:33 compute-1 nova_compute[189066]:     <controller type='pci' index='17' model='pcie-root-port'>
Dec 05 09:32:33 compute-1 nova_compute[189066]:       <model name='pcie-root-port'/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:       <target chassis='17' port='0x20'/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:       <alias name='pci.17'/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x0' multifunction='on'/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:     </controller>
Dec 05 09:32:33 compute-1 nova_compute[189066]:     <controller type='pci' index='18' model='pcie-root-port'>
Dec 05 09:32:33 compute-1 nova_compute[189066]:       <model name='pcie-root-port'/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:       <target chassis='18' port='0x21'/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:       <alias name='pci.18'/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x1'/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:     </controller>
Dec 05 09:32:33 compute-1 nova_compute[189066]:     <controller type='pci' index='19' model='pcie-root-port'>
Dec 05 09:32:33 compute-1 nova_compute[189066]:       <model name='pcie-root-port'/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:       <target chassis='19' port='0x22'/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:       <alias name='pci.19'/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x2'/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:     </controller>
Dec 05 09:32:33 compute-1 nova_compute[189066]:     <controller type='pci' index='20' model='pcie-root-port'>
Dec 05 09:32:33 compute-1 nova_compute[189066]:       <model name='pcie-root-port'/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:       <target chassis='20' port='0x23'/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:       <alias name='pci.20'/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x3'/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:     </controller>
Dec 05 09:32:33 compute-1 nova_compute[189066]:     <controller type='pci' index='21' model='pcie-root-port'>
Dec 05 09:32:33 compute-1 nova_compute[189066]:       <model name='pcie-root-port'/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:       <target chassis='21' port='0x24'/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:       <alias name='pci.21'/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x4'/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:     </controller>
Dec 05 09:32:33 compute-1 nova_compute[189066]:     <controller type='pci' index='22' model='pcie-root-port'>
Dec 05 09:32:33 compute-1 nova_compute[189066]:       <model name='pcie-root-port'/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:       <target chassis='22' port='0x25'/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:       <alias name='pci.22'/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x5'/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:     </controller>
Dec 05 09:32:33 compute-1 nova_compute[189066]:     <controller type='pci' index='23' model='pcie-root-port'>
Dec 05 09:32:33 compute-1 nova_compute[189066]:       <model name='pcie-root-port'/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:       <target chassis='23' port='0x26'/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:       <alias name='pci.23'/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x6'/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:     </controller>
Dec 05 09:32:33 compute-1 nova_compute[189066]:     <controller type='pci' index='24' model='pcie-root-port'>
Dec 05 09:32:33 compute-1 nova_compute[189066]:       <model name='pcie-root-port'/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:       <target chassis='24' port='0x27'/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:       <alias name='pci.24'/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x7'/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:     </controller>
Dec 05 09:32:33 compute-1 nova_compute[189066]:     <controller type='pci' index='25' model='pcie-root-port'>
Dec 05 09:32:33 compute-1 nova_compute[189066]:       <model name='pcie-root-port'/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:       <target chassis='25' port='0x28'/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:       <alias name='pci.25'/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x05' function='0x0'/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:     </controller>
Dec 05 09:32:33 compute-1 nova_compute[189066]:     <controller type='pci' index='26' model='pcie-to-pci-bridge'>
Dec 05 09:32:33 compute-1 nova_compute[189066]:       <model name='pcie-pci-bridge'/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:       <alias name='pci.26'/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:       <address type='pci' domain='0x0000' bus='0x01' slot='0x00' function='0x0'/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:     </controller>
Dec 05 09:32:33 compute-1 nova_compute[189066]:     <controller type='usb' index='0' model='piix3-uhci'>
Dec 05 09:32:33 compute-1 nova_compute[189066]:       <alias name='usb'/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:       <address type='pci' domain='0x0000' bus='0x1a' slot='0x01' function='0x0'/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:     </controller>
Dec 05 09:32:33 compute-1 nova_compute[189066]:     <controller type='sata' index='0'>
Dec 05 09:32:33 compute-1 nova_compute[189066]:       <alias name='ide'/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x1f' function='0x2'/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:     </controller>
Dec 05 09:32:33 compute-1 nova_compute[189066]:     <interface type='ethernet'>
Dec 05 09:32:33 compute-1 nova_compute[189066]:       <mac address='fa:16:3e:7f:68:29'/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:       <target dev='tapd7d3d638-32'/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:       <model type='virtio'/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:       <driver name='vhost' rx_queue_size='512'/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:       <mtu size='1442'/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:       <alias name='net0'/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:       <address type='pci' domain='0x0000' bus='0x02' slot='0x00' function='0x0'/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:     </interface>
Dec 05 09:32:33 compute-1 nova_compute[189066]:     <interface type='ethernet'>
Dec 05 09:32:33 compute-1 nova_compute[189066]:       <mac address='fa:16:3e:af:c1:b0'/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:       <target dev='tap68d7a18a-f1'/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:       <model type='virtio'/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:       <driver name='vhost' rx_queue_size='512'/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:       <mtu size='1442'/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:       <alias name='net1'/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:       <address type='pci' domain='0x0000' bus='0x06' slot='0x00' function='0x0'/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:     </interface>
Dec 05 09:32:33 compute-1 nova_compute[189066]:     <serial type='pty'>
Dec 05 09:32:33 compute-1 nova_compute[189066]:       <source path='/dev/pts/1'/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:       <log file='/var/lib/nova/instances/b843e130-e156-47b6-8a2a-d4811973b93a/console.log' append='off'/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:       <target type='isa-serial' port='0'>
Dec 05 09:32:33 compute-1 nova_compute[189066]:         <model name='isa-serial'/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:       </target>
Dec 05 09:32:33 compute-1 nova_compute[189066]:       <alias name='serial0'/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:     </serial>
Dec 05 09:32:33 compute-1 nova_compute[189066]:     <console type='pty' tty='/dev/pts/1'>
Dec 05 09:32:33 compute-1 nova_compute[189066]:       <source path='/dev/pts/1'/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:       <log file='/var/lib/nova/instances/b843e130-e156-47b6-8a2a-d4811973b93a/console.log' append='off'/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:       <target type='serial' port='0'/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:       <alias name='serial0'/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:     </console>
Dec 05 09:32:33 compute-1 nova_compute[189066]:     <input type='tablet' bus='usb'>
Dec 05 09:32:33 compute-1 nova_compute[189066]:       <alias name='input0'/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:       <address type='usb' bus='0' port='1'/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:     </input>
Dec 05 09:32:33 compute-1 nova_compute[189066]:     <input type='mouse' bus='ps2'>
Dec 05 09:32:33 compute-1 nova_compute[189066]:       <alias name='input1'/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:     </input>
Dec 05 09:32:33 compute-1 nova_compute[189066]:     <input type='keyboard' bus='ps2'>
Dec 05 09:32:33 compute-1 nova_compute[189066]:       <alias name='input2'/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:     </input>
Dec 05 09:32:33 compute-1 nova_compute[189066]:     <graphics type='vnc' port='5901' autoport='yes' listen='::0'>
Dec 05 09:32:33 compute-1 nova_compute[189066]:       <listen type='address' address='::0'/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:     </graphics>
Dec 05 09:32:33 compute-1 nova_compute[189066]:     <audio id='1' type='none'/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:     <video>
Dec 05 09:32:33 compute-1 nova_compute[189066]:       <model type='virtio' heads='1' primary='yes'/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:       <alias name='video0'/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x01' function='0x0'/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:     </video>
Dec 05 09:32:33 compute-1 nova_compute[189066]:     <watchdog model='itco' action='reset'>
Dec 05 09:32:33 compute-1 nova_compute[189066]:       <alias name='watchdog0'/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:     </watchdog>
Dec 05 09:32:33 compute-1 nova_compute[189066]:     <memballoon model='virtio'>
Dec 05 09:32:33 compute-1 nova_compute[189066]:       <stats period='10'/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:       <alias name='balloon0'/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:       <address type='pci' domain='0x0000' bus='0x04' slot='0x00' function='0x0'/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:     </memballoon>
Dec 05 09:32:33 compute-1 nova_compute[189066]:     <rng model='virtio'>
Dec 05 09:32:33 compute-1 nova_compute[189066]:       <backend model='random'>/dev/urandom</backend>
Dec 05 09:32:33 compute-1 nova_compute[189066]:       <alias name='rng0'/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:       <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x0'/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:     </rng>
Dec 05 09:32:33 compute-1 nova_compute[189066]:   </devices>
Dec 05 09:32:33 compute-1 nova_compute[189066]:   <seclabel type='dynamic' model='selinux' relabel='yes'>
Dec 05 09:32:33 compute-1 nova_compute[189066]:     <label>system_u:system_r:svirt_t:s0:c183,c365</label>
Dec 05 09:32:33 compute-1 nova_compute[189066]:     <imagelabel>system_u:object_r:svirt_image_t:s0:c183,c365</imagelabel>
Dec 05 09:32:33 compute-1 nova_compute[189066]:   </seclabel>
Dec 05 09:32:33 compute-1 nova_compute[189066]:   <seclabel type='dynamic' model='dac' relabel='yes'>
Dec 05 09:32:33 compute-1 nova_compute[189066]:     <label>+107:+107</label>
Dec 05 09:32:33 compute-1 nova_compute[189066]:     <imagelabel>+107:+107</imagelabel>
Dec 05 09:32:33 compute-1 nova_compute[189066]:   </seclabel>
Dec 05 09:32:33 compute-1 nova_compute[189066]: </domain>
Dec 05 09:32:33 compute-1 nova_compute[189066]:  get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:282
Dec 05 09:32:33 compute-1 nova_compute[189066]: 2025-12-05 09:32:33.020 189070 INFO nova.virt.libvirt.driver [None req-df92c33a-9cf2-435b-9d00-f524add3a333 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Successfully detached device tap68d7a18a-f1 from instance b843e130-e156-47b6-8a2a-d4811973b93a from the persistent domain config.
Dec 05 09:32:33 compute-1 nova_compute[189066]: 2025-12-05 09:32:33.021 189070 DEBUG nova.virt.libvirt.driver [None req-df92c33a-9cf2-435b-9d00-f524add3a333 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] (1/8): Attempting to detach device tap68d7a18a-f1 with device alias net1 from instance b843e130-e156-47b6-8a2a-d4811973b93a from the live domain config. _detach_from_live_with_retry /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2523
Dec 05 09:32:33 compute-1 nova_compute[189066]: 2025-12-05 09:32:33.022 189070 DEBUG nova.virt.libvirt.guest [None req-df92c33a-9cf2-435b-9d00-f524add3a333 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] detach device xml: <interface type="ethernet">
Dec 05 09:32:33 compute-1 nova_compute[189066]:   <mac address="fa:16:3e:af:c1:b0"/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:   <model type="virtio"/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:   <driver name="vhost" rx_queue_size="512"/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:   <mtu size="1442"/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:   <target dev="tap68d7a18a-f1"/>
Dec 05 09:32:33 compute-1 nova_compute[189066]: </interface>
Dec 05 09:32:33 compute-1 nova_compute[189066]:  detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465
Dec 05 09:32:33 compute-1 kernel: tap68d7a18a-f1 (unregistering): left promiscuous mode
Dec 05 09:32:33 compute-1 NetworkManager[55704]: <info>  [1764927153.1294] device (tap68d7a18a-f1): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec 05 09:32:33 compute-1 ovn_controller[95809]: 2025-12-05T09:32:33Z|00163|binding|INFO|Releasing lport 68d7a18a-f1fa-4b3c-bad0-34623cfb7eb5 from this chassis (sb_readonly=0)
Dec 05 09:32:33 compute-1 ovn_controller[95809]: 2025-12-05T09:32:33Z|00164|binding|INFO|Setting lport 68d7a18a-f1fa-4b3c-bad0-34623cfb7eb5 down in Southbound
Dec 05 09:32:33 compute-1 ovn_controller[95809]: 2025-12-05T09:32:33Z|00165|binding|INFO|Removing iface tap68d7a18a-f1 ovn-installed in OVS
Dec 05 09:32:33 compute-1 nova_compute[189066]: 2025-12-05 09:32:33.163 189070 DEBUG nova.virt.libvirt.driver [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] Received event <DeviceRemovedEvent: 1764927153.1624312, b843e130-e156-47b6-8a2a-d4811973b93a => net1> from libvirt while the driver is waiting for it; dispatched. emit_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2370
Dec 05 09:32:33 compute-1 nova_compute[189066]: 2025-12-05 09:32:33.168 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:32:33 compute-1 nova_compute[189066]: 2025-12-05 09:32:33.170 189070 DEBUG nova.virt.libvirt.driver [None req-df92c33a-9cf2-435b-9d00-f524add3a333 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Start waiting for the detach event from libvirt for device tap68d7a18a-f1 with device alias net1 for instance b843e130-e156-47b6-8a2a-d4811973b93a _detach_from_live_and_wait_for_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2599
Dec 05 09:32:33 compute-1 nova_compute[189066]: 2025-12-05 09:32:33.171 189070 DEBUG nova.virt.libvirt.guest [None req-df92c33a-9cf2-435b-9d00-f524add3a333 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:af:c1:b0"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap68d7a18a-f1"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257
Dec 05 09:32:33 compute-1 nova_compute[189066]: 2025-12-05 09:32:33.176 189070 DEBUG nova.virt.libvirt.guest [None req-df92c33a-9cf2-435b-9d00-f524add3a333 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] interface for config: <interface type="ethernet"><mac address="fa:16:3e:af:c1:b0"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap68d7a18a-f1"/></interface>not found in domain: <domain type='kvm' id='12'>
Dec 05 09:32:33 compute-1 nova_compute[189066]:   <name>instance-00000017</name>
Dec 05 09:32:33 compute-1 nova_compute[189066]:   <uuid>b843e130-e156-47b6-8a2a-d4811973b93a</uuid>
Dec 05 09:32:33 compute-1 nova_compute[189066]:   <metadata>
Dec 05 09:32:33 compute-1 nova_compute[189066]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1" xmlns:instance="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 05 09:32:33 compute-1 nova_compute[189066]:   <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:   <nova:name>tempest-TestNetworkBasicOps-server-1503446723</nova:name>
Dec 05 09:32:33 compute-1 nova_compute[189066]:   <nova:creationTime>2025-12-05 09:32:30</nova:creationTime>
Dec 05 09:32:33 compute-1 nova_compute[189066]:   <nova:flavor name="m1.nano">
Dec 05 09:32:33 compute-1 nova_compute[189066]:     <nova:memory>128</nova:memory>
Dec 05 09:32:33 compute-1 nova_compute[189066]:     <nova:disk>1</nova:disk>
Dec 05 09:32:33 compute-1 nova_compute[189066]:     <nova:swap>0</nova:swap>
Dec 05 09:32:33 compute-1 nova_compute[189066]:     <nova:ephemeral>0</nova:ephemeral>
Dec 05 09:32:33 compute-1 nova_compute[189066]:     <nova:vcpus>1</nova:vcpus>
Dec 05 09:32:33 compute-1 nova_compute[189066]:   </nova:flavor>
Dec 05 09:32:33 compute-1 nova_compute[189066]:   <nova:owner>
Dec 05 09:32:33 compute-1 nova_compute[189066]:     <nova:user uuid="822a2d80e80e46d9a5a49e1e9560e0d9">tempest-TestNetworkBasicOps-1341993432-project-member</nova:user>
Dec 05 09:32:33 compute-1 nova_compute[189066]:     <nova:project uuid="f918144b49634ed5a43d75f8f7d194d3">tempest-TestNetworkBasicOps-1341993432</nova:project>
Dec 05 09:32:33 compute-1 nova_compute[189066]:   </nova:owner>
Dec 05 09:32:33 compute-1 nova_compute[189066]:   <nova:root type="image" uuid="3ebffd97-b242-42d7-b245-ebdaf8e4377c"/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:   <nova:ports>
Dec 05 09:32:33 compute-1 nova_compute[189066]:     <nova:port uuid="d7d3d638-32a8-4d3e-abce-d0e8942a6c22">
Dec 05 09:32:33 compute-1 nova_compute[189066]:       <nova:ip type="fixed" address="10.100.0.10" ipVersion="4"/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:     </nova:port>
Dec 05 09:32:33 compute-1 nova_compute[189066]:     <nova:port uuid="68d7a18a-f1fa-4b3c-bad0-34623cfb7eb5">
Dec 05 09:32:33 compute-1 nova_compute[189066]:       <nova:ip type="fixed" address="10.100.0.18" ipVersion="4"/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:     </nova:port>
Dec 05 09:32:33 compute-1 nova_compute[189066]:   </nova:ports>
Dec 05 09:32:33 compute-1 nova_compute[189066]: </nova:instance>
Dec 05 09:32:33 compute-1 nova_compute[189066]:   </metadata>
Dec 05 09:32:33 compute-1 nova_compute[189066]:   <memory unit='KiB'>131072</memory>
Dec 05 09:32:33 compute-1 nova_compute[189066]:   <currentMemory unit='KiB'>131072</currentMemory>
Dec 05 09:32:33 compute-1 nova_compute[189066]:   <vcpu placement='static'>1</vcpu>
Dec 05 09:32:33 compute-1 nova_compute[189066]:   <resource>
Dec 05 09:32:33 compute-1 nova_compute[189066]:     <partition>/machine</partition>
Dec 05 09:32:33 compute-1 nova_compute[189066]:   </resource>
Dec 05 09:32:33 compute-1 nova_compute[189066]:   <sysinfo type='smbios'>
Dec 05 09:32:33 compute-1 nova_compute[189066]:     <system>
Dec 05 09:32:33 compute-1 nova_compute[189066]:       <entry name='manufacturer'>RDO</entry>
Dec 05 09:32:33 compute-1 nova_compute[189066]:       <entry name='product'>OpenStack Compute</entry>
Dec 05 09:32:33 compute-1 nova_compute[189066]:       <entry name='version'>27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec 05 09:32:33 compute-1 nova_compute[189066]:       <entry name='serial'>b843e130-e156-47b6-8a2a-d4811973b93a</entry>
Dec 05 09:32:33 compute-1 nova_compute[189066]:       <entry name='uuid'>b843e130-e156-47b6-8a2a-d4811973b93a</entry>
Dec 05 09:32:33 compute-1 nova_compute[189066]:       <entry name='family'>Virtual Machine</entry>
Dec 05 09:32:33 compute-1 nova_compute[189066]:     </system>
Dec 05 09:32:33 compute-1 nova_compute[189066]:   </sysinfo>
Dec 05 09:32:33 compute-1 nova_compute[189066]:   <os>
Dec 05 09:32:33 compute-1 nova_compute[189066]:     <type arch='x86_64' machine='pc-q35-rhel9.8.0'>hvm</type>
Dec 05 09:32:33 compute-1 nova_compute[189066]:     <boot dev='hd'/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:     <smbios mode='sysinfo'/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:   </os>
Dec 05 09:32:33 compute-1 nova_compute[189066]:   <features>
Dec 05 09:32:33 compute-1 nova_compute[189066]:     <acpi/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:     <apic/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:     <vmcoreinfo state='on'/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:   </features>
Dec 05 09:32:33 compute-1 nova_compute[189066]:   <cpu mode='custom' match='exact' check='full'>
Dec 05 09:32:33 compute-1 nova_compute[189066]:     <model fallback='forbid'>Nehalem</model>
Dec 05 09:32:33 compute-1 nova_compute[189066]:     <topology sockets='1' dies='1' clusters='1' cores='1' threads='1'/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:     <feature policy='require' name='x2apic'/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:     <feature policy='require' name='hypervisor'/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:     <feature policy='require' name='vme'/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:   </cpu>
Dec 05 09:32:33 compute-1 nova_compute[189066]:   <clock offset='utc'>
Dec 05 09:32:33 compute-1 nova_compute[189066]:     <timer name='pit' tickpolicy='delay'/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:     <timer name='rtc' tickpolicy='catchup'/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:     <timer name='hpet' present='no'/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:   </clock>
Dec 05 09:32:33 compute-1 nova_compute[189066]:   <on_poweroff>destroy</on_poweroff>
Dec 05 09:32:33 compute-1 nova_compute[189066]:   <on_reboot>restart</on_reboot>
Dec 05 09:32:33 compute-1 nova_compute[189066]:   <on_crash>destroy</on_crash>
Dec 05 09:32:33 compute-1 nova_compute[189066]:   <devices>
Dec 05 09:32:33 compute-1 nova_compute[189066]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Dec 05 09:32:33 compute-1 nova_compute[189066]:     <disk type='file' device='disk'>
Dec 05 09:32:33 compute-1 nova_compute[189066]:       <driver name='qemu' type='qcow2' cache='none'/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:       <source file='/var/lib/nova/instances/b843e130-e156-47b6-8a2a-d4811973b93a/disk' index='2'/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:       <backingStore type='file' index='3'>
Dec 05 09:32:33 compute-1 nova_compute[189066]:         <format type='raw'/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:         <source file='/var/lib/nova/instances/_base/5b1bdb5cc50b9bf43ebe3094e9279a12a18df0ce'/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:         <backingStore/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:       </backingStore>
Dec 05 09:32:33 compute-1 nova_compute[189066]:       <target dev='vda' bus='virtio'/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:       <alias name='virtio-disk0'/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:       <address type='pci' domain='0x0000' bus='0x03' slot='0x00' function='0x0'/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:     </disk>
Dec 05 09:32:33 compute-1 nova_compute[189066]:     <disk type='file' device='cdrom'>
Dec 05 09:32:33 compute-1 nova_compute[189066]:       <driver name='qemu' type='raw' cache='none'/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:       <source file='/var/lib/nova/instances/b843e130-e156-47b6-8a2a-d4811973b93a/disk.config' index='1'/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:       <backingStore/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:       <target dev='sda' bus='sata'/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:       <readonly/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:       <alias name='sata0-0-0'/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:       <address type='drive' controller='0' bus='0' target='0' unit='0'/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:     </disk>
Dec 05 09:32:33 compute-1 nova_compute[189066]:     <controller type='pci' index='0' model='pcie-root'>
Dec 05 09:32:33 compute-1 nova_compute[189066]:       <alias name='pcie.0'/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:     </controller>
Dec 05 09:32:33 compute-1 nova_compute[189066]:     <controller type='pci' index='1' model='pcie-root-port'>
Dec 05 09:32:33 compute-1 nova_compute[189066]:       <model name='pcie-root-port'/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:       <target chassis='1' port='0x10'/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:       <alias name='pci.1'/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x0' multifunction='on'/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:     </controller>
Dec 05 09:32:33 compute-1 nova_compute[189066]:     <controller type='pci' index='2' model='pcie-root-port'>
Dec 05 09:32:33 compute-1 nova_compute[189066]:       <model name='pcie-root-port'/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:       <target chassis='2' port='0x11'/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:       <alias name='pci.2'/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x1'/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:     </controller>
Dec 05 09:32:33 compute-1 nova_compute[189066]:     <controller type='pci' index='3' model='pcie-root-port'>
Dec 05 09:32:33 compute-1 nova_compute[189066]:       <model name='pcie-root-port'/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:       <target chassis='3' port='0x12'/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:       <alias name='pci.3'/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x2'/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:     </controller>
Dec 05 09:32:33 compute-1 nova_compute[189066]:     <controller type='pci' index='4' model='pcie-root-port'>
Dec 05 09:32:33 compute-1 nova_compute[189066]:       <model name='pcie-root-port'/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:       <target chassis='4' port='0x13'/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:       <alias name='pci.4'/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x3'/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:     </controller>
Dec 05 09:32:33 compute-1 nova_compute[189066]:     <controller type='pci' index='5' model='pcie-root-port'>
Dec 05 09:32:33 compute-1 nova_compute[189066]:       <model name='pcie-root-port'/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:       <target chassis='5' port='0x14'/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:       <alias name='pci.5'/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x4'/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:     </controller>
Dec 05 09:32:33 compute-1 nova_compute[189066]:     <controller type='pci' index='6' model='pcie-root-port'>
Dec 05 09:32:33 compute-1 nova_compute[189066]:       <model name='pcie-root-port'/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:       <target chassis='6' port='0x15'/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:       <alias name='pci.6'/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x5'/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:     </controller>
Dec 05 09:32:33 compute-1 nova_compute[189066]:     <controller type='pci' index='7' model='pcie-root-port'>
Dec 05 09:32:33 compute-1 nova_compute[189066]:       <model name='pcie-root-port'/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:       <target chassis='7' port='0x16'/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:       <alias name='pci.7'/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x6'/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:     </controller>
Dec 05 09:32:33 compute-1 nova_compute[189066]:     <controller type='pci' index='8' model='pcie-root-port'>
Dec 05 09:32:33 compute-1 nova_compute[189066]:       <model name='pcie-root-port'/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:       <target chassis='8' port='0x17'/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:       <alias name='pci.8'/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x7'/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:     </controller>
Dec 05 09:32:33 compute-1 nova_compute[189066]:     <controller type='pci' index='9' model='pcie-root-port'>
Dec 05 09:32:33 compute-1 nova_compute[189066]:       <model name='pcie-root-port'/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:       <target chassis='9' port='0x18'/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:       <alias name='pci.9'/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0' multifunction='on'/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:     </controller>
Dec 05 09:32:33 compute-1 nova_compute[189066]:     <controller type='pci' index='10' model='pcie-root-port'>
Dec 05 09:32:33 compute-1 nova_compute[189066]:       <model name='pcie-root-port'/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:       <target chassis='10' port='0x19'/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:       <alias name='pci.10'/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x1'/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:     </controller>
Dec 05 09:32:33 compute-1 nova_compute[189066]:     <controller type='pci' index='11' model='pcie-root-port'>
Dec 05 09:32:33 compute-1 nova_compute[189066]:       <model name='pcie-root-port'/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:       <target chassis='11' port='0x1a'/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:       <alias name='pci.11'/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x2'/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:     </controller>
Dec 05 09:32:33 compute-1 nova_compute[189066]:     <controller type='pci' index='12' model='pcie-root-port'>
Dec 05 09:32:33 compute-1 nova_compute[189066]:       <model name='pcie-root-port'/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:       <target chassis='12' port='0x1b'/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:       <alias name='pci.12'/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x3'/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:     </controller>
Dec 05 09:32:33 compute-1 nova_compute[189066]:     <controller type='pci' index='13' model='pcie-root-port'>
Dec 05 09:32:33 compute-1 nova_compute[189066]:       <model name='pcie-root-port'/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:       <target chassis='13' port='0x1c'/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:       <alias name='pci.13'/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x4'/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:     </controller>
Dec 05 09:32:33 compute-1 nova_compute[189066]:     <controller type='pci' index='14' model='pcie-root-port'>
Dec 05 09:32:33 compute-1 nova_compute[189066]:       <model name='pcie-root-port'/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:       <target chassis='14' port='0x1d'/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:       <alias name='pci.14'/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x5'/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:     </controller>
Dec 05 09:32:33 compute-1 nova_compute[189066]:     <controller type='pci' index='15' model='pcie-root-port'>
Dec 05 09:32:33 compute-1 nova_compute[189066]:       <model name='pcie-root-port'/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:       <target chassis='15' port='0x1e'/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:       <alias name='pci.15'/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x6'/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:     </controller>
Dec 05 09:32:33 compute-1 nova_compute[189066]:     <controller type='pci' index='16' model='pcie-root-port'>
Dec 05 09:32:33 compute-1 nova_compute[189066]:       <model name='pcie-root-port'/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:       <target chassis='16' port='0x1f'/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:       <alias name='pci.16'/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x7'/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:     </controller>
Dec 05 09:32:33 compute-1 nova_compute[189066]:     <controller type='pci' index='17' model='pcie-root-port'>
Dec 05 09:32:33 compute-1 nova_compute[189066]:       <model name='pcie-root-port'/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:       <target chassis='17' port='0x20'/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:       <alias name='pci.17'/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x0' multifunction='on'/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:     </controller>
Dec 05 09:32:33 compute-1 nova_compute[189066]:     <controller type='pci' index='18' model='pcie-root-port'>
Dec 05 09:32:33 compute-1 nova_compute[189066]:       <model name='pcie-root-port'/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:       <target chassis='18' port='0x21'/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:       <alias name='pci.18'/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x1'/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:     </controller>
Dec 05 09:32:33 compute-1 nova_compute[189066]:     <controller type='pci' index='19' model='pcie-root-port'>
Dec 05 09:32:33 compute-1 nova_compute[189066]:       <model name='pcie-root-port'/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:       <target chassis='19' port='0x22'/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:       <alias name='pci.19'/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x2'/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:     </controller>
Dec 05 09:32:33 compute-1 nova_compute[189066]:     <controller type='pci' index='20' model='pcie-root-port'>
Dec 05 09:32:33 compute-1 nova_compute[189066]:       <model name='pcie-root-port'/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:       <target chassis='20' port='0x23'/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:       <alias name='pci.20'/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x3'/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:     </controller>
Dec 05 09:32:33 compute-1 nova_compute[189066]:     <controller type='pci' index='21' model='pcie-root-port'>
Dec 05 09:32:33 compute-1 nova_compute[189066]:       <model name='pcie-root-port'/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:       <target chassis='21' port='0x24'/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:       <alias name='pci.21'/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x4'/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:     </controller>
Dec 05 09:32:33 compute-1 nova_compute[189066]:     <controller type='pci' index='22' model='pcie-root-port'>
Dec 05 09:32:33 compute-1 nova_compute[189066]:       <model name='pcie-root-port'/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:       <target chassis='22' port='0x25'/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:       <alias name='pci.22'/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x5'/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:     </controller>
Dec 05 09:32:33 compute-1 nova_compute[189066]:     <controller type='pci' index='23' model='pcie-root-port'>
Dec 05 09:32:33 compute-1 nova_compute[189066]:       <model name='pcie-root-port'/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:       <target chassis='23' port='0x26'/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:       <alias name='pci.23'/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x6'/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:     </controller>
Dec 05 09:32:33 compute-1 nova_compute[189066]:     <controller type='pci' index='24' model='pcie-root-port'>
Dec 05 09:32:33 compute-1 nova_compute[189066]:       <model name='pcie-root-port'/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:       <target chassis='24' port='0x27'/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:       <alias name='pci.24'/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x7'/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:     </controller>
Dec 05 09:32:33 compute-1 nova_compute[189066]:     <controller type='pci' index='25' model='pcie-root-port'>
Dec 05 09:32:33 compute-1 nova_compute[189066]:       <model name='pcie-root-port'/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:       <target chassis='25' port='0x28'/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:       <alias name='pci.25'/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x05' function='0x0'/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:     </controller>
Dec 05 09:32:33 compute-1 nova_compute[189066]:     <controller type='pci' index='26' model='pcie-to-pci-bridge'>
Dec 05 09:32:33 compute-1 nova_compute[189066]:       <model name='pcie-pci-bridge'/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:       <alias name='pci.26'/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:       <address type='pci' domain='0x0000' bus='0x01' slot='0x00' function='0x0'/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:     </controller>
Dec 05 09:32:33 compute-1 nova_compute[189066]:     <controller type='usb' index='0' model='piix3-uhci'>
Dec 05 09:32:33 compute-1 nova_compute[189066]:       <alias name='usb'/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:       <address type='pci' domain='0x0000' bus='0x1a' slot='0x01' function='0x0'/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:     </controller>
Dec 05 09:32:33 compute-1 nova_compute[189066]:     <controller type='sata' index='0'>
Dec 05 09:32:33 compute-1 nova_compute[189066]:       <alias name='ide'/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x1f' function='0x2'/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:     </controller>
Dec 05 09:32:33 compute-1 nova_compute[189066]:     <interface type='ethernet'>
Dec 05 09:32:33 compute-1 nova_compute[189066]:       <mac address='fa:16:3e:7f:68:29'/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:       <target dev='tapd7d3d638-32'/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:       <model type='virtio'/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:       <driver name='vhost' rx_queue_size='512'/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:       <mtu size='1442'/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:       <alias name='net0'/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:       <address type='pci' domain='0x0000' bus='0x02' slot='0x00' function='0x0'/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:     </interface>
Dec 05 09:32:33 compute-1 nova_compute[189066]:     <serial type='pty'>
Dec 05 09:32:33 compute-1 nova_compute[189066]:       <source path='/dev/pts/1'/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:       <log file='/var/lib/nova/instances/b843e130-e156-47b6-8a2a-d4811973b93a/console.log' append='off'/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:       <target type='isa-serial' port='0'>
Dec 05 09:32:33 compute-1 nova_compute[189066]:         <model name='isa-serial'/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:       </target>
Dec 05 09:32:33 compute-1 nova_compute[189066]:       <alias name='serial0'/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:     </serial>
Dec 05 09:32:33 compute-1 nova_compute[189066]:     <console type='pty' tty='/dev/pts/1'>
Dec 05 09:32:33 compute-1 nova_compute[189066]:       <source path='/dev/pts/1'/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:       <log file='/var/lib/nova/instances/b843e130-e156-47b6-8a2a-d4811973b93a/console.log' append='off'/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:       <target type='serial' port='0'/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:       <alias name='serial0'/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:     </console>
Dec 05 09:32:33 compute-1 nova_compute[189066]:     <input type='tablet' bus='usb'>
Dec 05 09:32:33 compute-1 nova_compute[189066]:       <alias name='input0'/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:       <address type='usb' bus='0' port='1'/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:     </input>
Dec 05 09:32:33 compute-1 nova_compute[189066]:     <input type='mouse' bus='ps2'>
Dec 05 09:32:33 compute-1 nova_compute[189066]:       <alias name='input1'/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:     </input>
Dec 05 09:32:33 compute-1 nova_compute[189066]:     <input type='keyboard' bus='ps2'>
Dec 05 09:32:33 compute-1 nova_compute[189066]:       <alias name='input2'/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:     </input>
Dec 05 09:32:33 compute-1 nova_compute[189066]:     <graphics type='vnc' port='5901' autoport='yes' listen='::0'>
Dec 05 09:32:33 compute-1 nova_compute[189066]:       <listen type='address' address='::0'/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:     </graphics>
Dec 05 09:32:33 compute-1 nova_compute[189066]:     <audio id='1' type='none'/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:     <video>
Dec 05 09:32:33 compute-1 nova_compute[189066]:       <model type='virtio' heads='1' primary='yes'/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:       <alias name='video0'/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x01' function='0x0'/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:     </video>
Dec 05 09:32:33 compute-1 nova_compute[189066]:     <watchdog model='itco' action='reset'>
Dec 05 09:32:33 compute-1 nova_compute[189066]:       <alias name='watchdog0'/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:     </watchdog>
Dec 05 09:32:33 compute-1 nova_compute[189066]:     <memballoon model='virtio'>
Dec 05 09:32:33 compute-1 nova_compute[189066]:       <stats period='10'/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:       <alias name='balloon0'/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:       <address type='pci' domain='0x0000' bus='0x04' slot='0x00' function='0x0'/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:     </memballoon>
Dec 05 09:32:33 compute-1 nova_compute[189066]:     <rng model='virtio'>
Dec 05 09:32:33 compute-1 nova_compute[189066]:       <backend model='random'>/dev/urandom</backend>
Dec 05 09:32:33 compute-1 nova_compute[189066]:       <alias name='rng0'/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:       <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x0'/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:     </rng>
Dec 05 09:32:33 compute-1 nova_compute[189066]:   </devices>
Dec 05 09:32:33 compute-1 nova_compute[189066]:   <seclabel type='dynamic' model='selinux' relabel='yes'>
Dec 05 09:32:33 compute-1 nova_compute[189066]:     <label>system_u:system_r:svirt_t:s0:c183,c365</label>
Dec 05 09:32:33 compute-1 nova_compute[189066]:     <imagelabel>system_u:object_r:svirt_image_t:s0:c183,c365</imagelabel>
Dec 05 09:32:33 compute-1 nova_compute[189066]:   </seclabel>
Dec 05 09:32:33 compute-1 nova_compute[189066]:   <seclabel type='dynamic' model='dac' relabel='yes'>
Dec 05 09:32:33 compute-1 nova_compute[189066]:     <label>+107:+107</label>
Dec 05 09:32:33 compute-1 nova_compute[189066]:     <imagelabel>+107:+107</imagelabel>
Dec 05 09:32:33 compute-1 nova_compute[189066]:   </seclabel>
Dec 05 09:32:33 compute-1 nova_compute[189066]: </domain>
Dec 05 09:32:33 compute-1 nova_compute[189066]:  get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:282
Dec 05 09:32:33 compute-1 nova_compute[189066]: 2025-12-05 09:32:33.177 189070 INFO nova.virt.libvirt.driver [None req-df92c33a-9cf2-435b-9d00-f524add3a333 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Successfully detached device tap68d7a18a-f1 from instance b843e130-e156-47b6-8a2a-d4811973b93a from the live domain config.
Dec 05 09:32:33 compute-1 nova_compute[189066]: 2025-12-05 09:32:33.178 189070 DEBUG nova.virt.libvirt.vif [None req-df92c33a-9cf2-435b-9d00-f524add3a333 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-05T09:31:38Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1503446723',display_name='tempest-TestNetworkBasicOps-server-1503446723',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1503446723',id=23,image_ref='3ebffd97-b242-42d7-b245-ebdaf8e4377c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBAxL8qwYiwFQ90XKpcC6fYNHvnM/WUgqEtm6wBf8f4M0HY8C3R6RbtlwIDxS0SDgqUTn2GT88UA7fyLkCPv6NpqTCLHrrPHnPjNeXNwF6PVPzq8XOcGedORn9QBGLd5VXw==',key_name='tempest-TestNetworkBasicOps-1643628689',keypairs=<?>,launch_index=0,launched_at=2025-12-05T09:31:53Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='f918144b49634ed5a43d75f8f7d194d3',ramdisk_id='',reservation_id='r-0tfmxwfc',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='3ebffd97-b242-42d7-b245-ebdaf8e4377c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1341993432',owner_user_name='tempest-TestNetworkBasicOps-1341993432-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-12-05T09:31:53Z,user_data=None,user_id='822a2d80e80e46d9a5a49e1e9560e0d9',uuid=b843e130-e156-47b6-8a2a-d4811973b93a,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "68d7a18a-f1fa-4b3c-bad0-34623cfb7eb5", "address": "fa:16:3e:af:c1:b0", "network": {"id": "644a5de2-2e9c-48ea-b27d-a8992c881d84", "bridge": "br-int", "label": "tempest-network-smoke--1022632946", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.18", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f918144b49634ed5a43d75f8f7d194d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap68d7a18a-f1", "ovs_interfaceid": "68d7a18a-f1fa-4b3c-bad0-34623cfb7eb5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Dec 05 09:32:33 compute-1 nova_compute[189066]: 2025-12-05 09:32:33.178 189070 DEBUG nova.network.os_vif_util [None req-df92c33a-9cf2-435b-9d00-f524add3a333 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Converting VIF {"id": "68d7a18a-f1fa-4b3c-bad0-34623cfb7eb5", "address": "fa:16:3e:af:c1:b0", "network": {"id": "644a5de2-2e9c-48ea-b27d-a8992c881d84", "bridge": "br-int", "label": "tempest-network-smoke--1022632946", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.18", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f918144b49634ed5a43d75f8f7d194d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap68d7a18a-f1", "ovs_interfaceid": "68d7a18a-f1fa-4b3c-bad0-34623cfb7eb5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 09:32:33 compute-1 nova_compute[189066]: 2025-12-05 09:32:33.179 189070 DEBUG nova.network.os_vif_util [None req-df92c33a-9cf2-435b-9d00-f524add3a333 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:af:c1:b0,bridge_name='br-int',has_traffic_filtering=True,id=68d7a18a-f1fa-4b3c-bad0-34623cfb7eb5,network=Network(644a5de2-2e9c-48ea-b27d-a8992c881d84),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap68d7a18a-f1') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 09:32:33 compute-1 nova_compute[189066]: 2025-12-05 09:32:33.179 189070 DEBUG os_vif [None req-df92c33a-9cf2-435b-9d00-f524add3a333 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:af:c1:b0,bridge_name='br-int',has_traffic_filtering=True,id=68d7a18a-f1fa-4b3c-bad0-34623cfb7eb5,network=Network(644a5de2-2e9c-48ea-b27d-a8992c881d84),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap68d7a18a-f1') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Dec 05 09:32:33 compute-1 nova_compute[189066]: 2025-12-05 09:32:33.180 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:32:33 compute-1 nova_compute[189066]: 2025-12-05 09:32:33.181 189070 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap68d7a18a-f1, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 09:32:33 compute-1 nova_compute[189066]: 2025-12-05 09:32:33.182 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:32:33 compute-1 nova_compute[189066]: 2025-12-05 09:32:33.182 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:32:33 compute-1 nova_compute[189066]: 2025-12-05 09:32:33.184 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 09:32:33 compute-1 nova_compute[189066]: 2025-12-05 09:32:33.185 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:32:33 compute-1 nova_compute[189066]: 2025-12-05 09:32:33.187 189070 INFO os_vif [None req-df92c33a-9cf2-435b-9d00-f524add3a333 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:af:c1:b0,bridge_name='br-int',has_traffic_filtering=True,id=68d7a18a-f1fa-4b3c-bad0-34623cfb7eb5,network=Network(644a5de2-2e9c-48ea-b27d-a8992c881d84),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap68d7a18a-f1')
Dec 05 09:32:33 compute-1 nova_compute[189066]: 2025-12-05 09:32:33.188 189070 DEBUG nova.virt.libvirt.guest [None req-df92c33a-9cf2-435b-9d00-f524add3a333 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] set metadata xml: <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 05 09:32:33 compute-1 nova_compute[189066]:   <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:   <nova:name>tempest-TestNetworkBasicOps-server-1503446723</nova:name>
Dec 05 09:32:33 compute-1 nova_compute[189066]:   <nova:creationTime>2025-12-05 09:32:33</nova:creationTime>
Dec 05 09:32:33 compute-1 nova_compute[189066]:   <nova:flavor name="m1.nano">
Dec 05 09:32:33 compute-1 nova_compute[189066]:     <nova:memory>128</nova:memory>
Dec 05 09:32:33 compute-1 nova_compute[189066]:     <nova:disk>1</nova:disk>
Dec 05 09:32:33 compute-1 nova_compute[189066]:     <nova:swap>0</nova:swap>
Dec 05 09:32:33 compute-1 nova_compute[189066]:     <nova:ephemeral>0</nova:ephemeral>
Dec 05 09:32:33 compute-1 nova_compute[189066]:     <nova:vcpus>1</nova:vcpus>
Dec 05 09:32:33 compute-1 nova_compute[189066]:   </nova:flavor>
Dec 05 09:32:33 compute-1 nova_compute[189066]:   <nova:owner>
Dec 05 09:32:33 compute-1 nova_compute[189066]:     <nova:user uuid="822a2d80e80e46d9a5a49e1e9560e0d9">tempest-TestNetworkBasicOps-1341993432-project-member</nova:user>
Dec 05 09:32:33 compute-1 nova_compute[189066]:     <nova:project uuid="f918144b49634ed5a43d75f8f7d194d3">tempest-TestNetworkBasicOps-1341993432</nova:project>
Dec 05 09:32:33 compute-1 nova_compute[189066]:   </nova:owner>
Dec 05 09:32:33 compute-1 nova_compute[189066]:   <nova:root type="image" uuid="3ebffd97-b242-42d7-b245-ebdaf8e4377c"/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:   <nova:ports>
Dec 05 09:32:33 compute-1 nova_compute[189066]:     <nova:port uuid="d7d3d638-32a8-4d3e-abce-d0e8942a6c22">
Dec 05 09:32:33 compute-1 nova_compute[189066]:       <nova:ip type="fixed" address="10.100.0.10" ipVersion="4"/>
Dec 05 09:32:33 compute-1 nova_compute[189066]:     </nova:port>
Dec 05 09:32:33 compute-1 nova_compute[189066]:   </nova:ports>
Dec 05 09:32:33 compute-1 nova_compute[189066]: </nova:instance>
Dec 05 09:32:33 compute-1 nova_compute[189066]:  set_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:359
Dec 05 09:32:33 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:32:33.199 105272 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:af:c1:b0 10.100.0.18'], port_security=['fa:16:3e:af:c1:b0 10.100.0.18'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.18/28', 'neutron:device_id': 'b843e130-e156-47b6-8a2a-d4811973b93a', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-644a5de2-2e9c-48ea-b27d-a8992c881d84', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'f918144b49634ed5a43d75f8f7d194d3', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'b999a1ea-2a47-43bb-b6d8-1df5b14a091a', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=4ebe793e-a06d-45b8-9629-f9aa35f26666, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fc06f54c970>], logical_port=68d7a18a-f1fa-4b3c-bad0-34623cfb7eb5) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fc06f54c970>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 09:32:33 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:32:33.201 105272 INFO neutron.agent.ovn.metadata.agent [-] Port 68d7a18a-f1fa-4b3c-bad0-34623cfb7eb5 in datapath 644a5de2-2e9c-48ea-b27d-a8992c881d84 unbound from our chassis
Dec 05 09:32:33 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:32:33.203 105272 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 644a5de2-2e9c-48ea-b27d-a8992c881d84, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 05 09:32:33 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:32:33.204 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[60c0fb58-7bb2-457b-94a7-e7fee0964d3b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:32:33 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:32:33.205 105272 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-644a5de2-2e9c-48ea-b27d-a8992c881d84 namespace which is not needed anymore
Dec 05 09:32:33 compute-1 neutron-haproxy-ovnmeta-644a5de2-2e9c-48ea-b27d-a8992c881d84[227083]: [NOTICE]   (227087) : haproxy version is 2.8.14-c23fe91
Dec 05 09:32:33 compute-1 neutron-haproxy-ovnmeta-644a5de2-2e9c-48ea-b27d-a8992c881d84[227083]: [NOTICE]   (227087) : path to executable is /usr/sbin/haproxy
Dec 05 09:32:33 compute-1 neutron-haproxy-ovnmeta-644a5de2-2e9c-48ea-b27d-a8992c881d84[227083]: [WARNING]  (227087) : Exiting Master process...
Dec 05 09:32:33 compute-1 neutron-haproxy-ovnmeta-644a5de2-2e9c-48ea-b27d-a8992c881d84[227083]: [WARNING]  (227087) : Exiting Master process...
Dec 05 09:32:33 compute-1 neutron-haproxy-ovnmeta-644a5de2-2e9c-48ea-b27d-a8992c881d84[227083]: [ALERT]    (227087) : Current worker (227089) exited with code 143 (Terminated)
Dec 05 09:32:33 compute-1 neutron-haproxy-ovnmeta-644a5de2-2e9c-48ea-b27d-a8992c881d84[227083]: [WARNING]  (227087) : All workers exited. Exiting... (0)
Dec 05 09:32:33 compute-1 systemd[1]: libpod-3729f9bed34cfd51910d138bbc18f75eafb4805b9d82140fca8e0fc9b6dc9817.scope: Deactivated successfully.
Dec 05 09:32:33 compute-1 podman[227121]: 2025-12-05 09:32:33.363506905 +0000 UTC m=+0.052486322 container died 3729f9bed34cfd51910d138bbc18f75eafb4805b9d82140fca8e0fc9b6dc9817 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-644a5de2-2e9c-48ea-b27d-a8992c881d84, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Dec 05 09:32:33 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-3729f9bed34cfd51910d138bbc18f75eafb4805b9d82140fca8e0fc9b6dc9817-userdata-shm.mount: Deactivated successfully.
Dec 05 09:32:33 compute-1 systemd[1]: var-lib-containers-storage-overlay-c42a599b9659148e1cd1e6021407d3d1d37f18d5111029bce9a9234222756d4c-merged.mount: Deactivated successfully.
Dec 05 09:32:33 compute-1 podman[227121]: 2025-12-05 09:32:33.413230349 +0000 UTC m=+0.102209746 container cleanup 3729f9bed34cfd51910d138bbc18f75eafb4805b9d82140fca8e0fc9b6dc9817 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-644a5de2-2e9c-48ea-b27d-a8992c881d84, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 05 09:32:33 compute-1 systemd[1]: libpod-conmon-3729f9bed34cfd51910d138bbc18f75eafb4805b9d82140fca8e0fc9b6dc9817.scope: Deactivated successfully.
Dec 05 09:32:33 compute-1 podman[227152]: 2025-12-05 09:32:33.487111846 +0000 UTC m=+0.046400712 container remove 3729f9bed34cfd51910d138bbc18f75eafb4805b9d82140fca8e0fc9b6dc9817 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-644a5de2-2e9c-48ea-b27d-a8992c881d84, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS)
Dec 05 09:32:33 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:32:33.494 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[87b97173-38ba-49b8-9364-2ba61333c116]: (4, ('Fri Dec  5 09:32:33 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-644a5de2-2e9c-48ea-b27d-a8992c881d84 (3729f9bed34cfd51910d138bbc18f75eafb4805b9d82140fca8e0fc9b6dc9817)\n3729f9bed34cfd51910d138bbc18f75eafb4805b9d82140fca8e0fc9b6dc9817\nFri Dec  5 09:32:33 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-644a5de2-2e9c-48ea-b27d-a8992c881d84 (3729f9bed34cfd51910d138bbc18f75eafb4805b9d82140fca8e0fc9b6dc9817)\n3729f9bed34cfd51910d138bbc18f75eafb4805b9d82140fca8e0fc9b6dc9817\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:32:33 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:32:33.496 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[03bee2c2-27a8-47fc-ab4f-c8c99e659102]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:32:33 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:32:33.498 105272 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap644a5de2-20, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 09:32:33 compute-1 nova_compute[189066]: 2025-12-05 09:32:33.501 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:32:33 compute-1 kernel: tap644a5de2-20: left promiscuous mode
Dec 05 09:32:33 compute-1 nova_compute[189066]: 2025-12-05 09:32:33.504 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:32:33 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:32:33.509 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[53f6c908-5d60-4f93-b4bb-13ef5a52b637]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:32:33 compute-1 nova_compute[189066]: 2025-12-05 09:32:33.518 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:32:33 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:32:33.530 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[01ed439d-1dce-41e1-8aeb-ab5fa06082f5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:32:33 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:32:33.532 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[30a455fd-4ea0-4a90-81d4-c9145ed13b50]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:32:33 compute-1 nova_compute[189066]: 2025-12-05 09:32:33.537 189070 DEBUG nova.network.neutron [req-70bcf2f9-a121-4629-a4b5-7b8b86660907 req-ae86c0e1-fc31-4779-aab4-00446d51243f 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: b843e130-e156-47b6-8a2a-d4811973b93a] Updated VIF entry in instance network info cache for port 68d7a18a-f1fa-4b3c-bad0-34623cfb7eb5. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Dec 05 09:32:33 compute-1 nova_compute[189066]: 2025-12-05 09:32:33.538 189070 DEBUG nova.network.neutron [req-70bcf2f9-a121-4629-a4b5-7b8b86660907 req-ae86c0e1-fc31-4779-aab4-00446d51243f 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: b843e130-e156-47b6-8a2a-d4811973b93a] Updating instance_info_cache with network_info: [{"id": "d7d3d638-32a8-4d3e-abce-d0e8942a6c22", "address": "fa:16:3e:7f:68:29", "network": {"id": "7e493efd-28a8-4124-9b83-acc24853936f", "bridge": "br-int", "label": "tempest-network-smoke--1424299295", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.205", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f918144b49634ed5a43d75f8f7d194d3", "mtu": null, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd7d3d638-32", "ovs_interfaceid": "d7d3d638-32a8-4d3e-abce-d0e8942a6c22", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "68d7a18a-f1fa-4b3c-bad0-34623cfb7eb5", "address": "fa:16:3e:af:c1:b0", "network": {"id": "644a5de2-2e9c-48ea-b27d-a8992c881d84", "bridge": "br-int", "label": "tempest-network-smoke--1022632946", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.18", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f918144b49634ed5a43d75f8f7d194d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap68d7a18a-f1", "ovs_interfaceid": "68d7a18a-f1fa-4b3c-bad0-34623cfb7eb5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 09:32:33 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:32:33.550 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[90ca5580-6e50-4bee-93e5-69e99051d2d8]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 442308, 'reachable_time': 33520, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 227165, 'error': None, 'target': 'ovnmeta-644a5de2-2e9c-48ea-b27d-a8992c881d84', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:32:33 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:32:33.554 105809 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-644a5de2-2e9c-48ea-b27d-a8992c881d84 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Dec 05 09:32:33 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:32:33.554 105809 DEBUG oslo.privsep.daemon [-] privsep: reply[fc93bc5e-19a9-4ce5-a9a6-305a73fd4f54]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:32:33 compute-1 systemd[1]: run-netns-ovnmeta\x2d644a5de2\x2d2e9c\x2d48ea\x2db27d\x2da8992c881d84.mount: Deactivated successfully.
Dec 05 09:32:33 compute-1 nova_compute[189066]: 2025-12-05 09:32:33.566 189070 DEBUG oslo_concurrency.lockutils [req-70bcf2f9-a121-4629-a4b5-7b8b86660907 req-ae86c0e1-fc31-4779-aab4-00446d51243f 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Releasing lock "refresh_cache-b843e130-e156-47b6-8a2a-d4811973b93a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 09:32:34 compute-1 nova_compute[189066]: 2025-12-05 09:32:34.206 189070 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764927139.2051697, a0adc838-396e-45a2-8503-a2ab451cd778 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 09:32:34 compute-1 nova_compute[189066]: 2025-12-05 09:32:34.207 189070 INFO nova.compute.manager [-] [instance: a0adc838-396e-45a2-8503-a2ab451cd778] VM Stopped (Lifecycle Event)
Dec 05 09:32:34 compute-1 nova_compute[189066]: 2025-12-05 09:32:34.361 189070 DEBUG nova.compute.manager [None req-1621a97f-0057-46ef-8aec-9f00329a85cb - - - - - -] [instance: a0adc838-396e-45a2-8503-a2ab451cd778] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 09:32:34 compute-1 nova_compute[189066]: 2025-12-05 09:32:34.452 189070 DEBUG oslo_concurrency.lockutils [None req-df92c33a-9cf2-435b-9d00-f524add3a333 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Acquiring lock "refresh_cache-b843e130-e156-47b6-8a2a-d4811973b93a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 09:32:34 compute-1 nova_compute[189066]: 2025-12-05 09:32:34.452 189070 DEBUG oslo_concurrency.lockutils [None req-df92c33a-9cf2-435b-9d00-f524add3a333 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Acquired lock "refresh_cache-b843e130-e156-47b6-8a2a-d4811973b93a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 09:32:34 compute-1 nova_compute[189066]: 2025-12-05 09:32:34.453 189070 DEBUG nova.network.neutron [None req-df92c33a-9cf2-435b-9d00-f524add3a333 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: b843e130-e156-47b6-8a2a-d4811973b93a] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Dec 05 09:32:34 compute-1 podman[227166]: 2025-12-05 09:32:34.630298804 +0000 UTC m=+0.067321747 container health_status 550b604b3b40c506829669f3af0f3e8dc4e7ac01464222ce7f26998708fe1a96 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Dec 05 09:32:34 compute-1 nova_compute[189066]: 2025-12-05 09:32:34.699 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:32:35 compute-1 ovn_controller[95809]: 2025-12-05T09:32:35Z|00166|binding|INFO|Releasing lport 99e3597b-6e93-4623-ba40-68c3147cea55 from this chassis (sb_readonly=0)
Dec 05 09:32:35 compute-1 nova_compute[189066]: 2025-12-05 09:32:35.394 189070 DEBUG nova.compute.manager [req-363f7fe4-b572-41ff-862b-035379403b87 req-0015fbf8-1f38-4066-8a6b-d7aaa56ae3e9 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: b843e130-e156-47b6-8a2a-d4811973b93a] Received event network-vif-unplugged-68d7a18a-f1fa-4b3c-bad0-34623cfb7eb5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 09:32:35 compute-1 nova_compute[189066]: 2025-12-05 09:32:35.395 189070 DEBUG oslo_concurrency.lockutils [req-363f7fe4-b572-41ff-862b-035379403b87 req-0015fbf8-1f38-4066-8a6b-d7aaa56ae3e9 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Acquiring lock "b843e130-e156-47b6-8a2a-d4811973b93a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:32:35 compute-1 nova_compute[189066]: 2025-12-05 09:32:35.395 189070 DEBUG oslo_concurrency.lockutils [req-363f7fe4-b572-41ff-862b-035379403b87 req-0015fbf8-1f38-4066-8a6b-d7aaa56ae3e9 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Lock "b843e130-e156-47b6-8a2a-d4811973b93a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:32:35 compute-1 nova_compute[189066]: 2025-12-05 09:32:35.395 189070 DEBUG oslo_concurrency.lockutils [req-363f7fe4-b572-41ff-862b-035379403b87 req-0015fbf8-1f38-4066-8a6b-d7aaa56ae3e9 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Lock "b843e130-e156-47b6-8a2a-d4811973b93a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:32:35 compute-1 nova_compute[189066]: 2025-12-05 09:32:35.395 189070 DEBUG nova.compute.manager [req-363f7fe4-b572-41ff-862b-035379403b87 req-0015fbf8-1f38-4066-8a6b-d7aaa56ae3e9 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: b843e130-e156-47b6-8a2a-d4811973b93a] No waiting events found dispatching network-vif-unplugged-68d7a18a-f1fa-4b3c-bad0-34623cfb7eb5 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 09:32:35 compute-1 nova_compute[189066]: 2025-12-05 09:32:35.396 189070 WARNING nova.compute.manager [req-363f7fe4-b572-41ff-862b-035379403b87 req-0015fbf8-1f38-4066-8a6b-d7aaa56ae3e9 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: b843e130-e156-47b6-8a2a-d4811973b93a] Received unexpected event network-vif-unplugged-68d7a18a-f1fa-4b3c-bad0-34623cfb7eb5 for instance with vm_state active and task_state None.
Dec 05 09:32:35 compute-1 nova_compute[189066]: 2025-12-05 09:32:35.396 189070 DEBUG nova.compute.manager [req-363f7fe4-b572-41ff-862b-035379403b87 req-0015fbf8-1f38-4066-8a6b-d7aaa56ae3e9 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: b843e130-e156-47b6-8a2a-d4811973b93a] Received event network-vif-plugged-68d7a18a-f1fa-4b3c-bad0-34623cfb7eb5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 09:32:35 compute-1 nova_compute[189066]: 2025-12-05 09:32:35.396 189070 DEBUG oslo_concurrency.lockutils [req-363f7fe4-b572-41ff-862b-035379403b87 req-0015fbf8-1f38-4066-8a6b-d7aaa56ae3e9 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Acquiring lock "b843e130-e156-47b6-8a2a-d4811973b93a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:32:35 compute-1 nova_compute[189066]: 2025-12-05 09:32:35.396 189070 DEBUG oslo_concurrency.lockutils [req-363f7fe4-b572-41ff-862b-035379403b87 req-0015fbf8-1f38-4066-8a6b-d7aaa56ae3e9 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Lock "b843e130-e156-47b6-8a2a-d4811973b93a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:32:35 compute-1 nova_compute[189066]: 2025-12-05 09:32:35.396 189070 DEBUG oslo_concurrency.lockutils [req-363f7fe4-b572-41ff-862b-035379403b87 req-0015fbf8-1f38-4066-8a6b-d7aaa56ae3e9 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Lock "b843e130-e156-47b6-8a2a-d4811973b93a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:32:35 compute-1 nova_compute[189066]: 2025-12-05 09:32:35.397 189070 DEBUG nova.compute.manager [req-363f7fe4-b572-41ff-862b-035379403b87 req-0015fbf8-1f38-4066-8a6b-d7aaa56ae3e9 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: b843e130-e156-47b6-8a2a-d4811973b93a] No waiting events found dispatching network-vif-plugged-68d7a18a-f1fa-4b3c-bad0-34623cfb7eb5 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 09:32:35 compute-1 nova_compute[189066]: 2025-12-05 09:32:35.397 189070 WARNING nova.compute.manager [req-363f7fe4-b572-41ff-862b-035379403b87 req-0015fbf8-1f38-4066-8a6b-d7aaa56ae3e9 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: b843e130-e156-47b6-8a2a-d4811973b93a] Received unexpected event network-vif-plugged-68d7a18a-f1fa-4b3c-bad0-34623cfb7eb5 for instance with vm_state active and task_state None.
Dec 05 09:32:35 compute-1 nova_compute[189066]: 2025-12-05 09:32:35.397 189070 DEBUG nova.compute.manager [req-363f7fe4-b572-41ff-862b-035379403b87 req-0015fbf8-1f38-4066-8a6b-d7aaa56ae3e9 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: b843e130-e156-47b6-8a2a-d4811973b93a] Received event network-vif-deleted-68d7a18a-f1fa-4b3c-bad0-34623cfb7eb5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 09:32:35 compute-1 nova_compute[189066]: 2025-12-05 09:32:35.397 189070 INFO nova.compute.manager [req-363f7fe4-b572-41ff-862b-035379403b87 req-0015fbf8-1f38-4066-8a6b-d7aaa56ae3e9 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: b843e130-e156-47b6-8a2a-d4811973b93a] Neutron deleted interface 68d7a18a-f1fa-4b3c-bad0-34623cfb7eb5; detaching it from the instance and deleting it from the info cache
Dec 05 09:32:35 compute-1 nova_compute[189066]: 2025-12-05 09:32:35.397 189070 DEBUG nova.network.neutron [req-363f7fe4-b572-41ff-862b-035379403b87 req-0015fbf8-1f38-4066-8a6b-d7aaa56ae3e9 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: b843e130-e156-47b6-8a2a-d4811973b93a] Updating instance_info_cache with network_info: [{"id": "d7d3d638-32a8-4d3e-abce-d0e8942a6c22", "address": "fa:16:3e:7f:68:29", "network": {"id": "7e493efd-28a8-4124-9b83-acc24853936f", "bridge": "br-int", "label": "tempest-network-smoke--1424299295", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.205", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f918144b49634ed5a43d75f8f7d194d3", "mtu": null, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd7d3d638-32", "ovs_interfaceid": "d7d3d638-32a8-4d3e-abce-d0e8942a6c22", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 09:32:35 compute-1 nova_compute[189066]: 2025-12-05 09:32:35.446 189070 DEBUG nova.objects.instance [req-363f7fe4-b572-41ff-862b-035379403b87 req-0015fbf8-1f38-4066-8a6b-d7aaa56ae3e9 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Lazy-loading 'system_metadata' on Instance uuid b843e130-e156-47b6-8a2a-d4811973b93a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 09:32:35 compute-1 nova_compute[189066]: 2025-12-05 09:32:35.468 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:32:35 compute-1 nova_compute[189066]: 2025-12-05 09:32:35.504 189070 DEBUG nova.objects.instance [req-363f7fe4-b572-41ff-862b-035379403b87 req-0015fbf8-1f38-4066-8a6b-d7aaa56ae3e9 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Lazy-loading 'flavor' on Instance uuid b843e130-e156-47b6-8a2a-d4811973b93a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 09:32:35 compute-1 nova_compute[189066]: 2025-12-05 09:32:35.532 189070 DEBUG nova.virt.libvirt.vif [req-363f7fe4-b572-41ff-862b-035379403b87 req-0015fbf8-1f38-4066-8a6b-d7aaa56ae3e9 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-05T09:31:38Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1503446723',display_name='tempest-TestNetworkBasicOps-server-1503446723',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1503446723',id=23,image_ref='3ebffd97-b242-42d7-b245-ebdaf8e4377c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBAxL8qwYiwFQ90XKpcC6fYNHvnM/WUgqEtm6wBf8f4M0HY8C3R6RbtlwIDxS0SDgqUTn2GT88UA7fyLkCPv6NpqTCLHrrPHnPjNeXNwF6PVPzq8XOcGedORn9QBGLd5VXw==',key_name='tempest-TestNetworkBasicOps-1643628689',keypairs=<?>,launch_index=0,launched_at=2025-12-05T09:31:53Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata=<?>,migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='f918144b49634ed5a43d75f8f7d194d3',ramdisk_id='',reservation_id='r-0tfmxwfc',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='3ebffd97-b242-42d7-b245-ebdaf8e4377c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1341993432',owner_user_name='tempest-TestNetworkBasicOps-1341993432-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-12-05T09:31:53Z,user_data=None,user_id='822a2d80e80e46d9a5a49e1e9560e0d9',uuid=b843e130-e156-47b6-8a2a-d4811973b93a,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "68d7a18a-f1fa-4b3c-bad0-34623cfb7eb5", "address": "fa:16:3e:af:c1:b0", "network": {"id": "644a5de2-2e9c-48ea-b27d-a8992c881d84", "bridge": "br-int", "label": "tempest-network-smoke--1022632946", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.18", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f918144b49634ed5a43d75f8f7d194d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap68d7a18a-f1", "ovs_interfaceid": "68d7a18a-f1fa-4b3c-bad0-34623cfb7eb5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Dec 05 09:32:35 compute-1 nova_compute[189066]: 2025-12-05 09:32:35.532 189070 DEBUG nova.network.os_vif_util [req-363f7fe4-b572-41ff-862b-035379403b87 req-0015fbf8-1f38-4066-8a6b-d7aaa56ae3e9 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Converting VIF {"id": "68d7a18a-f1fa-4b3c-bad0-34623cfb7eb5", "address": "fa:16:3e:af:c1:b0", "network": {"id": "644a5de2-2e9c-48ea-b27d-a8992c881d84", "bridge": "br-int", "label": "tempest-network-smoke--1022632946", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.18", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f918144b49634ed5a43d75f8f7d194d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap68d7a18a-f1", "ovs_interfaceid": "68d7a18a-f1fa-4b3c-bad0-34623cfb7eb5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 09:32:35 compute-1 nova_compute[189066]: 2025-12-05 09:32:35.533 189070 DEBUG nova.network.os_vif_util [req-363f7fe4-b572-41ff-862b-035379403b87 req-0015fbf8-1f38-4066-8a6b-d7aaa56ae3e9 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:af:c1:b0,bridge_name='br-int',has_traffic_filtering=True,id=68d7a18a-f1fa-4b3c-bad0-34623cfb7eb5,network=Network(644a5de2-2e9c-48ea-b27d-a8992c881d84),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap68d7a18a-f1') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 09:32:35 compute-1 nova_compute[189066]: 2025-12-05 09:32:35.537 189070 DEBUG nova.virt.libvirt.guest [req-363f7fe4-b572-41ff-862b-035379403b87 req-0015fbf8-1f38-4066-8a6b-d7aaa56ae3e9 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:af:c1:b0"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap68d7a18a-f1"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257
Dec 05 09:32:35 compute-1 nova_compute[189066]: 2025-12-05 09:32:35.540 189070 DEBUG nova.virt.libvirt.guest [req-363f7fe4-b572-41ff-862b-035379403b87 req-0015fbf8-1f38-4066-8a6b-d7aaa56ae3e9 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] interface for config: <interface type="ethernet"><mac address="fa:16:3e:af:c1:b0"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap68d7a18a-f1"/></interface>not found in domain: <domain type='kvm' id='12'>
Dec 05 09:32:35 compute-1 nova_compute[189066]:   <name>instance-00000017</name>
Dec 05 09:32:35 compute-1 nova_compute[189066]:   <uuid>b843e130-e156-47b6-8a2a-d4811973b93a</uuid>
Dec 05 09:32:35 compute-1 nova_compute[189066]:   <metadata>
Dec 05 09:32:35 compute-1 nova_compute[189066]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1" xmlns:instance="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 05 09:32:35 compute-1 nova_compute[189066]:   <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:   <nova:name>tempest-TestNetworkBasicOps-server-1503446723</nova:name>
Dec 05 09:32:35 compute-1 nova_compute[189066]:   <nova:creationTime>2025-12-05 09:32:33</nova:creationTime>
Dec 05 09:32:35 compute-1 nova_compute[189066]:   <nova:flavor name="m1.nano">
Dec 05 09:32:35 compute-1 nova_compute[189066]:     <nova:memory>128</nova:memory>
Dec 05 09:32:35 compute-1 nova_compute[189066]:     <nova:disk>1</nova:disk>
Dec 05 09:32:35 compute-1 nova_compute[189066]:     <nova:swap>0</nova:swap>
Dec 05 09:32:35 compute-1 nova_compute[189066]:     <nova:ephemeral>0</nova:ephemeral>
Dec 05 09:32:35 compute-1 nova_compute[189066]:     <nova:vcpus>1</nova:vcpus>
Dec 05 09:32:35 compute-1 nova_compute[189066]:   </nova:flavor>
Dec 05 09:32:35 compute-1 nova_compute[189066]:   <nova:owner>
Dec 05 09:32:35 compute-1 nova_compute[189066]:     <nova:user uuid="822a2d80e80e46d9a5a49e1e9560e0d9">tempest-TestNetworkBasicOps-1341993432-project-member</nova:user>
Dec 05 09:32:35 compute-1 nova_compute[189066]:     <nova:project uuid="f918144b49634ed5a43d75f8f7d194d3">tempest-TestNetworkBasicOps-1341993432</nova:project>
Dec 05 09:32:35 compute-1 nova_compute[189066]:   </nova:owner>
Dec 05 09:32:35 compute-1 nova_compute[189066]:   <nova:root type="image" uuid="3ebffd97-b242-42d7-b245-ebdaf8e4377c"/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:   <nova:ports>
Dec 05 09:32:35 compute-1 nova_compute[189066]:     <nova:port uuid="d7d3d638-32a8-4d3e-abce-d0e8942a6c22">
Dec 05 09:32:35 compute-1 nova_compute[189066]:       <nova:ip type="fixed" address="10.100.0.10" ipVersion="4"/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:     </nova:port>
Dec 05 09:32:35 compute-1 nova_compute[189066]:   </nova:ports>
Dec 05 09:32:35 compute-1 nova_compute[189066]: </nova:instance>
Dec 05 09:32:35 compute-1 nova_compute[189066]:   </metadata>
Dec 05 09:32:35 compute-1 nova_compute[189066]:   <memory unit='KiB'>131072</memory>
Dec 05 09:32:35 compute-1 nova_compute[189066]:   <currentMemory unit='KiB'>131072</currentMemory>
Dec 05 09:32:35 compute-1 nova_compute[189066]:   <vcpu placement='static'>1</vcpu>
Dec 05 09:32:35 compute-1 nova_compute[189066]:   <resource>
Dec 05 09:32:35 compute-1 nova_compute[189066]:     <partition>/machine</partition>
Dec 05 09:32:35 compute-1 nova_compute[189066]:   </resource>
Dec 05 09:32:35 compute-1 nova_compute[189066]:   <sysinfo type='smbios'>
Dec 05 09:32:35 compute-1 nova_compute[189066]:     <system>
Dec 05 09:32:35 compute-1 nova_compute[189066]:       <entry name='manufacturer'>RDO</entry>
Dec 05 09:32:35 compute-1 nova_compute[189066]:       <entry name='product'>OpenStack Compute</entry>
Dec 05 09:32:35 compute-1 nova_compute[189066]:       <entry name='version'>27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec 05 09:32:35 compute-1 nova_compute[189066]:       <entry name='serial'>b843e130-e156-47b6-8a2a-d4811973b93a</entry>
Dec 05 09:32:35 compute-1 nova_compute[189066]:       <entry name='uuid'>b843e130-e156-47b6-8a2a-d4811973b93a</entry>
Dec 05 09:32:35 compute-1 nova_compute[189066]:       <entry name='family'>Virtual Machine</entry>
Dec 05 09:32:35 compute-1 nova_compute[189066]:     </system>
Dec 05 09:32:35 compute-1 nova_compute[189066]:   </sysinfo>
Dec 05 09:32:35 compute-1 nova_compute[189066]:   <os>
Dec 05 09:32:35 compute-1 nova_compute[189066]:     <type arch='x86_64' machine='pc-q35-rhel9.8.0'>hvm</type>
Dec 05 09:32:35 compute-1 nova_compute[189066]:     <boot dev='hd'/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:     <smbios mode='sysinfo'/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:   </os>
Dec 05 09:32:35 compute-1 nova_compute[189066]:   <features>
Dec 05 09:32:35 compute-1 nova_compute[189066]:     <acpi/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:     <apic/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:     <vmcoreinfo state='on'/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:   </features>
Dec 05 09:32:35 compute-1 nova_compute[189066]:   <cpu mode='custom' match='exact' check='full'>
Dec 05 09:32:35 compute-1 nova_compute[189066]:     <model fallback='forbid'>Nehalem</model>
Dec 05 09:32:35 compute-1 nova_compute[189066]:     <topology sockets='1' dies='1' clusters='1' cores='1' threads='1'/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:     <feature policy='require' name='x2apic'/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:     <feature policy='require' name='hypervisor'/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:     <feature policy='require' name='vme'/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:   </cpu>
Dec 05 09:32:35 compute-1 nova_compute[189066]:   <clock offset='utc'>
Dec 05 09:32:35 compute-1 nova_compute[189066]:     <timer name='pit' tickpolicy='delay'/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:     <timer name='rtc' tickpolicy='catchup'/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:     <timer name='hpet' present='no'/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:   </clock>
Dec 05 09:32:35 compute-1 nova_compute[189066]:   <on_poweroff>destroy</on_poweroff>
Dec 05 09:32:35 compute-1 nova_compute[189066]:   <on_reboot>restart</on_reboot>
Dec 05 09:32:35 compute-1 nova_compute[189066]:   <on_crash>destroy</on_crash>
Dec 05 09:32:35 compute-1 nova_compute[189066]:   <devices>
Dec 05 09:32:35 compute-1 nova_compute[189066]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Dec 05 09:32:35 compute-1 nova_compute[189066]:     <disk type='file' device='disk'>
Dec 05 09:32:35 compute-1 nova_compute[189066]:       <driver name='qemu' type='qcow2' cache='none'/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:       <source file='/var/lib/nova/instances/b843e130-e156-47b6-8a2a-d4811973b93a/disk' index='2'/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:       <backingStore type='file' index='3'>
Dec 05 09:32:35 compute-1 nova_compute[189066]:         <format type='raw'/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:         <source file='/var/lib/nova/instances/_base/5b1bdb5cc50b9bf43ebe3094e9279a12a18df0ce'/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:         <backingStore/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:       </backingStore>
Dec 05 09:32:35 compute-1 nova_compute[189066]:       <target dev='vda' bus='virtio'/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:       <alias name='virtio-disk0'/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:       <address type='pci' domain='0x0000' bus='0x03' slot='0x00' function='0x0'/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:     </disk>
Dec 05 09:32:35 compute-1 nova_compute[189066]:     <disk type='file' device='cdrom'>
Dec 05 09:32:35 compute-1 nova_compute[189066]:       <driver name='qemu' type='raw' cache='none'/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:       <source file='/var/lib/nova/instances/b843e130-e156-47b6-8a2a-d4811973b93a/disk.config' index='1'/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:       <backingStore/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:       <target dev='sda' bus='sata'/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:       <readonly/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:       <alias name='sata0-0-0'/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:       <address type='drive' controller='0' bus='0' target='0' unit='0'/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:     </disk>
Dec 05 09:32:35 compute-1 nova_compute[189066]:     <controller type='pci' index='0' model='pcie-root'>
Dec 05 09:32:35 compute-1 nova_compute[189066]:       <alias name='pcie.0'/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:     </controller>
Dec 05 09:32:35 compute-1 nova_compute[189066]:     <controller type='pci' index='1' model='pcie-root-port'>
Dec 05 09:32:35 compute-1 nova_compute[189066]:       <model name='pcie-root-port'/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:       <target chassis='1' port='0x10'/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:       <alias name='pci.1'/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x0' multifunction='on'/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:     </controller>
Dec 05 09:32:35 compute-1 nova_compute[189066]:     <controller type='pci' index='2' model='pcie-root-port'>
Dec 05 09:32:35 compute-1 nova_compute[189066]:       <model name='pcie-root-port'/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:       <target chassis='2' port='0x11'/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:       <alias name='pci.2'/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x1'/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:     </controller>
Dec 05 09:32:35 compute-1 nova_compute[189066]:     <controller type='pci' index='3' model='pcie-root-port'>
Dec 05 09:32:35 compute-1 nova_compute[189066]:       <model name='pcie-root-port'/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:       <target chassis='3' port='0x12'/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:       <alias name='pci.3'/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x2'/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:     </controller>
Dec 05 09:32:35 compute-1 nova_compute[189066]:     <controller type='pci' index='4' model='pcie-root-port'>
Dec 05 09:32:35 compute-1 nova_compute[189066]:       <model name='pcie-root-port'/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:       <target chassis='4' port='0x13'/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:       <alias name='pci.4'/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x3'/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:     </controller>
Dec 05 09:32:35 compute-1 nova_compute[189066]:     <controller type='pci' index='5' model='pcie-root-port'>
Dec 05 09:32:35 compute-1 nova_compute[189066]:       <model name='pcie-root-port'/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:       <target chassis='5' port='0x14'/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:       <alias name='pci.5'/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x4'/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:     </controller>
Dec 05 09:32:35 compute-1 nova_compute[189066]:     <controller type='pci' index='6' model='pcie-root-port'>
Dec 05 09:32:35 compute-1 nova_compute[189066]:       <model name='pcie-root-port'/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:       <target chassis='6' port='0x15'/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:       <alias name='pci.6'/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x5'/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:     </controller>
Dec 05 09:32:35 compute-1 nova_compute[189066]:     <controller type='pci' index='7' model='pcie-root-port'>
Dec 05 09:32:35 compute-1 nova_compute[189066]:       <model name='pcie-root-port'/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:       <target chassis='7' port='0x16'/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:       <alias name='pci.7'/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x6'/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:     </controller>
Dec 05 09:32:35 compute-1 nova_compute[189066]:     <controller type='pci' index='8' model='pcie-root-port'>
Dec 05 09:32:35 compute-1 nova_compute[189066]:       <model name='pcie-root-port'/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:       <target chassis='8' port='0x17'/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:       <alias name='pci.8'/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x7'/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:     </controller>
Dec 05 09:32:35 compute-1 nova_compute[189066]:     <controller type='pci' index='9' model='pcie-root-port'>
Dec 05 09:32:35 compute-1 nova_compute[189066]:       <model name='pcie-root-port'/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:       <target chassis='9' port='0x18'/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:       <alias name='pci.9'/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0' multifunction='on'/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:     </controller>
Dec 05 09:32:35 compute-1 nova_compute[189066]:     <controller type='pci' index='10' model='pcie-root-port'>
Dec 05 09:32:35 compute-1 nova_compute[189066]:       <model name='pcie-root-port'/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:       <target chassis='10' port='0x19'/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:       <alias name='pci.10'/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x1'/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:     </controller>
Dec 05 09:32:35 compute-1 nova_compute[189066]:     <controller type='pci' index='11' model='pcie-root-port'>
Dec 05 09:32:35 compute-1 nova_compute[189066]:       <model name='pcie-root-port'/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:       <target chassis='11' port='0x1a'/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:       <alias name='pci.11'/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x2'/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:     </controller>
Dec 05 09:32:35 compute-1 nova_compute[189066]:     <controller type='pci' index='12' model='pcie-root-port'>
Dec 05 09:32:35 compute-1 nova_compute[189066]:       <model name='pcie-root-port'/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:       <target chassis='12' port='0x1b'/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:       <alias name='pci.12'/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x3'/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:     </controller>
Dec 05 09:32:35 compute-1 nova_compute[189066]:     <controller type='pci' index='13' model='pcie-root-port'>
Dec 05 09:32:35 compute-1 nova_compute[189066]:       <model name='pcie-root-port'/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:       <target chassis='13' port='0x1c'/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:       <alias name='pci.13'/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x4'/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:     </controller>
Dec 05 09:32:35 compute-1 nova_compute[189066]:     <controller type='pci' index='14' model='pcie-root-port'>
Dec 05 09:32:35 compute-1 nova_compute[189066]:       <model name='pcie-root-port'/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:       <target chassis='14' port='0x1d'/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:       <alias name='pci.14'/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x5'/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:     </controller>
Dec 05 09:32:35 compute-1 nova_compute[189066]:     <controller type='pci' index='15' model='pcie-root-port'>
Dec 05 09:32:35 compute-1 nova_compute[189066]:       <model name='pcie-root-port'/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:       <target chassis='15' port='0x1e'/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:       <alias name='pci.15'/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x6'/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:     </controller>
Dec 05 09:32:35 compute-1 nova_compute[189066]:     <controller type='pci' index='16' model='pcie-root-port'>
Dec 05 09:32:35 compute-1 nova_compute[189066]:       <model name='pcie-root-port'/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:       <target chassis='16' port='0x1f'/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:       <alias name='pci.16'/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x7'/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:     </controller>
Dec 05 09:32:35 compute-1 nova_compute[189066]:     <controller type='pci' index='17' model='pcie-root-port'>
Dec 05 09:32:35 compute-1 nova_compute[189066]:       <model name='pcie-root-port'/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:       <target chassis='17' port='0x20'/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:       <alias name='pci.17'/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x0' multifunction='on'/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:     </controller>
Dec 05 09:32:35 compute-1 nova_compute[189066]:     <controller type='pci' index='18' model='pcie-root-port'>
Dec 05 09:32:35 compute-1 nova_compute[189066]:       <model name='pcie-root-port'/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:       <target chassis='18' port='0x21'/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:       <alias name='pci.18'/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x1'/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:     </controller>
Dec 05 09:32:35 compute-1 nova_compute[189066]:     <controller type='pci' index='19' model='pcie-root-port'>
Dec 05 09:32:35 compute-1 nova_compute[189066]:       <model name='pcie-root-port'/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:       <target chassis='19' port='0x22'/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:       <alias name='pci.19'/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x2'/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:     </controller>
Dec 05 09:32:35 compute-1 nova_compute[189066]:     <controller type='pci' index='20' model='pcie-root-port'>
Dec 05 09:32:35 compute-1 nova_compute[189066]:       <model name='pcie-root-port'/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:       <target chassis='20' port='0x23'/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:       <alias name='pci.20'/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x3'/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:     </controller>
Dec 05 09:32:35 compute-1 nova_compute[189066]:     <controller type='pci' index='21' model='pcie-root-port'>
Dec 05 09:32:35 compute-1 nova_compute[189066]:       <model name='pcie-root-port'/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:       <target chassis='21' port='0x24'/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:       <alias name='pci.21'/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x4'/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:     </controller>
Dec 05 09:32:35 compute-1 nova_compute[189066]:     <controller type='pci' index='22' model='pcie-root-port'>
Dec 05 09:32:35 compute-1 nova_compute[189066]:       <model name='pcie-root-port'/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:       <target chassis='22' port='0x25'/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:       <alias name='pci.22'/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x5'/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:     </controller>
Dec 05 09:32:35 compute-1 nova_compute[189066]:     <controller type='pci' index='23' model='pcie-root-port'>
Dec 05 09:32:35 compute-1 nova_compute[189066]:       <model name='pcie-root-port'/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:       <target chassis='23' port='0x26'/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:       <alias name='pci.23'/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x6'/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:     </controller>
Dec 05 09:32:35 compute-1 nova_compute[189066]:     <controller type='pci' index='24' model='pcie-root-port'>
Dec 05 09:32:35 compute-1 nova_compute[189066]:       <model name='pcie-root-port'/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:       <target chassis='24' port='0x27'/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:       <alias name='pci.24'/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x7'/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:     </controller>
Dec 05 09:32:35 compute-1 nova_compute[189066]:     <controller type='pci' index='25' model='pcie-root-port'>
Dec 05 09:32:35 compute-1 nova_compute[189066]:       <model name='pcie-root-port'/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:       <target chassis='25' port='0x28'/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:       <alias name='pci.25'/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x05' function='0x0'/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:     </controller>
Dec 05 09:32:35 compute-1 nova_compute[189066]:     <controller type='pci' index='26' model='pcie-to-pci-bridge'>
Dec 05 09:32:35 compute-1 nova_compute[189066]:       <model name='pcie-pci-bridge'/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:       <alias name='pci.26'/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:       <address type='pci' domain='0x0000' bus='0x01' slot='0x00' function='0x0'/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:     </controller>
Dec 05 09:32:35 compute-1 nova_compute[189066]:     <controller type='usb' index='0' model='piix3-uhci'>
Dec 05 09:32:35 compute-1 nova_compute[189066]:       <alias name='usb'/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:       <address type='pci' domain='0x0000' bus='0x1a' slot='0x01' function='0x0'/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:     </controller>
Dec 05 09:32:35 compute-1 nova_compute[189066]:     <controller type='sata' index='0'>
Dec 05 09:32:35 compute-1 nova_compute[189066]:       <alias name='ide'/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x1f' function='0x2'/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:     </controller>
Dec 05 09:32:35 compute-1 nova_compute[189066]:     <interface type='ethernet'>
Dec 05 09:32:35 compute-1 nova_compute[189066]:       <mac address='fa:16:3e:7f:68:29'/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:       <target dev='tapd7d3d638-32'/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:       <model type='virtio'/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:       <driver name='vhost' rx_queue_size='512'/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:       <mtu size='1442'/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:       <alias name='net0'/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:       <address type='pci' domain='0x0000' bus='0x02' slot='0x00' function='0x0'/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:     </interface>
Dec 05 09:32:35 compute-1 nova_compute[189066]:     <serial type='pty'>
Dec 05 09:32:35 compute-1 nova_compute[189066]:       <source path='/dev/pts/1'/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:       <log file='/var/lib/nova/instances/b843e130-e156-47b6-8a2a-d4811973b93a/console.log' append='off'/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:       <target type='isa-serial' port='0'>
Dec 05 09:32:35 compute-1 nova_compute[189066]:         <model name='isa-serial'/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:       </target>
Dec 05 09:32:35 compute-1 nova_compute[189066]:       <alias name='serial0'/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:     </serial>
Dec 05 09:32:35 compute-1 nova_compute[189066]:     <console type='pty' tty='/dev/pts/1'>
Dec 05 09:32:35 compute-1 nova_compute[189066]:       <source path='/dev/pts/1'/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:       <log file='/var/lib/nova/instances/b843e130-e156-47b6-8a2a-d4811973b93a/console.log' append='off'/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:       <target type='serial' port='0'/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:       <alias name='serial0'/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:     </console>
Dec 05 09:32:35 compute-1 nova_compute[189066]:     <input type='tablet' bus='usb'>
Dec 05 09:32:35 compute-1 nova_compute[189066]:       <alias name='input0'/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:       <address type='usb' bus='0' port='1'/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:     </input>
Dec 05 09:32:35 compute-1 nova_compute[189066]:     <input type='mouse' bus='ps2'>
Dec 05 09:32:35 compute-1 nova_compute[189066]:       <alias name='input1'/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:     </input>
Dec 05 09:32:35 compute-1 nova_compute[189066]:     <input type='keyboard' bus='ps2'>
Dec 05 09:32:35 compute-1 nova_compute[189066]:       <alias name='input2'/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:     </input>
Dec 05 09:32:35 compute-1 nova_compute[189066]:     <graphics type='vnc' port='5901' autoport='yes' listen='::0'>
Dec 05 09:32:35 compute-1 nova_compute[189066]:       <listen type='address' address='::0'/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:     </graphics>
Dec 05 09:32:35 compute-1 nova_compute[189066]:     <audio id='1' type='none'/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:     <video>
Dec 05 09:32:35 compute-1 nova_compute[189066]:       <model type='virtio' heads='1' primary='yes'/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:       <alias name='video0'/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x01' function='0x0'/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:     </video>
Dec 05 09:32:35 compute-1 nova_compute[189066]:     <watchdog model='itco' action='reset'>
Dec 05 09:32:35 compute-1 nova_compute[189066]:       <alias name='watchdog0'/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:     </watchdog>
Dec 05 09:32:35 compute-1 nova_compute[189066]:     <memballoon model='virtio'>
Dec 05 09:32:35 compute-1 nova_compute[189066]:       <stats period='10'/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:       <alias name='balloon0'/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:       <address type='pci' domain='0x0000' bus='0x04' slot='0x00' function='0x0'/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:     </memballoon>
Dec 05 09:32:35 compute-1 nova_compute[189066]:     <rng model='virtio'>
Dec 05 09:32:35 compute-1 nova_compute[189066]:       <backend model='random'>/dev/urandom</backend>
Dec 05 09:32:35 compute-1 nova_compute[189066]:       <alias name='rng0'/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:       <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x0'/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:     </rng>
Dec 05 09:32:35 compute-1 nova_compute[189066]:   </devices>
Dec 05 09:32:35 compute-1 nova_compute[189066]:   <seclabel type='dynamic' model='selinux' relabel='yes'>
Dec 05 09:32:35 compute-1 nova_compute[189066]:     <label>system_u:system_r:svirt_t:s0:c183,c365</label>
Dec 05 09:32:35 compute-1 nova_compute[189066]:     <imagelabel>system_u:object_r:svirt_image_t:s0:c183,c365</imagelabel>
Dec 05 09:32:35 compute-1 nova_compute[189066]:   </seclabel>
Dec 05 09:32:35 compute-1 nova_compute[189066]:   <seclabel type='dynamic' model='dac' relabel='yes'>
Dec 05 09:32:35 compute-1 nova_compute[189066]:     <label>+107:+107</label>
Dec 05 09:32:35 compute-1 nova_compute[189066]:     <imagelabel>+107:+107</imagelabel>
Dec 05 09:32:35 compute-1 nova_compute[189066]:   </seclabel>
Dec 05 09:32:35 compute-1 nova_compute[189066]: </domain>
Dec 05 09:32:35 compute-1 nova_compute[189066]:  get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:282
Dec 05 09:32:35 compute-1 nova_compute[189066]: 2025-12-05 09:32:35.540 189070 DEBUG nova.virt.libvirt.guest [req-363f7fe4-b572-41ff-862b-035379403b87 req-0015fbf8-1f38-4066-8a6b-d7aaa56ae3e9 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:af:c1:b0"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap68d7a18a-f1"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257
Dec 05 09:32:35 compute-1 nova_compute[189066]: 2025-12-05 09:32:35.544 189070 DEBUG nova.virt.libvirt.guest [req-363f7fe4-b572-41ff-862b-035379403b87 req-0015fbf8-1f38-4066-8a6b-d7aaa56ae3e9 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] interface for config: <interface type="ethernet"><mac address="fa:16:3e:af:c1:b0"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap68d7a18a-f1"/></interface>not found in domain: <domain type='kvm' id='12'>
Dec 05 09:32:35 compute-1 nova_compute[189066]:   <name>instance-00000017</name>
Dec 05 09:32:35 compute-1 nova_compute[189066]:   <uuid>b843e130-e156-47b6-8a2a-d4811973b93a</uuid>
Dec 05 09:32:35 compute-1 nova_compute[189066]:   <metadata>
Dec 05 09:32:35 compute-1 nova_compute[189066]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1" xmlns:instance="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 05 09:32:35 compute-1 nova_compute[189066]:   <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:   <nova:name>tempest-TestNetworkBasicOps-server-1503446723</nova:name>
Dec 05 09:32:35 compute-1 nova_compute[189066]:   <nova:creationTime>2025-12-05 09:32:33</nova:creationTime>
Dec 05 09:32:35 compute-1 nova_compute[189066]:   <nova:flavor name="m1.nano">
Dec 05 09:32:35 compute-1 nova_compute[189066]:     <nova:memory>128</nova:memory>
Dec 05 09:32:35 compute-1 nova_compute[189066]:     <nova:disk>1</nova:disk>
Dec 05 09:32:35 compute-1 nova_compute[189066]:     <nova:swap>0</nova:swap>
Dec 05 09:32:35 compute-1 nova_compute[189066]:     <nova:ephemeral>0</nova:ephemeral>
Dec 05 09:32:35 compute-1 nova_compute[189066]:     <nova:vcpus>1</nova:vcpus>
Dec 05 09:32:35 compute-1 nova_compute[189066]:   </nova:flavor>
Dec 05 09:32:35 compute-1 nova_compute[189066]:   <nova:owner>
Dec 05 09:32:35 compute-1 nova_compute[189066]:     <nova:user uuid="822a2d80e80e46d9a5a49e1e9560e0d9">tempest-TestNetworkBasicOps-1341993432-project-member</nova:user>
Dec 05 09:32:35 compute-1 nova_compute[189066]:     <nova:project uuid="f918144b49634ed5a43d75f8f7d194d3">tempest-TestNetworkBasicOps-1341993432</nova:project>
Dec 05 09:32:35 compute-1 nova_compute[189066]:   </nova:owner>
Dec 05 09:32:35 compute-1 nova_compute[189066]:   <nova:root type="image" uuid="3ebffd97-b242-42d7-b245-ebdaf8e4377c"/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:   <nova:ports>
Dec 05 09:32:35 compute-1 nova_compute[189066]:     <nova:port uuid="d7d3d638-32a8-4d3e-abce-d0e8942a6c22">
Dec 05 09:32:35 compute-1 nova_compute[189066]:       <nova:ip type="fixed" address="10.100.0.10" ipVersion="4"/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:     </nova:port>
Dec 05 09:32:35 compute-1 nova_compute[189066]:   </nova:ports>
Dec 05 09:32:35 compute-1 nova_compute[189066]: </nova:instance>
Dec 05 09:32:35 compute-1 nova_compute[189066]:   </metadata>
Dec 05 09:32:35 compute-1 nova_compute[189066]:   <memory unit='KiB'>131072</memory>
Dec 05 09:32:35 compute-1 nova_compute[189066]:   <currentMemory unit='KiB'>131072</currentMemory>
Dec 05 09:32:35 compute-1 nova_compute[189066]:   <vcpu placement='static'>1</vcpu>
Dec 05 09:32:35 compute-1 nova_compute[189066]:   <resource>
Dec 05 09:32:35 compute-1 nova_compute[189066]:     <partition>/machine</partition>
Dec 05 09:32:35 compute-1 nova_compute[189066]:   </resource>
Dec 05 09:32:35 compute-1 nova_compute[189066]:   <sysinfo type='smbios'>
Dec 05 09:32:35 compute-1 nova_compute[189066]:     <system>
Dec 05 09:32:35 compute-1 nova_compute[189066]:       <entry name='manufacturer'>RDO</entry>
Dec 05 09:32:35 compute-1 nova_compute[189066]:       <entry name='product'>OpenStack Compute</entry>
Dec 05 09:32:35 compute-1 nova_compute[189066]:       <entry name='version'>27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec 05 09:32:35 compute-1 nova_compute[189066]:       <entry name='serial'>b843e130-e156-47b6-8a2a-d4811973b93a</entry>
Dec 05 09:32:35 compute-1 nova_compute[189066]:       <entry name='uuid'>b843e130-e156-47b6-8a2a-d4811973b93a</entry>
Dec 05 09:32:35 compute-1 nova_compute[189066]:       <entry name='family'>Virtual Machine</entry>
Dec 05 09:32:35 compute-1 nova_compute[189066]:     </system>
Dec 05 09:32:35 compute-1 nova_compute[189066]:   </sysinfo>
Dec 05 09:32:35 compute-1 nova_compute[189066]:   <os>
Dec 05 09:32:35 compute-1 nova_compute[189066]:     <type arch='x86_64' machine='pc-q35-rhel9.8.0'>hvm</type>
Dec 05 09:32:35 compute-1 nova_compute[189066]:     <boot dev='hd'/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:     <smbios mode='sysinfo'/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:   </os>
Dec 05 09:32:35 compute-1 nova_compute[189066]:   <features>
Dec 05 09:32:35 compute-1 nova_compute[189066]:     <acpi/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:     <apic/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:     <vmcoreinfo state='on'/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:   </features>
Dec 05 09:32:35 compute-1 nova_compute[189066]:   <cpu mode='custom' match='exact' check='full'>
Dec 05 09:32:35 compute-1 nova_compute[189066]:     <model fallback='forbid'>Nehalem</model>
Dec 05 09:32:35 compute-1 nova_compute[189066]:     <topology sockets='1' dies='1' clusters='1' cores='1' threads='1'/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:     <feature policy='require' name='x2apic'/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:     <feature policy='require' name='hypervisor'/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:     <feature policy='require' name='vme'/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:   </cpu>
Dec 05 09:32:35 compute-1 nova_compute[189066]:   <clock offset='utc'>
Dec 05 09:32:35 compute-1 nova_compute[189066]:     <timer name='pit' tickpolicy='delay'/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:     <timer name='rtc' tickpolicy='catchup'/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:     <timer name='hpet' present='no'/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:   </clock>
Dec 05 09:32:35 compute-1 nova_compute[189066]:   <on_poweroff>destroy</on_poweroff>
Dec 05 09:32:35 compute-1 nova_compute[189066]:   <on_reboot>restart</on_reboot>
Dec 05 09:32:35 compute-1 nova_compute[189066]:   <on_crash>destroy</on_crash>
Dec 05 09:32:35 compute-1 nova_compute[189066]:   <devices>
Dec 05 09:32:35 compute-1 nova_compute[189066]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Dec 05 09:32:35 compute-1 nova_compute[189066]:     <disk type='file' device='disk'>
Dec 05 09:32:35 compute-1 nova_compute[189066]:       <driver name='qemu' type='qcow2' cache='none'/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:       <source file='/var/lib/nova/instances/b843e130-e156-47b6-8a2a-d4811973b93a/disk' index='2'/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:       <backingStore type='file' index='3'>
Dec 05 09:32:35 compute-1 nova_compute[189066]:         <format type='raw'/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:         <source file='/var/lib/nova/instances/_base/5b1bdb5cc50b9bf43ebe3094e9279a12a18df0ce'/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:         <backingStore/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:       </backingStore>
Dec 05 09:32:35 compute-1 nova_compute[189066]:       <target dev='vda' bus='virtio'/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:       <alias name='virtio-disk0'/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:       <address type='pci' domain='0x0000' bus='0x03' slot='0x00' function='0x0'/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:     </disk>
Dec 05 09:32:35 compute-1 nova_compute[189066]:     <disk type='file' device='cdrom'>
Dec 05 09:32:35 compute-1 nova_compute[189066]:       <driver name='qemu' type='raw' cache='none'/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:       <source file='/var/lib/nova/instances/b843e130-e156-47b6-8a2a-d4811973b93a/disk.config' index='1'/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:       <backingStore/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:       <target dev='sda' bus='sata'/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:       <readonly/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:       <alias name='sata0-0-0'/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:       <address type='drive' controller='0' bus='0' target='0' unit='0'/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:     </disk>
Dec 05 09:32:35 compute-1 nova_compute[189066]:     <controller type='pci' index='0' model='pcie-root'>
Dec 05 09:32:35 compute-1 nova_compute[189066]:       <alias name='pcie.0'/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:     </controller>
Dec 05 09:32:35 compute-1 nova_compute[189066]:     <controller type='pci' index='1' model='pcie-root-port'>
Dec 05 09:32:35 compute-1 nova_compute[189066]:       <model name='pcie-root-port'/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:       <target chassis='1' port='0x10'/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:       <alias name='pci.1'/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x0' multifunction='on'/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:     </controller>
Dec 05 09:32:35 compute-1 nova_compute[189066]:     <controller type='pci' index='2' model='pcie-root-port'>
Dec 05 09:32:35 compute-1 nova_compute[189066]:       <model name='pcie-root-port'/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:       <target chassis='2' port='0x11'/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:       <alias name='pci.2'/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x1'/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:     </controller>
Dec 05 09:32:35 compute-1 nova_compute[189066]:     <controller type='pci' index='3' model='pcie-root-port'>
Dec 05 09:32:35 compute-1 nova_compute[189066]:       <model name='pcie-root-port'/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:       <target chassis='3' port='0x12'/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:       <alias name='pci.3'/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x2'/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:     </controller>
Dec 05 09:32:35 compute-1 nova_compute[189066]:     <controller type='pci' index='4' model='pcie-root-port'>
Dec 05 09:32:35 compute-1 nova_compute[189066]:       <model name='pcie-root-port'/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:       <target chassis='4' port='0x13'/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:       <alias name='pci.4'/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x3'/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:     </controller>
Dec 05 09:32:35 compute-1 nova_compute[189066]:     <controller type='pci' index='5' model='pcie-root-port'>
Dec 05 09:32:35 compute-1 nova_compute[189066]:       <model name='pcie-root-port'/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:       <target chassis='5' port='0x14'/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:       <alias name='pci.5'/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x4'/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:     </controller>
Dec 05 09:32:35 compute-1 nova_compute[189066]:     <controller type='pci' index='6' model='pcie-root-port'>
Dec 05 09:32:35 compute-1 nova_compute[189066]:       <model name='pcie-root-port'/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:       <target chassis='6' port='0x15'/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:       <alias name='pci.6'/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x5'/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:     </controller>
Dec 05 09:32:35 compute-1 nova_compute[189066]:     <controller type='pci' index='7' model='pcie-root-port'>
Dec 05 09:32:35 compute-1 nova_compute[189066]:       <model name='pcie-root-port'/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:       <target chassis='7' port='0x16'/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:       <alias name='pci.7'/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x6'/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:     </controller>
Dec 05 09:32:35 compute-1 nova_compute[189066]:     <controller type='pci' index='8' model='pcie-root-port'>
Dec 05 09:32:35 compute-1 nova_compute[189066]:       <model name='pcie-root-port'/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:       <target chassis='8' port='0x17'/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:       <alias name='pci.8'/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x7'/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:     </controller>
Dec 05 09:32:35 compute-1 nova_compute[189066]:     <controller type='pci' index='9' model='pcie-root-port'>
Dec 05 09:32:35 compute-1 nova_compute[189066]:       <model name='pcie-root-port'/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:       <target chassis='9' port='0x18'/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:       <alias name='pci.9'/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0' multifunction='on'/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:     </controller>
Dec 05 09:32:35 compute-1 nova_compute[189066]:     <controller type='pci' index='10' model='pcie-root-port'>
Dec 05 09:32:35 compute-1 nova_compute[189066]:       <model name='pcie-root-port'/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:       <target chassis='10' port='0x19'/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:       <alias name='pci.10'/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x1'/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:     </controller>
Dec 05 09:32:35 compute-1 nova_compute[189066]:     <controller type='pci' index='11' model='pcie-root-port'>
Dec 05 09:32:35 compute-1 nova_compute[189066]:       <model name='pcie-root-port'/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:       <target chassis='11' port='0x1a'/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:       <alias name='pci.11'/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x2'/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:     </controller>
Dec 05 09:32:35 compute-1 nova_compute[189066]:     <controller type='pci' index='12' model='pcie-root-port'>
Dec 05 09:32:35 compute-1 nova_compute[189066]:       <model name='pcie-root-port'/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:       <target chassis='12' port='0x1b'/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:       <alias name='pci.12'/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x3'/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:     </controller>
Dec 05 09:32:35 compute-1 nova_compute[189066]:     <controller type='pci' index='13' model='pcie-root-port'>
Dec 05 09:32:35 compute-1 nova_compute[189066]:       <model name='pcie-root-port'/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:       <target chassis='13' port='0x1c'/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:       <alias name='pci.13'/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x4'/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:     </controller>
Dec 05 09:32:35 compute-1 nova_compute[189066]:     <controller type='pci' index='14' model='pcie-root-port'>
Dec 05 09:32:35 compute-1 nova_compute[189066]:       <model name='pcie-root-port'/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:       <target chassis='14' port='0x1d'/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:       <alias name='pci.14'/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x5'/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:     </controller>
Dec 05 09:32:35 compute-1 nova_compute[189066]:     <controller type='pci' index='15' model='pcie-root-port'>
Dec 05 09:32:35 compute-1 nova_compute[189066]:       <model name='pcie-root-port'/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:       <target chassis='15' port='0x1e'/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:       <alias name='pci.15'/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x6'/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:     </controller>
Dec 05 09:32:35 compute-1 nova_compute[189066]:     <controller type='pci' index='16' model='pcie-root-port'>
Dec 05 09:32:35 compute-1 nova_compute[189066]:       <model name='pcie-root-port'/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:       <target chassis='16' port='0x1f'/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:       <alias name='pci.16'/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x7'/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:     </controller>
Dec 05 09:32:35 compute-1 nova_compute[189066]:     <controller type='pci' index='17' model='pcie-root-port'>
Dec 05 09:32:35 compute-1 nova_compute[189066]:       <model name='pcie-root-port'/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:       <target chassis='17' port='0x20'/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:       <alias name='pci.17'/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x0' multifunction='on'/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:     </controller>
Dec 05 09:32:35 compute-1 nova_compute[189066]:     <controller type='pci' index='18' model='pcie-root-port'>
Dec 05 09:32:35 compute-1 nova_compute[189066]:       <model name='pcie-root-port'/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:       <target chassis='18' port='0x21'/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:       <alias name='pci.18'/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x1'/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:     </controller>
Dec 05 09:32:35 compute-1 nova_compute[189066]:     <controller type='pci' index='19' model='pcie-root-port'>
Dec 05 09:32:35 compute-1 nova_compute[189066]:       <model name='pcie-root-port'/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:       <target chassis='19' port='0x22'/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:       <alias name='pci.19'/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x2'/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:     </controller>
Dec 05 09:32:35 compute-1 nova_compute[189066]:     <controller type='pci' index='20' model='pcie-root-port'>
Dec 05 09:32:35 compute-1 nova_compute[189066]:       <model name='pcie-root-port'/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:       <target chassis='20' port='0x23'/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:       <alias name='pci.20'/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x3'/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:     </controller>
Dec 05 09:32:35 compute-1 nova_compute[189066]:     <controller type='pci' index='21' model='pcie-root-port'>
Dec 05 09:32:35 compute-1 nova_compute[189066]:       <model name='pcie-root-port'/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:       <target chassis='21' port='0x24'/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:       <alias name='pci.21'/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x4'/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:     </controller>
Dec 05 09:32:35 compute-1 nova_compute[189066]:     <controller type='pci' index='22' model='pcie-root-port'>
Dec 05 09:32:35 compute-1 nova_compute[189066]:       <model name='pcie-root-port'/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:       <target chassis='22' port='0x25'/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:       <alias name='pci.22'/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x5'/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:     </controller>
Dec 05 09:32:35 compute-1 nova_compute[189066]:     <controller type='pci' index='23' model='pcie-root-port'>
Dec 05 09:32:35 compute-1 nova_compute[189066]:       <model name='pcie-root-port'/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:       <target chassis='23' port='0x26'/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:       <alias name='pci.23'/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x6'/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:     </controller>
Dec 05 09:32:35 compute-1 nova_compute[189066]:     <controller type='pci' index='24' model='pcie-root-port'>
Dec 05 09:32:35 compute-1 nova_compute[189066]:       <model name='pcie-root-port'/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:       <target chassis='24' port='0x27'/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:       <alias name='pci.24'/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x7'/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:     </controller>
Dec 05 09:32:35 compute-1 nova_compute[189066]:     <controller type='pci' index='25' model='pcie-root-port'>
Dec 05 09:32:35 compute-1 nova_compute[189066]:       <model name='pcie-root-port'/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:       <target chassis='25' port='0x28'/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:       <alias name='pci.25'/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x05' function='0x0'/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:     </controller>
Dec 05 09:32:35 compute-1 nova_compute[189066]:     <controller type='pci' index='26' model='pcie-to-pci-bridge'>
Dec 05 09:32:35 compute-1 nova_compute[189066]:       <model name='pcie-pci-bridge'/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:       <alias name='pci.26'/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:       <address type='pci' domain='0x0000' bus='0x01' slot='0x00' function='0x0'/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:     </controller>
Dec 05 09:32:35 compute-1 nova_compute[189066]:     <controller type='usb' index='0' model='piix3-uhci'>
Dec 05 09:32:35 compute-1 nova_compute[189066]:       <alias name='usb'/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:       <address type='pci' domain='0x0000' bus='0x1a' slot='0x01' function='0x0'/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:     </controller>
Dec 05 09:32:35 compute-1 nova_compute[189066]:     <controller type='sata' index='0'>
Dec 05 09:32:35 compute-1 nova_compute[189066]:       <alias name='ide'/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x1f' function='0x2'/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:     </controller>
Dec 05 09:32:35 compute-1 nova_compute[189066]:     <interface type='ethernet'>
Dec 05 09:32:35 compute-1 nova_compute[189066]:       <mac address='fa:16:3e:7f:68:29'/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:       <target dev='tapd7d3d638-32'/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:       <model type='virtio'/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:       <driver name='vhost' rx_queue_size='512'/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:       <mtu size='1442'/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:       <alias name='net0'/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:       <address type='pci' domain='0x0000' bus='0x02' slot='0x00' function='0x0'/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:     </interface>
Dec 05 09:32:35 compute-1 nova_compute[189066]:     <serial type='pty'>
Dec 05 09:32:35 compute-1 nova_compute[189066]:       <source path='/dev/pts/1'/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:       <log file='/var/lib/nova/instances/b843e130-e156-47b6-8a2a-d4811973b93a/console.log' append='off'/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:       <target type='isa-serial' port='0'>
Dec 05 09:32:35 compute-1 nova_compute[189066]:         <model name='isa-serial'/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:       </target>
Dec 05 09:32:35 compute-1 nova_compute[189066]:       <alias name='serial0'/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:     </serial>
Dec 05 09:32:35 compute-1 nova_compute[189066]:     <console type='pty' tty='/dev/pts/1'>
Dec 05 09:32:35 compute-1 nova_compute[189066]:       <source path='/dev/pts/1'/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:       <log file='/var/lib/nova/instances/b843e130-e156-47b6-8a2a-d4811973b93a/console.log' append='off'/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:       <target type='serial' port='0'/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:       <alias name='serial0'/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:     </console>
Dec 05 09:32:35 compute-1 nova_compute[189066]:     <input type='tablet' bus='usb'>
Dec 05 09:32:35 compute-1 nova_compute[189066]:       <alias name='input0'/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:       <address type='usb' bus='0' port='1'/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:     </input>
Dec 05 09:32:35 compute-1 nova_compute[189066]:     <input type='mouse' bus='ps2'>
Dec 05 09:32:35 compute-1 nova_compute[189066]:       <alias name='input1'/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:     </input>
Dec 05 09:32:35 compute-1 nova_compute[189066]:     <input type='keyboard' bus='ps2'>
Dec 05 09:32:35 compute-1 nova_compute[189066]:       <alias name='input2'/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:     </input>
Dec 05 09:32:35 compute-1 nova_compute[189066]:     <graphics type='vnc' port='5901' autoport='yes' listen='::0'>
Dec 05 09:32:35 compute-1 nova_compute[189066]:       <listen type='address' address='::0'/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:     </graphics>
Dec 05 09:32:35 compute-1 nova_compute[189066]:     <audio id='1' type='none'/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:     <video>
Dec 05 09:32:35 compute-1 nova_compute[189066]:       <model type='virtio' heads='1' primary='yes'/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:       <alias name='video0'/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x01' function='0x0'/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:     </video>
Dec 05 09:32:35 compute-1 nova_compute[189066]:     <watchdog model='itco' action='reset'>
Dec 05 09:32:35 compute-1 nova_compute[189066]:       <alias name='watchdog0'/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:     </watchdog>
Dec 05 09:32:35 compute-1 nova_compute[189066]:     <memballoon model='virtio'>
Dec 05 09:32:35 compute-1 nova_compute[189066]:       <stats period='10'/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:       <alias name='balloon0'/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:       <address type='pci' domain='0x0000' bus='0x04' slot='0x00' function='0x0'/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:     </memballoon>
Dec 05 09:32:35 compute-1 nova_compute[189066]:     <rng model='virtio'>
Dec 05 09:32:35 compute-1 nova_compute[189066]:       <backend model='random'>/dev/urandom</backend>
Dec 05 09:32:35 compute-1 nova_compute[189066]:       <alias name='rng0'/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:       <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x0'/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:     </rng>
Dec 05 09:32:35 compute-1 nova_compute[189066]:   </devices>
Dec 05 09:32:35 compute-1 nova_compute[189066]:   <seclabel type='dynamic' model='selinux' relabel='yes'>
Dec 05 09:32:35 compute-1 nova_compute[189066]:     <label>system_u:system_r:svirt_t:s0:c183,c365</label>
Dec 05 09:32:35 compute-1 nova_compute[189066]:     <imagelabel>system_u:object_r:svirt_image_t:s0:c183,c365</imagelabel>
Dec 05 09:32:35 compute-1 nova_compute[189066]:   </seclabel>
Dec 05 09:32:35 compute-1 nova_compute[189066]:   <seclabel type='dynamic' model='dac' relabel='yes'>
Dec 05 09:32:35 compute-1 nova_compute[189066]:     <label>+107:+107</label>
Dec 05 09:32:35 compute-1 nova_compute[189066]:     <imagelabel>+107:+107</imagelabel>
Dec 05 09:32:35 compute-1 nova_compute[189066]:   </seclabel>
Dec 05 09:32:35 compute-1 nova_compute[189066]: </domain>
Dec 05 09:32:35 compute-1 nova_compute[189066]:  get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:282
Dec 05 09:32:35 compute-1 nova_compute[189066]: 2025-12-05 09:32:35.544 189070 WARNING nova.virt.libvirt.driver [req-363f7fe4-b572-41ff-862b-035379403b87 req-0015fbf8-1f38-4066-8a6b-d7aaa56ae3e9 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: b843e130-e156-47b6-8a2a-d4811973b93a] Detaching interface fa:16:3e:af:c1:b0 failed because the device is no longer found on the guest.: nova.exception.DeviceNotFound: Device 'tap68d7a18a-f1' not found.
Dec 05 09:32:35 compute-1 nova_compute[189066]: 2025-12-05 09:32:35.545 189070 DEBUG nova.virt.libvirt.vif [req-363f7fe4-b572-41ff-862b-035379403b87 req-0015fbf8-1f38-4066-8a6b-d7aaa56ae3e9 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-05T09:31:38Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1503446723',display_name='tempest-TestNetworkBasicOps-server-1503446723',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1503446723',id=23,image_ref='3ebffd97-b242-42d7-b245-ebdaf8e4377c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBAxL8qwYiwFQ90XKpcC6fYNHvnM/WUgqEtm6wBf8f4M0HY8C3R6RbtlwIDxS0SDgqUTn2GT88UA7fyLkCPv6NpqTCLHrrPHnPjNeXNwF6PVPzq8XOcGedORn9QBGLd5VXw==',key_name='tempest-TestNetworkBasicOps-1643628689',keypairs=<?>,launch_index=0,launched_at=2025-12-05T09:31:53Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata=<?>,migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='f918144b49634ed5a43d75f8f7d194d3',ramdisk_id='',reservation_id='r-0tfmxwfc',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='3ebffd97-b242-42d7-b245-ebdaf8e4377c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1341993432',owner_user_name='tempest-TestNetworkBasicOps-1341993432-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-12-05T09:31:53Z,user_data=None,user_id='822a2d80e80e46d9a5a49e1e9560e0d9',uuid=b843e130-e156-47b6-8a2a-d4811973b93a,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "68d7a18a-f1fa-4b3c-bad0-34623cfb7eb5", "address": "fa:16:3e:af:c1:b0", "network": {"id": "644a5de2-2e9c-48ea-b27d-a8992c881d84", "bridge": "br-int", "label": "tempest-network-smoke--1022632946", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.18", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f918144b49634ed5a43d75f8f7d194d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap68d7a18a-f1", "ovs_interfaceid": "68d7a18a-f1fa-4b3c-bad0-34623cfb7eb5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Dec 05 09:32:35 compute-1 nova_compute[189066]: 2025-12-05 09:32:35.545 189070 DEBUG nova.network.os_vif_util [req-363f7fe4-b572-41ff-862b-035379403b87 req-0015fbf8-1f38-4066-8a6b-d7aaa56ae3e9 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Converting VIF {"id": "68d7a18a-f1fa-4b3c-bad0-34623cfb7eb5", "address": "fa:16:3e:af:c1:b0", "network": {"id": "644a5de2-2e9c-48ea-b27d-a8992c881d84", "bridge": "br-int", "label": "tempest-network-smoke--1022632946", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.18", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f918144b49634ed5a43d75f8f7d194d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap68d7a18a-f1", "ovs_interfaceid": "68d7a18a-f1fa-4b3c-bad0-34623cfb7eb5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 09:32:35 compute-1 nova_compute[189066]: 2025-12-05 09:32:35.546 189070 DEBUG nova.network.os_vif_util [req-363f7fe4-b572-41ff-862b-035379403b87 req-0015fbf8-1f38-4066-8a6b-d7aaa56ae3e9 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:af:c1:b0,bridge_name='br-int',has_traffic_filtering=True,id=68d7a18a-f1fa-4b3c-bad0-34623cfb7eb5,network=Network(644a5de2-2e9c-48ea-b27d-a8992c881d84),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap68d7a18a-f1') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 09:32:35 compute-1 nova_compute[189066]: 2025-12-05 09:32:35.546 189070 DEBUG os_vif [req-363f7fe4-b572-41ff-862b-035379403b87 req-0015fbf8-1f38-4066-8a6b-d7aaa56ae3e9 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:af:c1:b0,bridge_name='br-int',has_traffic_filtering=True,id=68d7a18a-f1fa-4b3c-bad0-34623cfb7eb5,network=Network(644a5de2-2e9c-48ea-b27d-a8992c881d84),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap68d7a18a-f1') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Dec 05 09:32:35 compute-1 nova_compute[189066]: 2025-12-05 09:32:35.548 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:32:35 compute-1 nova_compute[189066]: 2025-12-05 09:32:35.548 189070 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap68d7a18a-f1, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 09:32:35 compute-1 nova_compute[189066]: 2025-12-05 09:32:35.548 189070 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 09:32:35 compute-1 nova_compute[189066]: 2025-12-05 09:32:35.550 189070 INFO os_vif [req-363f7fe4-b572-41ff-862b-035379403b87 req-0015fbf8-1f38-4066-8a6b-d7aaa56ae3e9 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:af:c1:b0,bridge_name='br-int',has_traffic_filtering=True,id=68d7a18a-f1fa-4b3c-bad0-34623cfb7eb5,network=Network(644a5de2-2e9c-48ea-b27d-a8992c881d84),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap68d7a18a-f1')
Dec 05 09:32:35 compute-1 nova_compute[189066]: 2025-12-05 09:32:35.551 189070 DEBUG nova.virt.libvirt.guest [req-363f7fe4-b572-41ff-862b-035379403b87 req-0015fbf8-1f38-4066-8a6b-d7aaa56ae3e9 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] set metadata xml: <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 05 09:32:35 compute-1 nova_compute[189066]:   <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:   <nova:name>tempest-TestNetworkBasicOps-server-1503446723</nova:name>
Dec 05 09:32:35 compute-1 nova_compute[189066]:   <nova:creationTime>2025-12-05 09:32:35</nova:creationTime>
Dec 05 09:32:35 compute-1 nova_compute[189066]:   <nova:flavor name="m1.nano">
Dec 05 09:32:35 compute-1 nova_compute[189066]:     <nova:memory>128</nova:memory>
Dec 05 09:32:35 compute-1 nova_compute[189066]:     <nova:disk>1</nova:disk>
Dec 05 09:32:35 compute-1 nova_compute[189066]:     <nova:swap>0</nova:swap>
Dec 05 09:32:35 compute-1 nova_compute[189066]:     <nova:ephemeral>0</nova:ephemeral>
Dec 05 09:32:35 compute-1 nova_compute[189066]:     <nova:vcpus>1</nova:vcpus>
Dec 05 09:32:35 compute-1 nova_compute[189066]:   </nova:flavor>
Dec 05 09:32:35 compute-1 nova_compute[189066]:   <nova:owner>
Dec 05 09:32:35 compute-1 nova_compute[189066]:     <nova:user uuid="822a2d80e80e46d9a5a49e1e9560e0d9">tempest-TestNetworkBasicOps-1341993432-project-member</nova:user>
Dec 05 09:32:35 compute-1 nova_compute[189066]:     <nova:project uuid="f918144b49634ed5a43d75f8f7d194d3">tempest-TestNetworkBasicOps-1341993432</nova:project>
Dec 05 09:32:35 compute-1 nova_compute[189066]:   </nova:owner>
Dec 05 09:32:35 compute-1 nova_compute[189066]:   <nova:root type="image" uuid="3ebffd97-b242-42d7-b245-ebdaf8e4377c"/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:   <nova:ports>
Dec 05 09:32:35 compute-1 nova_compute[189066]:     <nova:port uuid="d7d3d638-32a8-4d3e-abce-d0e8942a6c22">
Dec 05 09:32:35 compute-1 nova_compute[189066]:       <nova:ip type="fixed" address="10.100.0.10" ipVersion="4"/>
Dec 05 09:32:35 compute-1 nova_compute[189066]:     </nova:port>
Dec 05 09:32:35 compute-1 nova_compute[189066]:   </nova:ports>
Dec 05 09:32:35 compute-1 nova_compute[189066]: </nova:instance>
Dec 05 09:32:35 compute-1 nova_compute[189066]:  set_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:359
Dec 05 09:32:38 compute-1 nova_compute[189066]: 2025-12-05 09:32:38.184 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:32:38 compute-1 nova_compute[189066]: 2025-12-05 09:32:38.196 189070 INFO nova.network.neutron [None req-df92c33a-9cf2-435b-9d00-f524add3a333 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: b843e130-e156-47b6-8a2a-d4811973b93a] Port 68d7a18a-f1fa-4b3c-bad0-34623cfb7eb5 from network info_cache is no longer associated with instance in Neutron. Removing from network info_cache.
Dec 05 09:32:38 compute-1 nova_compute[189066]: 2025-12-05 09:32:38.197 189070 DEBUG nova.network.neutron [None req-df92c33a-9cf2-435b-9d00-f524add3a333 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: b843e130-e156-47b6-8a2a-d4811973b93a] Updating instance_info_cache with network_info: [{"id": "d7d3d638-32a8-4d3e-abce-d0e8942a6c22", "address": "fa:16:3e:7f:68:29", "network": {"id": "7e493efd-28a8-4124-9b83-acc24853936f", "bridge": "br-int", "label": "tempest-network-smoke--1424299295", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.205", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f918144b49634ed5a43d75f8f7d194d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd7d3d638-32", "ovs_interfaceid": "d7d3d638-32a8-4d3e-abce-d0e8942a6c22", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 09:32:38 compute-1 nova_compute[189066]: 2025-12-05 09:32:38.333 189070 DEBUG oslo_concurrency.lockutils [None req-df92c33a-9cf2-435b-9d00-f524add3a333 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Releasing lock "refresh_cache-b843e130-e156-47b6-8a2a-d4811973b93a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 09:32:38 compute-1 nova_compute[189066]: 2025-12-05 09:32:38.356 189070 DEBUG oslo_concurrency.lockutils [None req-df92c33a-9cf2-435b-9d00-f524add3a333 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Lock "interface-b843e130-e156-47b6-8a2a-d4811973b93a-68d7a18a-f1fa-4b3c-bad0-34623cfb7eb5" "released" by "nova.compute.manager.ComputeManager.detach_interface.<locals>.do_detach_interface" :: held 5.574s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:32:38 compute-1 nova_compute[189066]: 2025-12-05 09:32:38.962 189070 DEBUG oslo_concurrency.lockutils [None req-55b11bf3-0c42-4c35-99e7-472650d03dbd 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Acquiring lock "b843e130-e156-47b6-8a2a-d4811973b93a" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:32:38 compute-1 nova_compute[189066]: 2025-12-05 09:32:38.962 189070 DEBUG oslo_concurrency.lockutils [None req-55b11bf3-0c42-4c35-99e7-472650d03dbd 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Lock "b843e130-e156-47b6-8a2a-d4811973b93a" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:32:38 compute-1 nova_compute[189066]: 2025-12-05 09:32:38.963 189070 DEBUG oslo_concurrency.lockutils [None req-55b11bf3-0c42-4c35-99e7-472650d03dbd 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Acquiring lock "b843e130-e156-47b6-8a2a-d4811973b93a-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:32:38 compute-1 nova_compute[189066]: 2025-12-05 09:32:38.963 189070 DEBUG oslo_concurrency.lockutils [None req-55b11bf3-0c42-4c35-99e7-472650d03dbd 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Lock "b843e130-e156-47b6-8a2a-d4811973b93a-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:32:38 compute-1 nova_compute[189066]: 2025-12-05 09:32:38.963 189070 DEBUG oslo_concurrency.lockutils [None req-55b11bf3-0c42-4c35-99e7-472650d03dbd 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Lock "b843e130-e156-47b6-8a2a-d4811973b93a-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:32:38 compute-1 nova_compute[189066]: 2025-12-05 09:32:38.965 189070 INFO nova.compute.manager [None req-55b11bf3-0c42-4c35-99e7-472650d03dbd 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: b843e130-e156-47b6-8a2a-d4811973b93a] Terminating instance
Dec 05 09:32:38 compute-1 nova_compute[189066]: 2025-12-05 09:32:38.966 189070 DEBUG nova.compute.manager [None req-55b11bf3-0c42-4c35-99e7-472650d03dbd 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: b843e130-e156-47b6-8a2a-d4811973b93a] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Dec 05 09:32:38 compute-1 kernel: tapd7d3d638-32 (unregistering): left promiscuous mode
Dec 05 09:32:38 compute-1 NetworkManager[55704]: <info>  [1764927158.9922] device (tapd7d3d638-32): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec 05 09:32:39 compute-1 ovn_controller[95809]: 2025-12-05T09:32:39Z|00167|binding|INFO|Releasing lport d7d3d638-32a8-4d3e-abce-d0e8942a6c22 from this chassis (sb_readonly=0)
Dec 05 09:32:39 compute-1 ovn_controller[95809]: 2025-12-05T09:32:39Z|00168|binding|INFO|Setting lport d7d3d638-32a8-4d3e-abce-d0e8942a6c22 down in Southbound
Dec 05 09:32:39 compute-1 nova_compute[189066]: 2025-12-05 09:32:38.999 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:32:39 compute-1 ovn_controller[95809]: 2025-12-05T09:32:39Z|00169|binding|INFO|Removing iface tapd7d3d638-32 ovn-installed in OVS
Dec 05 09:32:39 compute-1 nova_compute[189066]: 2025-12-05 09:32:39.002 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:32:39 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:32:39.009 105272 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:7f:68:29 10.100.0.10'], port_security=['fa:16:3e:7f:68:29 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': 'b843e130-e156-47b6-8a2a-d4811973b93a', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7e493efd-28a8-4124-9b83-acc24853936f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'f918144b49634ed5a43d75f8f7d194d3', 'neutron:revision_number': '4', 'neutron:security_group_ids': '2a619c6b-41fa-43e5-b2d4-5d1e0fb94c32', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=1e94af5e-95ab-48c4-b791-d5e364f548eb, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fc06f54c970>], logical_port=d7d3d638-32a8-4d3e-abce-d0e8942a6c22) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fc06f54c970>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 09:32:39 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:32:39.010 105272 INFO neutron.agent.ovn.metadata.agent [-] Port d7d3d638-32a8-4d3e-abce-d0e8942a6c22 in datapath 7e493efd-28a8-4124-9b83-acc24853936f unbound from our chassis
Dec 05 09:32:39 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:32:39.013 105272 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 7e493efd-28a8-4124-9b83-acc24853936f, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 05 09:32:39 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:32:39.014 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[93b8a53b-9a15-421e-ab1c-fb97c66238b7]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:32:39 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:32:39.015 105272 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-7e493efd-28a8-4124-9b83-acc24853936f namespace which is not needed anymore
Dec 05 09:32:39 compute-1 nova_compute[189066]: 2025-12-05 09:32:39.023 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:32:39 compute-1 systemd[1]: machine-qemu\x2d12\x2dinstance\x2d00000017.scope: Deactivated successfully.
Dec 05 09:32:39 compute-1 systemd[1]: machine-qemu\x2d12\x2dinstance\x2d00000017.scope: Consumed 16.244s CPU time.
Dec 05 09:32:39 compute-1 systemd-machined[154815]: Machine qemu-12-instance-00000017 terminated.
Dec 05 09:32:39 compute-1 neutron-haproxy-ovnmeta-7e493efd-28a8-4124-9b83-acc24853936f[226663]: [NOTICE]   (226674) : haproxy version is 2.8.14-c23fe91
Dec 05 09:32:39 compute-1 neutron-haproxy-ovnmeta-7e493efd-28a8-4124-9b83-acc24853936f[226663]: [NOTICE]   (226674) : path to executable is /usr/sbin/haproxy
Dec 05 09:32:39 compute-1 neutron-haproxy-ovnmeta-7e493efd-28a8-4124-9b83-acc24853936f[226663]: [WARNING]  (226674) : Exiting Master process...
Dec 05 09:32:39 compute-1 neutron-haproxy-ovnmeta-7e493efd-28a8-4124-9b83-acc24853936f[226663]: [WARNING]  (226674) : Exiting Master process...
Dec 05 09:32:39 compute-1 neutron-haproxy-ovnmeta-7e493efd-28a8-4124-9b83-acc24853936f[226663]: [ALERT]    (226674) : Current worker (226682) exited with code 143 (Terminated)
Dec 05 09:32:39 compute-1 neutron-haproxy-ovnmeta-7e493efd-28a8-4124-9b83-acc24853936f[226663]: [WARNING]  (226674) : All workers exited. Exiting... (0)
Dec 05 09:32:39 compute-1 systemd[1]: libpod-e4cc6f7e95e20852e8ccab93d85b207a1077a1bf7f58c822bd1b72ac71502d32.scope: Deactivated successfully.
Dec 05 09:32:39 compute-1 podman[227216]: 2025-12-05 09:32:39.175956398 +0000 UTC m=+0.050032561 container died e4cc6f7e95e20852e8ccab93d85b207a1077a1bf7f58c822bd1b72ac71502d32 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7e493efd-28a8-4124-9b83-acc24853936f, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Dec 05 09:32:39 compute-1 nova_compute[189066]: 2025-12-05 09:32:39.188 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:32:39 compute-1 nova_compute[189066]: 2025-12-05 09:32:39.195 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:32:39 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-e4cc6f7e95e20852e8ccab93d85b207a1077a1bf7f58c822bd1b72ac71502d32-userdata-shm.mount: Deactivated successfully.
Dec 05 09:32:39 compute-1 systemd[1]: var-lib-containers-storage-overlay-761f53ea117b0cb1315f5f911b085bd8d2ffeb6bb05961299e6af01dda09830a-merged.mount: Deactivated successfully.
Dec 05 09:32:39 compute-1 podman[227216]: 2025-12-05 09:32:39.224480813 +0000 UTC m=+0.098556966 container cleanup e4cc6f7e95e20852e8ccab93d85b207a1077a1bf7f58c822bd1b72ac71502d32 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7e493efd-28a8-4124-9b83-acc24853936f, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 05 09:32:39 compute-1 nova_compute[189066]: 2025-12-05 09:32:39.229 189070 INFO nova.virt.libvirt.driver [-] [instance: b843e130-e156-47b6-8a2a-d4811973b93a] Instance destroyed successfully.
Dec 05 09:32:39 compute-1 nova_compute[189066]: 2025-12-05 09:32:39.230 189070 DEBUG nova.objects.instance [None req-55b11bf3-0c42-4c35-99e7-472650d03dbd 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Lazy-loading 'resources' on Instance uuid b843e130-e156-47b6-8a2a-d4811973b93a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 09:32:39 compute-1 systemd[1]: libpod-conmon-e4cc6f7e95e20852e8ccab93d85b207a1077a1bf7f58c822bd1b72ac71502d32.scope: Deactivated successfully.
Dec 05 09:32:39 compute-1 nova_compute[189066]: 2025-12-05 09:32:39.262 189070 DEBUG nova.virt.libvirt.vif [None req-55b11bf3-0c42-4c35-99e7-472650d03dbd 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-05T09:31:38Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1503446723',display_name='tempest-TestNetworkBasicOps-server-1503446723',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1503446723',id=23,image_ref='3ebffd97-b242-42d7-b245-ebdaf8e4377c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBAxL8qwYiwFQ90XKpcC6fYNHvnM/WUgqEtm6wBf8f4M0HY8C3R6RbtlwIDxS0SDgqUTn2GT88UA7fyLkCPv6NpqTCLHrrPHnPjNeXNwF6PVPzq8XOcGedORn9QBGLd5VXw==',key_name='tempest-TestNetworkBasicOps-1643628689',keypairs=<?>,launch_index=0,launched_at=2025-12-05T09:31:53Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='f918144b49634ed5a43d75f8f7d194d3',ramdisk_id='',reservation_id='r-0tfmxwfc',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='3ebffd97-b242-42d7-b245-ebdaf8e4377c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1341993432',owner_user_name='tempest-TestNetworkBasicOps-1341993432-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-05T09:31:53Z,user_data=None,user_id='822a2d80e80e46d9a5a49e1e9560e0d9',uuid=b843e130-e156-47b6-8a2a-d4811973b93a,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "d7d3d638-32a8-4d3e-abce-d0e8942a6c22", "address": "fa:16:3e:7f:68:29", "network": {"id": "7e493efd-28a8-4124-9b83-acc24853936f", "bridge": "br-int", "label": "tempest-network-smoke--1424299295", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.205", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f918144b49634ed5a43d75f8f7d194d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd7d3d638-32", "ovs_interfaceid": "d7d3d638-32a8-4d3e-abce-d0e8942a6c22", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Dec 05 09:32:39 compute-1 nova_compute[189066]: 2025-12-05 09:32:39.263 189070 DEBUG nova.network.os_vif_util [None req-55b11bf3-0c42-4c35-99e7-472650d03dbd 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Converting VIF {"id": "d7d3d638-32a8-4d3e-abce-d0e8942a6c22", "address": "fa:16:3e:7f:68:29", "network": {"id": "7e493efd-28a8-4124-9b83-acc24853936f", "bridge": "br-int", "label": "tempest-network-smoke--1424299295", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.205", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f918144b49634ed5a43d75f8f7d194d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd7d3d638-32", "ovs_interfaceid": "d7d3d638-32a8-4d3e-abce-d0e8942a6c22", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 09:32:39 compute-1 nova_compute[189066]: 2025-12-05 09:32:39.263 189070 DEBUG nova.network.os_vif_util [None req-55b11bf3-0c42-4c35-99e7-472650d03dbd 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:7f:68:29,bridge_name='br-int',has_traffic_filtering=True,id=d7d3d638-32a8-4d3e-abce-d0e8942a6c22,network=Network(7e493efd-28a8-4124-9b83-acc24853936f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd7d3d638-32') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 09:32:39 compute-1 nova_compute[189066]: 2025-12-05 09:32:39.264 189070 DEBUG os_vif [None req-55b11bf3-0c42-4c35-99e7-472650d03dbd 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:7f:68:29,bridge_name='br-int',has_traffic_filtering=True,id=d7d3d638-32a8-4d3e-abce-d0e8942a6c22,network=Network(7e493efd-28a8-4124-9b83-acc24853936f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd7d3d638-32') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Dec 05 09:32:39 compute-1 nova_compute[189066]: 2025-12-05 09:32:39.265 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:32:39 compute-1 nova_compute[189066]: 2025-12-05 09:32:39.266 189070 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd7d3d638-32, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 09:32:39 compute-1 nova_compute[189066]: 2025-12-05 09:32:39.267 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:32:39 compute-1 nova_compute[189066]: 2025-12-05 09:32:39.269 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:32:39 compute-1 nova_compute[189066]: 2025-12-05 09:32:39.273 189070 INFO os_vif [None req-55b11bf3-0c42-4c35-99e7-472650d03dbd 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:7f:68:29,bridge_name='br-int',has_traffic_filtering=True,id=d7d3d638-32a8-4d3e-abce-d0e8942a6c22,network=Network(7e493efd-28a8-4124-9b83-acc24853936f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd7d3d638-32')
Dec 05 09:32:39 compute-1 nova_compute[189066]: 2025-12-05 09:32:39.275 189070 INFO nova.virt.libvirt.driver [None req-55b11bf3-0c42-4c35-99e7-472650d03dbd 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: b843e130-e156-47b6-8a2a-d4811973b93a] Deleting instance files /var/lib/nova/instances/b843e130-e156-47b6-8a2a-d4811973b93a_del
Dec 05 09:32:39 compute-1 nova_compute[189066]: 2025-12-05 09:32:39.275 189070 INFO nova.virt.libvirt.driver [None req-55b11bf3-0c42-4c35-99e7-472650d03dbd 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: b843e130-e156-47b6-8a2a-d4811973b93a] Deletion of /var/lib/nova/instances/b843e130-e156-47b6-8a2a-d4811973b93a_del complete
Dec 05 09:32:39 compute-1 podman[227261]: 2025-12-05 09:32:39.307385602 +0000 UTC m=+0.052225485 container remove e4cc6f7e95e20852e8ccab93d85b207a1077a1bf7f58c822bd1b72ac71502d32 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7e493efd-28a8-4124-9b83-acc24853936f, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, tcib_managed=true)
Dec 05 09:32:39 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:32:39.314 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[b212eaab-03dd-4f67-9224-20832d000787]: (4, ('Fri Dec  5 09:32:39 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-7e493efd-28a8-4124-9b83-acc24853936f (e4cc6f7e95e20852e8ccab93d85b207a1077a1bf7f58c822bd1b72ac71502d32)\ne4cc6f7e95e20852e8ccab93d85b207a1077a1bf7f58c822bd1b72ac71502d32\nFri Dec  5 09:32:39 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-7e493efd-28a8-4124-9b83-acc24853936f (e4cc6f7e95e20852e8ccab93d85b207a1077a1bf7f58c822bd1b72ac71502d32)\ne4cc6f7e95e20852e8ccab93d85b207a1077a1bf7f58c822bd1b72ac71502d32\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:32:39 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:32:39.316 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[26e16bc5-c248-437a-9008-5288815a8a55]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:32:39 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:32:39.317 105272 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7e493efd-20, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 09:32:39 compute-1 nova_compute[189066]: 2025-12-05 09:32:39.320 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:32:39 compute-1 kernel: tap7e493efd-20: left promiscuous mode
Dec 05 09:32:39 compute-1 nova_compute[189066]: 2025-12-05 09:32:39.322 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:32:39 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:32:39.327 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[85102e53-f8e4-4d32-b158-00151da6565b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:32:39 compute-1 nova_compute[189066]: 2025-12-05 09:32:39.334 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:32:39 compute-1 nova_compute[189066]: 2025-12-05 09:32:39.343 189070 INFO nova.compute.manager [None req-55b11bf3-0c42-4c35-99e7-472650d03dbd 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: b843e130-e156-47b6-8a2a-d4811973b93a] Took 0.38 seconds to destroy the instance on the hypervisor.
Dec 05 09:32:39 compute-1 nova_compute[189066]: 2025-12-05 09:32:39.343 189070 DEBUG oslo.service.loopingcall [None req-55b11bf3-0c42-4c35-99e7-472650d03dbd 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Dec 05 09:32:39 compute-1 nova_compute[189066]: 2025-12-05 09:32:39.344 189070 DEBUG nova.compute.manager [-] [instance: b843e130-e156-47b6-8a2a-d4811973b93a] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Dec 05 09:32:39 compute-1 nova_compute[189066]: 2025-12-05 09:32:39.344 189070 DEBUG nova.network.neutron [-] [instance: b843e130-e156-47b6-8a2a-d4811973b93a] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Dec 05 09:32:39 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:32:39.351 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[e062678c-b338-48cf-b650-bc2a0a3137c6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:32:39 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:32:39.353 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[87b13f52-b06f-4d4c-84ba-dae8e34a18b3]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:32:39 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:32:39.372 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[6cacb24a-9c2d-4a4d-bdd2-5cf6a704f1f9]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 438480, 'reachable_time': 32679, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 227276, 'error': None, 'target': 'ovnmeta-7e493efd-28a8-4124-9b83-acc24853936f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:32:39 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:32:39.375 105809 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-7e493efd-28a8-4124-9b83-acc24853936f deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Dec 05 09:32:39 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:32:39.375 105809 DEBUG oslo.privsep.daemon [-] privsep: reply[e45fc968-afc3-4200-98ad-03599ce4b005]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:32:39 compute-1 systemd[1]: run-netns-ovnmeta\x2d7e493efd\x2d28a8\x2d4124\x2d9b83\x2dacc24853936f.mount: Deactivated successfully.
Dec 05 09:32:39 compute-1 nova_compute[189066]: 2025-12-05 09:32:39.699 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:32:41 compute-1 nova_compute[189066]: 2025-12-05 09:32:41.390 189070 DEBUG nova.compute.manager [req-463d0f59-3020-4092-a8ce-9ab399c21ee5 req-3f96a4af-5a76-445b-8885-e5d3c61dc43f 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: b843e130-e156-47b6-8a2a-d4811973b93a] Received event network-changed-d7d3d638-32a8-4d3e-abce-d0e8942a6c22 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 09:32:41 compute-1 nova_compute[189066]: 2025-12-05 09:32:41.391 189070 DEBUG nova.compute.manager [req-463d0f59-3020-4092-a8ce-9ab399c21ee5 req-3f96a4af-5a76-445b-8885-e5d3c61dc43f 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: b843e130-e156-47b6-8a2a-d4811973b93a] Refreshing instance network info cache due to event network-changed-d7d3d638-32a8-4d3e-abce-d0e8942a6c22. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Dec 05 09:32:41 compute-1 nova_compute[189066]: 2025-12-05 09:32:41.391 189070 DEBUG oslo_concurrency.lockutils [req-463d0f59-3020-4092-a8ce-9ab399c21ee5 req-3f96a4af-5a76-445b-8885-e5d3c61dc43f 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Acquiring lock "refresh_cache-b843e130-e156-47b6-8a2a-d4811973b93a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 09:32:41 compute-1 nova_compute[189066]: 2025-12-05 09:32:41.391 189070 DEBUG oslo_concurrency.lockutils [req-463d0f59-3020-4092-a8ce-9ab399c21ee5 req-3f96a4af-5a76-445b-8885-e5d3c61dc43f 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Acquired lock "refresh_cache-b843e130-e156-47b6-8a2a-d4811973b93a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 09:32:41 compute-1 nova_compute[189066]: 2025-12-05 09:32:41.391 189070 DEBUG nova.network.neutron [req-463d0f59-3020-4092-a8ce-9ab399c21ee5 req-3f96a4af-5a76-445b-8885-e5d3c61dc43f 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: b843e130-e156-47b6-8a2a-d4811973b93a] Refreshing network info cache for port d7d3d638-32a8-4d3e-abce-d0e8942a6c22 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Dec 05 09:32:41 compute-1 nova_compute[189066]: 2025-12-05 09:32:41.557 189070 DEBUG nova.network.neutron [-] [instance: b843e130-e156-47b6-8a2a-d4811973b93a] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 09:32:41 compute-1 nova_compute[189066]: 2025-12-05 09:32:41.592 189070 INFO nova.compute.manager [-] [instance: b843e130-e156-47b6-8a2a-d4811973b93a] Took 2.25 seconds to deallocate network for instance.
Dec 05 09:32:41 compute-1 podman[227277]: 2025-12-05 09:32:41.649340826 +0000 UTC m=+0.071674304 container health_status d60d3dfd47255bc7c068dbedc03dbf86678863f0414a1b8f8536e4d53dd67a13 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, container_name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a42d21893a42142b0b00e96296a13ac9d2c0fedf55d6438ad104fd0309c63376-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Dec 05 09:32:41 compute-1 nova_compute[189066]: 2025-12-05 09:32:41.673 189070 DEBUG oslo_concurrency.lockutils [None req-55b11bf3-0c42-4c35-99e7-472650d03dbd 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:32:41 compute-1 nova_compute[189066]: 2025-12-05 09:32:41.673 189070 DEBUG oslo_concurrency.lockutils [None req-55b11bf3-0c42-4c35-99e7-472650d03dbd 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:32:41 compute-1 nova_compute[189066]: 2025-12-05 09:32:41.753 189070 DEBUG nova.compute.provider_tree [None req-55b11bf3-0c42-4c35-99e7-472650d03dbd 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Inventory has not changed in ProviderTree for provider: be68f9f1-7820-4bfa-8dbd-210e13729f64 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 09:32:41 compute-1 nova_compute[189066]: 2025-12-05 09:32:41.775 189070 DEBUG nova.scheduler.client.report [None req-55b11bf3-0c42-4c35-99e7-472650d03dbd 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Inventory has not changed for provider be68f9f1-7820-4bfa-8dbd-210e13729f64 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 09:32:41 compute-1 nova_compute[189066]: 2025-12-05 09:32:41.809 189070 DEBUG oslo_concurrency.lockutils [None req-55b11bf3-0c42-4c35-99e7-472650d03dbd 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.136s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:32:41 compute-1 nova_compute[189066]: 2025-12-05 09:32:41.848 189070 INFO nova.scheduler.client.report [None req-55b11bf3-0c42-4c35-99e7-472650d03dbd 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Deleted allocations for instance b843e130-e156-47b6-8a2a-d4811973b93a
Dec 05 09:32:41 compute-1 nova_compute[189066]: 2025-12-05 09:32:41.929 189070 DEBUG oslo_concurrency.lockutils [None req-55b11bf3-0c42-4c35-99e7-472650d03dbd 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Lock "b843e130-e156-47b6-8a2a-d4811973b93a" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.967s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:32:42 compute-1 nova_compute[189066]: 2025-12-05 09:32:42.140 189070 INFO nova.network.neutron [req-463d0f59-3020-4092-a8ce-9ab399c21ee5 req-3f96a4af-5a76-445b-8885-e5d3c61dc43f 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: b843e130-e156-47b6-8a2a-d4811973b93a] Port d7d3d638-32a8-4d3e-abce-d0e8942a6c22 from network info_cache is no longer associated with instance in Neutron. Removing from network info_cache.
Dec 05 09:32:42 compute-1 nova_compute[189066]: 2025-12-05 09:32:42.140 189070 DEBUG nova.network.neutron [req-463d0f59-3020-4092-a8ce-9ab399c21ee5 req-3f96a4af-5a76-445b-8885-e5d3c61dc43f 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: b843e130-e156-47b6-8a2a-d4811973b93a] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 09:32:42 compute-1 nova_compute[189066]: 2025-12-05 09:32:42.171 189070 DEBUG oslo_concurrency.lockutils [req-463d0f59-3020-4092-a8ce-9ab399c21ee5 req-3f96a4af-5a76-445b-8885-e5d3c61dc43f 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Releasing lock "refresh_cache-b843e130-e156-47b6-8a2a-d4811973b93a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 09:32:42 compute-1 nova_compute[189066]: 2025-12-05 09:32:42.934 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:32:43 compute-1 nova_compute[189066]: 2025-12-05 09:32:43.579 189070 DEBUG nova.compute.manager [req-dc2b4bfe-d62a-4c01-9cd0-d364672f2f62 req-bd09f6eb-4b57-4864-a053-e3e50b1a134c 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: b843e130-e156-47b6-8a2a-d4811973b93a] Received event network-vif-plugged-d7d3d638-32a8-4d3e-abce-d0e8942a6c22 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 09:32:43 compute-1 nova_compute[189066]: 2025-12-05 09:32:43.580 189070 DEBUG oslo_concurrency.lockutils [req-dc2b4bfe-d62a-4c01-9cd0-d364672f2f62 req-bd09f6eb-4b57-4864-a053-e3e50b1a134c 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Acquiring lock "b843e130-e156-47b6-8a2a-d4811973b93a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:32:43 compute-1 nova_compute[189066]: 2025-12-05 09:32:43.580 189070 DEBUG oslo_concurrency.lockutils [req-dc2b4bfe-d62a-4c01-9cd0-d364672f2f62 req-bd09f6eb-4b57-4864-a053-e3e50b1a134c 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Lock "b843e130-e156-47b6-8a2a-d4811973b93a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:32:43 compute-1 nova_compute[189066]: 2025-12-05 09:32:43.581 189070 DEBUG oslo_concurrency.lockutils [req-dc2b4bfe-d62a-4c01-9cd0-d364672f2f62 req-bd09f6eb-4b57-4864-a053-e3e50b1a134c 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Lock "b843e130-e156-47b6-8a2a-d4811973b93a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:32:43 compute-1 nova_compute[189066]: 2025-12-05 09:32:43.581 189070 DEBUG nova.compute.manager [req-dc2b4bfe-d62a-4c01-9cd0-d364672f2f62 req-bd09f6eb-4b57-4864-a053-e3e50b1a134c 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: b843e130-e156-47b6-8a2a-d4811973b93a] No waiting events found dispatching network-vif-plugged-d7d3d638-32a8-4d3e-abce-d0e8942a6c22 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 09:32:43 compute-1 nova_compute[189066]: 2025-12-05 09:32:43.581 189070 WARNING nova.compute.manager [req-dc2b4bfe-d62a-4c01-9cd0-d364672f2f62 req-bd09f6eb-4b57-4864-a053-e3e50b1a134c 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: b843e130-e156-47b6-8a2a-d4811973b93a] Received unexpected event network-vif-plugged-d7d3d638-32a8-4d3e-abce-d0e8942a6c22 for instance with vm_state deleted and task_state None.
Dec 05 09:32:43 compute-1 nova_compute[189066]: 2025-12-05 09:32:43.581 189070 DEBUG nova.compute.manager [req-dc2b4bfe-d62a-4c01-9cd0-d364672f2f62 req-bd09f6eb-4b57-4864-a053-e3e50b1a134c 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: b843e130-e156-47b6-8a2a-d4811973b93a] Received event network-vif-deleted-d7d3d638-32a8-4d3e-abce-d0e8942a6c22 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 09:32:44 compute-1 nova_compute[189066]: 2025-12-05 09:32:44.272 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:32:44 compute-1 nova_compute[189066]: 2025-12-05 09:32:44.702 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:32:45 compute-1 podman[227300]: 2025-12-05 09:32:45.662790376 +0000 UTC m=+0.097786806 container health_status 0cdc951f3b455a1459471b20e1f1626ca53f702831c125f835d1b670aeb17ccf (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a42d21893a42142b0b00e96296a13ac9d2c0fedf55d6438ad104fd0309c63376-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Dec 05 09:32:48 compute-1 podman[227326]: 2025-12-05 09:32:48.620418458 +0000 UTC m=+0.058045510 container health_status e8cea6352187f28e672aa25c227f2705a90d58074b8cf42235a56df5d4026fd5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a42d21893a42142b0b00e96296a13ac9d2c0fedf55d6438ad104fd0309c63376-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Dec 05 09:32:49 compute-1 nova_compute[189066]: 2025-12-05 09:32:49.277 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:32:49 compute-1 nova_compute[189066]: 2025-12-05 09:32:49.704 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:32:50 compute-1 sshd-session[227298]: Connection closed by 101.47.162.91 port 43890 [preauth]
Dec 05 09:32:51 compute-1 podman[227343]: 2025-12-05 09:32:51.630371387 +0000 UTC m=+0.066429086 container health_status 3f3288243aefe700d53cffce184353529fbaf6e44f1c19ec4792ed576af41176 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Dec 05 09:32:53 compute-1 nova_compute[189066]: 2025-12-05 09:32:53.453 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:32:53 compute-1 nova_compute[189066]: 2025-12-05 09:32:53.675 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:32:54 compute-1 nova_compute[189066]: 2025-12-05 09:32:54.228 189070 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764927159.2267177, b843e130-e156-47b6-8a2a-d4811973b93a => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 09:32:54 compute-1 nova_compute[189066]: 2025-12-05 09:32:54.229 189070 INFO nova.compute.manager [-] [instance: b843e130-e156-47b6-8a2a-d4811973b93a] VM Stopped (Lifecycle Event)
Dec 05 09:32:54 compute-1 nova_compute[189066]: 2025-12-05 09:32:54.263 189070 DEBUG nova.compute.manager [None req-76510e90-64e6-4199-ba7f-ca509b387976 - - - - - -] [instance: b843e130-e156-47b6-8a2a-d4811973b93a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 09:32:54 compute-1 nova_compute[189066]: 2025-12-05 09:32:54.279 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:32:54 compute-1 podman[227365]: 2025-12-05 09:32:54.616487959 +0000 UTC m=+0.057053414 container health_status b07f36300303a5c1729056a61e344d6c6fd9b2e404082d8e8ce7185d45652c2b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, architecture=x86_64, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., vendor=Red Hat, Inc., version=9.6, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1755695350, config_id=openstack_network_exporter, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, name=ubi9-minimal, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-type=git)
Dec 05 09:32:54 compute-1 nova_compute[189066]: 2025-12-05 09:32:54.706 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:32:58 compute-1 podman[227387]: 2025-12-05 09:32:58.616632011 +0000 UTC m=+0.053709472 container health_status b9f45cdcb567bb250381fa80d86c78c226d5638c7de1b6d79ee017275814ff68 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Dec 05 09:32:59 compute-1 nova_compute[189066]: 2025-12-05 09:32:59.282 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:32:59 compute-1 nova_compute[189066]: 2025-12-05 09:32:59.708 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:33:04 compute-1 nova_compute[189066]: 2025-12-05 09:33:04.326 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:33:04 compute-1 nova_compute[189066]: 2025-12-05 09:33:04.710 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:33:05 compute-1 podman[227411]: 2025-12-05 09:33:05.621885145 +0000 UTC m=+0.060798418 container health_status 550b604b3b40c506829669f3af0f3e8dc4e7ac01464222ce7f26998708fe1a96 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec 05 09:33:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:33:08.878 105272 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:33:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:33:08.878 105272 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:33:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:33:08.878 105272 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:33:09 compute-1 nova_compute[189066]: 2025-12-05 09:33:09.362 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:33:09 compute-1 nova_compute[189066]: 2025-12-05 09:33:09.712 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:33:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:33:10.748 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 09:33:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:33:10.750 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 09:33:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:33:10.750 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 09:33:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:33:10.750 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 09:33:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:33:10.750 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 09:33:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:33:10.750 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 09:33:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:33:10.750 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 09:33:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:33:10.751 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 09:33:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:33:10.751 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 09:33:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:33:10.751 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 09:33:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:33:10.751 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 09:33:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:33:10.751 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 09:33:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:33:10.751 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 09:33:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:33:10.752 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 09:33:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:33:10.752 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 09:33:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:33:10.752 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 09:33:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:33:10.752 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 09:33:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:33:10.752 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 09:33:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:33:10.752 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 09:33:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:33:10.752 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 09:33:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:33:10.753 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 09:33:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:33:10.753 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 09:33:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:33:10.753 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 09:33:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:33:10.753 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 09:33:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:33:10.753 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 09:33:12 compute-1 nova_compute[189066]: 2025-12-05 09:33:12.089 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:33:12 compute-1 podman[227434]: 2025-12-05 09:33:12.625768843 +0000 UTC m=+0.066445856 container health_status d60d3dfd47255bc7c068dbedc03dbf86678863f0414a1b8f8536e4d53dd67a13 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a42d21893a42142b0b00e96296a13ac9d2c0fedf55d6438ad104fd0309c63376-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, container_name=ceilometer_agent_compute)
Dec 05 09:33:14 compute-1 nova_compute[189066]: 2025-12-05 09:33:14.366 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:33:14 compute-1 nova_compute[189066]: 2025-12-05 09:33:14.714 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:33:15 compute-1 nova_compute[189066]: 2025-12-05 09:33:15.015 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:33:15 compute-1 nova_compute[189066]: 2025-12-05 09:33:15.020 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:33:15 compute-1 nova_compute[189066]: 2025-12-05 09:33:15.020 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:33:15 compute-1 nova_compute[189066]: 2025-12-05 09:33:15.059 189070 DEBUG oslo_concurrency.lockutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:33:15 compute-1 nova_compute[189066]: 2025-12-05 09:33:15.060 189070 DEBUG oslo_concurrency.lockutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:33:15 compute-1 nova_compute[189066]: 2025-12-05 09:33:15.060 189070 DEBUG oslo_concurrency.lockutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:33:15 compute-1 nova_compute[189066]: 2025-12-05 09:33:15.060 189070 DEBUG nova.compute.resource_tracker [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 05 09:33:15 compute-1 nova_compute[189066]: 2025-12-05 09:33:15.234 189070 WARNING nova.virt.libvirt.driver [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 05 09:33:15 compute-1 nova_compute[189066]: 2025-12-05 09:33:15.235 189070 DEBUG nova.compute.resource_tracker [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5753MB free_disk=73.33370590209961GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 05 09:33:15 compute-1 nova_compute[189066]: 2025-12-05 09:33:15.235 189070 DEBUG oslo_concurrency.lockutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:33:15 compute-1 nova_compute[189066]: 2025-12-05 09:33:15.235 189070 DEBUG oslo_concurrency.lockutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:33:15 compute-1 nova_compute[189066]: 2025-12-05 09:33:15.312 189070 DEBUG nova.compute.resource_tracker [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 05 09:33:15 compute-1 nova_compute[189066]: 2025-12-05 09:33:15.313 189070 DEBUG nova.compute.resource_tracker [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 05 09:33:15 compute-1 nova_compute[189066]: 2025-12-05 09:33:15.341 189070 DEBUG nova.compute.provider_tree [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Inventory has not changed in ProviderTree for provider: be68f9f1-7820-4bfa-8dbd-210e13729f64 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 09:33:15 compute-1 nova_compute[189066]: 2025-12-05 09:33:15.363 189070 DEBUG nova.scheduler.client.report [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Inventory has not changed for provider be68f9f1-7820-4bfa-8dbd-210e13729f64 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 09:33:15 compute-1 nova_compute[189066]: 2025-12-05 09:33:15.402 189070 DEBUG nova.compute.resource_tracker [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 05 09:33:15 compute-1 nova_compute[189066]: 2025-12-05 09:33:15.403 189070 DEBUG oslo_concurrency.lockutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.167s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:33:16 compute-1 podman[227454]: 2025-12-05 09:33:16.674934692 +0000 UTC m=+0.116441526 container health_status 0cdc951f3b455a1459471b20e1f1626ca53f702831c125f835d1b670aeb17ccf (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a42d21893a42142b0b00e96296a13ac9d2c0fedf55d6438ad104fd0309c63376-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3)
Dec 05 09:33:17 compute-1 nova_compute[189066]: 2025-12-05 09:33:17.403 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:33:18 compute-1 nova_compute[189066]: 2025-12-05 09:33:18.020 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:33:18 compute-1 nova_compute[189066]: 2025-12-05 09:33:18.020 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:33:18 compute-1 nova_compute[189066]: 2025-12-05 09:33:18.021 189070 DEBUG nova.compute.manager [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 05 09:33:18 compute-1 nova_compute[189066]: 2025-12-05 09:33:18.191 189070 DEBUG oslo_concurrency.lockutils [None req-954ff84d-cfc2-4696-8ce6-0c1972ea5f96 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Acquiring lock "06bf42f4-0e5e-43c8-81c4-b1df487dafe3" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:33:18 compute-1 nova_compute[189066]: 2025-12-05 09:33:18.191 189070 DEBUG oslo_concurrency.lockutils [None req-954ff84d-cfc2-4696-8ce6-0c1972ea5f96 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Lock "06bf42f4-0e5e-43c8-81c4-b1df487dafe3" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:33:18 compute-1 nova_compute[189066]: 2025-12-05 09:33:18.291 189070 DEBUG nova.compute.manager [None req-954ff84d-cfc2-4696-8ce6-0c1972ea5f96 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: 06bf42f4-0e5e-43c8-81c4-b1df487dafe3] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Dec 05 09:33:18 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:33:18.292 105272 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=16, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f2:d3:89', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '8e:43:2c:7d:20:dd'}, ipsec=False) old=SB_Global(nb_cfg=15) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 09:33:18 compute-1 nova_compute[189066]: 2025-12-05 09:33:18.294 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:33:18 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:33:18.294 105272 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Dec 05 09:33:18 compute-1 nova_compute[189066]: 2025-12-05 09:33:18.752 189070 DEBUG oslo_concurrency.lockutils [None req-954ff84d-cfc2-4696-8ce6-0c1972ea5f96 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:33:18 compute-1 nova_compute[189066]: 2025-12-05 09:33:18.752 189070 DEBUG oslo_concurrency.lockutils [None req-954ff84d-cfc2-4696-8ce6-0c1972ea5f96 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:33:18 compute-1 nova_compute[189066]: 2025-12-05 09:33:18.765 189070 DEBUG nova.virt.hardware [None req-954ff84d-cfc2-4696-8ce6-0c1972ea5f96 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Dec 05 09:33:18 compute-1 nova_compute[189066]: 2025-12-05 09:33:18.766 189070 INFO nova.compute.claims [None req-954ff84d-cfc2-4696-8ce6-0c1972ea5f96 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: 06bf42f4-0e5e-43c8-81c4-b1df487dafe3] Claim successful on node compute-1.ctlplane.example.com
Dec 05 09:33:18 compute-1 nova_compute[189066]: 2025-12-05 09:33:18.894 189070 DEBUG nova.compute.provider_tree [None req-954ff84d-cfc2-4696-8ce6-0c1972ea5f96 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Inventory has not changed in ProviderTree for provider: be68f9f1-7820-4bfa-8dbd-210e13729f64 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 09:33:18 compute-1 nova_compute[189066]: 2025-12-05 09:33:18.922 189070 DEBUG nova.scheduler.client.report [None req-954ff84d-cfc2-4696-8ce6-0c1972ea5f96 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Inventory has not changed for provider be68f9f1-7820-4bfa-8dbd-210e13729f64 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 09:33:18 compute-1 nova_compute[189066]: 2025-12-05 09:33:18.955 189070 DEBUG oslo_concurrency.lockutils [None req-954ff84d-cfc2-4696-8ce6-0c1972ea5f96 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.202s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:33:18 compute-1 nova_compute[189066]: 2025-12-05 09:33:18.956 189070 DEBUG nova.compute.manager [None req-954ff84d-cfc2-4696-8ce6-0c1972ea5f96 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: 06bf42f4-0e5e-43c8-81c4-b1df487dafe3] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Dec 05 09:33:19 compute-1 nova_compute[189066]: 2025-12-05 09:33:19.011 189070 DEBUG nova.compute.manager [None req-954ff84d-cfc2-4696-8ce6-0c1972ea5f96 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: 06bf42f4-0e5e-43c8-81c4-b1df487dafe3] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Dec 05 09:33:19 compute-1 nova_compute[189066]: 2025-12-05 09:33:19.012 189070 DEBUG nova.network.neutron [None req-954ff84d-cfc2-4696-8ce6-0c1972ea5f96 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: 06bf42f4-0e5e-43c8-81c4-b1df487dafe3] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Dec 05 09:33:19 compute-1 nova_compute[189066]: 2025-12-05 09:33:19.034 189070 INFO nova.virt.libvirt.driver [None req-954ff84d-cfc2-4696-8ce6-0c1972ea5f96 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: 06bf42f4-0e5e-43c8-81c4-b1df487dafe3] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Dec 05 09:33:19 compute-1 nova_compute[189066]: 2025-12-05 09:33:19.059 189070 DEBUG nova.compute.manager [None req-954ff84d-cfc2-4696-8ce6-0c1972ea5f96 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: 06bf42f4-0e5e-43c8-81c4-b1df487dafe3] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Dec 05 09:33:19 compute-1 nova_compute[189066]: 2025-12-05 09:33:19.200 189070 DEBUG nova.compute.manager [None req-954ff84d-cfc2-4696-8ce6-0c1972ea5f96 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: 06bf42f4-0e5e-43c8-81c4-b1df487dafe3] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Dec 05 09:33:19 compute-1 nova_compute[189066]: 2025-12-05 09:33:19.201 189070 DEBUG nova.virt.libvirt.driver [None req-954ff84d-cfc2-4696-8ce6-0c1972ea5f96 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: 06bf42f4-0e5e-43c8-81c4-b1df487dafe3] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Dec 05 09:33:19 compute-1 nova_compute[189066]: 2025-12-05 09:33:19.202 189070 INFO nova.virt.libvirt.driver [None req-954ff84d-cfc2-4696-8ce6-0c1972ea5f96 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: 06bf42f4-0e5e-43c8-81c4-b1df487dafe3] Creating image(s)
Dec 05 09:33:19 compute-1 nova_compute[189066]: 2025-12-05 09:33:19.203 189070 DEBUG oslo_concurrency.lockutils [None req-954ff84d-cfc2-4696-8ce6-0c1972ea5f96 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Acquiring lock "/var/lib/nova/instances/06bf42f4-0e5e-43c8-81c4-b1df487dafe3/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:33:19 compute-1 nova_compute[189066]: 2025-12-05 09:33:19.203 189070 DEBUG oslo_concurrency.lockutils [None req-954ff84d-cfc2-4696-8ce6-0c1972ea5f96 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Lock "/var/lib/nova/instances/06bf42f4-0e5e-43c8-81c4-b1df487dafe3/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:33:19 compute-1 nova_compute[189066]: 2025-12-05 09:33:19.204 189070 DEBUG oslo_concurrency.lockutils [None req-954ff84d-cfc2-4696-8ce6-0c1972ea5f96 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Lock "/var/lib/nova/instances/06bf42f4-0e5e-43c8-81c4-b1df487dafe3/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:33:19 compute-1 nova_compute[189066]: 2025-12-05 09:33:19.219 189070 DEBUG oslo_concurrency.processutils [None req-954ff84d-cfc2-4696-8ce6-0c1972ea5f96 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5b1bdb5cc50b9bf43ebe3094e9279a12a18df0ce --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 09:33:19 compute-1 nova_compute[189066]: 2025-12-05 09:33:19.287 189070 DEBUG oslo_concurrency.processutils [None req-954ff84d-cfc2-4696-8ce6-0c1972ea5f96 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5b1bdb5cc50b9bf43ebe3094e9279a12a18df0ce --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 09:33:19 compute-1 nova_compute[189066]: 2025-12-05 09:33:19.289 189070 DEBUG oslo_concurrency.lockutils [None req-954ff84d-cfc2-4696-8ce6-0c1972ea5f96 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Acquiring lock "5b1bdb5cc50b9bf43ebe3094e9279a12a18df0ce" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:33:19 compute-1 nova_compute[189066]: 2025-12-05 09:33:19.290 189070 DEBUG oslo_concurrency.lockutils [None req-954ff84d-cfc2-4696-8ce6-0c1972ea5f96 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Lock "5b1bdb5cc50b9bf43ebe3094e9279a12a18df0ce" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:33:19 compute-1 nova_compute[189066]: 2025-12-05 09:33:19.301 189070 DEBUG oslo_concurrency.processutils [None req-954ff84d-cfc2-4696-8ce6-0c1972ea5f96 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5b1bdb5cc50b9bf43ebe3094e9279a12a18df0ce --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 09:33:19 compute-1 nova_compute[189066]: 2025-12-05 09:33:19.361 189070 DEBUG oslo_concurrency.processutils [None req-954ff84d-cfc2-4696-8ce6-0c1972ea5f96 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5b1bdb5cc50b9bf43ebe3094e9279a12a18df0ce --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 09:33:19 compute-1 nova_compute[189066]: 2025-12-05 09:33:19.363 189070 DEBUG oslo_concurrency.processutils [None req-954ff84d-cfc2-4696-8ce6-0c1972ea5f96 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5b1bdb5cc50b9bf43ebe3094e9279a12a18df0ce,backing_fmt=raw /var/lib/nova/instances/06bf42f4-0e5e-43c8-81c4-b1df487dafe3/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 09:33:19 compute-1 nova_compute[189066]: 2025-12-05 09:33:19.385 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:33:19 compute-1 nova_compute[189066]: 2025-12-05 09:33:19.404 189070 DEBUG oslo_concurrency.processutils [None req-954ff84d-cfc2-4696-8ce6-0c1972ea5f96 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5b1bdb5cc50b9bf43ebe3094e9279a12a18df0ce,backing_fmt=raw /var/lib/nova/instances/06bf42f4-0e5e-43c8-81c4-b1df487dafe3/disk 1073741824" returned: 0 in 0.041s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 09:33:19 compute-1 nova_compute[189066]: 2025-12-05 09:33:19.405 189070 DEBUG oslo_concurrency.lockutils [None req-954ff84d-cfc2-4696-8ce6-0c1972ea5f96 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Lock "5b1bdb5cc50b9bf43ebe3094e9279a12a18df0ce" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.116s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:33:19 compute-1 nova_compute[189066]: 2025-12-05 09:33:19.406 189070 DEBUG oslo_concurrency.processutils [None req-954ff84d-cfc2-4696-8ce6-0c1972ea5f96 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5b1bdb5cc50b9bf43ebe3094e9279a12a18df0ce --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 09:33:19 compute-1 nova_compute[189066]: 2025-12-05 09:33:19.431 189070 DEBUG nova.policy [None req-954ff84d-cfc2-4696-8ce6-0c1972ea5f96 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '822a2d80e80e46d9a5a49e1e9560e0d9', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'f918144b49634ed5a43d75f8f7d194d3', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Dec 05 09:33:19 compute-1 nova_compute[189066]: 2025-12-05 09:33:19.473 189070 DEBUG oslo_concurrency.processutils [None req-954ff84d-cfc2-4696-8ce6-0c1972ea5f96 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5b1bdb5cc50b9bf43ebe3094e9279a12a18df0ce --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 09:33:19 compute-1 nova_compute[189066]: 2025-12-05 09:33:19.474 189070 DEBUG nova.virt.disk.api [None req-954ff84d-cfc2-4696-8ce6-0c1972ea5f96 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Checking if we can resize image /var/lib/nova/instances/06bf42f4-0e5e-43c8-81c4-b1df487dafe3/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Dec 05 09:33:19 compute-1 nova_compute[189066]: 2025-12-05 09:33:19.474 189070 DEBUG oslo_concurrency.processutils [None req-954ff84d-cfc2-4696-8ce6-0c1972ea5f96 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/06bf42f4-0e5e-43c8-81c4-b1df487dafe3/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 09:33:19 compute-1 nova_compute[189066]: 2025-12-05 09:33:19.537 189070 DEBUG oslo_concurrency.processutils [None req-954ff84d-cfc2-4696-8ce6-0c1972ea5f96 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/06bf42f4-0e5e-43c8-81c4-b1df487dafe3/disk --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 09:33:19 compute-1 nova_compute[189066]: 2025-12-05 09:33:19.538 189070 DEBUG nova.virt.disk.api [None req-954ff84d-cfc2-4696-8ce6-0c1972ea5f96 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Cannot resize image /var/lib/nova/instances/06bf42f4-0e5e-43c8-81c4-b1df487dafe3/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Dec 05 09:33:19 compute-1 nova_compute[189066]: 2025-12-05 09:33:19.539 189070 DEBUG nova.objects.instance [None req-954ff84d-cfc2-4696-8ce6-0c1972ea5f96 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Lazy-loading 'migration_context' on Instance uuid 06bf42f4-0e5e-43c8-81c4-b1df487dafe3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 09:33:19 compute-1 nova_compute[189066]: 2025-12-05 09:33:19.557 189070 DEBUG nova.virt.libvirt.driver [None req-954ff84d-cfc2-4696-8ce6-0c1972ea5f96 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: 06bf42f4-0e5e-43c8-81c4-b1df487dafe3] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Dec 05 09:33:19 compute-1 nova_compute[189066]: 2025-12-05 09:33:19.558 189070 DEBUG nova.virt.libvirt.driver [None req-954ff84d-cfc2-4696-8ce6-0c1972ea5f96 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: 06bf42f4-0e5e-43c8-81c4-b1df487dafe3] Ensure instance console log exists: /var/lib/nova/instances/06bf42f4-0e5e-43c8-81c4-b1df487dafe3/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Dec 05 09:33:19 compute-1 nova_compute[189066]: 2025-12-05 09:33:19.559 189070 DEBUG oslo_concurrency.lockutils [None req-954ff84d-cfc2-4696-8ce6-0c1972ea5f96 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:33:19 compute-1 nova_compute[189066]: 2025-12-05 09:33:19.559 189070 DEBUG oslo_concurrency.lockutils [None req-954ff84d-cfc2-4696-8ce6-0c1972ea5f96 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:33:19 compute-1 nova_compute[189066]: 2025-12-05 09:33:19.559 189070 DEBUG oslo_concurrency.lockutils [None req-954ff84d-cfc2-4696-8ce6-0c1972ea5f96 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:33:19 compute-1 podman[227496]: 2025-12-05 09:33:19.62223404 +0000 UTC m=+0.053597351 container health_status e8cea6352187f28e672aa25c227f2705a90d58074b8cf42235a56df5d4026fd5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a42d21893a42142b0b00e96296a13ac9d2c0fedf55d6438ad104fd0309c63376-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS)
Dec 05 09:33:19 compute-1 nova_compute[189066]: 2025-12-05 09:33:19.715 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:33:20 compute-1 nova_compute[189066]: 2025-12-05 09:33:20.020 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:33:21 compute-1 nova_compute[189066]: 2025-12-05 09:33:21.021 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:33:21 compute-1 nova_compute[189066]: 2025-12-05 09:33:21.021 189070 DEBUG nova.compute.manager [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 05 09:33:21 compute-1 nova_compute[189066]: 2025-12-05 09:33:21.022 189070 DEBUG nova.compute.manager [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 05 09:33:21 compute-1 nova_compute[189066]: 2025-12-05 09:33:21.040 189070 DEBUG nova.compute.manager [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] [instance: 06bf42f4-0e5e-43c8-81c4-b1df487dafe3] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871
Dec 05 09:33:21 compute-1 nova_compute[189066]: 2025-12-05 09:33:21.040 189070 DEBUG nova.compute.manager [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Dec 05 09:33:21 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:33:21.298 105272 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=2deaed7a-68f6-453c-b7f8-10ef033f3762, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '16'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 09:33:22 compute-1 podman[227514]: 2025-12-05 09:33:22.627257848 +0000 UTC m=+0.066022785 container health_status 3f3288243aefe700d53cffce184353529fbaf6e44f1c19ec4792ed576af41176 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd)
Dec 05 09:33:23 compute-1 nova_compute[189066]: 2025-12-05 09:33:23.490 189070 DEBUG nova.network.neutron [None req-954ff84d-cfc2-4696-8ce6-0c1972ea5f96 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: 06bf42f4-0e5e-43c8-81c4-b1df487dafe3] Successfully created port: f02068c2-6668-4bd6-9364-5ddc884043c6 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Dec 05 09:33:24 compute-1 nova_compute[189066]: 2025-12-05 09:33:24.405 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:33:24 compute-1 nova_compute[189066]: 2025-12-05 09:33:24.716 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:33:25 compute-1 podman[227535]: 2025-12-05 09:33:25.649564422 +0000 UTC m=+0.085758232 container health_status b07f36300303a5c1729056a61e344d6c6fd9b2e404082d8e8ce7185d45652c2b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, io.openshift.expose-services=, container_name=openstack_network_exporter, managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, io.openshift.tags=minimal rhel9, release=1755695350, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, distribution-scope=public, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Dec 05 09:33:29 compute-1 nova_compute[189066]: 2025-12-05 09:33:29.407 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:33:29 compute-1 podman[227556]: 2025-12-05 09:33:29.610549872 +0000 UTC m=+0.053522999 container health_status b9f45cdcb567bb250381fa80d86c78c226d5638c7de1b6d79ee017275814ff68 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Dec 05 09:33:29 compute-1 nova_compute[189066]: 2025-12-05 09:33:29.772 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:33:30 compute-1 nova_compute[189066]: 2025-12-05 09:33:30.222 189070 DEBUG nova.network.neutron [None req-954ff84d-cfc2-4696-8ce6-0c1972ea5f96 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: 06bf42f4-0e5e-43c8-81c4-b1df487dafe3] Successfully updated port: f02068c2-6668-4bd6-9364-5ddc884043c6 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Dec 05 09:33:30 compute-1 nova_compute[189066]: 2025-12-05 09:33:30.443 189070 DEBUG oslo_concurrency.lockutils [None req-954ff84d-cfc2-4696-8ce6-0c1972ea5f96 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Acquiring lock "refresh_cache-06bf42f4-0e5e-43c8-81c4-b1df487dafe3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 09:33:30 compute-1 nova_compute[189066]: 2025-12-05 09:33:30.443 189070 DEBUG oslo_concurrency.lockutils [None req-954ff84d-cfc2-4696-8ce6-0c1972ea5f96 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Acquired lock "refresh_cache-06bf42f4-0e5e-43c8-81c4-b1df487dafe3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 09:33:30 compute-1 nova_compute[189066]: 2025-12-05 09:33:30.443 189070 DEBUG nova.network.neutron [None req-954ff84d-cfc2-4696-8ce6-0c1972ea5f96 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: 06bf42f4-0e5e-43c8-81c4-b1df487dafe3] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Dec 05 09:33:30 compute-1 nova_compute[189066]: 2025-12-05 09:33:30.528 189070 DEBUG nova.compute.manager [req-c2529c60-8800-4252-aaa4-65e9bcd6c683 req-f6b5eb12-8739-4bde-8e63-c2da7ffd723c 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 06bf42f4-0e5e-43c8-81c4-b1df487dafe3] Received event network-changed-f02068c2-6668-4bd6-9364-5ddc884043c6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 09:33:30 compute-1 nova_compute[189066]: 2025-12-05 09:33:30.528 189070 DEBUG nova.compute.manager [req-c2529c60-8800-4252-aaa4-65e9bcd6c683 req-f6b5eb12-8739-4bde-8e63-c2da7ffd723c 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 06bf42f4-0e5e-43c8-81c4-b1df487dafe3] Refreshing instance network info cache due to event network-changed-f02068c2-6668-4bd6-9364-5ddc884043c6. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Dec 05 09:33:30 compute-1 nova_compute[189066]: 2025-12-05 09:33:30.528 189070 DEBUG oslo_concurrency.lockutils [req-c2529c60-8800-4252-aaa4-65e9bcd6c683 req-f6b5eb12-8739-4bde-8e63-c2da7ffd723c 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Acquiring lock "refresh_cache-06bf42f4-0e5e-43c8-81c4-b1df487dafe3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 09:33:30 compute-1 nova_compute[189066]: 2025-12-05 09:33:30.715 189070 DEBUG nova.network.neutron [None req-954ff84d-cfc2-4696-8ce6-0c1972ea5f96 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: 06bf42f4-0e5e-43c8-81c4-b1df487dafe3] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Dec 05 09:33:34 compute-1 nova_compute[189066]: 2025-12-05 09:33:34.249 189070 DEBUG nova.network.neutron [None req-954ff84d-cfc2-4696-8ce6-0c1972ea5f96 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: 06bf42f4-0e5e-43c8-81c4-b1df487dafe3] Updating instance_info_cache with network_info: [{"id": "f02068c2-6668-4bd6-9364-5ddc884043c6", "address": "fa:16:3e:e5:e6:29", "network": {"id": "918acd23-3741-478c-80cb-c5530f6594f8", "bridge": "br-int", "label": "tempest-network-smoke--1861576532", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f918144b49634ed5a43d75f8f7d194d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf02068c2-66", "ovs_interfaceid": "f02068c2-6668-4bd6-9364-5ddc884043c6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 09:33:34 compute-1 nova_compute[189066]: 2025-12-05 09:33:34.465 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:33:34 compute-1 nova_compute[189066]: 2025-12-05 09:33:34.775 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:33:34 compute-1 nova_compute[189066]: 2025-12-05 09:33:34.821 189070 DEBUG oslo_concurrency.lockutils [None req-954ff84d-cfc2-4696-8ce6-0c1972ea5f96 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Releasing lock "refresh_cache-06bf42f4-0e5e-43c8-81c4-b1df487dafe3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 09:33:34 compute-1 nova_compute[189066]: 2025-12-05 09:33:34.821 189070 DEBUG nova.compute.manager [None req-954ff84d-cfc2-4696-8ce6-0c1972ea5f96 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: 06bf42f4-0e5e-43c8-81c4-b1df487dafe3] Instance network_info: |[{"id": "f02068c2-6668-4bd6-9364-5ddc884043c6", "address": "fa:16:3e:e5:e6:29", "network": {"id": "918acd23-3741-478c-80cb-c5530f6594f8", "bridge": "br-int", "label": "tempest-network-smoke--1861576532", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f918144b49634ed5a43d75f8f7d194d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf02068c2-66", "ovs_interfaceid": "f02068c2-6668-4bd6-9364-5ddc884043c6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Dec 05 09:33:34 compute-1 nova_compute[189066]: 2025-12-05 09:33:34.822 189070 DEBUG oslo_concurrency.lockutils [req-c2529c60-8800-4252-aaa4-65e9bcd6c683 req-f6b5eb12-8739-4bde-8e63-c2da7ffd723c 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Acquired lock "refresh_cache-06bf42f4-0e5e-43c8-81c4-b1df487dafe3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 09:33:34 compute-1 nova_compute[189066]: 2025-12-05 09:33:34.822 189070 DEBUG nova.network.neutron [req-c2529c60-8800-4252-aaa4-65e9bcd6c683 req-f6b5eb12-8739-4bde-8e63-c2da7ffd723c 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 06bf42f4-0e5e-43c8-81c4-b1df487dafe3] Refreshing network info cache for port f02068c2-6668-4bd6-9364-5ddc884043c6 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Dec 05 09:33:34 compute-1 nova_compute[189066]: 2025-12-05 09:33:34.825 189070 DEBUG nova.virt.libvirt.driver [None req-954ff84d-cfc2-4696-8ce6-0c1972ea5f96 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: 06bf42f4-0e5e-43c8-81c4-b1df487dafe3] Start _get_guest_xml network_info=[{"id": "f02068c2-6668-4bd6-9364-5ddc884043c6", "address": "fa:16:3e:e5:e6:29", "network": {"id": "918acd23-3741-478c-80cb-c5530f6594f8", "bridge": "br-int", "label": "tempest-network-smoke--1861576532", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f918144b49634ed5a43d75f8f7d194d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf02068c2-66", "ovs_interfaceid": "f02068c2-6668-4bd6-9364-5ddc884043c6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T09:19:04Z,direct_url=<?>,disk_format='qcow2',id=3ebffd97-b242-42d7-b245-ebdaf8e4377c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='1cb29f3743454b7fb71af92993455437',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T09:19:06Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encrypted': False, 'encryption_secret_uuid': None, 'size': 0, 'boot_index': 0, 'device_name': '/dev/vda', 'disk_bus': 'virtio', 'guest_format': None, 'encryption_options': None, 'encryption_format': None, 'device_type': 'disk', 'image_id': '3ebffd97-b242-42d7-b245-ebdaf8e4377c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Dec 05 09:33:34 compute-1 nova_compute[189066]: 2025-12-05 09:33:34.830 189070 WARNING nova.virt.libvirt.driver [None req-954ff84d-cfc2-4696-8ce6-0c1972ea5f96 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 05 09:33:34 compute-1 nova_compute[189066]: 2025-12-05 09:33:34.839 189070 DEBUG nova.virt.libvirt.host [None req-954ff84d-cfc2-4696-8ce6-0c1972ea5f96 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Dec 05 09:33:34 compute-1 nova_compute[189066]: 2025-12-05 09:33:34.840 189070 DEBUG nova.virt.libvirt.host [None req-954ff84d-cfc2-4696-8ce6-0c1972ea5f96 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Dec 05 09:33:34 compute-1 nova_compute[189066]: 2025-12-05 09:33:34.846 189070 DEBUG nova.virt.libvirt.host [None req-954ff84d-cfc2-4696-8ce6-0c1972ea5f96 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Dec 05 09:33:34 compute-1 nova_compute[189066]: 2025-12-05 09:33:34.847 189070 DEBUG nova.virt.libvirt.host [None req-954ff84d-cfc2-4696-8ce6-0c1972ea5f96 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Dec 05 09:33:34 compute-1 nova_compute[189066]: 2025-12-05 09:33:34.849 189070 DEBUG nova.virt.libvirt.driver [None req-954ff84d-cfc2-4696-8ce6-0c1972ea5f96 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Dec 05 09:33:34 compute-1 nova_compute[189066]: 2025-12-05 09:33:34.850 189070 DEBUG nova.virt.hardware [None req-954ff84d-cfc2-4696-8ce6-0c1972ea5f96 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-05T09:19:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='fbadeab4-f24f-4100-963a-d228b2a6f7c4',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T09:19:04Z,direct_url=<?>,disk_format='qcow2',id=3ebffd97-b242-42d7-b245-ebdaf8e4377c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='1cb29f3743454b7fb71af92993455437',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T09:19:06Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Dec 05 09:33:34 compute-1 nova_compute[189066]: 2025-12-05 09:33:34.850 189070 DEBUG nova.virt.hardware [None req-954ff84d-cfc2-4696-8ce6-0c1972ea5f96 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Dec 05 09:33:34 compute-1 nova_compute[189066]: 2025-12-05 09:33:34.851 189070 DEBUG nova.virt.hardware [None req-954ff84d-cfc2-4696-8ce6-0c1972ea5f96 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Dec 05 09:33:34 compute-1 nova_compute[189066]: 2025-12-05 09:33:34.851 189070 DEBUG nova.virt.hardware [None req-954ff84d-cfc2-4696-8ce6-0c1972ea5f96 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Dec 05 09:33:34 compute-1 nova_compute[189066]: 2025-12-05 09:33:34.851 189070 DEBUG nova.virt.hardware [None req-954ff84d-cfc2-4696-8ce6-0c1972ea5f96 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Dec 05 09:33:34 compute-1 nova_compute[189066]: 2025-12-05 09:33:34.851 189070 DEBUG nova.virt.hardware [None req-954ff84d-cfc2-4696-8ce6-0c1972ea5f96 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Dec 05 09:33:34 compute-1 nova_compute[189066]: 2025-12-05 09:33:34.852 189070 DEBUG nova.virt.hardware [None req-954ff84d-cfc2-4696-8ce6-0c1972ea5f96 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Dec 05 09:33:34 compute-1 nova_compute[189066]: 2025-12-05 09:33:34.852 189070 DEBUG nova.virt.hardware [None req-954ff84d-cfc2-4696-8ce6-0c1972ea5f96 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Dec 05 09:33:34 compute-1 nova_compute[189066]: 2025-12-05 09:33:34.852 189070 DEBUG nova.virt.hardware [None req-954ff84d-cfc2-4696-8ce6-0c1972ea5f96 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Dec 05 09:33:34 compute-1 nova_compute[189066]: 2025-12-05 09:33:34.853 189070 DEBUG nova.virt.hardware [None req-954ff84d-cfc2-4696-8ce6-0c1972ea5f96 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Dec 05 09:33:34 compute-1 nova_compute[189066]: 2025-12-05 09:33:34.853 189070 DEBUG nova.virt.hardware [None req-954ff84d-cfc2-4696-8ce6-0c1972ea5f96 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Dec 05 09:33:34 compute-1 nova_compute[189066]: 2025-12-05 09:33:34.858 189070 DEBUG nova.virt.libvirt.vif [None req-954ff84d-cfc2-4696-8ce6-0c1972ea5f96 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T09:33:16Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-620680644',display_name='tempest-TestNetworkBasicOps-server-620680644',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-620680644',id=28,image_ref='3ebffd97-b242-42d7-b245-ebdaf8e4377c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDm7JtMu9Dp/tAvdvilb6CWwpNoXl3CoNqyGQ9V53LjMNqLGWWqKoK0rxjz9+HMJlVNjmXBoxrWMPvPgS4KFYbqapau5Y14IAaLaUXSF6ZHboVdyTm2c2OurXRERdd7b/w==',key_name='tempest-TestNetworkBasicOps-344419197',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='f918144b49634ed5a43d75f8f7d194d3',ramdisk_id='',reservation_id='r-54wr0cp7',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='3ebffd97-b242-42d7-b245-ebdaf8e4377c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1341993432',owner_user_name='tempest-TestNetworkBasicOps-1341993432-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T09:33:19Z,user_data=None,user_id='822a2d80e80e46d9a5a49e1e9560e0d9',uuid=06bf42f4-0e5e-43c8-81c4-b1df487dafe3,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "f02068c2-6668-4bd6-9364-5ddc884043c6", "address": "fa:16:3e:e5:e6:29", "network": {"id": "918acd23-3741-478c-80cb-c5530f6594f8", "bridge": "br-int", "label": "tempest-network-smoke--1861576532", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f918144b49634ed5a43d75f8f7d194d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf02068c2-66", "ovs_interfaceid": "f02068c2-6668-4bd6-9364-5ddc884043c6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Dec 05 09:33:34 compute-1 nova_compute[189066]: 2025-12-05 09:33:34.859 189070 DEBUG nova.network.os_vif_util [None req-954ff84d-cfc2-4696-8ce6-0c1972ea5f96 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Converting VIF {"id": "f02068c2-6668-4bd6-9364-5ddc884043c6", "address": "fa:16:3e:e5:e6:29", "network": {"id": "918acd23-3741-478c-80cb-c5530f6594f8", "bridge": "br-int", "label": "tempest-network-smoke--1861576532", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f918144b49634ed5a43d75f8f7d194d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf02068c2-66", "ovs_interfaceid": "f02068c2-6668-4bd6-9364-5ddc884043c6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 09:33:34 compute-1 nova_compute[189066]: 2025-12-05 09:33:34.860 189070 DEBUG nova.network.os_vif_util [None req-954ff84d-cfc2-4696-8ce6-0c1972ea5f96 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:e5:e6:29,bridge_name='br-int',has_traffic_filtering=True,id=f02068c2-6668-4bd6-9364-5ddc884043c6,network=Network(918acd23-3741-478c-80cb-c5530f6594f8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf02068c2-66') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 09:33:34 compute-1 nova_compute[189066]: 2025-12-05 09:33:34.861 189070 DEBUG nova.objects.instance [None req-954ff84d-cfc2-4696-8ce6-0c1972ea5f96 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Lazy-loading 'pci_devices' on Instance uuid 06bf42f4-0e5e-43c8-81c4-b1df487dafe3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 09:33:34 compute-1 nova_compute[189066]: 2025-12-05 09:33:34.884 189070 DEBUG nova.virt.libvirt.driver [None req-954ff84d-cfc2-4696-8ce6-0c1972ea5f96 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: 06bf42f4-0e5e-43c8-81c4-b1df487dafe3] End _get_guest_xml xml=<domain type="kvm">
Dec 05 09:33:34 compute-1 nova_compute[189066]:   <uuid>06bf42f4-0e5e-43c8-81c4-b1df487dafe3</uuid>
Dec 05 09:33:34 compute-1 nova_compute[189066]:   <name>instance-0000001c</name>
Dec 05 09:33:34 compute-1 nova_compute[189066]:   <memory>131072</memory>
Dec 05 09:33:34 compute-1 nova_compute[189066]:   <vcpu>1</vcpu>
Dec 05 09:33:34 compute-1 nova_compute[189066]:   <metadata>
Dec 05 09:33:34 compute-1 nova_compute[189066]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 05 09:33:34 compute-1 nova_compute[189066]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec 05 09:33:34 compute-1 nova_compute[189066]:       <nova:name>tempest-TestNetworkBasicOps-server-620680644</nova:name>
Dec 05 09:33:34 compute-1 nova_compute[189066]:       <nova:creationTime>2025-12-05 09:33:34</nova:creationTime>
Dec 05 09:33:34 compute-1 nova_compute[189066]:       <nova:flavor name="m1.nano">
Dec 05 09:33:34 compute-1 nova_compute[189066]:         <nova:memory>128</nova:memory>
Dec 05 09:33:34 compute-1 nova_compute[189066]:         <nova:disk>1</nova:disk>
Dec 05 09:33:34 compute-1 nova_compute[189066]:         <nova:swap>0</nova:swap>
Dec 05 09:33:34 compute-1 nova_compute[189066]:         <nova:ephemeral>0</nova:ephemeral>
Dec 05 09:33:34 compute-1 nova_compute[189066]:         <nova:vcpus>1</nova:vcpus>
Dec 05 09:33:34 compute-1 nova_compute[189066]:       </nova:flavor>
Dec 05 09:33:34 compute-1 nova_compute[189066]:       <nova:owner>
Dec 05 09:33:34 compute-1 nova_compute[189066]:         <nova:user uuid="822a2d80e80e46d9a5a49e1e9560e0d9">tempest-TestNetworkBasicOps-1341993432-project-member</nova:user>
Dec 05 09:33:34 compute-1 nova_compute[189066]:         <nova:project uuid="f918144b49634ed5a43d75f8f7d194d3">tempest-TestNetworkBasicOps-1341993432</nova:project>
Dec 05 09:33:34 compute-1 nova_compute[189066]:       </nova:owner>
Dec 05 09:33:34 compute-1 nova_compute[189066]:       <nova:root type="image" uuid="3ebffd97-b242-42d7-b245-ebdaf8e4377c"/>
Dec 05 09:33:34 compute-1 nova_compute[189066]:       <nova:ports>
Dec 05 09:33:34 compute-1 nova_compute[189066]:         <nova:port uuid="f02068c2-6668-4bd6-9364-5ddc884043c6">
Dec 05 09:33:34 compute-1 nova_compute[189066]:           <nova:ip type="fixed" address="10.100.0.10" ipVersion="4"/>
Dec 05 09:33:34 compute-1 nova_compute[189066]:         </nova:port>
Dec 05 09:33:34 compute-1 nova_compute[189066]:       </nova:ports>
Dec 05 09:33:34 compute-1 nova_compute[189066]:     </nova:instance>
Dec 05 09:33:34 compute-1 nova_compute[189066]:   </metadata>
Dec 05 09:33:34 compute-1 nova_compute[189066]:   <sysinfo type="smbios">
Dec 05 09:33:34 compute-1 nova_compute[189066]:     <system>
Dec 05 09:33:34 compute-1 nova_compute[189066]:       <entry name="manufacturer">RDO</entry>
Dec 05 09:33:34 compute-1 nova_compute[189066]:       <entry name="product">OpenStack Compute</entry>
Dec 05 09:33:34 compute-1 nova_compute[189066]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec 05 09:33:34 compute-1 nova_compute[189066]:       <entry name="serial">06bf42f4-0e5e-43c8-81c4-b1df487dafe3</entry>
Dec 05 09:33:34 compute-1 nova_compute[189066]:       <entry name="uuid">06bf42f4-0e5e-43c8-81c4-b1df487dafe3</entry>
Dec 05 09:33:34 compute-1 nova_compute[189066]:       <entry name="family">Virtual Machine</entry>
Dec 05 09:33:34 compute-1 nova_compute[189066]:     </system>
Dec 05 09:33:34 compute-1 nova_compute[189066]:   </sysinfo>
Dec 05 09:33:34 compute-1 nova_compute[189066]:   <os>
Dec 05 09:33:34 compute-1 nova_compute[189066]:     <type arch="x86_64" machine="q35">hvm</type>
Dec 05 09:33:34 compute-1 nova_compute[189066]:     <boot dev="hd"/>
Dec 05 09:33:34 compute-1 nova_compute[189066]:     <smbios mode="sysinfo"/>
Dec 05 09:33:34 compute-1 nova_compute[189066]:   </os>
Dec 05 09:33:34 compute-1 nova_compute[189066]:   <features>
Dec 05 09:33:34 compute-1 nova_compute[189066]:     <acpi/>
Dec 05 09:33:34 compute-1 nova_compute[189066]:     <apic/>
Dec 05 09:33:34 compute-1 nova_compute[189066]:     <vmcoreinfo/>
Dec 05 09:33:34 compute-1 nova_compute[189066]:   </features>
Dec 05 09:33:34 compute-1 nova_compute[189066]:   <clock offset="utc">
Dec 05 09:33:34 compute-1 nova_compute[189066]:     <timer name="pit" tickpolicy="delay"/>
Dec 05 09:33:34 compute-1 nova_compute[189066]:     <timer name="rtc" tickpolicy="catchup"/>
Dec 05 09:33:34 compute-1 nova_compute[189066]:     <timer name="hpet" present="no"/>
Dec 05 09:33:34 compute-1 nova_compute[189066]:   </clock>
Dec 05 09:33:34 compute-1 nova_compute[189066]:   <cpu mode="custom" match="exact">
Dec 05 09:33:34 compute-1 nova_compute[189066]:     <model>Nehalem</model>
Dec 05 09:33:34 compute-1 nova_compute[189066]:     <topology sockets="1" cores="1" threads="1"/>
Dec 05 09:33:34 compute-1 nova_compute[189066]:   </cpu>
Dec 05 09:33:34 compute-1 nova_compute[189066]:   <devices>
Dec 05 09:33:34 compute-1 nova_compute[189066]:     <disk type="file" device="disk">
Dec 05 09:33:34 compute-1 nova_compute[189066]:       <driver name="qemu" type="qcow2" cache="none"/>
Dec 05 09:33:34 compute-1 nova_compute[189066]:       <source file="/var/lib/nova/instances/06bf42f4-0e5e-43c8-81c4-b1df487dafe3/disk"/>
Dec 05 09:33:34 compute-1 nova_compute[189066]:       <target dev="vda" bus="virtio"/>
Dec 05 09:33:34 compute-1 nova_compute[189066]:     </disk>
Dec 05 09:33:34 compute-1 nova_compute[189066]:     <disk type="file" device="cdrom">
Dec 05 09:33:34 compute-1 nova_compute[189066]:       <driver name="qemu" type="raw" cache="none"/>
Dec 05 09:33:34 compute-1 nova_compute[189066]:       <source file="/var/lib/nova/instances/06bf42f4-0e5e-43c8-81c4-b1df487dafe3/disk.config"/>
Dec 05 09:33:34 compute-1 nova_compute[189066]:       <target dev="sda" bus="sata"/>
Dec 05 09:33:34 compute-1 nova_compute[189066]:     </disk>
Dec 05 09:33:34 compute-1 nova_compute[189066]:     <interface type="ethernet">
Dec 05 09:33:34 compute-1 nova_compute[189066]:       <mac address="fa:16:3e:e5:e6:29"/>
Dec 05 09:33:34 compute-1 nova_compute[189066]:       <model type="virtio"/>
Dec 05 09:33:34 compute-1 nova_compute[189066]:       <driver name="vhost" rx_queue_size="512"/>
Dec 05 09:33:34 compute-1 nova_compute[189066]:       <mtu size="1442"/>
Dec 05 09:33:34 compute-1 nova_compute[189066]:       <target dev="tapf02068c2-66"/>
Dec 05 09:33:34 compute-1 nova_compute[189066]:     </interface>
Dec 05 09:33:34 compute-1 nova_compute[189066]:     <serial type="pty">
Dec 05 09:33:34 compute-1 nova_compute[189066]:       <log file="/var/lib/nova/instances/06bf42f4-0e5e-43c8-81c4-b1df487dafe3/console.log" append="off"/>
Dec 05 09:33:34 compute-1 nova_compute[189066]:     </serial>
Dec 05 09:33:34 compute-1 nova_compute[189066]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Dec 05 09:33:34 compute-1 nova_compute[189066]:     <video>
Dec 05 09:33:34 compute-1 nova_compute[189066]:       <model type="virtio"/>
Dec 05 09:33:34 compute-1 nova_compute[189066]:     </video>
Dec 05 09:33:34 compute-1 nova_compute[189066]:     <input type="tablet" bus="usb"/>
Dec 05 09:33:34 compute-1 nova_compute[189066]:     <rng model="virtio">
Dec 05 09:33:34 compute-1 nova_compute[189066]:       <backend model="random">/dev/urandom</backend>
Dec 05 09:33:34 compute-1 nova_compute[189066]:     </rng>
Dec 05 09:33:34 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root"/>
Dec 05 09:33:34 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:33:34 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:33:34 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:33:34 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:33:34 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:33:34 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:33:34 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:33:34 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:33:34 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:33:34 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:33:34 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:33:34 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:33:34 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:33:34 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:33:34 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:33:34 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:33:34 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:33:34 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:33:34 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:33:34 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:33:34 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:33:34 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:33:34 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:33:34 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:33:34 compute-1 nova_compute[189066]:     <controller type="usb" index="0"/>
Dec 05 09:33:34 compute-1 nova_compute[189066]:     <memballoon model="virtio">
Dec 05 09:33:34 compute-1 nova_compute[189066]:       <stats period="10"/>
Dec 05 09:33:34 compute-1 nova_compute[189066]:     </memballoon>
Dec 05 09:33:34 compute-1 nova_compute[189066]:   </devices>
Dec 05 09:33:34 compute-1 nova_compute[189066]: </domain>
Dec 05 09:33:34 compute-1 nova_compute[189066]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Dec 05 09:33:34 compute-1 nova_compute[189066]: 2025-12-05 09:33:34.885 189070 DEBUG nova.compute.manager [None req-954ff84d-cfc2-4696-8ce6-0c1972ea5f96 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: 06bf42f4-0e5e-43c8-81c4-b1df487dafe3] Preparing to wait for external event network-vif-plugged-f02068c2-6668-4bd6-9364-5ddc884043c6 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Dec 05 09:33:34 compute-1 nova_compute[189066]: 2025-12-05 09:33:34.885 189070 DEBUG oslo_concurrency.lockutils [None req-954ff84d-cfc2-4696-8ce6-0c1972ea5f96 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Acquiring lock "06bf42f4-0e5e-43c8-81c4-b1df487dafe3-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:33:34 compute-1 nova_compute[189066]: 2025-12-05 09:33:34.886 189070 DEBUG oslo_concurrency.lockutils [None req-954ff84d-cfc2-4696-8ce6-0c1972ea5f96 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Lock "06bf42f4-0e5e-43c8-81c4-b1df487dafe3-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:33:34 compute-1 nova_compute[189066]: 2025-12-05 09:33:34.886 189070 DEBUG oslo_concurrency.lockutils [None req-954ff84d-cfc2-4696-8ce6-0c1972ea5f96 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Lock "06bf42f4-0e5e-43c8-81c4-b1df487dafe3-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:33:34 compute-1 nova_compute[189066]: 2025-12-05 09:33:34.886 189070 DEBUG nova.virt.libvirt.vif [None req-954ff84d-cfc2-4696-8ce6-0c1972ea5f96 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T09:33:16Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-620680644',display_name='tempest-TestNetworkBasicOps-server-620680644',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-620680644',id=28,image_ref='3ebffd97-b242-42d7-b245-ebdaf8e4377c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDm7JtMu9Dp/tAvdvilb6CWwpNoXl3CoNqyGQ9V53LjMNqLGWWqKoK0rxjz9+HMJlVNjmXBoxrWMPvPgS4KFYbqapau5Y14IAaLaUXSF6ZHboVdyTm2c2OurXRERdd7b/w==',key_name='tempest-TestNetworkBasicOps-344419197',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='f918144b49634ed5a43d75f8f7d194d3',ramdisk_id='',reservation_id='r-54wr0cp7',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='3ebffd97-b242-42d7-b245-ebdaf8e4377c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1341993432',owner_user_name='tempest-TestNetworkBasicOps-1341993432-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T09:33:19Z,user_data=None,user_id='822a2d80e80e46d9a5a49e1e9560e0d9',uuid=06bf42f4-0e5e-43c8-81c4-b1df487dafe3,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "f02068c2-6668-4bd6-9364-5ddc884043c6", "address": "fa:16:3e:e5:e6:29", "network": {"id": "918acd23-3741-478c-80cb-c5530f6594f8", "bridge": "br-int", "label": "tempest-network-smoke--1861576532", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f918144b49634ed5a43d75f8f7d194d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf02068c2-66", "ovs_interfaceid": "f02068c2-6668-4bd6-9364-5ddc884043c6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Dec 05 09:33:34 compute-1 nova_compute[189066]: 2025-12-05 09:33:34.887 189070 DEBUG nova.network.os_vif_util [None req-954ff84d-cfc2-4696-8ce6-0c1972ea5f96 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Converting VIF {"id": "f02068c2-6668-4bd6-9364-5ddc884043c6", "address": "fa:16:3e:e5:e6:29", "network": {"id": "918acd23-3741-478c-80cb-c5530f6594f8", "bridge": "br-int", "label": "tempest-network-smoke--1861576532", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f918144b49634ed5a43d75f8f7d194d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf02068c2-66", "ovs_interfaceid": "f02068c2-6668-4bd6-9364-5ddc884043c6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 09:33:34 compute-1 nova_compute[189066]: 2025-12-05 09:33:34.887 189070 DEBUG nova.network.os_vif_util [None req-954ff84d-cfc2-4696-8ce6-0c1972ea5f96 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:e5:e6:29,bridge_name='br-int',has_traffic_filtering=True,id=f02068c2-6668-4bd6-9364-5ddc884043c6,network=Network(918acd23-3741-478c-80cb-c5530f6594f8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf02068c2-66') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 09:33:34 compute-1 nova_compute[189066]: 2025-12-05 09:33:34.888 189070 DEBUG os_vif [None req-954ff84d-cfc2-4696-8ce6-0c1972ea5f96 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:e5:e6:29,bridge_name='br-int',has_traffic_filtering=True,id=f02068c2-6668-4bd6-9364-5ddc884043c6,network=Network(918acd23-3741-478c-80cb-c5530f6594f8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf02068c2-66') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Dec 05 09:33:34 compute-1 nova_compute[189066]: 2025-12-05 09:33:34.888 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:33:34 compute-1 nova_compute[189066]: 2025-12-05 09:33:34.889 189070 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 09:33:34 compute-1 nova_compute[189066]: 2025-12-05 09:33:34.889 189070 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 09:33:34 compute-1 nova_compute[189066]: 2025-12-05 09:33:34.893 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:33:34 compute-1 nova_compute[189066]: 2025-12-05 09:33:34.893 189070 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf02068c2-66, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 09:33:34 compute-1 nova_compute[189066]: 2025-12-05 09:33:34.893 189070 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapf02068c2-66, col_values=(('external_ids', {'iface-id': 'f02068c2-6668-4bd6-9364-5ddc884043c6', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:e5:e6:29', 'vm-uuid': '06bf42f4-0e5e-43c8-81c4-b1df487dafe3'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 09:33:34 compute-1 nova_compute[189066]: 2025-12-05 09:33:34.895 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:33:34 compute-1 NetworkManager[55704]: <info>  [1764927214.8962] manager: (tapf02068c2-66): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/89)
Dec 05 09:33:34 compute-1 nova_compute[189066]: 2025-12-05 09:33:34.898 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 09:33:34 compute-1 nova_compute[189066]: 2025-12-05 09:33:34.903 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:33:34 compute-1 nova_compute[189066]: 2025-12-05 09:33:34.904 189070 INFO os_vif [None req-954ff84d-cfc2-4696-8ce6-0c1972ea5f96 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:e5:e6:29,bridge_name='br-int',has_traffic_filtering=True,id=f02068c2-6668-4bd6-9364-5ddc884043c6,network=Network(918acd23-3741-478c-80cb-c5530f6594f8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf02068c2-66')
Dec 05 09:33:35 compute-1 nova_compute[189066]: 2025-12-05 09:33:35.596 189070 DEBUG nova.virt.libvirt.driver [None req-954ff84d-cfc2-4696-8ce6-0c1972ea5f96 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 05 09:33:35 compute-1 nova_compute[189066]: 2025-12-05 09:33:35.597 189070 DEBUG nova.virt.libvirt.driver [None req-954ff84d-cfc2-4696-8ce6-0c1972ea5f96 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 05 09:33:35 compute-1 nova_compute[189066]: 2025-12-05 09:33:35.597 189070 DEBUG nova.virt.libvirt.driver [None req-954ff84d-cfc2-4696-8ce6-0c1972ea5f96 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] No VIF found with MAC fa:16:3e:e5:e6:29, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Dec 05 09:33:35 compute-1 nova_compute[189066]: 2025-12-05 09:33:35.598 189070 INFO nova.virt.libvirt.driver [None req-954ff84d-cfc2-4696-8ce6-0c1972ea5f96 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: 06bf42f4-0e5e-43c8-81c4-b1df487dafe3] Using config drive
Dec 05 09:33:36 compute-1 podman[227583]: 2025-12-05 09:33:36.621042412 +0000 UTC m=+0.057319551 container health_status 550b604b3b40c506829669f3af0f3e8dc4e7ac01464222ce7f26998708fe1a96 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Dec 05 09:33:36 compute-1 nova_compute[189066]: 2025-12-05 09:33:36.807 189070 INFO nova.virt.libvirt.driver [None req-954ff84d-cfc2-4696-8ce6-0c1972ea5f96 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: 06bf42f4-0e5e-43c8-81c4-b1df487dafe3] Creating config drive at /var/lib/nova/instances/06bf42f4-0e5e-43c8-81c4-b1df487dafe3/disk.config
Dec 05 09:33:36 compute-1 nova_compute[189066]: 2025-12-05 09:33:36.813 189070 DEBUG oslo_concurrency.processutils [None req-954ff84d-cfc2-4696-8ce6-0c1972ea5f96 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/06bf42f4-0e5e-43c8-81c4-b1df487dafe3/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpprl3ejy2 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 09:33:36 compute-1 nova_compute[189066]: 2025-12-05 09:33:36.944 189070 DEBUG oslo_concurrency.processutils [None req-954ff84d-cfc2-4696-8ce6-0c1972ea5f96 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/06bf42f4-0e5e-43c8-81c4-b1df487dafe3/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpprl3ejy2" returned: 0 in 0.131s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 09:33:37 compute-1 kernel: tapf02068c2-66: entered promiscuous mode
Dec 05 09:33:37 compute-1 NetworkManager[55704]: <info>  [1764927217.0140] manager: (tapf02068c2-66): new Tun device (/org/freedesktop/NetworkManager/Devices/90)
Dec 05 09:33:37 compute-1 ovn_controller[95809]: 2025-12-05T09:33:37Z|00170|binding|INFO|Claiming lport f02068c2-6668-4bd6-9364-5ddc884043c6 for this chassis.
Dec 05 09:33:37 compute-1 ovn_controller[95809]: 2025-12-05T09:33:37Z|00171|binding|INFO|f02068c2-6668-4bd6-9364-5ddc884043c6: Claiming fa:16:3e:e5:e6:29 10.100.0.10
Dec 05 09:33:37 compute-1 nova_compute[189066]: 2025-12-05 09:33:37.016 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:33:37 compute-1 systemd-udevd[227622]: Network interface NamePolicy= disabled on kernel command line.
Dec 05 09:33:37 compute-1 NetworkManager[55704]: <info>  [1764927217.0615] device (tapf02068c2-66): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec 05 09:33:37 compute-1 NetworkManager[55704]: <info>  [1764927217.0629] device (tapf02068c2-66): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec 05 09:33:37 compute-1 nova_compute[189066]: 2025-12-05 09:33:37.072 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:33:37 compute-1 nova_compute[189066]: 2025-12-05 09:33:37.077 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:33:37 compute-1 ovn_controller[95809]: 2025-12-05T09:33:37Z|00172|binding|INFO|Setting lport f02068c2-6668-4bd6-9364-5ddc884043c6 ovn-installed in OVS
Dec 05 09:33:37 compute-1 nova_compute[189066]: 2025-12-05 09:33:37.081 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:33:37 compute-1 systemd-machined[154815]: New machine qemu-13-instance-0000001c.
Dec 05 09:33:37 compute-1 systemd[1]: Started Virtual Machine qemu-13-instance-0000001c.
Dec 05 09:33:37 compute-1 ovn_controller[95809]: 2025-12-05T09:33:37Z|00173|binding|INFO|Setting lport f02068c2-6668-4bd6-9364-5ddc884043c6 up in Southbound
Dec 05 09:33:37 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:33:37.358 105272 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e5:e6:29 10.100.0.10'], port_security=['fa:16:3e:e5:e6:29 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': '06bf42f4-0e5e-43c8-81c4-b1df487dafe3', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-918acd23-3741-478c-80cb-c5530f6594f8', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'f918144b49634ed5a43d75f8f7d194d3', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'c71111d9-c1ab-4a44-83b2-10a513d3fb97', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=7f705fc6-bf74-428a-8b4c-cdd14a589f25, chassis=[<ovs.db.idl.Row object at 0x7fc06f54c970>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fc06f54c970>], logical_port=f02068c2-6668-4bd6-9364-5ddc884043c6) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 09:33:37 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:33:37.359 105272 INFO neutron.agent.ovn.metadata.agent [-] Port f02068c2-6668-4bd6-9364-5ddc884043c6 in datapath 918acd23-3741-478c-80cb-c5530f6594f8 bound to our chassis
Dec 05 09:33:37 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:33:37.361 105272 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 918acd23-3741-478c-80cb-c5530f6594f8
Dec 05 09:33:37 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:33:37.376 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[d2105555-72a2-4744-a8e9-7c0478df17dd]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:33:37 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:33:37.377 105272 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap918acd23-31 in ovnmeta-918acd23-3741-478c-80cb-c5530f6594f8 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Dec 05 09:33:37 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:33:37.379 220556 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap918acd23-30 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Dec 05 09:33:37 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:33:37.379 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[447a1946-b22a-45e9-8fa5-4440a921b4ec]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:33:37 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:33:37.380 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[d3398de4-96d0-4f56-a842-c311681f1ff6]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:33:37 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:33:37.399 105809 DEBUG oslo.privsep.daemon [-] privsep: reply[abd985c6-03fd-45db-89c2-10190bfb459e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:33:37 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:33:37.428 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[f3deadce-9ee2-4c3e-9a98-ec330da8a606]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:33:37 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:33:37.460 220896 DEBUG oslo.privsep.daemon [-] privsep: reply[e48bf654-fc74-4219-af80-0010e21ad303]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:33:37 compute-1 NetworkManager[55704]: <info>  [1764927217.4720] manager: (tap918acd23-30): new Veth device (/org/freedesktop/NetworkManager/Devices/91)
Dec 05 09:33:37 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:33:37.473 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[e9a26fac-d536-42a7-a85d-fe7c001d3875]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:33:37 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:33:37.507 220896 DEBUG oslo.privsep.daemon [-] privsep: reply[c45b104f-9895-4260-b6b8-731cbf7ad771]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:33:37 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:33:37.510 220896 DEBUG oslo.privsep.daemon [-] privsep: reply[83c14927-9edc-4015-bd61-1f605a8a54fa]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:33:37 compute-1 NetworkManager[55704]: <info>  [1764927217.5341] device (tap918acd23-30): carrier: link connected
Dec 05 09:33:37 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:33:37.538 220896 DEBUG oslo.privsep.daemon [-] privsep: reply[59ea6979-6d32-4ccc-9993-d86b498261c5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:33:37 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:33:37.557 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[5b5e0247-8713-4600-885f-f427ae9b93ac]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap918acd23-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:1d:ed:28'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 51], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 449053, 'reachable_time': 41488, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 227666, 'error': None, 'target': 'ovnmeta-918acd23-3741-478c-80cb-c5530f6594f8', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:33:37 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:33:37.580 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[5454d480-cdf1-4867-8d24-a4401cc797c2]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe1d:ed28'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 449053, 'tstamp': 449053}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 227667, 'error': None, 'target': 'ovnmeta-918acd23-3741-478c-80cb-c5530f6594f8', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:33:37 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:33:37.599 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[255a9742-98bc-4d59-99a0-24ee795db74c]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap918acd23-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:1d:ed:28'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 51], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 449053, 'reachable_time': 41488, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 227668, 'error': None, 'target': 'ovnmeta-918acd23-3741-478c-80cb-c5530f6594f8', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:33:37 compute-1 nova_compute[189066]: 2025-12-05 09:33:37.628 189070 DEBUG nova.virt.driver [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] Emitting event <LifecycleEvent: 1764927217.6283164, 06bf42f4-0e5e-43c8-81c4-b1df487dafe3 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 09:33:37 compute-1 nova_compute[189066]: 2025-12-05 09:33:37.629 189070 INFO nova.compute.manager [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] [instance: 06bf42f4-0e5e-43c8-81c4-b1df487dafe3] VM Started (Lifecycle Event)
Dec 05 09:33:37 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:33:37.638 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[42a650b1-e604-461c-8022-1eb545910052]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:33:37 compute-1 nova_compute[189066]: 2025-12-05 09:33:37.674 189070 DEBUG nova.compute.manager [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] [instance: 06bf42f4-0e5e-43c8-81c4-b1df487dafe3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 09:33:37 compute-1 nova_compute[189066]: 2025-12-05 09:33:37.679 189070 DEBUG nova.virt.driver [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] Emitting event <LifecycleEvent: 1764927217.628549, 06bf42f4-0e5e-43c8-81c4-b1df487dafe3 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 09:33:37 compute-1 nova_compute[189066]: 2025-12-05 09:33:37.679 189070 INFO nova.compute.manager [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] [instance: 06bf42f4-0e5e-43c8-81c4-b1df487dafe3] VM Paused (Lifecycle Event)
Dec 05 09:33:37 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:33:37.711 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[4ff96404-fb67-4ed8-bb5a-2c6a115ca485]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:33:37 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:33:37.713 105272 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap918acd23-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 09:33:37 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:33:37.713 105272 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 09:33:37 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:33:37.714 105272 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap918acd23-30, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 09:33:37 compute-1 nova_compute[189066]: 2025-12-05 09:33:37.715 189070 DEBUG nova.compute.manager [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] [instance: 06bf42f4-0e5e-43c8-81c4-b1df487dafe3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 09:33:37 compute-1 nova_compute[189066]: 2025-12-05 09:33:37.716 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:33:37 compute-1 kernel: tap918acd23-30: entered promiscuous mode
Dec 05 09:33:37 compute-1 NetworkManager[55704]: <info>  [1764927217.7169] manager: (tap918acd23-30): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/92)
Dec 05 09:33:37 compute-1 nova_compute[189066]: 2025-12-05 09:33:37.718 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:33:37 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:33:37.719 105272 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap918acd23-30, col_values=(('external_ids', {'iface-id': '48adc250-a1d1-4155-84e6-c2b0cd01f0be'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 09:33:37 compute-1 ovn_controller[95809]: 2025-12-05T09:33:37Z|00174|binding|INFO|Releasing lport 48adc250-a1d1-4155-84e6-c2b0cd01f0be from this chassis (sb_readonly=0)
Dec 05 09:33:37 compute-1 nova_compute[189066]: 2025-12-05 09:33:37.723 189070 DEBUG nova.compute.manager [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] [instance: 06bf42f4-0e5e-43c8-81c4-b1df487dafe3] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 09:33:37 compute-1 nova_compute[189066]: 2025-12-05 09:33:37.731 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:33:37 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:33:37.732 105272 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/918acd23-3741-478c-80cb-c5530f6594f8.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/918acd23-3741-478c-80cb-c5530f6594f8.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Dec 05 09:33:37 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:33:37.733 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[84c34025-db33-467a-ba51-7f7c2746cf7e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:33:37 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:33:37.734 105272 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Dec 05 09:33:37 compute-1 ovn_metadata_agent[105267]: global
Dec 05 09:33:37 compute-1 ovn_metadata_agent[105267]:     log         /dev/log local0 debug
Dec 05 09:33:37 compute-1 ovn_metadata_agent[105267]:     log-tag     haproxy-metadata-proxy-918acd23-3741-478c-80cb-c5530f6594f8
Dec 05 09:33:37 compute-1 ovn_metadata_agent[105267]:     user        root
Dec 05 09:33:37 compute-1 ovn_metadata_agent[105267]:     group       root
Dec 05 09:33:37 compute-1 ovn_metadata_agent[105267]:     maxconn     1024
Dec 05 09:33:37 compute-1 ovn_metadata_agent[105267]:     pidfile     /var/lib/neutron/external/pids/918acd23-3741-478c-80cb-c5530f6594f8.pid.haproxy
Dec 05 09:33:37 compute-1 ovn_metadata_agent[105267]:     daemon
Dec 05 09:33:37 compute-1 ovn_metadata_agent[105267]: 
Dec 05 09:33:37 compute-1 ovn_metadata_agent[105267]: defaults
Dec 05 09:33:37 compute-1 ovn_metadata_agent[105267]:     log global
Dec 05 09:33:37 compute-1 ovn_metadata_agent[105267]:     mode http
Dec 05 09:33:37 compute-1 ovn_metadata_agent[105267]:     option httplog
Dec 05 09:33:37 compute-1 ovn_metadata_agent[105267]:     option dontlognull
Dec 05 09:33:37 compute-1 ovn_metadata_agent[105267]:     option http-server-close
Dec 05 09:33:37 compute-1 ovn_metadata_agent[105267]:     option forwardfor
Dec 05 09:33:37 compute-1 ovn_metadata_agent[105267]:     retries                 3
Dec 05 09:33:37 compute-1 ovn_metadata_agent[105267]:     timeout http-request    30s
Dec 05 09:33:37 compute-1 ovn_metadata_agent[105267]:     timeout connect         30s
Dec 05 09:33:37 compute-1 ovn_metadata_agent[105267]:     timeout client          32s
Dec 05 09:33:37 compute-1 ovn_metadata_agent[105267]:     timeout server          32s
Dec 05 09:33:37 compute-1 ovn_metadata_agent[105267]:     timeout http-keep-alive 30s
Dec 05 09:33:37 compute-1 ovn_metadata_agent[105267]: 
Dec 05 09:33:37 compute-1 ovn_metadata_agent[105267]: 
Dec 05 09:33:37 compute-1 ovn_metadata_agent[105267]: listen listener
Dec 05 09:33:37 compute-1 ovn_metadata_agent[105267]:     bind 169.254.169.254:80
Dec 05 09:33:37 compute-1 ovn_metadata_agent[105267]:     server metadata /var/lib/neutron/metadata_proxy
Dec 05 09:33:37 compute-1 ovn_metadata_agent[105267]:     http-request add-header X-OVN-Network-ID 918acd23-3741-478c-80cb-c5530f6594f8
Dec 05 09:33:37 compute-1 ovn_metadata_agent[105267]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Dec 05 09:33:37 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:33:37.734 105272 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-918acd23-3741-478c-80cb-c5530f6594f8', 'env', 'PROCESS_TAG=haproxy-918acd23-3741-478c-80cb-c5530f6594f8', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/918acd23-3741-478c-80cb-c5530f6594f8.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Dec 05 09:33:37 compute-1 nova_compute[189066]: 2025-12-05 09:33:37.756 189070 INFO nova.compute.manager [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] [instance: 06bf42f4-0e5e-43c8-81c4-b1df487dafe3] During sync_power_state the instance has a pending task (spawning). Skip.
Dec 05 09:33:38 compute-1 podman[227701]: 2025-12-05 09:33:38.109030963 +0000 UTC m=+0.048559365 container create ae4aebb1f529fcc5564fdc09f2d4f5f1e5ade28d28d2bfe3aae723729be47ce6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-918acd23-3741-478c-80cb-c5530f6594f8, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Dec 05 09:33:38 compute-1 systemd[1]: Started libpod-conmon-ae4aebb1f529fcc5564fdc09f2d4f5f1e5ade28d28d2bfe3aae723729be47ce6.scope.
Dec 05 09:33:38 compute-1 systemd[1]: Started libcrun container.
Dec 05 09:33:38 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e76c29f90911fd5c27181b6bb69505fa5d087040bd156891b7f30dbd3a661bb5/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 05 09:33:38 compute-1 podman[227701]: 2025-12-05 09:33:38.082840969 +0000 UTC m=+0.022369381 image pull 014dc726c85414b29f2dde7b5d875685d08784761c0f0ffa8630d1583a877bf9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Dec 05 09:33:38 compute-1 podman[227701]: 2025-12-05 09:33:38.18813438 +0000 UTC m=+0.127662792 container init ae4aebb1f529fcc5564fdc09f2d4f5f1e5ade28d28d2bfe3aae723729be47ce6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-918acd23-3741-478c-80cb-c5530f6594f8, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0)
Dec 05 09:33:38 compute-1 podman[227701]: 2025-12-05 09:33:38.193593224 +0000 UTC m=+0.133121616 container start ae4aebb1f529fcc5564fdc09f2d4f5f1e5ade28d28d2bfe3aae723729be47ce6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-918acd23-3741-478c-80cb-c5530f6594f8, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125)
Dec 05 09:33:38 compute-1 neutron-haproxy-ovnmeta-918acd23-3741-478c-80cb-c5530f6594f8[227715]: [NOTICE]   (227719) : New worker (227721) forked
Dec 05 09:33:38 compute-1 neutron-haproxy-ovnmeta-918acd23-3741-478c-80cb-c5530f6594f8[227715]: [NOTICE]   (227719) : Loading success.
Dec 05 09:33:39 compute-1 nova_compute[189066]: 2025-12-05 09:33:39.554 189070 DEBUG nova.compute.manager [req-42eb6733-ebef-4c49-91e4-5aacd7c5ad6b req-6362199b-e09b-4348-b3eb-17340211a106 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 06bf42f4-0e5e-43c8-81c4-b1df487dafe3] Received event network-vif-plugged-f02068c2-6668-4bd6-9364-5ddc884043c6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 09:33:39 compute-1 nova_compute[189066]: 2025-12-05 09:33:39.554 189070 DEBUG oslo_concurrency.lockutils [req-42eb6733-ebef-4c49-91e4-5aacd7c5ad6b req-6362199b-e09b-4348-b3eb-17340211a106 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Acquiring lock "06bf42f4-0e5e-43c8-81c4-b1df487dafe3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:33:39 compute-1 nova_compute[189066]: 2025-12-05 09:33:39.554 189070 DEBUG oslo_concurrency.lockutils [req-42eb6733-ebef-4c49-91e4-5aacd7c5ad6b req-6362199b-e09b-4348-b3eb-17340211a106 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Lock "06bf42f4-0e5e-43c8-81c4-b1df487dafe3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:33:39 compute-1 nova_compute[189066]: 2025-12-05 09:33:39.554 189070 DEBUG oslo_concurrency.lockutils [req-42eb6733-ebef-4c49-91e4-5aacd7c5ad6b req-6362199b-e09b-4348-b3eb-17340211a106 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Lock "06bf42f4-0e5e-43c8-81c4-b1df487dafe3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:33:39 compute-1 nova_compute[189066]: 2025-12-05 09:33:39.555 189070 DEBUG nova.compute.manager [req-42eb6733-ebef-4c49-91e4-5aacd7c5ad6b req-6362199b-e09b-4348-b3eb-17340211a106 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 06bf42f4-0e5e-43c8-81c4-b1df487dafe3] Processing event network-vif-plugged-f02068c2-6668-4bd6-9364-5ddc884043c6 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Dec 05 09:33:39 compute-1 nova_compute[189066]: 2025-12-05 09:33:39.555 189070 DEBUG nova.compute.manager [None req-954ff84d-cfc2-4696-8ce6-0c1972ea5f96 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: 06bf42f4-0e5e-43c8-81c4-b1df487dafe3] Instance event wait completed in 1 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Dec 05 09:33:39 compute-1 nova_compute[189066]: 2025-12-05 09:33:39.560 189070 DEBUG nova.virt.driver [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] Emitting event <LifecycleEvent: 1764927219.5601902, 06bf42f4-0e5e-43c8-81c4-b1df487dafe3 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 09:33:39 compute-1 nova_compute[189066]: 2025-12-05 09:33:39.560 189070 INFO nova.compute.manager [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] [instance: 06bf42f4-0e5e-43c8-81c4-b1df487dafe3] VM Resumed (Lifecycle Event)
Dec 05 09:33:39 compute-1 nova_compute[189066]: 2025-12-05 09:33:39.564 189070 DEBUG nova.virt.libvirt.driver [None req-954ff84d-cfc2-4696-8ce6-0c1972ea5f96 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: 06bf42f4-0e5e-43c8-81c4-b1df487dafe3] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Dec 05 09:33:39 compute-1 nova_compute[189066]: 2025-12-05 09:33:39.567 189070 INFO nova.virt.libvirt.driver [-] [instance: 06bf42f4-0e5e-43c8-81c4-b1df487dafe3] Instance spawned successfully.
Dec 05 09:33:39 compute-1 nova_compute[189066]: 2025-12-05 09:33:39.567 189070 DEBUG nova.virt.libvirt.driver [None req-954ff84d-cfc2-4696-8ce6-0c1972ea5f96 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: 06bf42f4-0e5e-43c8-81c4-b1df487dafe3] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Dec 05 09:33:39 compute-1 nova_compute[189066]: 2025-12-05 09:33:39.590 189070 DEBUG nova.compute.manager [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] [instance: 06bf42f4-0e5e-43c8-81c4-b1df487dafe3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 09:33:39 compute-1 nova_compute[189066]: 2025-12-05 09:33:39.596 189070 DEBUG nova.virt.libvirt.driver [None req-954ff84d-cfc2-4696-8ce6-0c1972ea5f96 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: 06bf42f4-0e5e-43c8-81c4-b1df487dafe3] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 09:33:39 compute-1 nova_compute[189066]: 2025-12-05 09:33:39.597 189070 DEBUG nova.virt.libvirt.driver [None req-954ff84d-cfc2-4696-8ce6-0c1972ea5f96 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: 06bf42f4-0e5e-43c8-81c4-b1df487dafe3] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 09:33:39 compute-1 nova_compute[189066]: 2025-12-05 09:33:39.597 189070 DEBUG nova.virt.libvirt.driver [None req-954ff84d-cfc2-4696-8ce6-0c1972ea5f96 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: 06bf42f4-0e5e-43c8-81c4-b1df487dafe3] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 09:33:39 compute-1 nova_compute[189066]: 2025-12-05 09:33:39.598 189070 DEBUG nova.virt.libvirt.driver [None req-954ff84d-cfc2-4696-8ce6-0c1972ea5f96 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: 06bf42f4-0e5e-43c8-81c4-b1df487dafe3] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 09:33:39 compute-1 nova_compute[189066]: 2025-12-05 09:33:39.598 189070 DEBUG nova.virt.libvirt.driver [None req-954ff84d-cfc2-4696-8ce6-0c1972ea5f96 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: 06bf42f4-0e5e-43c8-81c4-b1df487dafe3] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 09:33:39 compute-1 nova_compute[189066]: 2025-12-05 09:33:39.599 189070 DEBUG nova.virt.libvirt.driver [None req-954ff84d-cfc2-4696-8ce6-0c1972ea5f96 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: 06bf42f4-0e5e-43c8-81c4-b1df487dafe3] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 09:33:39 compute-1 nova_compute[189066]: 2025-12-05 09:33:39.604 189070 DEBUG nova.compute.manager [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] [instance: 06bf42f4-0e5e-43c8-81c4-b1df487dafe3] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 09:33:39 compute-1 nova_compute[189066]: 2025-12-05 09:33:39.636 189070 INFO nova.compute.manager [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] [instance: 06bf42f4-0e5e-43c8-81c4-b1df487dafe3] During sync_power_state the instance has a pending task (spawning). Skip.
Dec 05 09:33:39 compute-1 nova_compute[189066]: 2025-12-05 09:33:39.709 189070 INFO nova.compute.manager [None req-954ff84d-cfc2-4696-8ce6-0c1972ea5f96 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: 06bf42f4-0e5e-43c8-81c4-b1df487dafe3] Took 20.51 seconds to spawn the instance on the hypervisor.
Dec 05 09:33:39 compute-1 nova_compute[189066]: 2025-12-05 09:33:39.709 189070 DEBUG nova.compute.manager [None req-954ff84d-cfc2-4696-8ce6-0c1972ea5f96 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: 06bf42f4-0e5e-43c8-81c4-b1df487dafe3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 09:33:39 compute-1 nova_compute[189066]: 2025-12-05 09:33:39.746 189070 DEBUG nova.network.neutron [req-c2529c60-8800-4252-aaa4-65e9bcd6c683 req-f6b5eb12-8739-4bde-8e63-c2da7ffd723c 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 06bf42f4-0e5e-43c8-81c4-b1df487dafe3] Updated VIF entry in instance network info cache for port f02068c2-6668-4bd6-9364-5ddc884043c6. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Dec 05 09:33:39 compute-1 nova_compute[189066]: 2025-12-05 09:33:39.747 189070 DEBUG nova.network.neutron [req-c2529c60-8800-4252-aaa4-65e9bcd6c683 req-f6b5eb12-8739-4bde-8e63-c2da7ffd723c 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 06bf42f4-0e5e-43c8-81c4-b1df487dafe3] Updating instance_info_cache with network_info: [{"id": "f02068c2-6668-4bd6-9364-5ddc884043c6", "address": "fa:16:3e:e5:e6:29", "network": {"id": "918acd23-3741-478c-80cb-c5530f6594f8", "bridge": "br-int", "label": "tempest-network-smoke--1861576532", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f918144b49634ed5a43d75f8f7d194d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf02068c2-66", "ovs_interfaceid": "f02068c2-6668-4bd6-9364-5ddc884043c6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 09:33:39 compute-1 nova_compute[189066]: 2025-12-05 09:33:39.774 189070 DEBUG oslo_concurrency.lockutils [req-c2529c60-8800-4252-aaa4-65e9bcd6c683 req-f6b5eb12-8739-4bde-8e63-c2da7ffd723c 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Releasing lock "refresh_cache-06bf42f4-0e5e-43c8-81c4-b1df487dafe3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 09:33:39 compute-1 nova_compute[189066]: 2025-12-05 09:33:39.778 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:33:39 compute-1 nova_compute[189066]: 2025-12-05 09:33:39.788 189070 INFO nova.compute.manager [None req-954ff84d-cfc2-4696-8ce6-0c1972ea5f96 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: 06bf42f4-0e5e-43c8-81c4-b1df487dafe3] Took 21.18 seconds to build instance.
Dec 05 09:33:39 compute-1 nova_compute[189066]: 2025-12-05 09:33:39.821 189070 DEBUG oslo_concurrency.lockutils [None req-954ff84d-cfc2-4696-8ce6-0c1972ea5f96 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Lock "06bf42f4-0e5e-43c8-81c4-b1df487dafe3" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 21.630s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:33:39 compute-1 nova_compute[189066]: 2025-12-05 09:33:39.896 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:33:42 compute-1 nova_compute[189066]: 2025-12-05 09:33:42.521 189070 DEBUG nova.compute.manager [req-fe324341-587c-4b20-941b-23e23bc2c0ff req-883706c1-fd75-47b0-8133-7d981bc682b5 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 06bf42f4-0e5e-43c8-81c4-b1df487dafe3] Received event network-vif-plugged-f02068c2-6668-4bd6-9364-5ddc884043c6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 09:33:42 compute-1 nova_compute[189066]: 2025-12-05 09:33:42.522 189070 DEBUG oslo_concurrency.lockutils [req-fe324341-587c-4b20-941b-23e23bc2c0ff req-883706c1-fd75-47b0-8133-7d981bc682b5 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Acquiring lock "06bf42f4-0e5e-43c8-81c4-b1df487dafe3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:33:42 compute-1 nova_compute[189066]: 2025-12-05 09:33:42.522 189070 DEBUG oslo_concurrency.lockutils [req-fe324341-587c-4b20-941b-23e23bc2c0ff req-883706c1-fd75-47b0-8133-7d981bc682b5 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Lock "06bf42f4-0e5e-43c8-81c4-b1df487dafe3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:33:42 compute-1 nova_compute[189066]: 2025-12-05 09:33:42.522 189070 DEBUG oslo_concurrency.lockutils [req-fe324341-587c-4b20-941b-23e23bc2c0ff req-883706c1-fd75-47b0-8133-7d981bc682b5 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Lock "06bf42f4-0e5e-43c8-81c4-b1df487dafe3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:33:42 compute-1 nova_compute[189066]: 2025-12-05 09:33:42.522 189070 DEBUG nova.compute.manager [req-fe324341-587c-4b20-941b-23e23bc2c0ff req-883706c1-fd75-47b0-8133-7d981bc682b5 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 06bf42f4-0e5e-43c8-81c4-b1df487dafe3] No waiting events found dispatching network-vif-plugged-f02068c2-6668-4bd6-9364-5ddc884043c6 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 09:33:42 compute-1 nova_compute[189066]: 2025-12-05 09:33:42.522 189070 WARNING nova.compute.manager [req-fe324341-587c-4b20-941b-23e23bc2c0ff req-883706c1-fd75-47b0-8133-7d981bc682b5 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 06bf42f4-0e5e-43c8-81c4-b1df487dafe3] Received unexpected event network-vif-plugged-f02068c2-6668-4bd6-9364-5ddc884043c6 for instance with vm_state active and task_state None.
Dec 05 09:33:43 compute-1 podman[227730]: 2025-12-05 09:33:43.643682071 +0000 UTC m=+0.083139726 container health_status d60d3dfd47255bc7c068dbedc03dbf86678863f0414a1b8f8536e4d53dd67a13 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, config_id=ceilometer_agent_compute, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a42d21893a42142b0b00e96296a13ac9d2c0fedf55d6438ad104fd0309c63376-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ceilometer_agent_compute)
Dec 05 09:33:44 compute-1 nova_compute[189066]: 2025-12-05 09:33:44.779 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:33:44 compute-1 nova_compute[189066]: 2025-12-05 09:33:44.898 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:33:45 compute-1 NetworkManager[55704]: <info>  [1764927225.7950] manager: (patch-provnet-540ef657-1e81-4b72-856b-0e1c84504735-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/93)
Dec 05 09:33:45 compute-1 NetworkManager[55704]: <info>  [1764927225.7960] manager: (patch-br-int-to-provnet-540ef657-1e81-4b72-856b-0e1c84504735): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/94)
Dec 05 09:33:45 compute-1 nova_compute[189066]: 2025-12-05 09:33:45.794 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:33:45 compute-1 nova_compute[189066]: 2025-12-05 09:33:45.956 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:33:45 compute-1 ovn_controller[95809]: 2025-12-05T09:33:45Z|00175|binding|INFO|Releasing lport 48adc250-a1d1-4155-84e6-c2b0cd01f0be from this chassis (sb_readonly=0)
Dec 05 09:33:45 compute-1 nova_compute[189066]: 2025-12-05 09:33:45.980 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:33:47 compute-1 podman[227752]: 2025-12-05 09:33:47.653089883 +0000 UTC m=+0.088527839 container health_status 0cdc951f3b455a1459471b20e1f1626ca53f702831c125f835d1b670aeb17ccf (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a42d21893a42142b0b00e96296a13ac9d2c0fedf55d6438ad104fd0309c63376-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Dec 05 09:33:47 compute-1 nova_compute[189066]: 2025-12-05 09:33:47.736 189070 DEBUG nova.compute.manager [req-13a70e73-49a6-4faf-9e10-c6b28ae18a01 req-c4328c15-5a55-4dac-82f8-ce63abfb80bb 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 06bf42f4-0e5e-43c8-81c4-b1df487dafe3] Received event network-changed-f02068c2-6668-4bd6-9364-5ddc884043c6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 09:33:47 compute-1 nova_compute[189066]: 2025-12-05 09:33:47.737 189070 DEBUG nova.compute.manager [req-13a70e73-49a6-4faf-9e10-c6b28ae18a01 req-c4328c15-5a55-4dac-82f8-ce63abfb80bb 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 06bf42f4-0e5e-43c8-81c4-b1df487dafe3] Refreshing instance network info cache due to event network-changed-f02068c2-6668-4bd6-9364-5ddc884043c6. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Dec 05 09:33:47 compute-1 nova_compute[189066]: 2025-12-05 09:33:47.737 189070 DEBUG oslo_concurrency.lockutils [req-13a70e73-49a6-4faf-9e10-c6b28ae18a01 req-c4328c15-5a55-4dac-82f8-ce63abfb80bb 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Acquiring lock "refresh_cache-06bf42f4-0e5e-43c8-81c4-b1df487dafe3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 09:33:47 compute-1 nova_compute[189066]: 2025-12-05 09:33:47.737 189070 DEBUG oslo_concurrency.lockutils [req-13a70e73-49a6-4faf-9e10-c6b28ae18a01 req-c4328c15-5a55-4dac-82f8-ce63abfb80bb 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Acquired lock "refresh_cache-06bf42f4-0e5e-43c8-81c4-b1df487dafe3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 09:33:47 compute-1 nova_compute[189066]: 2025-12-05 09:33:47.738 189070 DEBUG nova.network.neutron [req-13a70e73-49a6-4faf-9e10-c6b28ae18a01 req-c4328c15-5a55-4dac-82f8-ce63abfb80bb 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 06bf42f4-0e5e-43c8-81c4-b1df487dafe3] Refreshing network info cache for port f02068c2-6668-4bd6-9364-5ddc884043c6 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Dec 05 09:33:49 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:33:49.002 105272 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:4e:c7:a7 2001:db8:0:1:f816:3eff:fe4e:c7a7 2001:db8::f816:3eff:fe4e:c7a7'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8:0:1:f816:3eff:fe4e:c7a7/64 2001:db8::f816:3eff:fe4e:c7a7/64', 'neutron:device_id': 'ovnmeta-1ee7616f-eb09-48d4-95fd-44f73d99bae9', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1ee7616f-eb09-48d4-95fd-44f73d99bae9', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fa1cd463d74b49139a088d332d37e611', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=da3f5033-2564-469b-98a2-3dc266c0c01a, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=b882a982-6090-4679-ae56-430e728f7b16) old=Port_Binding(mac=['fa:16:3e:4e:c7:a7 2001:db8::f816:3eff:fe4e:c7a7'], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:fe4e:c7a7/64', 'neutron:device_id': 'ovnmeta-1ee7616f-eb09-48d4-95fd-44f73d99bae9', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1ee7616f-eb09-48d4-95fd-44f73d99bae9', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fa1cd463d74b49139a088d332d37e611', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 09:33:49 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:33:49.005 105272 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port b882a982-6090-4679-ae56-430e728f7b16 in datapath 1ee7616f-eb09-48d4-95fd-44f73d99bae9 updated
Dec 05 09:33:49 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:33:49.007 105272 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 1ee7616f-eb09-48d4-95fd-44f73d99bae9, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 05 09:33:49 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:33:49.008 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[63ad713a-215b-4c46-9d89-b0f30854894b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:33:49 compute-1 nova_compute[189066]: 2025-12-05 09:33:49.280 189070 DEBUG nova.network.neutron [req-13a70e73-49a6-4faf-9e10-c6b28ae18a01 req-c4328c15-5a55-4dac-82f8-ce63abfb80bb 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 06bf42f4-0e5e-43c8-81c4-b1df487dafe3] Updated VIF entry in instance network info cache for port f02068c2-6668-4bd6-9364-5ddc884043c6. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Dec 05 09:33:49 compute-1 nova_compute[189066]: 2025-12-05 09:33:49.281 189070 DEBUG nova.network.neutron [req-13a70e73-49a6-4faf-9e10-c6b28ae18a01 req-c4328c15-5a55-4dac-82f8-ce63abfb80bb 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 06bf42f4-0e5e-43c8-81c4-b1df487dafe3] Updating instance_info_cache with network_info: [{"id": "f02068c2-6668-4bd6-9364-5ddc884043c6", "address": "fa:16:3e:e5:e6:29", "network": {"id": "918acd23-3741-478c-80cb-c5530f6594f8", "bridge": "br-int", "label": "tempest-network-smoke--1861576532", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.221", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f918144b49634ed5a43d75f8f7d194d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf02068c2-66", "ovs_interfaceid": "f02068c2-6668-4bd6-9364-5ddc884043c6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 09:33:49 compute-1 nova_compute[189066]: 2025-12-05 09:33:49.319 189070 DEBUG oslo_concurrency.lockutils [req-13a70e73-49a6-4faf-9e10-c6b28ae18a01 req-c4328c15-5a55-4dac-82f8-ce63abfb80bb 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Releasing lock "refresh_cache-06bf42f4-0e5e-43c8-81c4-b1df487dafe3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 09:33:49 compute-1 nova_compute[189066]: 2025-12-05 09:33:49.782 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:33:49 compute-1 nova_compute[189066]: 2025-12-05 09:33:49.900 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:33:50 compute-1 podman[227778]: 2025-12-05 09:33:50.672036543 +0000 UTC m=+0.093747337 container health_status e8cea6352187f28e672aa25c227f2705a90d58074b8cf42235a56df5d4026fd5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a42d21893a42142b0b00e96296a13ac9d2c0fedf55d6438ad104fd0309c63376-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Dec 05 09:33:53 compute-1 ovn_controller[95809]: 2025-12-05T09:33:53Z|00029|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:e5:e6:29 10.100.0.10
Dec 05 09:33:53 compute-1 ovn_controller[95809]: 2025-12-05T09:33:53Z|00030|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:e5:e6:29 10.100.0.10
Dec 05 09:33:53 compute-1 podman[227806]: 2025-12-05 09:33:53.15670299 +0000 UTC m=+0.091265867 container health_status 3f3288243aefe700d53cffce184353529fbaf6e44f1c19ec4792ed576af41176 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=multipathd, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 05 09:33:54 compute-1 nova_compute[189066]: 2025-12-05 09:33:54.785 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:33:54 compute-1 nova_compute[189066]: 2025-12-05 09:33:54.902 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:33:56 compute-1 nova_compute[189066]: 2025-12-05 09:33:56.121 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:33:56 compute-1 podman[227826]: 2025-12-05 09:33:56.636881262 +0000 UTC m=+0.064277369 container health_status b07f36300303a5c1729056a61e344d6c6fd9b2e404082d8e8ce7185d45652c2b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, distribution-scope=public, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=openstack_network_exporter, io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., container_name=openstack_network_exporter, managed_by=edpm_ansible, version=9.6, architecture=x86_64, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7)
Dec 05 09:33:59 compute-1 nova_compute[189066]: 2025-12-05 09:33:59.788 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:33:59 compute-1 nova_compute[189066]: 2025-12-05 09:33:59.858 189070 INFO nova.compute.manager [None req-b0aaa5a8-7b38-490d-86fd-5460d9e931fc 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: 06bf42f4-0e5e-43c8-81c4-b1df487dafe3] Get console output
Dec 05 09:33:59 compute-1 nova_compute[189066]: 2025-12-05 09:33:59.865 220618 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Dec 05 09:33:59 compute-1 nova_compute[189066]: 2025-12-05 09:33:59.904 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:34:00 compute-1 podman[227847]: 2025-12-05 09:34:00.652484906 +0000 UTC m=+0.090251739 container health_status b9f45cdcb567bb250381fa80d86c78c226d5638c7de1b6d79ee017275814ff68 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Dec 05 09:34:02 compute-1 nova_compute[189066]: 2025-12-05 09:34:02.174 189070 DEBUG nova.compute.manager [req-e7ef0543-1fd3-44fd-807a-3f4cf0b2acbc req-9cb5d0d8-4290-4ce7-9d2a-1f4d4da271a2 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 06bf42f4-0e5e-43c8-81c4-b1df487dafe3] Received event network-changed-f02068c2-6668-4bd6-9364-5ddc884043c6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 09:34:02 compute-1 nova_compute[189066]: 2025-12-05 09:34:02.174 189070 DEBUG nova.compute.manager [req-e7ef0543-1fd3-44fd-807a-3f4cf0b2acbc req-9cb5d0d8-4290-4ce7-9d2a-1f4d4da271a2 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 06bf42f4-0e5e-43c8-81c4-b1df487dafe3] Refreshing instance network info cache due to event network-changed-f02068c2-6668-4bd6-9364-5ddc884043c6. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Dec 05 09:34:02 compute-1 nova_compute[189066]: 2025-12-05 09:34:02.175 189070 DEBUG oslo_concurrency.lockutils [req-e7ef0543-1fd3-44fd-807a-3f4cf0b2acbc req-9cb5d0d8-4290-4ce7-9d2a-1f4d4da271a2 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Acquiring lock "refresh_cache-06bf42f4-0e5e-43c8-81c4-b1df487dafe3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 09:34:02 compute-1 nova_compute[189066]: 2025-12-05 09:34:02.175 189070 DEBUG oslo_concurrency.lockutils [req-e7ef0543-1fd3-44fd-807a-3f4cf0b2acbc req-9cb5d0d8-4290-4ce7-9d2a-1f4d4da271a2 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Acquired lock "refresh_cache-06bf42f4-0e5e-43c8-81c4-b1df487dafe3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 09:34:02 compute-1 nova_compute[189066]: 2025-12-05 09:34:02.175 189070 DEBUG nova.network.neutron [req-e7ef0543-1fd3-44fd-807a-3f4cf0b2acbc req-9cb5d0d8-4290-4ce7-9d2a-1f4d4da271a2 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 06bf42f4-0e5e-43c8-81c4-b1df487dafe3] Refreshing network info cache for port f02068c2-6668-4bd6-9364-5ddc884043c6 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Dec 05 09:34:04 compute-1 nova_compute[189066]: 2025-12-05 09:34:04.790 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:34:04 compute-1 nova_compute[189066]: 2025-12-05 09:34:04.906 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:34:06 compute-1 nova_compute[189066]: 2025-12-05 09:34:06.095 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:34:06 compute-1 nova_compute[189066]: 2025-12-05 09:34:06.108 189070 DEBUG nova.network.neutron [req-e7ef0543-1fd3-44fd-807a-3f4cf0b2acbc req-9cb5d0d8-4290-4ce7-9d2a-1f4d4da271a2 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 06bf42f4-0e5e-43c8-81c4-b1df487dafe3] Updated VIF entry in instance network info cache for port f02068c2-6668-4bd6-9364-5ddc884043c6. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Dec 05 09:34:06 compute-1 nova_compute[189066]: 2025-12-05 09:34:06.109 189070 DEBUG nova.network.neutron [req-e7ef0543-1fd3-44fd-807a-3f4cf0b2acbc req-9cb5d0d8-4290-4ce7-9d2a-1f4d4da271a2 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 06bf42f4-0e5e-43c8-81c4-b1df487dafe3] Updating instance_info_cache with network_info: [{"id": "f02068c2-6668-4bd6-9364-5ddc884043c6", "address": "fa:16:3e:e5:e6:29", "network": {"id": "918acd23-3741-478c-80cb-c5530f6594f8", "bridge": "br-int", "label": "tempest-network-smoke--1861576532", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f918144b49634ed5a43d75f8f7d194d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf02068c2-66", "ovs_interfaceid": "f02068c2-6668-4bd6-9364-5ddc884043c6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 09:34:06 compute-1 nova_compute[189066]: 2025-12-05 09:34:06.133 189070 DEBUG oslo_concurrency.lockutils [req-e7ef0543-1fd3-44fd-807a-3f4cf0b2acbc req-9cb5d0d8-4290-4ce7-9d2a-1f4d4da271a2 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Releasing lock "refresh_cache-06bf42f4-0e5e-43c8-81c4-b1df487dafe3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 09:34:07 compute-1 podman[227872]: 2025-12-05 09:34:07.629992898 +0000 UTC m=+0.065202161 container health_status 550b604b3b40c506829669f3af0f3e8dc4e7ac01464222ce7f26998708fe1a96 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Dec 05 09:34:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:34:08.878 105272 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:34:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:34:08.880 105272 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:34:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:34:08.880 105272 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:34:09 compute-1 nova_compute[189066]: 2025-12-05 09:34:09.792 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:34:09 compute-1 nova_compute[189066]: 2025-12-05 09:34:09.909 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:34:13 compute-1 nova_compute[189066]: 2025-12-05 09:34:13.020 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:34:13 compute-1 sshd-session[227898]: Received disconnect from 122.114.113.177 port 45638:11: Bye Bye [preauth]
Dec 05 09:34:13 compute-1 sshd-session[227898]: Disconnected from authenticating user root 122.114.113.177 port 45638 [preauth]
Dec 05 09:34:14 compute-1 nova_compute[189066]: 2025-12-05 09:34:14.580 189070 DEBUG oslo_concurrency.lockutils [None req-31adc746-5ca3-479b-98c5-9c86b1b8f9c1 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Acquiring lock "438132a9-5f29-4b35-b457-98c182eb1660" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:34:14 compute-1 nova_compute[189066]: 2025-12-05 09:34:14.580 189070 DEBUG oslo_concurrency.lockutils [None req-31adc746-5ca3-479b-98c5-9c86b1b8f9c1 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Lock "438132a9-5f29-4b35-b457-98c182eb1660" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:34:14 compute-1 nova_compute[189066]: 2025-12-05 09:34:14.611 189070 DEBUG nova.compute.manager [None req-31adc746-5ca3-479b-98c5-9c86b1b8f9c1 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: 438132a9-5f29-4b35-b457-98c182eb1660] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Dec 05 09:34:14 compute-1 podman[227900]: 2025-12-05 09:34:14.662071928 +0000 UTC m=+0.092295322 container health_status d60d3dfd47255bc7c068dbedc03dbf86678863f0414a1b8f8536e4d53dd67a13 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a42d21893a42142b0b00e96296a13ac9d2c0fedf55d6438ad104fd0309c63376-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Dec 05 09:34:14 compute-1 nova_compute[189066]: 2025-12-05 09:34:14.795 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:34:14 compute-1 nova_compute[189066]: 2025-12-05 09:34:14.911 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:34:15 compute-1 nova_compute[189066]: 2025-12-05 09:34:15.005 189070 DEBUG oslo_concurrency.lockutils [None req-31adc746-5ca3-479b-98c5-9c86b1b8f9c1 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:34:15 compute-1 nova_compute[189066]: 2025-12-05 09:34:15.006 189070 DEBUG oslo_concurrency.lockutils [None req-31adc746-5ca3-479b-98c5-9c86b1b8f9c1 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:34:15 compute-1 nova_compute[189066]: 2025-12-05 09:34:15.014 189070 DEBUG nova.virt.hardware [None req-31adc746-5ca3-479b-98c5-9c86b1b8f9c1 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Dec 05 09:34:15 compute-1 nova_compute[189066]: 2025-12-05 09:34:15.014 189070 INFO nova.compute.claims [None req-31adc746-5ca3-479b-98c5-9c86b1b8f9c1 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: 438132a9-5f29-4b35-b457-98c182eb1660] Claim successful on node compute-1.ctlplane.example.com
Dec 05 09:34:15 compute-1 nova_compute[189066]: 2025-12-05 09:34:15.017 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:34:15 compute-1 nova_compute[189066]: 2025-12-05 09:34:15.214 189070 DEBUG nova.compute.provider_tree [None req-31adc746-5ca3-479b-98c5-9c86b1b8f9c1 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Inventory has not changed in ProviderTree for provider: be68f9f1-7820-4bfa-8dbd-210e13729f64 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 09:34:15 compute-1 nova_compute[189066]: 2025-12-05 09:34:15.248 189070 DEBUG nova.scheduler.client.report [None req-31adc746-5ca3-479b-98c5-9c86b1b8f9c1 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Inventory has not changed for provider be68f9f1-7820-4bfa-8dbd-210e13729f64 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 09:34:15 compute-1 nova_compute[189066]: 2025-12-05 09:34:15.288 189070 DEBUG oslo_concurrency.lockutils [None req-31adc746-5ca3-479b-98c5-9c86b1b8f9c1 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.282s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:34:15 compute-1 nova_compute[189066]: 2025-12-05 09:34:15.290 189070 DEBUG nova.compute.manager [None req-31adc746-5ca3-479b-98c5-9c86b1b8f9c1 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: 438132a9-5f29-4b35-b457-98c182eb1660] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Dec 05 09:34:15 compute-1 nova_compute[189066]: 2025-12-05 09:34:15.373 189070 DEBUG nova.compute.manager [None req-31adc746-5ca3-479b-98c5-9c86b1b8f9c1 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: 438132a9-5f29-4b35-b457-98c182eb1660] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Dec 05 09:34:15 compute-1 nova_compute[189066]: 2025-12-05 09:34:15.374 189070 DEBUG nova.network.neutron [None req-31adc746-5ca3-479b-98c5-9c86b1b8f9c1 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: 438132a9-5f29-4b35-b457-98c182eb1660] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Dec 05 09:34:15 compute-1 nova_compute[189066]: 2025-12-05 09:34:15.415 189070 INFO nova.virt.libvirt.driver [None req-31adc746-5ca3-479b-98c5-9c86b1b8f9c1 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: 438132a9-5f29-4b35-b457-98c182eb1660] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Dec 05 09:34:15 compute-1 nova_compute[189066]: 2025-12-05 09:34:15.441 189070 DEBUG nova.compute.manager [None req-31adc746-5ca3-479b-98c5-9c86b1b8f9c1 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: 438132a9-5f29-4b35-b457-98c182eb1660] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Dec 05 09:34:15 compute-1 nova_compute[189066]: 2025-12-05 09:34:15.561 189070 DEBUG nova.compute.manager [None req-31adc746-5ca3-479b-98c5-9c86b1b8f9c1 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: 438132a9-5f29-4b35-b457-98c182eb1660] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Dec 05 09:34:15 compute-1 nova_compute[189066]: 2025-12-05 09:34:15.563 189070 DEBUG nova.virt.libvirt.driver [None req-31adc746-5ca3-479b-98c5-9c86b1b8f9c1 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: 438132a9-5f29-4b35-b457-98c182eb1660] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Dec 05 09:34:15 compute-1 nova_compute[189066]: 2025-12-05 09:34:15.563 189070 INFO nova.virt.libvirt.driver [None req-31adc746-5ca3-479b-98c5-9c86b1b8f9c1 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: 438132a9-5f29-4b35-b457-98c182eb1660] Creating image(s)
Dec 05 09:34:15 compute-1 nova_compute[189066]: 2025-12-05 09:34:15.564 189070 DEBUG oslo_concurrency.lockutils [None req-31adc746-5ca3-479b-98c5-9c86b1b8f9c1 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Acquiring lock "/var/lib/nova/instances/438132a9-5f29-4b35-b457-98c182eb1660/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:34:15 compute-1 nova_compute[189066]: 2025-12-05 09:34:15.566 189070 DEBUG oslo_concurrency.lockutils [None req-31adc746-5ca3-479b-98c5-9c86b1b8f9c1 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Lock "/var/lib/nova/instances/438132a9-5f29-4b35-b457-98c182eb1660/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:34:15 compute-1 nova_compute[189066]: 2025-12-05 09:34:15.566 189070 DEBUG oslo_concurrency.lockutils [None req-31adc746-5ca3-479b-98c5-9c86b1b8f9c1 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Lock "/var/lib/nova/instances/438132a9-5f29-4b35-b457-98c182eb1660/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:34:15 compute-1 nova_compute[189066]: 2025-12-05 09:34:15.581 189070 DEBUG oslo_concurrency.processutils [None req-31adc746-5ca3-479b-98c5-9c86b1b8f9c1 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5b1bdb5cc50b9bf43ebe3094e9279a12a18df0ce --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 09:34:15 compute-1 nova_compute[189066]: 2025-12-05 09:34:15.643 189070 DEBUG oslo_concurrency.processutils [None req-31adc746-5ca3-479b-98c5-9c86b1b8f9c1 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5b1bdb5cc50b9bf43ebe3094e9279a12a18df0ce --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 09:34:15 compute-1 nova_compute[189066]: 2025-12-05 09:34:15.644 189070 DEBUG oslo_concurrency.lockutils [None req-31adc746-5ca3-479b-98c5-9c86b1b8f9c1 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Acquiring lock "5b1bdb5cc50b9bf43ebe3094e9279a12a18df0ce" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:34:15 compute-1 nova_compute[189066]: 2025-12-05 09:34:15.645 189070 DEBUG oslo_concurrency.lockutils [None req-31adc746-5ca3-479b-98c5-9c86b1b8f9c1 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Lock "5b1bdb5cc50b9bf43ebe3094e9279a12a18df0ce" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:34:15 compute-1 nova_compute[189066]: 2025-12-05 09:34:15.655 189070 DEBUG oslo_concurrency.processutils [None req-31adc746-5ca3-479b-98c5-9c86b1b8f9c1 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5b1bdb5cc50b9bf43ebe3094e9279a12a18df0ce --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 09:34:15 compute-1 nova_compute[189066]: 2025-12-05 09:34:15.716 189070 DEBUG oslo_concurrency.processutils [None req-31adc746-5ca3-479b-98c5-9c86b1b8f9c1 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5b1bdb5cc50b9bf43ebe3094e9279a12a18df0ce --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 09:34:15 compute-1 nova_compute[189066]: 2025-12-05 09:34:15.717 189070 DEBUG oslo_concurrency.processutils [None req-31adc746-5ca3-479b-98c5-9c86b1b8f9c1 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5b1bdb5cc50b9bf43ebe3094e9279a12a18df0ce,backing_fmt=raw /var/lib/nova/instances/438132a9-5f29-4b35-b457-98c182eb1660/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 09:34:15 compute-1 nova_compute[189066]: 2025-12-05 09:34:15.759 189070 DEBUG oslo_concurrency.processutils [None req-31adc746-5ca3-479b-98c5-9c86b1b8f9c1 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5b1bdb5cc50b9bf43ebe3094e9279a12a18df0ce,backing_fmt=raw /var/lib/nova/instances/438132a9-5f29-4b35-b457-98c182eb1660/disk 1073741824" returned: 0 in 0.042s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 09:34:15 compute-1 nova_compute[189066]: 2025-12-05 09:34:15.760 189070 DEBUG oslo_concurrency.lockutils [None req-31adc746-5ca3-479b-98c5-9c86b1b8f9c1 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Lock "5b1bdb5cc50b9bf43ebe3094e9279a12a18df0ce" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.115s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:34:15 compute-1 nova_compute[189066]: 2025-12-05 09:34:15.760 189070 DEBUG oslo_concurrency.processutils [None req-31adc746-5ca3-479b-98c5-9c86b1b8f9c1 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5b1bdb5cc50b9bf43ebe3094e9279a12a18df0ce --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 09:34:15 compute-1 nova_compute[189066]: 2025-12-05 09:34:15.818 189070 DEBUG oslo_concurrency.processutils [None req-31adc746-5ca3-479b-98c5-9c86b1b8f9c1 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5b1bdb5cc50b9bf43ebe3094e9279a12a18df0ce --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 09:34:15 compute-1 nova_compute[189066]: 2025-12-05 09:34:15.819 189070 DEBUG nova.virt.disk.api [None req-31adc746-5ca3-479b-98c5-9c86b1b8f9c1 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Checking if we can resize image /var/lib/nova/instances/438132a9-5f29-4b35-b457-98c182eb1660/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Dec 05 09:34:15 compute-1 nova_compute[189066]: 2025-12-05 09:34:15.820 189070 DEBUG oslo_concurrency.processutils [None req-31adc746-5ca3-479b-98c5-9c86b1b8f9c1 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/438132a9-5f29-4b35-b457-98c182eb1660/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 09:34:15 compute-1 nova_compute[189066]: 2025-12-05 09:34:15.882 189070 DEBUG oslo_concurrency.processutils [None req-31adc746-5ca3-479b-98c5-9c86b1b8f9c1 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/438132a9-5f29-4b35-b457-98c182eb1660/disk --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 09:34:15 compute-1 nova_compute[189066]: 2025-12-05 09:34:15.883 189070 DEBUG nova.virt.disk.api [None req-31adc746-5ca3-479b-98c5-9c86b1b8f9c1 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Cannot resize image /var/lib/nova/instances/438132a9-5f29-4b35-b457-98c182eb1660/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Dec 05 09:34:15 compute-1 nova_compute[189066]: 2025-12-05 09:34:15.883 189070 DEBUG nova.objects.instance [None req-31adc746-5ca3-479b-98c5-9c86b1b8f9c1 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Lazy-loading 'migration_context' on Instance uuid 438132a9-5f29-4b35-b457-98c182eb1660 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 09:34:15 compute-1 nova_compute[189066]: 2025-12-05 09:34:15.897 189070 DEBUG nova.policy [None req-31adc746-5ca3-479b-98c5-9c86b1b8f9c1 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '822a2d80e80e46d9a5a49e1e9560e0d9', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'f918144b49634ed5a43d75f8f7d194d3', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Dec 05 09:34:15 compute-1 nova_compute[189066]: 2025-12-05 09:34:15.950 189070 DEBUG nova.virt.libvirt.driver [None req-31adc746-5ca3-479b-98c5-9c86b1b8f9c1 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: 438132a9-5f29-4b35-b457-98c182eb1660] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Dec 05 09:34:15 compute-1 nova_compute[189066]: 2025-12-05 09:34:15.951 189070 DEBUG nova.virt.libvirt.driver [None req-31adc746-5ca3-479b-98c5-9c86b1b8f9c1 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: 438132a9-5f29-4b35-b457-98c182eb1660] Ensure instance console log exists: /var/lib/nova/instances/438132a9-5f29-4b35-b457-98c182eb1660/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Dec 05 09:34:15 compute-1 nova_compute[189066]: 2025-12-05 09:34:15.951 189070 DEBUG oslo_concurrency.lockutils [None req-31adc746-5ca3-479b-98c5-9c86b1b8f9c1 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:34:15 compute-1 nova_compute[189066]: 2025-12-05 09:34:15.952 189070 DEBUG oslo_concurrency.lockutils [None req-31adc746-5ca3-479b-98c5-9c86b1b8f9c1 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:34:15 compute-1 nova_compute[189066]: 2025-12-05 09:34:15.952 189070 DEBUG oslo_concurrency.lockutils [None req-31adc746-5ca3-479b-98c5-9c86b1b8f9c1 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:34:17 compute-1 nova_compute[189066]: 2025-12-05 09:34:17.020 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:34:17 compute-1 nova_compute[189066]: 2025-12-05 09:34:17.021 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:34:17 compute-1 nova_compute[189066]: 2025-12-05 09:34:17.021 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:34:17 compute-1 nova_compute[189066]: 2025-12-05 09:34:17.073 189070 DEBUG oslo_concurrency.lockutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:34:17 compute-1 nova_compute[189066]: 2025-12-05 09:34:17.074 189070 DEBUG oslo_concurrency.lockutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:34:17 compute-1 nova_compute[189066]: 2025-12-05 09:34:17.074 189070 DEBUG oslo_concurrency.lockutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:34:17 compute-1 nova_compute[189066]: 2025-12-05 09:34:17.074 189070 DEBUG nova.compute.resource_tracker [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 05 09:34:17 compute-1 nova_compute[189066]: 2025-12-05 09:34:17.203 189070 DEBUG oslo_concurrency.processutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/06bf42f4-0e5e-43c8-81c4-b1df487dafe3/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 09:34:17 compute-1 nova_compute[189066]: 2025-12-05 09:34:17.281 189070 DEBUG oslo_concurrency.processutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/06bf42f4-0e5e-43c8-81c4-b1df487dafe3/disk --force-share --output=json" returned: 0 in 0.078s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 09:34:17 compute-1 nova_compute[189066]: 2025-12-05 09:34:17.283 189070 DEBUG oslo_concurrency.processutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/06bf42f4-0e5e-43c8-81c4-b1df487dafe3/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 09:34:17 compute-1 nova_compute[189066]: 2025-12-05 09:34:17.354 189070 DEBUG oslo_concurrency.processutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/06bf42f4-0e5e-43c8-81c4-b1df487dafe3/disk --force-share --output=json" returned: 0 in 0.071s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 09:34:17 compute-1 nova_compute[189066]: 2025-12-05 09:34:17.551 189070 WARNING nova.virt.libvirt.driver [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 05 09:34:17 compute-1 nova_compute[189066]: 2025-12-05 09:34:17.553 189070 DEBUG nova.compute.resource_tracker [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5577MB free_disk=73.30462265014648GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 05 09:34:17 compute-1 nova_compute[189066]: 2025-12-05 09:34:17.553 189070 DEBUG oslo_concurrency.lockutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:34:17 compute-1 nova_compute[189066]: 2025-12-05 09:34:17.553 189070 DEBUG oslo_concurrency.lockutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:34:17 compute-1 nova_compute[189066]: 2025-12-05 09:34:17.731 189070 DEBUG nova.compute.resource_tracker [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Instance 06bf42f4-0e5e-43c8-81c4-b1df487dafe3 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Dec 05 09:34:17 compute-1 nova_compute[189066]: 2025-12-05 09:34:17.731 189070 DEBUG nova.compute.resource_tracker [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Instance 438132a9-5f29-4b35-b457-98c182eb1660 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Dec 05 09:34:17 compute-1 nova_compute[189066]: 2025-12-05 09:34:17.731 189070 DEBUG nova.compute.resource_tracker [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 05 09:34:17 compute-1 nova_compute[189066]: 2025-12-05 09:34:17.732 189070 DEBUG nova.compute.resource_tracker [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 05 09:34:17 compute-1 nova_compute[189066]: 2025-12-05 09:34:17.825 189070 DEBUG nova.compute.provider_tree [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Inventory has not changed in ProviderTree for provider: be68f9f1-7820-4bfa-8dbd-210e13729f64 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 09:34:17 compute-1 nova_compute[189066]: 2025-12-05 09:34:17.845 189070 DEBUG nova.scheduler.client.report [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Inventory has not changed for provider be68f9f1-7820-4bfa-8dbd-210e13729f64 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 09:34:17 compute-1 nova_compute[189066]: 2025-12-05 09:34:17.895 189070 DEBUG nova.compute.resource_tracker [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 05 09:34:17 compute-1 nova_compute[189066]: 2025-12-05 09:34:17.895 189070 DEBUG oslo_concurrency.lockutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.342s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:34:18 compute-1 podman[227940]: 2025-12-05 09:34:18.645257474 +0000 UTC m=+0.084272886 container health_status 0cdc951f3b455a1459471b20e1f1626ca53f702831c125f835d1b670aeb17ccf (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a42d21893a42142b0b00e96296a13ac9d2c0fedf55d6438ad104fd0309c63376-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Dec 05 09:34:18 compute-1 nova_compute[189066]: 2025-12-05 09:34:18.895 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:34:18 compute-1 nova_compute[189066]: 2025-12-05 09:34:18.896 189070 DEBUG nova.compute.manager [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 05 09:34:19 compute-1 nova_compute[189066]: 2025-12-05 09:34:19.021 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:34:19 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:34:19.725 105272 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=17, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f2:d3:89', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '8e:43:2c:7d:20:dd'}, ipsec=False) old=SB_Global(nb_cfg=16) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 09:34:19 compute-1 nova_compute[189066]: 2025-12-05 09:34:19.726 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:34:19 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:34:19.726 105272 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Dec 05 09:34:19 compute-1 nova_compute[189066]: 2025-12-05 09:34:19.797 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:34:19 compute-1 nova_compute[189066]: 2025-12-05 09:34:19.913 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:34:20 compute-1 nova_compute[189066]: 2025-12-05 09:34:20.021 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:34:20 compute-1 nova_compute[189066]: 2025-12-05 09:34:20.204 189070 DEBUG nova.network.neutron [None req-31adc746-5ca3-479b-98c5-9c86b1b8f9c1 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: 438132a9-5f29-4b35-b457-98c182eb1660] Successfully created port: 20f9bafd-4df6-4d4a-849b-c578c6f0bcef _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Dec 05 09:34:21 compute-1 nova_compute[189066]: 2025-12-05 09:34:21.020 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:34:21 compute-1 nova_compute[189066]: 2025-12-05 09:34:21.021 189070 DEBUG nova.compute.manager [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 05 09:34:21 compute-1 nova_compute[189066]: 2025-12-05 09:34:21.021 189070 DEBUG nova.compute.manager [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 05 09:34:21 compute-1 nova_compute[189066]: 2025-12-05 09:34:21.067 189070 DEBUG nova.compute.manager [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] [instance: 438132a9-5f29-4b35-b457-98c182eb1660] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871
Dec 05 09:34:21 compute-1 nova_compute[189066]: 2025-12-05 09:34:21.331 189070 DEBUG oslo_concurrency.lockutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Acquiring lock "refresh_cache-06bf42f4-0e5e-43c8-81c4-b1df487dafe3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 09:34:21 compute-1 nova_compute[189066]: 2025-12-05 09:34:21.332 189070 DEBUG oslo_concurrency.lockutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Acquired lock "refresh_cache-06bf42f4-0e5e-43c8-81c4-b1df487dafe3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 09:34:21 compute-1 nova_compute[189066]: 2025-12-05 09:34:21.332 189070 DEBUG nova.network.neutron [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] [instance: 06bf42f4-0e5e-43c8-81c4-b1df487dafe3] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Dec 05 09:34:21 compute-1 nova_compute[189066]: 2025-12-05 09:34:21.332 189070 DEBUG nova.objects.instance [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 06bf42f4-0e5e-43c8-81c4-b1df487dafe3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 09:34:21 compute-1 podman[227967]: 2025-12-05 09:34:21.610836077 +0000 UTC m=+0.050996055 container health_status e8cea6352187f28e672aa25c227f2705a90d58074b8cf42235a56df5d4026fd5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a42d21893a42142b0b00e96296a13ac9d2c0fedf55d6438ad104fd0309c63376-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS)
Dec 05 09:34:23 compute-1 nova_compute[189066]: 2025-12-05 09:34:23.046 189070 DEBUG nova.network.neutron [None req-31adc746-5ca3-479b-98c5-9c86b1b8f9c1 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: 438132a9-5f29-4b35-b457-98c182eb1660] Successfully updated port: 20f9bafd-4df6-4d4a-849b-c578c6f0bcef _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Dec 05 09:34:23 compute-1 nova_compute[189066]: 2025-12-05 09:34:23.075 189070 DEBUG oslo_concurrency.lockutils [None req-31adc746-5ca3-479b-98c5-9c86b1b8f9c1 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Acquiring lock "refresh_cache-438132a9-5f29-4b35-b457-98c182eb1660" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 09:34:23 compute-1 nova_compute[189066]: 2025-12-05 09:34:23.076 189070 DEBUG oslo_concurrency.lockutils [None req-31adc746-5ca3-479b-98c5-9c86b1b8f9c1 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Acquired lock "refresh_cache-438132a9-5f29-4b35-b457-98c182eb1660" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 09:34:23 compute-1 nova_compute[189066]: 2025-12-05 09:34:23.076 189070 DEBUG nova.network.neutron [None req-31adc746-5ca3-479b-98c5-9c86b1b8f9c1 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: 438132a9-5f29-4b35-b457-98c182eb1660] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Dec 05 09:34:23 compute-1 nova_compute[189066]: 2025-12-05 09:34:23.431 189070 DEBUG nova.network.neutron [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] [instance: 06bf42f4-0e5e-43c8-81c4-b1df487dafe3] Updating instance_info_cache with network_info: [{"id": "f02068c2-6668-4bd6-9364-5ddc884043c6", "address": "fa:16:3e:e5:e6:29", "network": {"id": "918acd23-3741-478c-80cb-c5530f6594f8", "bridge": "br-int", "label": "tempest-network-smoke--1861576532", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f918144b49634ed5a43d75f8f7d194d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf02068c2-66", "ovs_interfaceid": "f02068c2-6668-4bd6-9364-5ddc884043c6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 09:34:23 compute-1 nova_compute[189066]: 2025-12-05 09:34:23.462 189070 DEBUG oslo_concurrency.lockutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Releasing lock "refresh_cache-06bf42f4-0e5e-43c8-81c4-b1df487dafe3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 09:34:23 compute-1 nova_compute[189066]: 2025-12-05 09:34:23.462 189070 DEBUG nova.compute.manager [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] [instance: 06bf42f4-0e5e-43c8-81c4-b1df487dafe3] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Dec 05 09:34:23 compute-1 nova_compute[189066]: 2025-12-05 09:34:23.463 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:34:23 compute-1 nova_compute[189066]: 2025-12-05 09:34:23.513 189070 DEBUG nova.network.neutron [None req-31adc746-5ca3-479b-98c5-9c86b1b8f9c1 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: 438132a9-5f29-4b35-b457-98c182eb1660] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Dec 05 09:34:23 compute-1 podman[227988]: 2025-12-05 09:34:23.620578582 +0000 UTC m=+0.054318296 container health_status 3f3288243aefe700d53cffce184353529fbaf6e44f1c19ec4792ed576af41176 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 05 09:34:24 compute-1 nova_compute[189066]: 2025-12-05 09:34:24.818 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:34:24 compute-1 nova_compute[189066]: 2025-12-05 09:34:24.915 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:34:25 compute-1 nova_compute[189066]: 2025-12-05 09:34:25.719 189070 DEBUG nova.compute.manager [req-5b536c13-bab2-422c-b0a6-c618544fca5e req-b8b2a648-63cc-4b3d-bcba-5fae51e1232e 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 438132a9-5f29-4b35-b457-98c182eb1660] Received event network-changed-20f9bafd-4df6-4d4a-849b-c578c6f0bcef external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 09:34:25 compute-1 nova_compute[189066]: 2025-12-05 09:34:25.720 189070 DEBUG nova.compute.manager [req-5b536c13-bab2-422c-b0a6-c618544fca5e req-b8b2a648-63cc-4b3d-bcba-5fae51e1232e 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 438132a9-5f29-4b35-b457-98c182eb1660] Refreshing instance network info cache due to event network-changed-20f9bafd-4df6-4d4a-849b-c578c6f0bcef. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Dec 05 09:34:25 compute-1 nova_compute[189066]: 2025-12-05 09:34:25.720 189070 DEBUG oslo_concurrency.lockutils [req-5b536c13-bab2-422c-b0a6-c618544fca5e req-b8b2a648-63cc-4b3d-bcba-5fae51e1232e 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Acquiring lock "refresh_cache-438132a9-5f29-4b35-b457-98c182eb1660" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 09:34:26 compute-1 nova_compute[189066]: 2025-12-05 09:34:26.290 189070 DEBUG nova.network.neutron [None req-31adc746-5ca3-479b-98c5-9c86b1b8f9c1 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: 438132a9-5f29-4b35-b457-98c182eb1660] Updating instance_info_cache with network_info: [{"id": "20f9bafd-4df6-4d4a-849b-c578c6f0bcef", "address": "fa:16:3e:e4:6f:05", "network": {"id": "918acd23-3741-478c-80cb-c5530f6594f8", "bridge": "br-int", "label": "tempest-network-smoke--1861576532", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f918144b49634ed5a43d75f8f7d194d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap20f9bafd-4d", "ovs_interfaceid": "20f9bafd-4df6-4d4a-849b-c578c6f0bcef", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 09:34:26 compute-1 nova_compute[189066]: 2025-12-05 09:34:26.325 189070 DEBUG oslo_concurrency.lockutils [None req-31adc746-5ca3-479b-98c5-9c86b1b8f9c1 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Releasing lock "refresh_cache-438132a9-5f29-4b35-b457-98c182eb1660" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 09:34:26 compute-1 nova_compute[189066]: 2025-12-05 09:34:26.326 189070 DEBUG nova.compute.manager [None req-31adc746-5ca3-479b-98c5-9c86b1b8f9c1 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: 438132a9-5f29-4b35-b457-98c182eb1660] Instance network_info: |[{"id": "20f9bafd-4df6-4d4a-849b-c578c6f0bcef", "address": "fa:16:3e:e4:6f:05", "network": {"id": "918acd23-3741-478c-80cb-c5530f6594f8", "bridge": "br-int", "label": "tempest-network-smoke--1861576532", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f918144b49634ed5a43d75f8f7d194d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap20f9bafd-4d", "ovs_interfaceid": "20f9bafd-4df6-4d4a-849b-c578c6f0bcef", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Dec 05 09:34:26 compute-1 nova_compute[189066]: 2025-12-05 09:34:26.326 189070 DEBUG oslo_concurrency.lockutils [req-5b536c13-bab2-422c-b0a6-c618544fca5e req-b8b2a648-63cc-4b3d-bcba-5fae51e1232e 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Acquired lock "refresh_cache-438132a9-5f29-4b35-b457-98c182eb1660" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 09:34:26 compute-1 nova_compute[189066]: 2025-12-05 09:34:26.326 189070 DEBUG nova.network.neutron [req-5b536c13-bab2-422c-b0a6-c618544fca5e req-b8b2a648-63cc-4b3d-bcba-5fae51e1232e 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 438132a9-5f29-4b35-b457-98c182eb1660] Refreshing network info cache for port 20f9bafd-4df6-4d4a-849b-c578c6f0bcef _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Dec 05 09:34:26 compute-1 nova_compute[189066]: 2025-12-05 09:34:26.329 189070 DEBUG nova.virt.libvirt.driver [None req-31adc746-5ca3-479b-98c5-9c86b1b8f9c1 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: 438132a9-5f29-4b35-b457-98c182eb1660] Start _get_guest_xml network_info=[{"id": "20f9bafd-4df6-4d4a-849b-c578c6f0bcef", "address": "fa:16:3e:e4:6f:05", "network": {"id": "918acd23-3741-478c-80cb-c5530f6594f8", "bridge": "br-int", "label": "tempest-network-smoke--1861576532", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f918144b49634ed5a43d75f8f7d194d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap20f9bafd-4d", "ovs_interfaceid": "20f9bafd-4df6-4d4a-849b-c578c6f0bcef", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T09:19:04Z,direct_url=<?>,disk_format='qcow2',id=3ebffd97-b242-42d7-b245-ebdaf8e4377c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='1cb29f3743454b7fb71af92993455437',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T09:19:06Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encrypted': False, 'encryption_secret_uuid': None, 'size': 0, 'boot_index': 0, 'device_name': '/dev/vda', 'disk_bus': 'virtio', 'guest_format': None, 'encryption_options': None, 'encryption_format': None, 'device_type': 'disk', 'image_id': '3ebffd97-b242-42d7-b245-ebdaf8e4377c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Dec 05 09:34:26 compute-1 nova_compute[189066]: 2025-12-05 09:34:26.334 189070 WARNING nova.virt.libvirt.driver [None req-31adc746-5ca3-479b-98c5-9c86b1b8f9c1 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 05 09:34:26 compute-1 nova_compute[189066]: 2025-12-05 09:34:26.340 189070 DEBUG nova.virt.libvirt.host [None req-31adc746-5ca3-479b-98c5-9c86b1b8f9c1 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Dec 05 09:34:26 compute-1 nova_compute[189066]: 2025-12-05 09:34:26.341 189070 DEBUG nova.virt.libvirt.host [None req-31adc746-5ca3-479b-98c5-9c86b1b8f9c1 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Dec 05 09:34:26 compute-1 nova_compute[189066]: 2025-12-05 09:34:26.365 189070 DEBUG nova.virt.libvirt.host [None req-31adc746-5ca3-479b-98c5-9c86b1b8f9c1 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Dec 05 09:34:26 compute-1 nova_compute[189066]: 2025-12-05 09:34:26.366 189070 DEBUG nova.virt.libvirt.host [None req-31adc746-5ca3-479b-98c5-9c86b1b8f9c1 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Dec 05 09:34:26 compute-1 nova_compute[189066]: 2025-12-05 09:34:26.368 189070 DEBUG nova.virt.libvirt.driver [None req-31adc746-5ca3-479b-98c5-9c86b1b8f9c1 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Dec 05 09:34:26 compute-1 nova_compute[189066]: 2025-12-05 09:34:26.368 189070 DEBUG nova.virt.hardware [None req-31adc746-5ca3-479b-98c5-9c86b1b8f9c1 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-05T09:19:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='fbadeab4-f24f-4100-963a-d228b2a6f7c4',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T09:19:04Z,direct_url=<?>,disk_format='qcow2',id=3ebffd97-b242-42d7-b245-ebdaf8e4377c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='1cb29f3743454b7fb71af92993455437',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T09:19:06Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Dec 05 09:34:26 compute-1 nova_compute[189066]: 2025-12-05 09:34:26.369 189070 DEBUG nova.virt.hardware [None req-31adc746-5ca3-479b-98c5-9c86b1b8f9c1 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Dec 05 09:34:26 compute-1 nova_compute[189066]: 2025-12-05 09:34:26.369 189070 DEBUG nova.virt.hardware [None req-31adc746-5ca3-479b-98c5-9c86b1b8f9c1 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Dec 05 09:34:26 compute-1 nova_compute[189066]: 2025-12-05 09:34:26.369 189070 DEBUG nova.virt.hardware [None req-31adc746-5ca3-479b-98c5-9c86b1b8f9c1 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Dec 05 09:34:26 compute-1 nova_compute[189066]: 2025-12-05 09:34:26.369 189070 DEBUG nova.virt.hardware [None req-31adc746-5ca3-479b-98c5-9c86b1b8f9c1 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Dec 05 09:34:26 compute-1 nova_compute[189066]: 2025-12-05 09:34:26.369 189070 DEBUG nova.virt.hardware [None req-31adc746-5ca3-479b-98c5-9c86b1b8f9c1 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Dec 05 09:34:26 compute-1 nova_compute[189066]: 2025-12-05 09:34:26.370 189070 DEBUG nova.virt.hardware [None req-31adc746-5ca3-479b-98c5-9c86b1b8f9c1 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Dec 05 09:34:26 compute-1 nova_compute[189066]: 2025-12-05 09:34:26.370 189070 DEBUG nova.virt.hardware [None req-31adc746-5ca3-479b-98c5-9c86b1b8f9c1 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Dec 05 09:34:26 compute-1 nova_compute[189066]: 2025-12-05 09:34:26.370 189070 DEBUG nova.virt.hardware [None req-31adc746-5ca3-479b-98c5-9c86b1b8f9c1 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Dec 05 09:34:26 compute-1 nova_compute[189066]: 2025-12-05 09:34:26.371 189070 DEBUG nova.virt.hardware [None req-31adc746-5ca3-479b-98c5-9c86b1b8f9c1 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Dec 05 09:34:26 compute-1 nova_compute[189066]: 2025-12-05 09:34:26.371 189070 DEBUG nova.virt.hardware [None req-31adc746-5ca3-479b-98c5-9c86b1b8f9c1 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Dec 05 09:34:26 compute-1 nova_compute[189066]: 2025-12-05 09:34:26.376 189070 DEBUG nova.virt.libvirt.vif [None req-31adc746-5ca3-479b-98c5-9c86b1b8f9c1 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T09:34:13Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-2088978795',display_name='tempest-TestNetworkBasicOps-server-2088978795',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-2088978795',id=32,image_ref='3ebffd97-b242-42d7-b245-ebdaf8e4377c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBMuwUlWk2J8MhLzGrVSCPKsjZ2cn2j5HsZgCuIma4m+NLQ3JmfTlLaKyz+KSaQTRKJUK07n4T6cKvrKFN6/xGM5MvB+0fJJJDxn949bj5AoXo6fvLq7r3jh6uryDDUZ5zQ==',key_name='tempest-TestNetworkBasicOps-829941913',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='f918144b49634ed5a43d75f8f7d194d3',ramdisk_id='',reservation_id='r-jwd1ncb2',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='3ebffd97-b242-42d7-b245-ebdaf8e4377c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1341993432',owner_user_name='tempest-TestNetworkBasicOps-1341993432-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T09:34:15Z,user_data=None,user_id='822a2d80e80e46d9a5a49e1e9560e0d9',uuid=438132a9-5f29-4b35-b457-98c182eb1660,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "20f9bafd-4df6-4d4a-849b-c578c6f0bcef", "address": "fa:16:3e:e4:6f:05", "network": {"id": "918acd23-3741-478c-80cb-c5530f6594f8", "bridge": "br-int", "label": "tempest-network-smoke--1861576532", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f918144b49634ed5a43d75f8f7d194d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap20f9bafd-4d", "ovs_interfaceid": "20f9bafd-4df6-4d4a-849b-c578c6f0bcef", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Dec 05 09:34:26 compute-1 nova_compute[189066]: 2025-12-05 09:34:26.376 189070 DEBUG nova.network.os_vif_util [None req-31adc746-5ca3-479b-98c5-9c86b1b8f9c1 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Converting VIF {"id": "20f9bafd-4df6-4d4a-849b-c578c6f0bcef", "address": "fa:16:3e:e4:6f:05", "network": {"id": "918acd23-3741-478c-80cb-c5530f6594f8", "bridge": "br-int", "label": "tempest-network-smoke--1861576532", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f918144b49634ed5a43d75f8f7d194d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap20f9bafd-4d", "ovs_interfaceid": "20f9bafd-4df6-4d4a-849b-c578c6f0bcef", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 09:34:26 compute-1 nova_compute[189066]: 2025-12-05 09:34:26.377 189070 DEBUG nova.network.os_vif_util [None req-31adc746-5ca3-479b-98c5-9c86b1b8f9c1 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:e4:6f:05,bridge_name='br-int',has_traffic_filtering=True,id=20f9bafd-4df6-4d4a-849b-c578c6f0bcef,network=Network(918acd23-3741-478c-80cb-c5530f6594f8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap20f9bafd-4d') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 09:34:26 compute-1 nova_compute[189066]: 2025-12-05 09:34:26.378 189070 DEBUG nova.objects.instance [None req-31adc746-5ca3-479b-98c5-9c86b1b8f9c1 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Lazy-loading 'pci_devices' on Instance uuid 438132a9-5f29-4b35-b457-98c182eb1660 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 09:34:26 compute-1 nova_compute[189066]: 2025-12-05 09:34:26.404 189070 DEBUG nova.virt.libvirt.driver [None req-31adc746-5ca3-479b-98c5-9c86b1b8f9c1 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: 438132a9-5f29-4b35-b457-98c182eb1660] End _get_guest_xml xml=<domain type="kvm">
Dec 05 09:34:26 compute-1 nova_compute[189066]:   <uuid>438132a9-5f29-4b35-b457-98c182eb1660</uuid>
Dec 05 09:34:26 compute-1 nova_compute[189066]:   <name>instance-00000020</name>
Dec 05 09:34:26 compute-1 nova_compute[189066]:   <memory>131072</memory>
Dec 05 09:34:26 compute-1 nova_compute[189066]:   <vcpu>1</vcpu>
Dec 05 09:34:26 compute-1 nova_compute[189066]:   <metadata>
Dec 05 09:34:26 compute-1 nova_compute[189066]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 05 09:34:26 compute-1 nova_compute[189066]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec 05 09:34:26 compute-1 nova_compute[189066]:       <nova:name>tempest-TestNetworkBasicOps-server-2088978795</nova:name>
Dec 05 09:34:26 compute-1 nova_compute[189066]:       <nova:creationTime>2025-12-05 09:34:26</nova:creationTime>
Dec 05 09:34:26 compute-1 nova_compute[189066]:       <nova:flavor name="m1.nano">
Dec 05 09:34:26 compute-1 nova_compute[189066]:         <nova:memory>128</nova:memory>
Dec 05 09:34:26 compute-1 nova_compute[189066]:         <nova:disk>1</nova:disk>
Dec 05 09:34:26 compute-1 nova_compute[189066]:         <nova:swap>0</nova:swap>
Dec 05 09:34:26 compute-1 nova_compute[189066]:         <nova:ephemeral>0</nova:ephemeral>
Dec 05 09:34:26 compute-1 nova_compute[189066]:         <nova:vcpus>1</nova:vcpus>
Dec 05 09:34:26 compute-1 nova_compute[189066]:       </nova:flavor>
Dec 05 09:34:26 compute-1 nova_compute[189066]:       <nova:owner>
Dec 05 09:34:26 compute-1 nova_compute[189066]:         <nova:user uuid="822a2d80e80e46d9a5a49e1e9560e0d9">tempest-TestNetworkBasicOps-1341993432-project-member</nova:user>
Dec 05 09:34:26 compute-1 nova_compute[189066]:         <nova:project uuid="f918144b49634ed5a43d75f8f7d194d3">tempest-TestNetworkBasicOps-1341993432</nova:project>
Dec 05 09:34:26 compute-1 nova_compute[189066]:       </nova:owner>
Dec 05 09:34:26 compute-1 nova_compute[189066]:       <nova:root type="image" uuid="3ebffd97-b242-42d7-b245-ebdaf8e4377c"/>
Dec 05 09:34:26 compute-1 nova_compute[189066]:       <nova:ports>
Dec 05 09:34:26 compute-1 nova_compute[189066]:         <nova:port uuid="20f9bafd-4df6-4d4a-849b-c578c6f0bcef">
Dec 05 09:34:26 compute-1 nova_compute[189066]:           <nova:ip type="fixed" address="10.100.0.9" ipVersion="4"/>
Dec 05 09:34:26 compute-1 nova_compute[189066]:         </nova:port>
Dec 05 09:34:26 compute-1 nova_compute[189066]:       </nova:ports>
Dec 05 09:34:26 compute-1 nova_compute[189066]:     </nova:instance>
Dec 05 09:34:26 compute-1 nova_compute[189066]:   </metadata>
Dec 05 09:34:26 compute-1 nova_compute[189066]:   <sysinfo type="smbios">
Dec 05 09:34:26 compute-1 nova_compute[189066]:     <system>
Dec 05 09:34:26 compute-1 nova_compute[189066]:       <entry name="manufacturer">RDO</entry>
Dec 05 09:34:26 compute-1 nova_compute[189066]:       <entry name="product">OpenStack Compute</entry>
Dec 05 09:34:26 compute-1 nova_compute[189066]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec 05 09:34:26 compute-1 nova_compute[189066]:       <entry name="serial">438132a9-5f29-4b35-b457-98c182eb1660</entry>
Dec 05 09:34:26 compute-1 nova_compute[189066]:       <entry name="uuid">438132a9-5f29-4b35-b457-98c182eb1660</entry>
Dec 05 09:34:26 compute-1 nova_compute[189066]:       <entry name="family">Virtual Machine</entry>
Dec 05 09:34:26 compute-1 nova_compute[189066]:     </system>
Dec 05 09:34:26 compute-1 nova_compute[189066]:   </sysinfo>
Dec 05 09:34:26 compute-1 nova_compute[189066]:   <os>
Dec 05 09:34:26 compute-1 nova_compute[189066]:     <type arch="x86_64" machine="q35">hvm</type>
Dec 05 09:34:26 compute-1 nova_compute[189066]:     <boot dev="hd"/>
Dec 05 09:34:26 compute-1 nova_compute[189066]:     <smbios mode="sysinfo"/>
Dec 05 09:34:26 compute-1 nova_compute[189066]:   </os>
Dec 05 09:34:26 compute-1 nova_compute[189066]:   <features>
Dec 05 09:34:26 compute-1 nova_compute[189066]:     <acpi/>
Dec 05 09:34:26 compute-1 nova_compute[189066]:     <apic/>
Dec 05 09:34:26 compute-1 nova_compute[189066]:     <vmcoreinfo/>
Dec 05 09:34:26 compute-1 nova_compute[189066]:   </features>
Dec 05 09:34:26 compute-1 nova_compute[189066]:   <clock offset="utc">
Dec 05 09:34:26 compute-1 nova_compute[189066]:     <timer name="pit" tickpolicy="delay"/>
Dec 05 09:34:26 compute-1 nova_compute[189066]:     <timer name="rtc" tickpolicy="catchup"/>
Dec 05 09:34:26 compute-1 nova_compute[189066]:     <timer name="hpet" present="no"/>
Dec 05 09:34:26 compute-1 nova_compute[189066]:   </clock>
Dec 05 09:34:26 compute-1 nova_compute[189066]:   <cpu mode="custom" match="exact">
Dec 05 09:34:26 compute-1 nova_compute[189066]:     <model>Nehalem</model>
Dec 05 09:34:26 compute-1 nova_compute[189066]:     <topology sockets="1" cores="1" threads="1"/>
Dec 05 09:34:26 compute-1 nova_compute[189066]:   </cpu>
Dec 05 09:34:26 compute-1 nova_compute[189066]:   <devices>
Dec 05 09:34:26 compute-1 nova_compute[189066]:     <disk type="file" device="disk">
Dec 05 09:34:26 compute-1 nova_compute[189066]:       <driver name="qemu" type="qcow2" cache="none"/>
Dec 05 09:34:26 compute-1 nova_compute[189066]:       <source file="/var/lib/nova/instances/438132a9-5f29-4b35-b457-98c182eb1660/disk"/>
Dec 05 09:34:26 compute-1 nova_compute[189066]:       <target dev="vda" bus="virtio"/>
Dec 05 09:34:26 compute-1 nova_compute[189066]:     </disk>
Dec 05 09:34:26 compute-1 nova_compute[189066]:     <disk type="file" device="cdrom">
Dec 05 09:34:26 compute-1 nova_compute[189066]:       <driver name="qemu" type="raw" cache="none"/>
Dec 05 09:34:26 compute-1 nova_compute[189066]:       <source file="/var/lib/nova/instances/438132a9-5f29-4b35-b457-98c182eb1660/disk.config"/>
Dec 05 09:34:26 compute-1 nova_compute[189066]:       <target dev="sda" bus="sata"/>
Dec 05 09:34:26 compute-1 nova_compute[189066]:     </disk>
Dec 05 09:34:26 compute-1 nova_compute[189066]:     <interface type="ethernet">
Dec 05 09:34:26 compute-1 nova_compute[189066]:       <mac address="fa:16:3e:e4:6f:05"/>
Dec 05 09:34:26 compute-1 nova_compute[189066]:       <model type="virtio"/>
Dec 05 09:34:26 compute-1 nova_compute[189066]:       <driver name="vhost" rx_queue_size="512"/>
Dec 05 09:34:26 compute-1 nova_compute[189066]:       <mtu size="1442"/>
Dec 05 09:34:26 compute-1 nova_compute[189066]:       <target dev="tap20f9bafd-4d"/>
Dec 05 09:34:26 compute-1 nova_compute[189066]:     </interface>
Dec 05 09:34:26 compute-1 nova_compute[189066]:     <serial type="pty">
Dec 05 09:34:26 compute-1 nova_compute[189066]:       <log file="/var/lib/nova/instances/438132a9-5f29-4b35-b457-98c182eb1660/console.log" append="off"/>
Dec 05 09:34:26 compute-1 nova_compute[189066]:     </serial>
Dec 05 09:34:26 compute-1 nova_compute[189066]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Dec 05 09:34:26 compute-1 nova_compute[189066]:     <video>
Dec 05 09:34:26 compute-1 nova_compute[189066]:       <model type="virtio"/>
Dec 05 09:34:26 compute-1 nova_compute[189066]:     </video>
Dec 05 09:34:26 compute-1 nova_compute[189066]:     <input type="tablet" bus="usb"/>
Dec 05 09:34:26 compute-1 nova_compute[189066]:     <rng model="virtio">
Dec 05 09:34:26 compute-1 nova_compute[189066]:       <backend model="random">/dev/urandom</backend>
Dec 05 09:34:26 compute-1 nova_compute[189066]:     </rng>
Dec 05 09:34:26 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root"/>
Dec 05 09:34:26 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:34:26 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:34:26 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:34:26 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:34:26 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:34:26 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:34:26 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:34:26 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:34:26 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:34:26 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:34:26 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:34:26 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:34:26 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:34:26 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:34:26 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:34:26 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:34:26 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:34:26 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:34:26 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:34:26 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:34:26 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:34:26 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:34:26 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:34:26 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:34:26 compute-1 nova_compute[189066]:     <controller type="usb" index="0"/>
Dec 05 09:34:26 compute-1 nova_compute[189066]:     <memballoon model="virtio">
Dec 05 09:34:26 compute-1 nova_compute[189066]:       <stats period="10"/>
Dec 05 09:34:26 compute-1 nova_compute[189066]:     </memballoon>
Dec 05 09:34:26 compute-1 nova_compute[189066]:   </devices>
Dec 05 09:34:26 compute-1 nova_compute[189066]: </domain>
Dec 05 09:34:26 compute-1 nova_compute[189066]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Dec 05 09:34:26 compute-1 nova_compute[189066]: 2025-12-05 09:34:26.405 189070 DEBUG nova.compute.manager [None req-31adc746-5ca3-479b-98c5-9c86b1b8f9c1 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: 438132a9-5f29-4b35-b457-98c182eb1660] Preparing to wait for external event network-vif-plugged-20f9bafd-4df6-4d4a-849b-c578c6f0bcef prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Dec 05 09:34:26 compute-1 nova_compute[189066]: 2025-12-05 09:34:26.405 189070 DEBUG oslo_concurrency.lockutils [None req-31adc746-5ca3-479b-98c5-9c86b1b8f9c1 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Acquiring lock "438132a9-5f29-4b35-b457-98c182eb1660-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:34:26 compute-1 nova_compute[189066]: 2025-12-05 09:34:26.406 189070 DEBUG oslo_concurrency.lockutils [None req-31adc746-5ca3-479b-98c5-9c86b1b8f9c1 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Lock "438132a9-5f29-4b35-b457-98c182eb1660-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:34:26 compute-1 nova_compute[189066]: 2025-12-05 09:34:26.406 189070 DEBUG oslo_concurrency.lockutils [None req-31adc746-5ca3-479b-98c5-9c86b1b8f9c1 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Lock "438132a9-5f29-4b35-b457-98c182eb1660-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:34:26 compute-1 nova_compute[189066]: 2025-12-05 09:34:26.407 189070 DEBUG nova.virt.libvirt.vif [None req-31adc746-5ca3-479b-98c5-9c86b1b8f9c1 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T09:34:13Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-2088978795',display_name='tempest-TestNetworkBasicOps-server-2088978795',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-2088978795',id=32,image_ref='3ebffd97-b242-42d7-b245-ebdaf8e4377c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBMuwUlWk2J8MhLzGrVSCPKsjZ2cn2j5HsZgCuIma4m+NLQ3JmfTlLaKyz+KSaQTRKJUK07n4T6cKvrKFN6/xGM5MvB+0fJJJDxn949bj5AoXo6fvLq7r3jh6uryDDUZ5zQ==',key_name='tempest-TestNetworkBasicOps-829941913',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='f918144b49634ed5a43d75f8f7d194d3',ramdisk_id='',reservation_id='r-jwd1ncb2',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='3ebffd97-b242-42d7-b245-ebdaf8e4377c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1341993432',owner_user_name='tempest-TestNetworkBasicOps-1341993432-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T09:34:15Z,user_data=None,user_id='822a2d80e80e46d9a5a49e1e9560e0d9',uuid=438132a9-5f29-4b35-b457-98c182eb1660,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "20f9bafd-4df6-4d4a-849b-c578c6f0bcef", "address": "fa:16:3e:e4:6f:05", "network": {"id": "918acd23-3741-478c-80cb-c5530f6594f8", "bridge": "br-int", "label": "tempest-network-smoke--1861576532", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f918144b49634ed5a43d75f8f7d194d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap20f9bafd-4d", "ovs_interfaceid": "20f9bafd-4df6-4d4a-849b-c578c6f0bcef", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Dec 05 09:34:26 compute-1 nova_compute[189066]: 2025-12-05 09:34:26.407 189070 DEBUG nova.network.os_vif_util [None req-31adc746-5ca3-479b-98c5-9c86b1b8f9c1 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Converting VIF {"id": "20f9bafd-4df6-4d4a-849b-c578c6f0bcef", "address": "fa:16:3e:e4:6f:05", "network": {"id": "918acd23-3741-478c-80cb-c5530f6594f8", "bridge": "br-int", "label": "tempest-network-smoke--1861576532", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f918144b49634ed5a43d75f8f7d194d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap20f9bafd-4d", "ovs_interfaceid": "20f9bafd-4df6-4d4a-849b-c578c6f0bcef", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 09:34:26 compute-1 nova_compute[189066]: 2025-12-05 09:34:26.408 189070 DEBUG nova.network.os_vif_util [None req-31adc746-5ca3-479b-98c5-9c86b1b8f9c1 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:e4:6f:05,bridge_name='br-int',has_traffic_filtering=True,id=20f9bafd-4df6-4d4a-849b-c578c6f0bcef,network=Network(918acd23-3741-478c-80cb-c5530f6594f8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap20f9bafd-4d') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 09:34:26 compute-1 nova_compute[189066]: 2025-12-05 09:34:26.408 189070 DEBUG os_vif [None req-31adc746-5ca3-479b-98c5-9c86b1b8f9c1 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:e4:6f:05,bridge_name='br-int',has_traffic_filtering=True,id=20f9bafd-4df6-4d4a-849b-c578c6f0bcef,network=Network(918acd23-3741-478c-80cb-c5530f6594f8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap20f9bafd-4d') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Dec 05 09:34:26 compute-1 nova_compute[189066]: 2025-12-05 09:34:26.409 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:34:26 compute-1 nova_compute[189066]: 2025-12-05 09:34:26.409 189070 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 09:34:26 compute-1 nova_compute[189066]: 2025-12-05 09:34:26.410 189070 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 09:34:26 compute-1 nova_compute[189066]: 2025-12-05 09:34:26.413 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:34:26 compute-1 nova_compute[189066]: 2025-12-05 09:34:26.414 189070 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap20f9bafd-4d, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 09:34:26 compute-1 nova_compute[189066]: 2025-12-05 09:34:26.414 189070 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap20f9bafd-4d, col_values=(('external_ids', {'iface-id': '20f9bafd-4df6-4d4a-849b-c578c6f0bcef', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:e4:6f:05', 'vm-uuid': '438132a9-5f29-4b35-b457-98c182eb1660'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 09:34:26 compute-1 nova_compute[189066]: 2025-12-05 09:34:26.415 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:34:26 compute-1 NetworkManager[55704]: <info>  [1764927266.4166] manager: (tap20f9bafd-4d): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/95)
Dec 05 09:34:26 compute-1 nova_compute[189066]: 2025-12-05 09:34:26.418 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 09:34:26 compute-1 nova_compute[189066]: 2025-12-05 09:34:26.423 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:34:26 compute-1 nova_compute[189066]: 2025-12-05 09:34:26.424 189070 INFO os_vif [None req-31adc746-5ca3-479b-98c5-9c86b1b8f9c1 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:e4:6f:05,bridge_name='br-int',has_traffic_filtering=True,id=20f9bafd-4df6-4d4a-849b-c578c6f0bcef,network=Network(918acd23-3741-478c-80cb-c5530f6594f8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap20f9bafd-4d')
Dec 05 09:34:26 compute-1 nova_compute[189066]: 2025-12-05 09:34:26.492 189070 DEBUG nova.virt.libvirt.driver [None req-31adc746-5ca3-479b-98c5-9c86b1b8f9c1 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 05 09:34:26 compute-1 nova_compute[189066]: 2025-12-05 09:34:26.493 189070 DEBUG nova.virt.libvirt.driver [None req-31adc746-5ca3-479b-98c5-9c86b1b8f9c1 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 05 09:34:26 compute-1 nova_compute[189066]: 2025-12-05 09:34:26.493 189070 DEBUG nova.virt.libvirt.driver [None req-31adc746-5ca3-479b-98c5-9c86b1b8f9c1 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] No VIF found with MAC fa:16:3e:e4:6f:05, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Dec 05 09:34:26 compute-1 nova_compute[189066]: 2025-12-05 09:34:26.495 189070 INFO nova.virt.libvirt.driver [None req-31adc746-5ca3-479b-98c5-9c86b1b8f9c1 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: 438132a9-5f29-4b35-b457-98c182eb1660] Using config drive
Dec 05 09:34:26 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:34:26.729 105272 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=2deaed7a-68f6-453c-b7f8-10ef033f3762, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '17'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 09:34:27 compute-1 nova_compute[189066]: 2025-12-05 09:34:27.627 189070 INFO nova.virt.libvirt.driver [None req-31adc746-5ca3-479b-98c5-9c86b1b8f9c1 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: 438132a9-5f29-4b35-b457-98c182eb1660] Creating config drive at /var/lib/nova/instances/438132a9-5f29-4b35-b457-98c182eb1660/disk.config
Dec 05 09:34:27 compute-1 podman[228012]: 2025-12-05 09:34:27.630678385 +0000 UTC m=+0.064762520 container health_status b07f36300303a5c1729056a61e344d6c6fd9b2e404082d8e8ce7185d45652c2b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_id=openstack_network_exporter, container_name=openstack_network_exporter, release=1755695350, maintainer=Red Hat, Inc., managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, name=ubi9-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., io.openshift.expose-services=, vcs-type=git, architecture=x86_64, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41)
Dec 05 09:34:27 compute-1 nova_compute[189066]: 2025-12-05 09:34:27.633 189070 DEBUG oslo_concurrency.processutils [None req-31adc746-5ca3-479b-98c5-9c86b1b8f9c1 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/438132a9-5f29-4b35-b457-98c182eb1660/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpn9f7y6y6 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 09:34:27 compute-1 nova_compute[189066]: 2025-12-05 09:34:27.765 189070 DEBUG oslo_concurrency.processutils [None req-31adc746-5ca3-479b-98c5-9c86b1b8f9c1 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/438132a9-5f29-4b35-b457-98c182eb1660/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpn9f7y6y6" returned: 0 in 0.133s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 09:34:27 compute-1 kernel: tap20f9bafd-4d: entered promiscuous mode
Dec 05 09:34:27 compute-1 NetworkManager[55704]: <info>  [1764927267.8320] manager: (tap20f9bafd-4d): new Tun device (/org/freedesktop/NetworkManager/Devices/96)
Dec 05 09:34:27 compute-1 ovn_controller[95809]: 2025-12-05T09:34:27Z|00176|binding|INFO|Claiming lport 20f9bafd-4df6-4d4a-849b-c578c6f0bcef for this chassis.
Dec 05 09:34:27 compute-1 ovn_controller[95809]: 2025-12-05T09:34:27Z|00177|binding|INFO|20f9bafd-4df6-4d4a-849b-c578c6f0bcef: Claiming fa:16:3e:e4:6f:05 10.100.0.9
Dec 05 09:34:27 compute-1 nova_compute[189066]: 2025-12-05 09:34:27.833 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:34:27 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:34:27.841 105272 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e4:6f:05 10.100.0.9'], port_security=['fa:16:3e:e4:6f:05 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '438132a9-5f29-4b35-b457-98c182eb1660', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-918acd23-3741-478c-80cb-c5530f6594f8', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'f918144b49634ed5a43d75f8f7d194d3', 'neutron:revision_number': '2', 'neutron:security_group_ids': '06a3815b-7930-400f-8e26-f165abb3d0b0', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=7f705fc6-bf74-428a-8b4c-cdd14a589f25, chassis=[<ovs.db.idl.Row object at 0x7fc06f54c970>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fc06f54c970>], logical_port=20f9bafd-4df6-4d4a-849b-c578c6f0bcef) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 09:34:27 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:34:27.843 105272 INFO neutron.agent.ovn.metadata.agent [-] Port 20f9bafd-4df6-4d4a-849b-c578c6f0bcef in datapath 918acd23-3741-478c-80cb-c5530f6594f8 bound to our chassis
Dec 05 09:34:27 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:34:27.845 105272 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 918acd23-3741-478c-80cb-c5530f6594f8
Dec 05 09:34:27 compute-1 ovn_controller[95809]: 2025-12-05T09:34:27Z|00178|binding|INFO|Setting lport 20f9bafd-4df6-4d4a-849b-c578c6f0bcef ovn-installed in OVS
Dec 05 09:34:27 compute-1 ovn_controller[95809]: 2025-12-05T09:34:27Z|00179|binding|INFO|Setting lport 20f9bafd-4df6-4d4a-849b-c578c6f0bcef up in Southbound
Dec 05 09:34:27 compute-1 nova_compute[189066]: 2025-12-05 09:34:27.847 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:34:27 compute-1 nova_compute[189066]: 2025-12-05 09:34:27.850 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:34:27 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:34:27.867 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[cb2c7189-7791-4ff1-a829-12b31ae7605a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:34:27 compute-1 systemd-udevd[228048]: Network interface NamePolicy= disabled on kernel command line.
Dec 05 09:34:27 compute-1 systemd-machined[154815]: New machine qemu-14-instance-00000020.
Dec 05 09:34:27 compute-1 NetworkManager[55704]: <info>  [1764927267.8850] device (tap20f9bafd-4d): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec 05 09:34:27 compute-1 NetworkManager[55704]: <info>  [1764927267.8859] device (tap20f9bafd-4d): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec 05 09:34:27 compute-1 systemd[1]: Started Virtual Machine qemu-14-instance-00000020.
Dec 05 09:34:27 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:34:27.904 220896 DEBUG oslo.privsep.daemon [-] privsep: reply[37ddc9e8-37e8-416f-a42a-a2253fb4405a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:34:27 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:34:27.908 220896 DEBUG oslo.privsep.daemon [-] privsep: reply[b975c3df-f815-4dcf-8e0b-c4ec54dda5e4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:34:27 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:34:27.943 220896 DEBUG oslo.privsep.daemon [-] privsep: reply[0054c752-2b5e-4d26-8901-af33e5e9e399]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:34:27 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:34:27.964 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[ed5de057-04ba-4092-b2d6-81116a73dbc6]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap918acd23-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:1d:ed:28'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 8, 'tx_packets': 5, 'rx_bytes': 616, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 8, 'tx_packets': 5, 'rx_bytes': 616, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 51], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 449053, 'reachable_time': 41488, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 228062, 'error': None, 'target': 'ovnmeta-918acd23-3741-478c-80cb-c5530f6594f8', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:34:27 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:34:27.987 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[8d39359d-162e-407a-a0d5-1af5afc93afb]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap918acd23-31'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 449067, 'tstamp': 449067}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 228064, 'error': None, 'target': 'ovnmeta-918acd23-3741-478c-80cb-c5530f6594f8', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap918acd23-31'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 449070, 'tstamp': 449070}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 228064, 'error': None, 'target': 'ovnmeta-918acd23-3741-478c-80cb-c5530f6594f8', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:34:27 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:34:27.989 105272 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap918acd23-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 09:34:27 compute-1 nova_compute[189066]: 2025-12-05 09:34:27.990 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:34:27 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:34:27.992 105272 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap918acd23-30, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 09:34:27 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:34:27.992 105272 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 09:34:27 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:34:27.992 105272 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap918acd23-30, col_values=(('external_ids', {'iface-id': '48adc250-a1d1-4155-84e6-c2b0cd01f0be'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 09:34:27 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:34:27.993 105272 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 09:34:28 compute-1 nova_compute[189066]: 2025-12-05 09:34:28.386 189070 DEBUG nova.virt.driver [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] Emitting event <LifecycleEvent: 1764927268.3853936, 438132a9-5f29-4b35-b457-98c182eb1660 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 09:34:28 compute-1 nova_compute[189066]: 2025-12-05 09:34:28.386 189070 INFO nova.compute.manager [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] [instance: 438132a9-5f29-4b35-b457-98c182eb1660] VM Started (Lifecycle Event)
Dec 05 09:34:28 compute-1 nova_compute[189066]: 2025-12-05 09:34:28.411 189070 DEBUG nova.compute.manager [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] [instance: 438132a9-5f29-4b35-b457-98c182eb1660] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 09:34:28 compute-1 nova_compute[189066]: 2025-12-05 09:34:28.417 189070 DEBUG nova.virt.driver [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] Emitting event <LifecycleEvent: 1764927268.3857334, 438132a9-5f29-4b35-b457-98c182eb1660 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 09:34:28 compute-1 nova_compute[189066]: 2025-12-05 09:34:28.418 189070 INFO nova.compute.manager [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] [instance: 438132a9-5f29-4b35-b457-98c182eb1660] VM Paused (Lifecycle Event)
Dec 05 09:34:28 compute-1 nova_compute[189066]: 2025-12-05 09:34:28.494 189070 DEBUG nova.compute.manager [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] [instance: 438132a9-5f29-4b35-b457-98c182eb1660] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 09:34:28 compute-1 nova_compute[189066]: 2025-12-05 09:34:28.498 189070 DEBUG nova.compute.manager [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] [instance: 438132a9-5f29-4b35-b457-98c182eb1660] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 09:34:28 compute-1 nova_compute[189066]: 2025-12-05 09:34:28.542 189070 INFO nova.compute.manager [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] [instance: 438132a9-5f29-4b35-b457-98c182eb1660] During sync_power_state the instance has a pending task (spawning). Skip.
Dec 05 09:34:29 compute-1 nova_compute[189066]: 2025-12-05 09:34:29.147 189070 DEBUG nova.network.neutron [req-5b536c13-bab2-422c-b0a6-c618544fca5e req-b8b2a648-63cc-4b3d-bcba-5fae51e1232e 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 438132a9-5f29-4b35-b457-98c182eb1660] Updated VIF entry in instance network info cache for port 20f9bafd-4df6-4d4a-849b-c578c6f0bcef. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Dec 05 09:34:29 compute-1 nova_compute[189066]: 2025-12-05 09:34:29.148 189070 DEBUG nova.network.neutron [req-5b536c13-bab2-422c-b0a6-c618544fca5e req-b8b2a648-63cc-4b3d-bcba-5fae51e1232e 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 438132a9-5f29-4b35-b457-98c182eb1660] Updating instance_info_cache with network_info: [{"id": "20f9bafd-4df6-4d4a-849b-c578c6f0bcef", "address": "fa:16:3e:e4:6f:05", "network": {"id": "918acd23-3741-478c-80cb-c5530f6594f8", "bridge": "br-int", "label": "tempest-network-smoke--1861576532", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f918144b49634ed5a43d75f8f7d194d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap20f9bafd-4d", "ovs_interfaceid": "20f9bafd-4df6-4d4a-849b-c578c6f0bcef", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 09:34:29 compute-1 nova_compute[189066]: 2025-12-05 09:34:29.185 189070 DEBUG oslo_concurrency.lockutils [req-5b536c13-bab2-422c-b0a6-c618544fca5e req-b8b2a648-63cc-4b3d-bcba-5fae51e1232e 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Releasing lock "refresh_cache-438132a9-5f29-4b35-b457-98c182eb1660" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 09:34:29 compute-1 nova_compute[189066]: 2025-12-05 09:34:29.820 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:34:29 compute-1 nova_compute[189066]: 2025-12-05 09:34:29.900 189070 DEBUG nova.compute.manager [req-d5e8827c-e058-4889-98e0-551575c290b2 req-576538ca-0f15-4023-86d8-97758d3382d0 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 438132a9-5f29-4b35-b457-98c182eb1660] Received event network-vif-plugged-20f9bafd-4df6-4d4a-849b-c578c6f0bcef external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 09:34:29 compute-1 nova_compute[189066]: 2025-12-05 09:34:29.901 189070 DEBUG oslo_concurrency.lockutils [req-d5e8827c-e058-4889-98e0-551575c290b2 req-576538ca-0f15-4023-86d8-97758d3382d0 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Acquiring lock "438132a9-5f29-4b35-b457-98c182eb1660-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:34:29 compute-1 nova_compute[189066]: 2025-12-05 09:34:29.901 189070 DEBUG oslo_concurrency.lockutils [req-d5e8827c-e058-4889-98e0-551575c290b2 req-576538ca-0f15-4023-86d8-97758d3382d0 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Lock "438132a9-5f29-4b35-b457-98c182eb1660-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:34:29 compute-1 nova_compute[189066]: 2025-12-05 09:34:29.902 189070 DEBUG oslo_concurrency.lockutils [req-d5e8827c-e058-4889-98e0-551575c290b2 req-576538ca-0f15-4023-86d8-97758d3382d0 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Lock "438132a9-5f29-4b35-b457-98c182eb1660-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:34:29 compute-1 nova_compute[189066]: 2025-12-05 09:34:29.902 189070 DEBUG nova.compute.manager [req-d5e8827c-e058-4889-98e0-551575c290b2 req-576538ca-0f15-4023-86d8-97758d3382d0 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 438132a9-5f29-4b35-b457-98c182eb1660] Processing event network-vif-plugged-20f9bafd-4df6-4d4a-849b-c578c6f0bcef _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Dec 05 09:34:29 compute-1 nova_compute[189066]: 2025-12-05 09:34:29.903 189070 DEBUG nova.compute.manager [None req-31adc746-5ca3-479b-98c5-9c86b1b8f9c1 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: 438132a9-5f29-4b35-b457-98c182eb1660] Instance event wait completed in 1 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Dec 05 09:34:29 compute-1 nova_compute[189066]: 2025-12-05 09:34:29.908 189070 DEBUG nova.virt.driver [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] Emitting event <LifecycleEvent: 1764927269.907555, 438132a9-5f29-4b35-b457-98c182eb1660 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 09:34:29 compute-1 nova_compute[189066]: 2025-12-05 09:34:29.908 189070 INFO nova.compute.manager [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] [instance: 438132a9-5f29-4b35-b457-98c182eb1660] VM Resumed (Lifecycle Event)
Dec 05 09:34:29 compute-1 nova_compute[189066]: 2025-12-05 09:34:29.912 189070 DEBUG nova.virt.libvirt.driver [None req-31adc746-5ca3-479b-98c5-9c86b1b8f9c1 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: 438132a9-5f29-4b35-b457-98c182eb1660] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Dec 05 09:34:29 compute-1 nova_compute[189066]: 2025-12-05 09:34:29.915 189070 INFO nova.virt.libvirt.driver [-] [instance: 438132a9-5f29-4b35-b457-98c182eb1660] Instance spawned successfully.
Dec 05 09:34:29 compute-1 nova_compute[189066]: 2025-12-05 09:34:29.916 189070 DEBUG nova.virt.libvirt.driver [None req-31adc746-5ca3-479b-98c5-9c86b1b8f9c1 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: 438132a9-5f29-4b35-b457-98c182eb1660] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Dec 05 09:34:29 compute-1 nova_compute[189066]: 2025-12-05 09:34:29.939 189070 DEBUG nova.compute.manager [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] [instance: 438132a9-5f29-4b35-b457-98c182eb1660] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 09:34:29 compute-1 nova_compute[189066]: 2025-12-05 09:34:29.944 189070 DEBUG nova.virt.libvirt.driver [None req-31adc746-5ca3-479b-98c5-9c86b1b8f9c1 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: 438132a9-5f29-4b35-b457-98c182eb1660] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 09:34:29 compute-1 nova_compute[189066]: 2025-12-05 09:34:29.944 189070 DEBUG nova.virt.libvirt.driver [None req-31adc746-5ca3-479b-98c5-9c86b1b8f9c1 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: 438132a9-5f29-4b35-b457-98c182eb1660] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 09:34:29 compute-1 nova_compute[189066]: 2025-12-05 09:34:29.944 189070 DEBUG nova.virt.libvirt.driver [None req-31adc746-5ca3-479b-98c5-9c86b1b8f9c1 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: 438132a9-5f29-4b35-b457-98c182eb1660] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 09:34:29 compute-1 nova_compute[189066]: 2025-12-05 09:34:29.945 189070 DEBUG nova.virt.libvirt.driver [None req-31adc746-5ca3-479b-98c5-9c86b1b8f9c1 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: 438132a9-5f29-4b35-b457-98c182eb1660] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 09:34:29 compute-1 nova_compute[189066]: 2025-12-05 09:34:29.945 189070 DEBUG nova.virt.libvirt.driver [None req-31adc746-5ca3-479b-98c5-9c86b1b8f9c1 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: 438132a9-5f29-4b35-b457-98c182eb1660] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 09:34:29 compute-1 nova_compute[189066]: 2025-12-05 09:34:29.945 189070 DEBUG nova.virt.libvirt.driver [None req-31adc746-5ca3-479b-98c5-9c86b1b8f9c1 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: 438132a9-5f29-4b35-b457-98c182eb1660] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 09:34:29 compute-1 nova_compute[189066]: 2025-12-05 09:34:29.950 189070 DEBUG nova.compute.manager [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] [instance: 438132a9-5f29-4b35-b457-98c182eb1660] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 09:34:29 compute-1 nova_compute[189066]: 2025-12-05 09:34:29.993 189070 INFO nova.compute.manager [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] [instance: 438132a9-5f29-4b35-b457-98c182eb1660] During sync_power_state the instance has a pending task (spawning). Skip.
Dec 05 09:34:30 compute-1 nova_compute[189066]: 2025-12-05 09:34:30.022 189070 INFO nova.compute.manager [None req-31adc746-5ca3-479b-98c5-9c86b1b8f9c1 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: 438132a9-5f29-4b35-b457-98c182eb1660] Took 14.46 seconds to spawn the instance on the hypervisor.
Dec 05 09:34:30 compute-1 nova_compute[189066]: 2025-12-05 09:34:30.022 189070 DEBUG nova.compute.manager [None req-31adc746-5ca3-479b-98c5-9c86b1b8f9c1 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: 438132a9-5f29-4b35-b457-98c182eb1660] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 09:34:30 compute-1 nova_compute[189066]: 2025-12-05 09:34:30.102 189070 INFO nova.compute.manager [None req-31adc746-5ca3-479b-98c5-9c86b1b8f9c1 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: 438132a9-5f29-4b35-b457-98c182eb1660] Took 15.14 seconds to build instance.
Dec 05 09:34:30 compute-1 nova_compute[189066]: 2025-12-05 09:34:30.127 189070 DEBUG oslo_concurrency.lockutils [None req-31adc746-5ca3-479b-98c5-9c86b1b8f9c1 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Lock "438132a9-5f29-4b35-b457-98c182eb1660" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 15.547s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:34:31 compute-1 nova_compute[189066]: 2025-12-05 09:34:31.417 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:34:31 compute-1 podman[228072]: 2025-12-05 09:34:31.660619381 +0000 UTC m=+0.086109041 container health_status b9f45cdcb567bb250381fa80d86c78c226d5638c7de1b6d79ee017275814ff68 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Dec 05 09:34:32 compute-1 nova_compute[189066]: 2025-12-05 09:34:32.163 189070 DEBUG nova.compute.manager [req-be3f229e-061c-4fe4-8829-ce6be7a64014 req-1a436137-80fa-45d4-888b-cc80344dd7c6 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 438132a9-5f29-4b35-b457-98c182eb1660] Received event network-vif-plugged-20f9bafd-4df6-4d4a-849b-c578c6f0bcef external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 09:34:32 compute-1 nova_compute[189066]: 2025-12-05 09:34:32.164 189070 DEBUG oslo_concurrency.lockutils [req-be3f229e-061c-4fe4-8829-ce6be7a64014 req-1a436137-80fa-45d4-888b-cc80344dd7c6 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Acquiring lock "438132a9-5f29-4b35-b457-98c182eb1660-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:34:32 compute-1 nova_compute[189066]: 2025-12-05 09:34:32.164 189070 DEBUG oslo_concurrency.lockutils [req-be3f229e-061c-4fe4-8829-ce6be7a64014 req-1a436137-80fa-45d4-888b-cc80344dd7c6 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Lock "438132a9-5f29-4b35-b457-98c182eb1660-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:34:32 compute-1 nova_compute[189066]: 2025-12-05 09:34:32.164 189070 DEBUG oslo_concurrency.lockutils [req-be3f229e-061c-4fe4-8829-ce6be7a64014 req-1a436137-80fa-45d4-888b-cc80344dd7c6 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Lock "438132a9-5f29-4b35-b457-98c182eb1660-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:34:32 compute-1 nova_compute[189066]: 2025-12-05 09:34:32.164 189070 DEBUG nova.compute.manager [req-be3f229e-061c-4fe4-8829-ce6be7a64014 req-1a436137-80fa-45d4-888b-cc80344dd7c6 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 438132a9-5f29-4b35-b457-98c182eb1660] No waiting events found dispatching network-vif-plugged-20f9bafd-4df6-4d4a-849b-c578c6f0bcef pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 09:34:32 compute-1 nova_compute[189066]: 2025-12-05 09:34:32.164 189070 WARNING nova.compute.manager [req-be3f229e-061c-4fe4-8829-ce6be7a64014 req-1a436137-80fa-45d4-888b-cc80344dd7c6 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 438132a9-5f29-4b35-b457-98c182eb1660] Received unexpected event network-vif-plugged-20f9bafd-4df6-4d4a-849b-c578c6f0bcef for instance with vm_state active and task_state None.
Dec 05 09:34:34 compute-1 nova_compute[189066]: 2025-12-05 09:34:34.824 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:34:34 compute-1 nova_compute[189066]: 2025-12-05 09:34:34.983 189070 DEBUG nova.compute.manager [req-e6238f68-12a8-4795-b22f-10f903fd0806 req-965035a4-1ad0-4500-8f19-f06be3b0c545 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 438132a9-5f29-4b35-b457-98c182eb1660] Received event network-changed-20f9bafd-4df6-4d4a-849b-c578c6f0bcef external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 09:34:34 compute-1 nova_compute[189066]: 2025-12-05 09:34:34.984 189070 DEBUG nova.compute.manager [req-e6238f68-12a8-4795-b22f-10f903fd0806 req-965035a4-1ad0-4500-8f19-f06be3b0c545 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 438132a9-5f29-4b35-b457-98c182eb1660] Refreshing instance network info cache due to event network-changed-20f9bafd-4df6-4d4a-849b-c578c6f0bcef. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Dec 05 09:34:34 compute-1 nova_compute[189066]: 2025-12-05 09:34:34.984 189070 DEBUG oslo_concurrency.lockutils [req-e6238f68-12a8-4795-b22f-10f903fd0806 req-965035a4-1ad0-4500-8f19-f06be3b0c545 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Acquiring lock "refresh_cache-438132a9-5f29-4b35-b457-98c182eb1660" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 09:34:34 compute-1 nova_compute[189066]: 2025-12-05 09:34:34.984 189070 DEBUG oslo_concurrency.lockutils [req-e6238f68-12a8-4795-b22f-10f903fd0806 req-965035a4-1ad0-4500-8f19-f06be3b0c545 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Acquired lock "refresh_cache-438132a9-5f29-4b35-b457-98c182eb1660" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 09:34:34 compute-1 nova_compute[189066]: 2025-12-05 09:34:34.985 189070 DEBUG nova.network.neutron [req-e6238f68-12a8-4795-b22f-10f903fd0806 req-965035a4-1ad0-4500-8f19-f06be3b0c545 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 438132a9-5f29-4b35-b457-98c182eb1660] Refreshing network info cache for port 20f9bafd-4df6-4d4a-849b-c578c6f0bcef _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Dec 05 09:34:35 compute-1 ovn_controller[95809]: 2025-12-05T09:34:35Z|00180|binding|INFO|Releasing lport 48adc250-a1d1-4155-84e6-c2b0cd01f0be from this chassis (sb_readonly=0)
Dec 05 09:34:35 compute-1 nova_compute[189066]: 2025-12-05 09:34:35.885 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:34:36 compute-1 nova_compute[189066]: 2025-12-05 09:34:36.420 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:34:36 compute-1 nova_compute[189066]: 2025-12-05 09:34:36.759 189070 DEBUG nova.network.neutron [req-e6238f68-12a8-4795-b22f-10f903fd0806 req-965035a4-1ad0-4500-8f19-f06be3b0c545 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 438132a9-5f29-4b35-b457-98c182eb1660] Updated VIF entry in instance network info cache for port 20f9bafd-4df6-4d4a-849b-c578c6f0bcef. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Dec 05 09:34:36 compute-1 nova_compute[189066]: 2025-12-05 09:34:36.760 189070 DEBUG nova.network.neutron [req-e6238f68-12a8-4795-b22f-10f903fd0806 req-965035a4-1ad0-4500-8f19-f06be3b0c545 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 438132a9-5f29-4b35-b457-98c182eb1660] Updating instance_info_cache with network_info: [{"id": "20f9bafd-4df6-4d4a-849b-c578c6f0bcef", "address": "fa:16:3e:e4:6f:05", "network": {"id": "918acd23-3741-478c-80cb-c5530f6594f8", "bridge": "br-int", "label": "tempest-network-smoke--1861576532", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.221", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f918144b49634ed5a43d75f8f7d194d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap20f9bafd-4d", "ovs_interfaceid": "20f9bafd-4df6-4d4a-849b-c578c6f0bcef", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 09:34:36 compute-1 nova_compute[189066]: 2025-12-05 09:34:36.786 189070 DEBUG oslo_concurrency.lockutils [req-e6238f68-12a8-4795-b22f-10f903fd0806 req-965035a4-1ad0-4500-8f19-f06be3b0c545 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Releasing lock "refresh_cache-438132a9-5f29-4b35-b457-98c182eb1660" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 09:34:38 compute-1 podman[228098]: 2025-12-05 09:34:38.628773322 +0000 UTC m=+0.059607403 container health_status 550b604b3b40c506829669f3af0f3e8dc4e7ac01464222ce7f26998708fe1a96 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Dec 05 09:34:39 compute-1 nova_compute[189066]: 2025-12-05 09:34:39.826 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:34:41 compute-1 nova_compute[189066]: 2025-12-05 09:34:41.423 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:34:44 compute-1 ovn_controller[95809]: 2025-12-05T09:34:44Z|00031|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:e4:6f:05 10.100.0.9
Dec 05 09:34:44 compute-1 ovn_controller[95809]: 2025-12-05T09:34:44Z|00032|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:e4:6f:05 10.100.0.9
Dec 05 09:34:44 compute-1 nova_compute[189066]: 2025-12-05 09:34:44.829 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:34:45 compute-1 podman[228155]: 2025-12-05 09:34:45.656370306 +0000 UTC m=+0.095221443 container health_status d60d3dfd47255bc7c068dbedc03dbf86678863f0414a1b8f8536e4d53dd67a13 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a42d21893a42142b0b00e96296a13ac9d2c0fedf55d6438ad104fd0309c63376-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, container_name=ceilometer_agent_compute)
Dec 05 09:34:46 compute-1 nova_compute[189066]: 2025-12-05 09:34:46.425 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:34:49 compute-1 nova_compute[189066]: 2025-12-05 09:34:49.594 189070 INFO nova.compute.manager [None req-22e18c78-62b5-4568-9bc4-e1a0ab2efbae 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: 438132a9-5f29-4b35-b457-98c182eb1660] Get console output
Dec 05 09:34:49 compute-1 nova_compute[189066]: 2025-12-05 09:34:49.603 220618 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Dec 05 09:34:49 compute-1 podman[228176]: 2025-12-05 09:34:49.706165615 +0000 UTC m=+0.138812025 container health_status 0cdc951f3b455a1459471b20e1f1626ca53f702831c125f835d1b670aeb17ccf (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a42d21893a42142b0b00e96296a13ac9d2c0fedf55d6438ad104fd0309c63376-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true)
Dec 05 09:34:49 compute-1 nova_compute[189066]: 2025-12-05 09:34:49.832 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:34:49 compute-1 nova_compute[189066]: 2025-12-05 09:34:49.916 189070 DEBUG oslo_concurrency.lockutils [None req-f4b86f44-3290-4dea-95fa-423e6125ace0 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Acquiring lock "438132a9-5f29-4b35-b457-98c182eb1660" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:34:49 compute-1 nova_compute[189066]: 2025-12-05 09:34:49.917 189070 DEBUG oslo_concurrency.lockutils [None req-f4b86f44-3290-4dea-95fa-423e6125ace0 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Lock "438132a9-5f29-4b35-b457-98c182eb1660" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:34:49 compute-1 nova_compute[189066]: 2025-12-05 09:34:49.917 189070 DEBUG oslo_concurrency.lockutils [None req-f4b86f44-3290-4dea-95fa-423e6125ace0 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Acquiring lock "438132a9-5f29-4b35-b457-98c182eb1660-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:34:49 compute-1 nova_compute[189066]: 2025-12-05 09:34:49.918 189070 DEBUG oslo_concurrency.lockutils [None req-f4b86f44-3290-4dea-95fa-423e6125ace0 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Lock "438132a9-5f29-4b35-b457-98c182eb1660-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:34:49 compute-1 nova_compute[189066]: 2025-12-05 09:34:49.918 189070 DEBUG oslo_concurrency.lockutils [None req-f4b86f44-3290-4dea-95fa-423e6125ace0 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Lock "438132a9-5f29-4b35-b457-98c182eb1660-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:34:49 compute-1 nova_compute[189066]: 2025-12-05 09:34:49.919 189070 INFO nova.compute.manager [None req-f4b86f44-3290-4dea-95fa-423e6125ace0 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: 438132a9-5f29-4b35-b457-98c182eb1660] Terminating instance
Dec 05 09:34:49 compute-1 nova_compute[189066]: 2025-12-05 09:34:49.921 189070 DEBUG nova.compute.manager [None req-f4b86f44-3290-4dea-95fa-423e6125ace0 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: 438132a9-5f29-4b35-b457-98c182eb1660] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Dec 05 09:34:49 compute-1 kernel: tap20f9bafd-4d (unregistering): left promiscuous mode
Dec 05 09:34:49 compute-1 NetworkManager[55704]: <info>  [1764927289.9516] device (tap20f9bafd-4d): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec 05 09:34:49 compute-1 ovn_controller[95809]: 2025-12-05T09:34:49Z|00181|binding|INFO|Releasing lport 20f9bafd-4df6-4d4a-849b-c578c6f0bcef from this chassis (sb_readonly=0)
Dec 05 09:34:49 compute-1 ovn_controller[95809]: 2025-12-05T09:34:49Z|00182|binding|INFO|Setting lport 20f9bafd-4df6-4d4a-849b-c578c6f0bcef down in Southbound
Dec 05 09:34:49 compute-1 nova_compute[189066]: 2025-12-05 09:34:49.962 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:34:49 compute-1 ovn_controller[95809]: 2025-12-05T09:34:49Z|00183|binding|INFO|Removing iface tap20f9bafd-4d ovn-installed in OVS
Dec 05 09:34:49 compute-1 nova_compute[189066]: 2025-12-05 09:34:49.966 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:34:49 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:34:49.978 105272 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e4:6f:05 10.100.0.9'], port_security=['fa:16:3e:e4:6f:05 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '438132a9-5f29-4b35-b457-98c182eb1660', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-918acd23-3741-478c-80cb-c5530f6594f8', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'f918144b49634ed5a43d75f8f7d194d3', 'neutron:revision_number': '4', 'neutron:security_group_ids': '06a3815b-7930-400f-8e26-f165abb3d0b0', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com', 'neutron:port_fip': '192.168.122.221'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=7f705fc6-bf74-428a-8b4c-cdd14a589f25, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fc06f54c970>], logical_port=20f9bafd-4df6-4d4a-849b-c578c6f0bcef) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fc06f54c970>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 09:34:49 compute-1 nova_compute[189066]: 2025-12-05 09:34:49.981 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:34:49 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:34:49.982 105272 INFO neutron.agent.ovn.metadata.agent [-] Port 20f9bafd-4df6-4d4a-849b-c578c6f0bcef in datapath 918acd23-3741-478c-80cb-c5530f6594f8 unbound from our chassis
Dec 05 09:34:49 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:34:49.984 105272 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 918acd23-3741-478c-80cb-c5530f6594f8
Dec 05 09:34:50 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:34:50.006 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[01d9152c-0764-4659-9e67-621c0b732249]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:34:50 compute-1 systemd[1]: machine-qemu\x2d14\x2dinstance\x2d00000020.scope: Deactivated successfully.
Dec 05 09:34:50 compute-1 systemd[1]: machine-qemu\x2d14\x2dinstance\x2d00000020.scope: Consumed 14.726s CPU time.
Dec 05 09:34:50 compute-1 systemd-machined[154815]: Machine qemu-14-instance-00000020 terminated.
Dec 05 09:34:50 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:34:50.048 220896 DEBUG oslo.privsep.daemon [-] privsep: reply[66d868ff-630a-4778-b244-dfc838f36e28]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:34:50 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:34:50.053 220896 DEBUG oslo.privsep.daemon [-] privsep: reply[b863be41-04d1-425e-b096-553a0b9b33e1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:34:50 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:34:50.084 220896 DEBUG oslo.privsep.daemon [-] privsep: reply[06d83316-fc3c-4a04-9619-1859141d75ff]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:34:50 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:34:50.106 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[21efbc33-c22d-4af3-ac68-53ea9b80fb88]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap918acd23-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:1d:ed:28'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 9, 'tx_packets': 7, 'rx_bytes': 658, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 9, 'tx_packets': 7, 'rx_bytes': 658, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 51], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 449053, 'reachable_time': 41488, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 228214, 'error': None, 'target': 'ovnmeta-918acd23-3741-478c-80cb-c5530f6594f8', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:34:50 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:34:50.128 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[73139d09-71a2-4ca5-856f-c3520981b890]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap918acd23-31'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 449067, 'tstamp': 449067}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 228215, 'error': None, 'target': 'ovnmeta-918acd23-3741-478c-80cb-c5530f6594f8', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap918acd23-31'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 449070, 'tstamp': 449070}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 228215, 'error': None, 'target': 'ovnmeta-918acd23-3741-478c-80cb-c5530f6594f8', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:34:50 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:34:50.131 105272 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap918acd23-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 09:34:50 compute-1 nova_compute[189066]: 2025-12-05 09:34:50.133 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:34:50 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:34:50.139 105272 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap918acd23-30, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 09:34:50 compute-1 nova_compute[189066]: 2025-12-05 09:34:50.139 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:34:50 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:34:50.140 105272 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 09:34:50 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:34:50.140 105272 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap918acd23-30, col_values=(('external_ids', {'iface-id': '48adc250-a1d1-4155-84e6-c2b0cd01f0be'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 09:34:50 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:34:50.141 105272 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 09:34:50 compute-1 nova_compute[189066]: 2025-12-05 09:34:50.147 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:34:50 compute-1 nova_compute[189066]: 2025-12-05 09:34:50.152 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:34:50 compute-1 nova_compute[189066]: 2025-12-05 09:34:50.190 189070 INFO nova.virt.libvirt.driver [-] [instance: 438132a9-5f29-4b35-b457-98c182eb1660] Instance destroyed successfully.
Dec 05 09:34:50 compute-1 nova_compute[189066]: 2025-12-05 09:34:50.190 189070 DEBUG nova.objects.instance [None req-f4b86f44-3290-4dea-95fa-423e6125ace0 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Lazy-loading 'resources' on Instance uuid 438132a9-5f29-4b35-b457-98c182eb1660 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 09:34:50 compute-1 nova_compute[189066]: 2025-12-05 09:34:50.207 189070 DEBUG nova.virt.libvirt.vif [None req-f4b86f44-3290-4dea-95fa-423e6125ace0 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-05T09:34:13Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-2088978795',display_name='tempest-TestNetworkBasicOps-server-2088978795',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-2088978795',id=32,image_ref='3ebffd97-b242-42d7-b245-ebdaf8e4377c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBMuwUlWk2J8MhLzGrVSCPKsjZ2cn2j5HsZgCuIma4m+NLQ3JmfTlLaKyz+KSaQTRKJUK07n4T6cKvrKFN6/xGM5MvB+0fJJJDxn949bj5AoXo6fvLq7r3jh6uryDDUZ5zQ==',key_name='tempest-TestNetworkBasicOps-829941913',keypairs=<?>,launch_index=0,launched_at=2025-12-05T09:34:30Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='f918144b49634ed5a43d75f8f7d194d3',ramdisk_id='',reservation_id='r-jwd1ncb2',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='3ebffd97-b242-42d7-b245-ebdaf8e4377c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1341993432',owner_user_name='tempest-TestNetworkBasicOps-1341993432-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-05T09:34:30Z,user_data=None,user_id='822a2d80e80e46d9a5a49e1e9560e0d9',uuid=438132a9-5f29-4b35-b457-98c182eb1660,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "20f9bafd-4df6-4d4a-849b-c578c6f0bcef", "address": "fa:16:3e:e4:6f:05", "network": {"id": "918acd23-3741-478c-80cb-c5530f6594f8", "bridge": "br-int", "label": "tempest-network-smoke--1861576532", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.221", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f918144b49634ed5a43d75f8f7d194d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap20f9bafd-4d", "ovs_interfaceid": "20f9bafd-4df6-4d4a-849b-c578c6f0bcef", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Dec 05 09:34:50 compute-1 nova_compute[189066]: 2025-12-05 09:34:50.208 189070 DEBUG nova.network.os_vif_util [None req-f4b86f44-3290-4dea-95fa-423e6125ace0 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Converting VIF {"id": "20f9bafd-4df6-4d4a-849b-c578c6f0bcef", "address": "fa:16:3e:e4:6f:05", "network": {"id": "918acd23-3741-478c-80cb-c5530f6594f8", "bridge": "br-int", "label": "tempest-network-smoke--1861576532", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.221", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f918144b49634ed5a43d75f8f7d194d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap20f9bafd-4d", "ovs_interfaceid": "20f9bafd-4df6-4d4a-849b-c578c6f0bcef", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 09:34:50 compute-1 nova_compute[189066]: 2025-12-05 09:34:50.209 189070 DEBUG nova.network.os_vif_util [None req-f4b86f44-3290-4dea-95fa-423e6125ace0 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:e4:6f:05,bridge_name='br-int',has_traffic_filtering=True,id=20f9bafd-4df6-4d4a-849b-c578c6f0bcef,network=Network(918acd23-3741-478c-80cb-c5530f6594f8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap20f9bafd-4d') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 09:34:50 compute-1 nova_compute[189066]: 2025-12-05 09:34:50.209 189070 DEBUG os_vif [None req-f4b86f44-3290-4dea-95fa-423e6125ace0 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:e4:6f:05,bridge_name='br-int',has_traffic_filtering=True,id=20f9bafd-4df6-4d4a-849b-c578c6f0bcef,network=Network(918acd23-3741-478c-80cb-c5530f6594f8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap20f9bafd-4d') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Dec 05 09:34:50 compute-1 nova_compute[189066]: 2025-12-05 09:34:50.212 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:34:50 compute-1 nova_compute[189066]: 2025-12-05 09:34:50.212 189070 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap20f9bafd-4d, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 09:34:50 compute-1 nova_compute[189066]: 2025-12-05 09:34:50.214 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:34:50 compute-1 nova_compute[189066]: 2025-12-05 09:34:50.215 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:34:50 compute-1 nova_compute[189066]: 2025-12-05 09:34:50.221 189070 INFO os_vif [None req-f4b86f44-3290-4dea-95fa-423e6125ace0 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:e4:6f:05,bridge_name='br-int',has_traffic_filtering=True,id=20f9bafd-4df6-4d4a-849b-c578c6f0bcef,network=Network(918acd23-3741-478c-80cb-c5530f6594f8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap20f9bafd-4d')
Dec 05 09:34:50 compute-1 nova_compute[189066]: 2025-12-05 09:34:50.221 189070 INFO nova.virt.libvirt.driver [None req-f4b86f44-3290-4dea-95fa-423e6125ace0 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: 438132a9-5f29-4b35-b457-98c182eb1660] Deleting instance files /var/lib/nova/instances/438132a9-5f29-4b35-b457-98c182eb1660_del
Dec 05 09:34:50 compute-1 nova_compute[189066]: 2025-12-05 09:34:50.222 189070 INFO nova.virt.libvirt.driver [None req-f4b86f44-3290-4dea-95fa-423e6125ace0 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: 438132a9-5f29-4b35-b457-98c182eb1660] Deletion of /var/lib/nova/instances/438132a9-5f29-4b35-b457-98c182eb1660_del complete
Dec 05 09:34:50 compute-1 nova_compute[189066]: 2025-12-05 09:34:50.279 189070 INFO nova.compute.manager [None req-f4b86f44-3290-4dea-95fa-423e6125ace0 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: 438132a9-5f29-4b35-b457-98c182eb1660] Took 0.36 seconds to destroy the instance on the hypervisor.
Dec 05 09:34:50 compute-1 nova_compute[189066]: 2025-12-05 09:34:50.280 189070 DEBUG oslo.service.loopingcall [None req-f4b86f44-3290-4dea-95fa-423e6125ace0 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Dec 05 09:34:50 compute-1 nova_compute[189066]: 2025-12-05 09:34:50.280 189070 DEBUG nova.compute.manager [-] [instance: 438132a9-5f29-4b35-b457-98c182eb1660] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Dec 05 09:34:50 compute-1 nova_compute[189066]: 2025-12-05 09:34:50.280 189070 DEBUG nova.network.neutron [-] [instance: 438132a9-5f29-4b35-b457-98c182eb1660] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Dec 05 09:34:50 compute-1 nova_compute[189066]: 2025-12-05 09:34:50.699 189070 DEBUG nova.compute.manager [req-84b93901-8343-423a-b6da-4549a128bef2 req-2f536bb9-5d4b-4d32-87c1-e62a606b2a0a 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 438132a9-5f29-4b35-b457-98c182eb1660] Received event network-vif-unplugged-20f9bafd-4df6-4d4a-849b-c578c6f0bcef external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 09:34:50 compute-1 nova_compute[189066]: 2025-12-05 09:34:50.700 189070 DEBUG oslo_concurrency.lockutils [req-84b93901-8343-423a-b6da-4549a128bef2 req-2f536bb9-5d4b-4d32-87c1-e62a606b2a0a 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Acquiring lock "438132a9-5f29-4b35-b457-98c182eb1660-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:34:50 compute-1 nova_compute[189066]: 2025-12-05 09:34:50.700 189070 DEBUG oslo_concurrency.lockutils [req-84b93901-8343-423a-b6da-4549a128bef2 req-2f536bb9-5d4b-4d32-87c1-e62a606b2a0a 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Lock "438132a9-5f29-4b35-b457-98c182eb1660-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:34:50 compute-1 nova_compute[189066]: 2025-12-05 09:34:50.701 189070 DEBUG oslo_concurrency.lockutils [req-84b93901-8343-423a-b6da-4549a128bef2 req-2f536bb9-5d4b-4d32-87c1-e62a606b2a0a 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Lock "438132a9-5f29-4b35-b457-98c182eb1660-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:34:50 compute-1 nova_compute[189066]: 2025-12-05 09:34:50.701 189070 DEBUG nova.compute.manager [req-84b93901-8343-423a-b6da-4549a128bef2 req-2f536bb9-5d4b-4d32-87c1-e62a606b2a0a 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 438132a9-5f29-4b35-b457-98c182eb1660] No waiting events found dispatching network-vif-unplugged-20f9bafd-4df6-4d4a-849b-c578c6f0bcef pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 09:34:50 compute-1 nova_compute[189066]: 2025-12-05 09:34:50.701 189070 DEBUG nova.compute.manager [req-84b93901-8343-423a-b6da-4549a128bef2 req-2f536bb9-5d4b-4d32-87c1-e62a606b2a0a 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 438132a9-5f29-4b35-b457-98c182eb1660] Received event network-vif-unplugged-20f9bafd-4df6-4d4a-849b-c578c6f0bcef for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Dec 05 09:34:52 compute-1 podman[228233]: 2025-12-05 09:34:52.633069496 +0000 UTC m=+0.066073542 container health_status e8cea6352187f28e672aa25c227f2705a90d58074b8cf42235a56df5d4026fd5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a42d21893a42142b0b00e96296a13ac9d2c0fedf55d6438ad104fd0309c63376-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Dec 05 09:34:52 compute-1 nova_compute[189066]: 2025-12-05 09:34:52.731 189070 DEBUG nova.network.neutron [-] [instance: 438132a9-5f29-4b35-b457-98c182eb1660] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 09:34:52 compute-1 nova_compute[189066]: 2025-12-05 09:34:52.761 189070 INFO nova.compute.manager [-] [instance: 438132a9-5f29-4b35-b457-98c182eb1660] Took 2.48 seconds to deallocate network for instance.
Dec 05 09:34:52 compute-1 nova_compute[189066]: 2025-12-05 09:34:52.837 189070 DEBUG oslo_concurrency.lockutils [None req-f4b86f44-3290-4dea-95fa-423e6125ace0 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:34:52 compute-1 nova_compute[189066]: 2025-12-05 09:34:52.837 189070 DEBUG oslo_concurrency.lockutils [None req-f4b86f44-3290-4dea-95fa-423e6125ace0 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:34:52 compute-1 nova_compute[189066]: 2025-12-05 09:34:52.869 189070 DEBUG nova.compute.manager [req-6558c71a-23a3-4b9d-a5e7-3056c6076006 req-929ff8d8-1dc1-47b5-bb95-a0c34b90718f 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 438132a9-5f29-4b35-b457-98c182eb1660] Received event network-vif-plugged-20f9bafd-4df6-4d4a-849b-c578c6f0bcef external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 09:34:52 compute-1 nova_compute[189066]: 2025-12-05 09:34:52.870 189070 DEBUG oslo_concurrency.lockutils [req-6558c71a-23a3-4b9d-a5e7-3056c6076006 req-929ff8d8-1dc1-47b5-bb95-a0c34b90718f 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Acquiring lock "438132a9-5f29-4b35-b457-98c182eb1660-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:34:52 compute-1 nova_compute[189066]: 2025-12-05 09:34:52.870 189070 DEBUG oslo_concurrency.lockutils [req-6558c71a-23a3-4b9d-a5e7-3056c6076006 req-929ff8d8-1dc1-47b5-bb95-a0c34b90718f 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Lock "438132a9-5f29-4b35-b457-98c182eb1660-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:34:52 compute-1 nova_compute[189066]: 2025-12-05 09:34:52.870 189070 DEBUG oslo_concurrency.lockutils [req-6558c71a-23a3-4b9d-a5e7-3056c6076006 req-929ff8d8-1dc1-47b5-bb95-a0c34b90718f 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Lock "438132a9-5f29-4b35-b457-98c182eb1660-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:34:52 compute-1 nova_compute[189066]: 2025-12-05 09:34:52.870 189070 DEBUG nova.compute.manager [req-6558c71a-23a3-4b9d-a5e7-3056c6076006 req-929ff8d8-1dc1-47b5-bb95-a0c34b90718f 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 438132a9-5f29-4b35-b457-98c182eb1660] No waiting events found dispatching network-vif-plugged-20f9bafd-4df6-4d4a-849b-c578c6f0bcef pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 09:34:52 compute-1 nova_compute[189066]: 2025-12-05 09:34:52.871 189070 WARNING nova.compute.manager [req-6558c71a-23a3-4b9d-a5e7-3056c6076006 req-929ff8d8-1dc1-47b5-bb95-a0c34b90718f 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 438132a9-5f29-4b35-b457-98c182eb1660] Received unexpected event network-vif-plugged-20f9bafd-4df6-4d4a-849b-c578c6f0bcef for instance with vm_state deleted and task_state None.
Dec 05 09:34:52 compute-1 nova_compute[189066]: 2025-12-05 09:34:52.871 189070 DEBUG nova.compute.manager [req-6558c71a-23a3-4b9d-a5e7-3056c6076006 req-929ff8d8-1dc1-47b5-bb95-a0c34b90718f 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 438132a9-5f29-4b35-b457-98c182eb1660] Received event network-vif-deleted-20f9bafd-4df6-4d4a-849b-c578c6f0bcef external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 09:34:52 compute-1 nova_compute[189066]: 2025-12-05 09:34:52.983 189070 DEBUG nova.compute.provider_tree [None req-f4b86f44-3290-4dea-95fa-423e6125ace0 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Inventory has not changed in ProviderTree for provider: be68f9f1-7820-4bfa-8dbd-210e13729f64 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 09:34:53 compute-1 nova_compute[189066]: 2025-12-05 09:34:53.004 189070 DEBUG nova.scheduler.client.report [None req-f4b86f44-3290-4dea-95fa-423e6125ace0 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Inventory has not changed for provider be68f9f1-7820-4bfa-8dbd-210e13729f64 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 09:34:53 compute-1 nova_compute[189066]: 2025-12-05 09:34:53.032 189070 DEBUG oslo_concurrency.lockutils [None req-f4b86f44-3290-4dea-95fa-423e6125ace0 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.195s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:34:53 compute-1 nova_compute[189066]: 2025-12-05 09:34:53.067 189070 INFO nova.scheduler.client.report [None req-f4b86f44-3290-4dea-95fa-423e6125ace0 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Deleted allocations for instance 438132a9-5f29-4b35-b457-98c182eb1660
Dec 05 09:34:53 compute-1 nova_compute[189066]: 2025-12-05 09:34:53.179 189070 DEBUG oslo_concurrency.lockutils [None req-f4b86f44-3290-4dea-95fa-423e6125ace0 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Lock "438132a9-5f29-4b35-b457-98c182eb1660" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.262s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:34:54 compute-1 nova_compute[189066]: 2025-12-05 09:34:54.531 189070 DEBUG oslo_concurrency.lockutils [None req-1409d481-8163-406f-af58-f12849ad789d 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Acquiring lock "06bf42f4-0e5e-43c8-81c4-b1df487dafe3" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:34:54 compute-1 nova_compute[189066]: 2025-12-05 09:34:54.532 189070 DEBUG oslo_concurrency.lockutils [None req-1409d481-8163-406f-af58-f12849ad789d 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Lock "06bf42f4-0e5e-43c8-81c4-b1df487dafe3" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:34:54 compute-1 nova_compute[189066]: 2025-12-05 09:34:54.532 189070 DEBUG oslo_concurrency.lockutils [None req-1409d481-8163-406f-af58-f12849ad789d 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Acquiring lock "06bf42f4-0e5e-43c8-81c4-b1df487dafe3-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:34:54 compute-1 nova_compute[189066]: 2025-12-05 09:34:54.533 189070 DEBUG oslo_concurrency.lockutils [None req-1409d481-8163-406f-af58-f12849ad789d 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Lock "06bf42f4-0e5e-43c8-81c4-b1df487dafe3-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:34:54 compute-1 nova_compute[189066]: 2025-12-05 09:34:54.533 189070 DEBUG oslo_concurrency.lockutils [None req-1409d481-8163-406f-af58-f12849ad789d 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Lock "06bf42f4-0e5e-43c8-81c4-b1df487dafe3-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:34:54 compute-1 nova_compute[189066]: 2025-12-05 09:34:54.534 189070 INFO nova.compute.manager [None req-1409d481-8163-406f-af58-f12849ad789d 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: 06bf42f4-0e5e-43c8-81c4-b1df487dafe3] Terminating instance
Dec 05 09:34:54 compute-1 nova_compute[189066]: 2025-12-05 09:34:54.535 189070 DEBUG nova.compute.manager [None req-1409d481-8163-406f-af58-f12849ad789d 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: 06bf42f4-0e5e-43c8-81c4-b1df487dafe3] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Dec 05 09:34:54 compute-1 kernel: tapf02068c2-66 (unregistering): left promiscuous mode
Dec 05 09:34:54 compute-1 NetworkManager[55704]: <info>  [1764927294.5590] device (tapf02068c2-66): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec 05 09:34:54 compute-1 ovn_controller[95809]: 2025-12-05T09:34:54Z|00184|binding|INFO|Releasing lport f02068c2-6668-4bd6-9364-5ddc884043c6 from this chassis (sb_readonly=0)
Dec 05 09:34:54 compute-1 ovn_controller[95809]: 2025-12-05T09:34:54Z|00185|binding|INFO|Setting lport f02068c2-6668-4bd6-9364-5ddc884043c6 down in Southbound
Dec 05 09:34:54 compute-1 ovn_controller[95809]: 2025-12-05T09:34:54Z|00186|binding|INFO|Removing iface tapf02068c2-66 ovn-installed in OVS
Dec 05 09:34:54 compute-1 nova_compute[189066]: 2025-12-05 09:34:54.569 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:34:54 compute-1 nova_compute[189066]: 2025-12-05 09:34:54.572 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:34:54 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:34:54.583 105272 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e5:e6:29 10.100.0.10'], port_security=['fa:16:3e:e5:e6:29 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': '06bf42f4-0e5e-43c8-81c4-b1df487dafe3', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-918acd23-3741-478c-80cb-c5530f6594f8', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'f918144b49634ed5a43d75f8f7d194d3', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'c71111d9-c1ab-4a44-83b2-10a513d3fb97', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=7f705fc6-bf74-428a-8b4c-cdd14a589f25, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fc06f54c970>], logical_port=f02068c2-6668-4bd6-9364-5ddc884043c6) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fc06f54c970>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 09:34:54 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:34:54.584 105272 INFO neutron.agent.ovn.metadata.agent [-] Port f02068c2-6668-4bd6-9364-5ddc884043c6 in datapath 918acd23-3741-478c-80cb-c5530f6594f8 unbound from our chassis
Dec 05 09:34:54 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:34:54.587 105272 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 918acd23-3741-478c-80cb-c5530f6594f8, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 05 09:34:54 compute-1 nova_compute[189066]: 2025-12-05 09:34:54.587 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:34:54 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:34:54.588 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[7a9e0eeb-74cd-4eb0-9ef1-5398f84d3098]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:34:54 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:34:54.589 105272 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-918acd23-3741-478c-80cb-c5530f6594f8 namespace which is not needed anymore
Dec 05 09:34:54 compute-1 systemd[1]: machine-qemu\x2d13\x2dinstance\x2d0000001c.scope: Deactivated successfully.
Dec 05 09:34:54 compute-1 systemd[1]: machine-qemu\x2d13\x2dinstance\x2d0000001c.scope: Consumed 16.643s CPU time.
Dec 05 09:34:54 compute-1 systemd-machined[154815]: Machine qemu-13-instance-0000001c terminated.
Dec 05 09:34:54 compute-1 podman[228252]: 2025-12-05 09:34:54.648790388 +0000 UTC m=+0.074147129 container health_status 3f3288243aefe700d53cffce184353529fbaf6e44f1c19ec4792ed576af41176 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3)
Dec 05 09:34:54 compute-1 neutron-haproxy-ovnmeta-918acd23-3741-478c-80cb-c5530f6594f8[227715]: [NOTICE]   (227719) : haproxy version is 2.8.14-c23fe91
Dec 05 09:34:54 compute-1 neutron-haproxy-ovnmeta-918acd23-3741-478c-80cb-c5530f6594f8[227715]: [NOTICE]   (227719) : path to executable is /usr/sbin/haproxy
Dec 05 09:34:54 compute-1 neutron-haproxy-ovnmeta-918acd23-3741-478c-80cb-c5530f6594f8[227715]: [WARNING]  (227719) : Exiting Master process...
Dec 05 09:34:54 compute-1 neutron-haproxy-ovnmeta-918acd23-3741-478c-80cb-c5530f6594f8[227715]: [WARNING]  (227719) : Exiting Master process...
Dec 05 09:34:54 compute-1 neutron-haproxy-ovnmeta-918acd23-3741-478c-80cb-c5530f6594f8[227715]: [ALERT]    (227719) : Current worker (227721) exited with code 143 (Terminated)
Dec 05 09:34:54 compute-1 neutron-haproxy-ovnmeta-918acd23-3741-478c-80cb-c5530f6594f8[227715]: [WARNING]  (227719) : All workers exited. Exiting... (0)
Dec 05 09:34:54 compute-1 systemd[1]: libpod-ae4aebb1f529fcc5564fdc09f2d4f5f1e5ade28d28d2bfe3aae723729be47ce6.scope: Deactivated successfully.
Dec 05 09:34:54 compute-1 podman[228298]: 2025-12-05 09:34:54.748649103 +0000 UTC m=+0.049558439 container died ae4aebb1f529fcc5564fdc09f2d4f5f1e5ade28d28d2bfe3aae723729be47ce6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-918acd23-3741-478c-80cb-c5530f6594f8, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251125)
Dec 05 09:34:54 compute-1 nova_compute[189066]: 2025-12-05 09:34:54.759 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:34:54 compute-1 nova_compute[189066]: 2025-12-05 09:34:54.764 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:34:54 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-ae4aebb1f529fcc5564fdc09f2d4f5f1e5ade28d28d2bfe3aae723729be47ce6-userdata-shm.mount: Deactivated successfully.
Dec 05 09:34:54 compute-1 systemd[1]: var-lib-containers-storage-overlay-e76c29f90911fd5c27181b6bb69505fa5d087040bd156891b7f30dbd3a661bb5-merged.mount: Deactivated successfully.
Dec 05 09:34:54 compute-1 podman[228298]: 2025-12-05 09:34:54.795786623 +0000 UTC m=+0.096695969 container cleanup ae4aebb1f529fcc5564fdc09f2d4f5f1e5ade28d28d2bfe3aae723729be47ce6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-918acd23-3741-478c-80cb-c5530f6594f8, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Dec 05 09:34:54 compute-1 nova_compute[189066]: 2025-12-05 09:34:54.805 189070 INFO nova.virt.libvirt.driver [-] [instance: 06bf42f4-0e5e-43c8-81c4-b1df487dafe3] Instance destroyed successfully.
Dec 05 09:34:54 compute-1 nova_compute[189066]: 2025-12-05 09:34:54.806 189070 DEBUG nova.objects.instance [None req-1409d481-8163-406f-af58-f12849ad789d 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Lazy-loading 'resources' on Instance uuid 06bf42f4-0e5e-43c8-81c4-b1df487dafe3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 09:34:54 compute-1 systemd[1]: libpod-conmon-ae4aebb1f529fcc5564fdc09f2d4f5f1e5ade28d28d2bfe3aae723729be47ce6.scope: Deactivated successfully.
Dec 05 09:34:54 compute-1 nova_compute[189066]: 2025-12-05 09:34:54.823 189070 DEBUG nova.virt.libvirt.vif [None req-1409d481-8163-406f-af58-f12849ad789d 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-05T09:33:16Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-620680644',display_name='tempest-TestNetworkBasicOps-server-620680644',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-620680644',id=28,image_ref='3ebffd97-b242-42d7-b245-ebdaf8e4377c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDm7JtMu9Dp/tAvdvilb6CWwpNoXl3CoNqyGQ9V53LjMNqLGWWqKoK0rxjz9+HMJlVNjmXBoxrWMPvPgS4KFYbqapau5Y14IAaLaUXSF6ZHboVdyTm2c2OurXRERdd7b/w==',key_name='tempest-TestNetworkBasicOps-344419197',keypairs=<?>,launch_index=0,launched_at=2025-12-05T09:33:39Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='f918144b49634ed5a43d75f8f7d194d3',ramdisk_id='',reservation_id='r-54wr0cp7',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='3ebffd97-b242-42d7-b245-ebdaf8e4377c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1341993432',owner_user_name='tempest-TestNetworkBasicOps-1341993432-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-05T09:33:39Z,user_data=None,user_id='822a2d80e80e46d9a5a49e1e9560e0d9',uuid=06bf42f4-0e5e-43c8-81c4-b1df487dafe3,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "f02068c2-6668-4bd6-9364-5ddc884043c6", "address": "fa:16:3e:e5:e6:29", "network": {"id": "918acd23-3741-478c-80cb-c5530f6594f8", "bridge": "br-int", "label": "tempest-network-smoke--1861576532", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f918144b49634ed5a43d75f8f7d194d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf02068c2-66", "ovs_interfaceid": "f02068c2-6668-4bd6-9364-5ddc884043c6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Dec 05 09:34:54 compute-1 nova_compute[189066]: 2025-12-05 09:34:54.823 189070 DEBUG nova.network.os_vif_util [None req-1409d481-8163-406f-af58-f12849ad789d 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Converting VIF {"id": "f02068c2-6668-4bd6-9364-5ddc884043c6", "address": "fa:16:3e:e5:e6:29", "network": {"id": "918acd23-3741-478c-80cb-c5530f6594f8", "bridge": "br-int", "label": "tempest-network-smoke--1861576532", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f918144b49634ed5a43d75f8f7d194d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf02068c2-66", "ovs_interfaceid": "f02068c2-6668-4bd6-9364-5ddc884043c6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 09:34:54 compute-1 nova_compute[189066]: 2025-12-05 09:34:54.824 189070 DEBUG nova.network.os_vif_util [None req-1409d481-8163-406f-af58-f12849ad789d 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:e5:e6:29,bridge_name='br-int',has_traffic_filtering=True,id=f02068c2-6668-4bd6-9364-5ddc884043c6,network=Network(918acd23-3741-478c-80cb-c5530f6594f8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf02068c2-66') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 09:34:54 compute-1 nova_compute[189066]: 2025-12-05 09:34:54.825 189070 DEBUG os_vif [None req-1409d481-8163-406f-af58-f12849ad789d 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:e5:e6:29,bridge_name='br-int',has_traffic_filtering=True,id=f02068c2-6668-4bd6-9364-5ddc884043c6,network=Network(918acd23-3741-478c-80cb-c5530f6594f8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf02068c2-66') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Dec 05 09:34:54 compute-1 nova_compute[189066]: 2025-12-05 09:34:54.826 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:34:54 compute-1 nova_compute[189066]: 2025-12-05 09:34:54.827 189070 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf02068c2-66, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 09:34:54 compute-1 nova_compute[189066]: 2025-12-05 09:34:54.828 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:34:54 compute-1 nova_compute[189066]: 2025-12-05 09:34:54.830 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 09:34:54 compute-1 nova_compute[189066]: 2025-12-05 09:34:54.831 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:34:54 compute-1 nova_compute[189066]: 2025-12-05 09:34:54.833 189070 INFO os_vif [None req-1409d481-8163-406f-af58-f12849ad789d 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:e5:e6:29,bridge_name='br-int',has_traffic_filtering=True,id=f02068c2-6668-4bd6-9364-5ddc884043c6,network=Network(918acd23-3741-478c-80cb-c5530f6594f8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf02068c2-66')
Dec 05 09:34:54 compute-1 nova_compute[189066]: 2025-12-05 09:34:54.834 189070 INFO nova.virt.libvirt.driver [None req-1409d481-8163-406f-af58-f12849ad789d 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: 06bf42f4-0e5e-43c8-81c4-b1df487dafe3] Deleting instance files /var/lib/nova/instances/06bf42f4-0e5e-43c8-81c4-b1df487dafe3_del
Dec 05 09:34:54 compute-1 nova_compute[189066]: 2025-12-05 09:34:54.834 189070 INFO nova.virt.libvirt.driver [None req-1409d481-8163-406f-af58-f12849ad789d 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: 06bf42f4-0e5e-43c8-81c4-b1df487dafe3] Deletion of /var/lib/nova/instances/06bf42f4-0e5e-43c8-81c4-b1df487dafe3_del complete
Dec 05 09:34:54 compute-1 nova_compute[189066]: 2025-12-05 09:34:54.839 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:34:54 compute-1 podman[228341]: 2025-12-05 09:34:54.868157948 +0000 UTC m=+0.044786883 container remove ae4aebb1f529fcc5564fdc09f2d4f5f1e5ade28d28d2bfe3aae723729be47ce6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-918acd23-3741-478c-80cb-c5530f6594f8, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 05 09:34:54 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:34:54.874 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[3d2cdd03-8d1a-460f-a895-fdac731fbecd]: (4, ('Fri Dec  5 09:34:54 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-918acd23-3741-478c-80cb-c5530f6594f8 (ae4aebb1f529fcc5564fdc09f2d4f5f1e5ade28d28d2bfe3aae723729be47ce6)\nae4aebb1f529fcc5564fdc09f2d4f5f1e5ade28d28d2bfe3aae723729be47ce6\nFri Dec  5 09:34:54 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-918acd23-3741-478c-80cb-c5530f6594f8 (ae4aebb1f529fcc5564fdc09f2d4f5f1e5ade28d28d2bfe3aae723729be47ce6)\nae4aebb1f529fcc5564fdc09f2d4f5f1e5ade28d28d2bfe3aae723729be47ce6\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:34:54 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:34:54.876 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[66e4d42b-3e16-40bb-8d4d-da401b4fb0ad]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:34:54 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:34:54.877 105272 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap918acd23-30, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 09:34:54 compute-1 nova_compute[189066]: 2025-12-05 09:34:54.879 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:34:54 compute-1 kernel: tap918acd23-30: left promiscuous mode
Dec 05 09:34:54 compute-1 nova_compute[189066]: 2025-12-05 09:34:54.881 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:34:54 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:34:54.884 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[a341a59b-6758-4bc9-a3ae-a1a7a222b3e5]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:34:54 compute-1 nova_compute[189066]: 2025-12-05 09:34:54.893 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:34:54 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:34:54.908 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[9aa3258f-9ec6-4546-8e21-71ee86d63602]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:34:54 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:34:54.909 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[efb70965-722e-4547-af86-8e3ebeb70c99]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:34:54 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:34:54.927 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[58d24b67-4f95-4033-aa2d-5a5a4397d1b9]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 449045, 'reachable_time': 39271, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 228357, 'error': None, 'target': 'ovnmeta-918acd23-3741-478c-80cb-c5530f6594f8', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:34:54 compute-1 systemd[1]: run-netns-ovnmeta\x2d918acd23\x2d3741\x2d478c\x2d80cb\x2dc5530f6594f8.mount: Deactivated successfully.
Dec 05 09:34:54 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:34:54.932 105809 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-918acd23-3741-478c-80cb-c5530f6594f8 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Dec 05 09:34:54 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:34:54.933 105809 DEBUG oslo.privsep.daemon [-] privsep: reply[71a1023e-ea44-4cd1-bbc9-f6eb941aa121]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:34:54 compute-1 nova_compute[189066]: 2025-12-05 09:34:54.936 189070 INFO nova.compute.manager [None req-1409d481-8163-406f-af58-f12849ad789d 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: 06bf42f4-0e5e-43c8-81c4-b1df487dafe3] Took 0.40 seconds to destroy the instance on the hypervisor.
Dec 05 09:34:54 compute-1 nova_compute[189066]: 2025-12-05 09:34:54.936 189070 DEBUG oslo.service.loopingcall [None req-1409d481-8163-406f-af58-f12849ad789d 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Dec 05 09:34:54 compute-1 nova_compute[189066]: 2025-12-05 09:34:54.937 189070 DEBUG nova.compute.manager [-] [instance: 06bf42f4-0e5e-43c8-81c4-b1df487dafe3] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Dec 05 09:34:54 compute-1 nova_compute[189066]: 2025-12-05 09:34:54.937 189070 DEBUG nova.network.neutron [-] [instance: 06bf42f4-0e5e-43c8-81c4-b1df487dafe3] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Dec 05 09:34:55 compute-1 nova_compute[189066]: 2025-12-05 09:34:55.214 189070 DEBUG nova.compute.manager [req-4ba33f15-711e-4271-9e6a-bb8911021edf req-c9334503-9244-4043-9ead-04d4f8dab960 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 06bf42f4-0e5e-43c8-81c4-b1df487dafe3] Received event network-vif-unplugged-f02068c2-6668-4bd6-9364-5ddc884043c6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 09:34:55 compute-1 nova_compute[189066]: 2025-12-05 09:34:55.214 189070 DEBUG oslo_concurrency.lockutils [req-4ba33f15-711e-4271-9e6a-bb8911021edf req-c9334503-9244-4043-9ead-04d4f8dab960 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Acquiring lock "06bf42f4-0e5e-43c8-81c4-b1df487dafe3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:34:55 compute-1 nova_compute[189066]: 2025-12-05 09:34:55.214 189070 DEBUG oslo_concurrency.lockutils [req-4ba33f15-711e-4271-9e6a-bb8911021edf req-c9334503-9244-4043-9ead-04d4f8dab960 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Lock "06bf42f4-0e5e-43c8-81c4-b1df487dafe3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:34:55 compute-1 nova_compute[189066]: 2025-12-05 09:34:55.215 189070 DEBUG oslo_concurrency.lockutils [req-4ba33f15-711e-4271-9e6a-bb8911021edf req-c9334503-9244-4043-9ead-04d4f8dab960 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Lock "06bf42f4-0e5e-43c8-81c4-b1df487dafe3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:34:55 compute-1 nova_compute[189066]: 2025-12-05 09:34:55.215 189070 DEBUG nova.compute.manager [req-4ba33f15-711e-4271-9e6a-bb8911021edf req-c9334503-9244-4043-9ead-04d4f8dab960 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 06bf42f4-0e5e-43c8-81c4-b1df487dafe3] No waiting events found dispatching network-vif-unplugged-f02068c2-6668-4bd6-9364-5ddc884043c6 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 09:34:55 compute-1 nova_compute[189066]: 2025-12-05 09:34:55.215 189070 DEBUG nova.compute.manager [req-4ba33f15-711e-4271-9e6a-bb8911021edf req-c9334503-9244-4043-9ead-04d4f8dab960 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 06bf42f4-0e5e-43c8-81c4-b1df487dafe3] Received event network-vif-unplugged-f02068c2-6668-4bd6-9364-5ddc884043c6 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Dec 05 09:34:55 compute-1 nova_compute[189066]: 2025-12-05 09:34:55.650 189070 DEBUG nova.network.neutron [-] [instance: 06bf42f4-0e5e-43c8-81c4-b1df487dafe3] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 09:34:55 compute-1 nova_compute[189066]: 2025-12-05 09:34:55.688 189070 INFO nova.compute.manager [-] [instance: 06bf42f4-0e5e-43c8-81c4-b1df487dafe3] Took 0.75 seconds to deallocate network for instance.
Dec 05 09:34:55 compute-1 nova_compute[189066]: 2025-12-05 09:34:55.754 189070 DEBUG oslo_concurrency.lockutils [None req-1409d481-8163-406f-af58-f12849ad789d 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:34:55 compute-1 nova_compute[189066]: 2025-12-05 09:34:55.755 189070 DEBUG oslo_concurrency.lockutils [None req-1409d481-8163-406f-af58-f12849ad789d 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:34:55 compute-1 nova_compute[189066]: 2025-12-05 09:34:55.830 189070 DEBUG nova.compute.provider_tree [None req-1409d481-8163-406f-af58-f12849ad789d 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Inventory has not changed in ProviderTree for provider: be68f9f1-7820-4bfa-8dbd-210e13729f64 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 09:34:56 compute-1 nova_compute[189066]: 2025-12-05 09:34:56.706 189070 DEBUG nova.compute.manager [req-c9404340-2e08-47d0-81f5-971e635d9857 req-d9cf4eac-6f9b-4eba-85be-bb19a62c1589 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 06bf42f4-0e5e-43c8-81c4-b1df487dafe3] Received event network-vif-deleted-f02068c2-6668-4bd6-9364-5ddc884043c6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 09:34:56 compute-1 nova_compute[189066]: 2025-12-05 09:34:56.743 189070 DEBUG nova.scheduler.client.report [None req-1409d481-8163-406f-af58-f12849ad789d 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Inventory has not changed for provider be68f9f1-7820-4bfa-8dbd-210e13729f64 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 09:34:56 compute-1 nova_compute[189066]: 2025-12-05 09:34:56.773 189070 DEBUG oslo_concurrency.lockutils [None req-1409d481-8163-406f-af58-f12849ad789d 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 1.018s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:34:56 compute-1 nova_compute[189066]: 2025-12-05 09:34:56.809 189070 INFO nova.scheduler.client.report [None req-1409d481-8163-406f-af58-f12849ad789d 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Deleted allocations for instance 06bf42f4-0e5e-43c8-81c4-b1df487dafe3
Dec 05 09:34:56 compute-1 nova_compute[189066]: 2025-12-05 09:34:56.942 189070 DEBUG oslo_concurrency.lockutils [None req-1409d481-8163-406f-af58-f12849ad789d 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Lock "06bf42f4-0e5e-43c8-81c4-b1df487dafe3" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.410s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:34:58 compute-1 nova_compute[189066]: 2025-12-05 09:34:58.276 189070 DEBUG nova.compute.manager [req-8f0c8e32-df10-41b4-9e7c-24068628bf6b req-c034720c-1fbf-46ea-bbde-3bba3b41c752 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 06bf42f4-0e5e-43c8-81c4-b1df487dafe3] Received event network-vif-plugged-f02068c2-6668-4bd6-9364-5ddc884043c6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 09:34:58 compute-1 nova_compute[189066]: 2025-12-05 09:34:58.276 189070 DEBUG oslo_concurrency.lockutils [req-8f0c8e32-df10-41b4-9e7c-24068628bf6b req-c034720c-1fbf-46ea-bbde-3bba3b41c752 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Acquiring lock "06bf42f4-0e5e-43c8-81c4-b1df487dafe3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:34:58 compute-1 nova_compute[189066]: 2025-12-05 09:34:58.277 189070 DEBUG oslo_concurrency.lockutils [req-8f0c8e32-df10-41b4-9e7c-24068628bf6b req-c034720c-1fbf-46ea-bbde-3bba3b41c752 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Lock "06bf42f4-0e5e-43c8-81c4-b1df487dafe3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:34:58 compute-1 nova_compute[189066]: 2025-12-05 09:34:58.277 189070 DEBUG oslo_concurrency.lockutils [req-8f0c8e32-df10-41b4-9e7c-24068628bf6b req-c034720c-1fbf-46ea-bbde-3bba3b41c752 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Lock "06bf42f4-0e5e-43c8-81c4-b1df487dafe3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:34:58 compute-1 nova_compute[189066]: 2025-12-05 09:34:58.277 189070 DEBUG nova.compute.manager [req-8f0c8e32-df10-41b4-9e7c-24068628bf6b req-c034720c-1fbf-46ea-bbde-3bba3b41c752 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 06bf42f4-0e5e-43c8-81c4-b1df487dafe3] No waiting events found dispatching network-vif-plugged-f02068c2-6668-4bd6-9364-5ddc884043c6 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 09:34:58 compute-1 nova_compute[189066]: 2025-12-05 09:34:58.277 189070 WARNING nova.compute.manager [req-8f0c8e32-df10-41b4-9e7c-24068628bf6b req-c034720c-1fbf-46ea-bbde-3bba3b41c752 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 06bf42f4-0e5e-43c8-81c4-b1df487dafe3] Received unexpected event network-vif-plugged-f02068c2-6668-4bd6-9364-5ddc884043c6 for instance with vm_state deleted and task_state None.
Dec 05 09:34:58 compute-1 podman[228358]: 2025-12-05 09:34:58.644753696 +0000 UTC m=+0.066568605 container health_status b07f36300303a5c1729056a61e344d6c6fd9b2e404082d8e8ce7185d45652c2b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, release=1755695350, vcs-type=git, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., managed_by=edpm_ansible, distribution-scope=public, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, architecture=x86_64, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., version=9.6, config_id=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, build-date=2025-08-20T13:12:41, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, maintainer=Red Hat, Inc.)
Dec 05 09:34:59 compute-1 nova_compute[189066]: 2025-12-05 09:34:59.829 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:34:59 compute-1 nova_compute[189066]: 2025-12-05 09:34:59.835 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:35:02 compute-1 podman[228379]: 2025-12-05 09:35:02.609432781 +0000 UTC m=+0.048992406 container health_status b9f45cdcb567bb250381fa80d86c78c226d5638c7de1b6d79ee017275814ff68 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Dec 05 09:35:04 compute-1 nova_compute[189066]: 2025-12-05 09:35:04.833 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:35:04 compute-1 nova_compute[189066]: 2025-12-05 09:35:04.836 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:35:05 compute-1 nova_compute[189066]: 2025-12-05 09:35:05.187 189070 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764927290.1854088, 438132a9-5f29-4b35-b457-98c182eb1660 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 09:35:05 compute-1 nova_compute[189066]: 2025-12-05 09:35:05.188 189070 INFO nova.compute.manager [-] [instance: 438132a9-5f29-4b35-b457-98c182eb1660] VM Stopped (Lifecycle Event)
Dec 05 09:35:05 compute-1 nova_compute[189066]: 2025-12-05 09:35:05.606 189070 DEBUG nova.compute.manager [None req-15b3f255-337b-45de-a486-a8f834123131 - - - - - -] [instance: 438132a9-5f29-4b35-b457-98c182eb1660] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 09:35:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:35:08.879 105272 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:35:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:35:08.881 105272 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:35:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:35:08.881 105272 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:35:09 compute-1 podman[228406]: 2025-12-05 09:35:09.63288242 +0000 UTC m=+0.064573715 container health_status 550b604b3b40c506829669f3af0f3e8dc4e7ac01464222ce7f26998708fe1a96 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Dec 05 09:35:09 compute-1 nova_compute[189066]: 2025-12-05 09:35:09.803 189070 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764927294.8014235, 06bf42f4-0e5e-43c8-81c4-b1df487dafe3 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 09:35:09 compute-1 nova_compute[189066]: 2025-12-05 09:35:09.803 189070 INFO nova.compute.manager [-] [instance: 06bf42f4-0e5e-43c8-81c4-b1df487dafe3] VM Stopped (Lifecycle Event)
Dec 05 09:35:09 compute-1 nova_compute[189066]: 2025-12-05 09:35:09.836 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:35:09 compute-1 nova_compute[189066]: 2025-12-05 09:35:09.837 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:35:10 compute-1 nova_compute[189066]: 2025-12-05 09:35:10.586 189070 DEBUG nova.compute.manager [None req-2b7bcd17-2013-41e5-b79d-0dbdb2e99324 - - - - - -] [instance: 06bf42f4-0e5e-43c8-81c4-b1df487dafe3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 09:35:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:35:10.748 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 09:35:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:35:10.750 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 09:35:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:35:10.750 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 09:35:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:35:10.750 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 09:35:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:35:10.750 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 09:35:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:35:10.750 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 09:35:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:35:10.750 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 09:35:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:35:10.750 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 09:35:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:35:10.750 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 09:35:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:35:10.750 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 09:35:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:35:10.751 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 09:35:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:35:10.751 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 09:35:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:35:10.751 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 09:35:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:35:10.751 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 09:35:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:35:10.751 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 09:35:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:35:10.751 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 09:35:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:35:10.751 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 09:35:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:35:10.751 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 09:35:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:35:10.751 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 09:35:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:35:10.751 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 09:35:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:35:10.751 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 09:35:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:35:10.751 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 09:35:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:35:10.751 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 09:35:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:35:10.751 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 09:35:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:35:10.752 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 09:35:12 compute-1 sshd-session[228404]: Received disconnect from 101.47.162.91 port 45694:11: Bye Bye [preauth]
Dec 05 09:35:12 compute-1 sshd-session[228404]: Disconnected from authenticating user root 101.47.162.91 port 45694 [preauth]
Dec 05 09:35:13 compute-1 nova_compute[189066]: 2025-12-05 09:35:13.021 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:35:13 compute-1 nova_compute[189066]: 2025-12-05 09:35:13.597 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:35:13 compute-1 nova_compute[189066]: 2025-12-05 09:35:13.768 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:35:14 compute-1 nova_compute[189066]: 2025-12-05 09:35:14.840 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:35:16 compute-1 podman[228431]: 2025-12-05 09:35:16.624556915 +0000 UTC m=+0.068372177 container health_status d60d3dfd47255bc7c068dbedc03dbf86678863f0414a1b8f8536e4d53dd67a13 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a42d21893a42142b0b00e96296a13ac9d2c0fedf55d6438ad104fd0309c63376-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0)
Dec 05 09:35:17 compute-1 nova_compute[189066]: 2025-12-05 09:35:17.015 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:35:17 compute-1 nova_compute[189066]: 2025-12-05 09:35:17.020 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:35:18 compute-1 nova_compute[189066]: 2025-12-05 09:35:18.021 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:35:18 compute-1 nova_compute[189066]: 2025-12-05 09:35:18.022 189070 DEBUG nova.compute.manager [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 05 09:35:19 compute-1 nova_compute[189066]: 2025-12-05 09:35:19.021 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:35:19 compute-1 nova_compute[189066]: 2025-12-05 09:35:19.175 189070 DEBUG oslo_concurrency.lockutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:35:19 compute-1 nova_compute[189066]: 2025-12-05 09:35:19.175 189070 DEBUG oslo_concurrency.lockutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:35:19 compute-1 nova_compute[189066]: 2025-12-05 09:35:19.176 189070 DEBUG oslo_concurrency.lockutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:35:19 compute-1 nova_compute[189066]: 2025-12-05 09:35:19.176 189070 DEBUG nova.compute.resource_tracker [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 05 09:35:19 compute-1 nova_compute[189066]: 2025-12-05 09:35:19.358 189070 WARNING nova.virt.libvirt.driver [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 05 09:35:19 compute-1 nova_compute[189066]: 2025-12-05 09:35:19.360 189070 DEBUG nova.compute.resource_tracker [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5745MB free_disk=73.33369064331055GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 05 09:35:19 compute-1 nova_compute[189066]: 2025-12-05 09:35:19.361 189070 DEBUG oslo_concurrency.lockutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:35:19 compute-1 nova_compute[189066]: 2025-12-05 09:35:19.361 189070 DEBUG oslo_concurrency.lockutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:35:19 compute-1 nova_compute[189066]: 2025-12-05 09:35:19.843 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:35:19 compute-1 nova_compute[189066]: 2025-12-05 09:35:19.875 189070 DEBUG nova.compute.resource_tracker [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 05 09:35:19 compute-1 nova_compute[189066]: 2025-12-05 09:35:19.876 189070 DEBUG nova.compute.resource_tracker [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 05 09:35:19 compute-1 nova_compute[189066]: 2025-12-05 09:35:19.901 189070 DEBUG nova.compute.provider_tree [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Inventory has not changed in ProviderTree for provider: be68f9f1-7820-4bfa-8dbd-210e13729f64 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 09:35:19 compute-1 nova_compute[189066]: 2025-12-05 09:35:19.987 189070 DEBUG nova.scheduler.client.report [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Inventory has not changed for provider be68f9f1-7820-4bfa-8dbd-210e13729f64 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 09:35:20 compute-1 nova_compute[189066]: 2025-12-05 09:35:20.048 189070 DEBUG nova.compute.resource_tracker [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 05 09:35:20 compute-1 nova_compute[189066]: 2025-12-05 09:35:20.049 189070 DEBUG oslo_concurrency.lockutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.688s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:35:20 compute-1 podman[228453]: 2025-12-05 09:35:20.688810919 +0000 UTC m=+0.124830545 container health_status 0cdc951f3b455a1459471b20e1f1626ca53f702831c125f835d1b670aeb17ccf (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a42d21893a42142b0b00e96296a13ac9d2c0fedf55d6438ad104fd0309c63376-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true)
Dec 05 09:35:21 compute-1 nova_compute[189066]: 2025-12-05 09:35:21.049 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:35:21 compute-1 nova_compute[189066]: 2025-12-05 09:35:21.050 189070 DEBUG nova.compute.manager [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 05 09:35:21 compute-1 nova_compute[189066]: 2025-12-05 09:35:21.050 189070 DEBUG nova.compute.manager [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 05 09:35:21 compute-1 nova_compute[189066]: 2025-12-05 09:35:21.070 189070 DEBUG nova.compute.manager [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Dec 05 09:35:21 compute-1 nova_compute[189066]: 2025-12-05 09:35:21.071 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:35:21 compute-1 nova_compute[189066]: 2025-12-05 09:35:21.071 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:35:22 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:35:22.081 105272 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=18, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f2:d3:89', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '8e:43:2c:7d:20:dd'}, ipsec=False) old=SB_Global(nb_cfg=17) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 09:35:22 compute-1 nova_compute[189066]: 2025-12-05 09:35:22.081 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:35:22 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:35:22.082 105272 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Dec 05 09:35:22 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:35:22.083 105272 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=2deaed7a-68f6-453c-b7f8-10ef033f3762, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '18'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 09:35:23 compute-1 nova_compute[189066]: 2025-12-05 09:35:23.020 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:35:23 compute-1 podman[228479]: 2025-12-05 09:35:23.614811827 +0000 UTC m=+0.054600733 container health_status e8cea6352187f28e672aa25c227f2705a90d58074b8cf42235a56df5d4026fd5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a42d21893a42142b0b00e96296a13ac9d2c0fedf55d6438ad104fd0309c63376-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Dec 05 09:35:24 compute-1 nova_compute[189066]: 2025-12-05 09:35:24.858 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:35:25 compute-1 podman[228496]: 2025-12-05 09:35:25.618355541 +0000 UTC m=+0.058467377 container health_status 3f3288243aefe700d53cffce184353529fbaf6e44f1c19ec4792ed576af41176 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=multipathd, org.label-schema.build-date=20251125)
Dec 05 09:35:29 compute-1 podman[228516]: 2025-12-05 09:35:29.617888615 +0000 UTC m=+0.061629244 container health_status b07f36300303a5c1729056a61e344d6c6fd9b2e404082d8e8ce7185d45652c2b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.33.7, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.6, io.openshift.expose-services=, release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., managed_by=edpm_ansible, vcs-type=git, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=openstack_network_exporter, vendor=Red Hat, Inc.)
Dec 05 09:35:29 compute-1 nova_compute[189066]: 2025-12-05 09:35:29.859 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:35:33 compute-1 podman[228537]: 2025-12-05 09:35:33.615701017 +0000 UTC m=+0.053457284 container health_status b9f45cdcb567bb250381fa80d86c78c226d5638c7de1b6d79ee017275814ff68 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Dec 05 09:35:34 compute-1 nova_compute[189066]: 2025-12-05 09:35:34.861 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 09:35:39 compute-1 nova_compute[189066]: 2025-12-05 09:35:39.863 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 09:35:39 compute-1 nova_compute[189066]: 2025-12-05 09:35:39.865 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 09:35:39 compute-1 nova_compute[189066]: 2025-12-05 09:35:39.865 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Dec 05 09:35:39 compute-1 nova_compute[189066]: 2025-12-05 09:35:39.865 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 05 09:35:39 compute-1 nova_compute[189066]: 2025-12-05 09:35:39.900 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:35:39 compute-1 nova_compute[189066]: 2025-12-05 09:35:39.901 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 05 09:35:40 compute-1 podman[228563]: 2025-12-05 09:35:40.626131379 +0000 UTC m=+0.061504101 container health_status 550b604b3b40c506829669f3af0f3e8dc4e7ac01464222ce7f26998708fe1a96 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Dec 05 09:35:44 compute-1 nova_compute[189066]: 2025-12-05 09:35:44.901 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:35:44 compute-1 nova_compute[189066]: 2025-12-05 09:35:44.902 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:35:47 compute-1 podman[228587]: 2025-12-05 09:35:47.62092195 +0000 UTC m=+0.059865710 container health_status d60d3dfd47255bc7c068dbedc03dbf86678863f0414a1b8f8536e4d53dd67a13 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a42d21893a42142b0b00e96296a13ac9d2c0fedf55d6438ad104fd0309c63376-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ceilometer_agent_compute)
Dec 05 09:35:49 compute-1 nova_compute[189066]: 2025-12-05 09:35:49.904 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:35:49 compute-1 nova_compute[189066]: 2025-12-05 09:35:49.907 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 09:35:51 compute-1 podman[228607]: 2025-12-05 09:35:51.73135678 +0000 UTC m=+0.166216514 container health_status 0cdc951f3b455a1459471b20e1f1626ca53f702831c125f835d1b670aeb17ccf (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_controller, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a42d21893a42142b0b00e96296a13ac9d2c0fedf55d6438ad104fd0309c63376-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Dec 05 09:35:54 compute-1 podman[228633]: 2025-12-05 09:35:54.640555609 +0000 UTC m=+0.071379052 container health_status e8cea6352187f28e672aa25c227f2705a90d58074b8cf42235a56df5d4026fd5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a42d21893a42142b0b00e96296a13ac9d2c0fedf55d6438ad104fd0309c63376-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Dec 05 09:35:54 compute-1 nova_compute[189066]: 2025-12-05 09:35:54.816 189070 DEBUG oslo_concurrency.lockutils [None req-acd1d181-0a96-4bf5-bc4c-e12df5cce28e 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Acquiring lock "caf0a99c-b4d0-4fac-9883-ab0be359b528" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:35:54 compute-1 nova_compute[189066]: 2025-12-05 09:35:54.816 189070 DEBUG oslo_concurrency.lockutils [None req-acd1d181-0a96-4bf5-bc4c-e12df5cce28e 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Lock "caf0a99c-b4d0-4fac-9883-ab0be359b528" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:35:54 compute-1 nova_compute[189066]: 2025-12-05 09:35:54.847 189070 DEBUG nova.compute.manager [None req-acd1d181-0a96-4bf5-bc4c-e12df5cce28e 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: caf0a99c-b4d0-4fac-9883-ab0be359b528] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Dec 05 09:35:54 compute-1 nova_compute[189066]: 2025-12-05 09:35:54.907 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4996-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 09:35:54 compute-1 nova_compute[189066]: 2025-12-05 09:35:54.992 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:35:54 compute-1 nova_compute[189066]: 2025-12-05 09:35:54.993 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5087 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Dec 05 09:35:54 compute-1 nova_compute[189066]: 2025-12-05 09:35:54.993 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 05 09:35:54 compute-1 nova_compute[189066]: 2025-12-05 09:35:54.994 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 05 09:35:55 compute-1 nova_compute[189066]: 2025-12-05 09:35:55.003 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 09:35:55 compute-1 nova_compute[189066]: 2025-12-05 09:35:55.033 189070 DEBUG oslo_concurrency.lockutils [None req-acd1d181-0a96-4bf5-bc4c-e12df5cce28e 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:35:55 compute-1 nova_compute[189066]: 2025-12-05 09:35:55.034 189070 DEBUG oslo_concurrency.lockutils [None req-acd1d181-0a96-4bf5-bc4c-e12df5cce28e 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:35:55 compute-1 nova_compute[189066]: 2025-12-05 09:35:55.041 189070 DEBUG nova.virt.hardware [None req-acd1d181-0a96-4bf5-bc4c-e12df5cce28e 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Dec 05 09:35:55 compute-1 nova_compute[189066]: 2025-12-05 09:35:55.041 189070 INFO nova.compute.claims [None req-acd1d181-0a96-4bf5-bc4c-e12df5cce28e 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: caf0a99c-b4d0-4fac-9883-ab0be359b528] Claim successful on node compute-1.ctlplane.example.com
Dec 05 09:35:55 compute-1 nova_compute[189066]: 2025-12-05 09:35:55.182 189070 DEBUG nova.compute.provider_tree [None req-acd1d181-0a96-4bf5-bc4c-e12df5cce28e 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Inventory has not changed in ProviderTree for provider: be68f9f1-7820-4bfa-8dbd-210e13729f64 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 09:35:55 compute-1 nova_compute[189066]: 2025-12-05 09:35:55.387 189070 DEBUG nova.scheduler.client.report [None req-acd1d181-0a96-4bf5-bc4c-e12df5cce28e 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Inventory has not changed for provider be68f9f1-7820-4bfa-8dbd-210e13729f64 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 09:35:55 compute-1 nova_compute[189066]: 2025-12-05 09:35:55.507 189070 DEBUG oslo_concurrency.lockutils [None req-acd1d181-0a96-4bf5-bc4c-e12df5cce28e 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.474s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:35:55 compute-1 nova_compute[189066]: 2025-12-05 09:35:55.509 189070 DEBUG nova.compute.manager [None req-acd1d181-0a96-4bf5-bc4c-e12df5cce28e 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: caf0a99c-b4d0-4fac-9883-ab0be359b528] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Dec 05 09:35:55 compute-1 nova_compute[189066]: 2025-12-05 09:35:55.742 189070 DEBUG nova.compute.manager [None req-acd1d181-0a96-4bf5-bc4c-e12df5cce28e 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: caf0a99c-b4d0-4fac-9883-ab0be359b528] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Dec 05 09:35:55 compute-1 nova_compute[189066]: 2025-12-05 09:35:55.743 189070 DEBUG nova.network.neutron [None req-acd1d181-0a96-4bf5-bc4c-e12df5cce28e 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: caf0a99c-b4d0-4fac-9883-ab0be359b528] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Dec 05 09:35:55 compute-1 nova_compute[189066]: 2025-12-05 09:35:55.781 189070 INFO nova.virt.libvirt.driver [None req-acd1d181-0a96-4bf5-bc4c-e12df5cce28e 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: caf0a99c-b4d0-4fac-9883-ab0be359b528] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Dec 05 09:35:55 compute-1 nova_compute[189066]: 2025-12-05 09:35:55.833 189070 DEBUG nova.compute.manager [None req-acd1d181-0a96-4bf5-bc4c-e12df5cce28e 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: caf0a99c-b4d0-4fac-9883-ab0be359b528] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Dec 05 09:35:55 compute-1 nova_compute[189066]: 2025-12-05 09:35:55.999 189070 DEBUG nova.compute.manager [None req-acd1d181-0a96-4bf5-bc4c-e12df5cce28e 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: caf0a99c-b4d0-4fac-9883-ab0be359b528] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Dec 05 09:35:56 compute-1 nova_compute[189066]: 2025-12-05 09:35:56.001 189070 DEBUG nova.virt.libvirt.driver [None req-acd1d181-0a96-4bf5-bc4c-e12df5cce28e 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: caf0a99c-b4d0-4fac-9883-ab0be359b528] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Dec 05 09:35:56 compute-1 nova_compute[189066]: 2025-12-05 09:35:56.001 189070 INFO nova.virt.libvirt.driver [None req-acd1d181-0a96-4bf5-bc4c-e12df5cce28e 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: caf0a99c-b4d0-4fac-9883-ab0be359b528] Creating image(s)
Dec 05 09:35:56 compute-1 nova_compute[189066]: 2025-12-05 09:35:56.002 189070 DEBUG oslo_concurrency.lockutils [None req-acd1d181-0a96-4bf5-bc4c-e12df5cce28e 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Acquiring lock "/var/lib/nova/instances/caf0a99c-b4d0-4fac-9883-ab0be359b528/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:35:56 compute-1 nova_compute[189066]: 2025-12-05 09:35:56.003 189070 DEBUG oslo_concurrency.lockutils [None req-acd1d181-0a96-4bf5-bc4c-e12df5cce28e 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Lock "/var/lib/nova/instances/caf0a99c-b4d0-4fac-9883-ab0be359b528/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:35:56 compute-1 nova_compute[189066]: 2025-12-05 09:35:56.004 189070 DEBUG oslo_concurrency.lockutils [None req-acd1d181-0a96-4bf5-bc4c-e12df5cce28e 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Lock "/var/lib/nova/instances/caf0a99c-b4d0-4fac-9883-ab0be359b528/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:35:56 compute-1 nova_compute[189066]: 2025-12-05 09:35:56.020 189070 DEBUG oslo_concurrency.processutils [None req-acd1d181-0a96-4bf5-bc4c-e12df5cce28e 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5b1bdb5cc50b9bf43ebe3094e9279a12a18df0ce --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 09:35:56 compute-1 nova_compute[189066]: 2025-12-05 09:35:56.056 189070 DEBUG nova.policy [None req-acd1d181-0a96-4bf5-bc4c-e12df5cce28e 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '822a2d80e80e46d9a5a49e1e9560e0d9', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'f918144b49634ed5a43d75f8f7d194d3', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Dec 05 09:35:56 compute-1 nova_compute[189066]: 2025-12-05 09:35:56.088 189070 DEBUG oslo_concurrency.processutils [None req-acd1d181-0a96-4bf5-bc4c-e12df5cce28e 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5b1bdb5cc50b9bf43ebe3094e9279a12a18df0ce --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 09:35:56 compute-1 nova_compute[189066]: 2025-12-05 09:35:56.089 189070 DEBUG oslo_concurrency.lockutils [None req-acd1d181-0a96-4bf5-bc4c-e12df5cce28e 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Acquiring lock "5b1bdb5cc50b9bf43ebe3094e9279a12a18df0ce" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:35:56 compute-1 nova_compute[189066]: 2025-12-05 09:35:56.091 189070 DEBUG oslo_concurrency.lockutils [None req-acd1d181-0a96-4bf5-bc4c-e12df5cce28e 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Lock "5b1bdb5cc50b9bf43ebe3094e9279a12a18df0ce" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:35:56 compute-1 nova_compute[189066]: 2025-12-05 09:35:56.109 189070 DEBUG oslo_concurrency.processutils [None req-acd1d181-0a96-4bf5-bc4c-e12df5cce28e 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5b1bdb5cc50b9bf43ebe3094e9279a12a18df0ce --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 09:35:56 compute-1 nova_compute[189066]: 2025-12-05 09:35:56.173 189070 DEBUG oslo_concurrency.processutils [None req-acd1d181-0a96-4bf5-bc4c-e12df5cce28e 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5b1bdb5cc50b9bf43ebe3094e9279a12a18df0ce --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 09:35:56 compute-1 nova_compute[189066]: 2025-12-05 09:35:56.175 189070 DEBUG oslo_concurrency.processutils [None req-acd1d181-0a96-4bf5-bc4c-e12df5cce28e 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5b1bdb5cc50b9bf43ebe3094e9279a12a18df0ce,backing_fmt=raw /var/lib/nova/instances/caf0a99c-b4d0-4fac-9883-ab0be359b528/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 09:35:56 compute-1 nova_compute[189066]: 2025-12-05 09:35:56.269 189070 DEBUG oslo_concurrency.processutils [None req-acd1d181-0a96-4bf5-bc4c-e12df5cce28e 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5b1bdb5cc50b9bf43ebe3094e9279a12a18df0ce,backing_fmt=raw /var/lib/nova/instances/caf0a99c-b4d0-4fac-9883-ab0be359b528/disk 1073741824" returned: 0 in 0.094s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 09:35:56 compute-1 nova_compute[189066]: 2025-12-05 09:35:56.270 189070 DEBUG oslo_concurrency.lockutils [None req-acd1d181-0a96-4bf5-bc4c-e12df5cce28e 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Lock "5b1bdb5cc50b9bf43ebe3094e9279a12a18df0ce" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.180s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:35:56 compute-1 nova_compute[189066]: 2025-12-05 09:35:56.271 189070 DEBUG oslo_concurrency.processutils [None req-acd1d181-0a96-4bf5-bc4c-e12df5cce28e 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5b1bdb5cc50b9bf43ebe3094e9279a12a18df0ce --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 09:35:56 compute-1 nova_compute[189066]: 2025-12-05 09:35:56.331 189070 DEBUG oslo_concurrency.processutils [None req-acd1d181-0a96-4bf5-bc4c-e12df5cce28e 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5b1bdb5cc50b9bf43ebe3094e9279a12a18df0ce --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 09:35:56 compute-1 nova_compute[189066]: 2025-12-05 09:35:56.332 189070 DEBUG nova.virt.disk.api [None req-acd1d181-0a96-4bf5-bc4c-e12df5cce28e 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Checking if we can resize image /var/lib/nova/instances/caf0a99c-b4d0-4fac-9883-ab0be359b528/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Dec 05 09:35:56 compute-1 nova_compute[189066]: 2025-12-05 09:35:56.333 189070 DEBUG oslo_concurrency.processutils [None req-acd1d181-0a96-4bf5-bc4c-e12df5cce28e 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/caf0a99c-b4d0-4fac-9883-ab0be359b528/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 09:35:56 compute-1 nova_compute[189066]: 2025-12-05 09:35:56.394 189070 DEBUG oslo_concurrency.processutils [None req-acd1d181-0a96-4bf5-bc4c-e12df5cce28e 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/caf0a99c-b4d0-4fac-9883-ab0be359b528/disk --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 09:35:56 compute-1 nova_compute[189066]: 2025-12-05 09:35:56.395 189070 DEBUG nova.virt.disk.api [None req-acd1d181-0a96-4bf5-bc4c-e12df5cce28e 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Cannot resize image /var/lib/nova/instances/caf0a99c-b4d0-4fac-9883-ab0be359b528/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Dec 05 09:35:56 compute-1 nova_compute[189066]: 2025-12-05 09:35:56.396 189070 DEBUG nova.objects.instance [None req-acd1d181-0a96-4bf5-bc4c-e12df5cce28e 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Lazy-loading 'migration_context' on Instance uuid caf0a99c-b4d0-4fac-9883-ab0be359b528 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 09:35:56 compute-1 nova_compute[189066]: 2025-12-05 09:35:56.411 189070 DEBUG nova.virt.libvirt.driver [None req-acd1d181-0a96-4bf5-bc4c-e12df5cce28e 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: caf0a99c-b4d0-4fac-9883-ab0be359b528] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Dec 05 09:35:56 compute-1 nova_compute[189066]: 2025-12-05 09:35:56.412 189070 DEBUG nova.virt.libvirt.driver [None req-acd1d181-0a96-4bf5-bc4c-e12df5cce28e 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: caf0a99c-b4d0-4fac-9883-ab0be359b528] Ensure instance console log exists: /var/lib/nova/instances/caf0a99c-b4d0-4fac-9883-ab0be359b528/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Dec 05 09:35:56 compute-1 nova_compute[189066]: 2025-12-05 09:35:56.413 189070 DEBUG oslo_concurrency.lockutils [None req-acd1d181-0a96-4bf5-bc4c-e12df5cce28e 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:35:56 compute-1 nova_compute[189066]: 2025-12-05 09:35:56.413 189070 DEBUG oslo_concurrency.lockutils [None req-acd1d181-0a96-4bf5-bc4c-e12df5cce28e 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:35:56 compute-1 nova_compute[189066]: 2025-12-05 09:35:56.414 189070 DEBUG oslo_concurrency.lockutils [None req-acd1d181-0a96-4bf5-bc4c-e12df5cce28e 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:35:56 compute-1 podman[228667]: 2025-12-05 09:35:56.618796346 +0000 UTC m=+0.057579415 container health_status 3f3288243aefe700d53cffce184353529fbaf6e44f1c19ec4792ed576af41176 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251125)
Dec 05 09:35:59 compute-1 nova_compute[189066]: 2025-12-05 09:35:59.157 189070 DEBUG nova.network.neutron [None req-acd1d181-0a96-4bf5-bc4c-e12df5cce28e 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: caf0a99c-b4d0-4fac-9883-ab0be359b528] Successfully created port: dc081d90-d263-461d-a7af-bc8649b2d24e _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Dec 05 09:35:59 compute-1 nova_compute[189066]: 2025-12-05 09:35:59.994 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:36:00 compute-1 podman[228687]: 2025-12-05 09:36:00.629864473 +0000 UTC m=+0.061159053 container health_status b07f36300303a5c1729056a61e344d6c6fd9b2e404082d8e8ce7185d45652c2b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.expose-services=, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., io.buildah.version=1.33.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, container_name=openstack_network_exporter, name=ubi9-minimal, managed_by=edpm_ansible, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, build-date=2025-08-20T13:12:41, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=openstack_network_exporter)
Dec 05 09:36:04 compute-1 podman[228709]: 2025-12-05 09:36:04.616506363 +0000 UTC m=+0.055156836 container health_status b9f45cdcb567bb250381fa80d86c78c226d5638c7de1b6d79ee017275814ff68 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Dec 05 09:36:04 compute-1 nova_compute[189066]: 2025-12-05 09:36:04.995 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:36:05 compute-1 nova_compute[189066]: 2025-12-05 09:36:05.975 189070 DEBUG nova.network.neutron [None req-acd1d181-0a96-4bf5-bc4c-e12df5cce28e 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: caf0a99c-b4d0-4fac-9883-ab0be359b528] Successfully updated port: dc081d90-d263-461d-a7af-bc8649b2d24e _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Dec 05 09:36:06 compute-1 nova_compute[189066]: 2025-12-05 09:36:06.000 189070 DEBUG oslo_concurrency.lockutils [None req-acd1d181-0a96-4bf5-bc4c-e12df5cce28e 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Acquiring lock "refresh_cache-caf0a99c-b4d0-4fac-9883-ab0be359b528" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 09:36:06 compute-1 nova_compute[189066]: 2025-12-05 09:36:06.001 189070 DEBUG oslo_concurrency.lockutils [None req-acd1d181-0a96-4bf5-bc4c-e12df5cce28e 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Acquired lock "refresh_cache-caf0a99c-b4d0-4fac-9883-ab0be359b528" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 09:36:06 compute-1 nova_compute[189066]: 2025-12-05 09:36:06.001 189070 DEBUG nova.network.neutron [None req-acd1d181-0a96-4bf5-bc4c-e12df5cce28e 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: caf0a99c-b4d0-4fac-9883-ab0be359b528] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Dec 05 09:36:06 compute-1 nova_compute[189066]: 2025-12-05 09:36:06.225 189070 DEBUG nova.network.neutron [None req-acd1d181-0a96-4bf5-bc4c-e12df5cce28e 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: caf0a99c-b4d0-4fac-9883-ab0be359b528] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Dec 05 09:36:06 compute-1 nova_compute[189066]: 2025-12-05 09:36:06.656 189070 DEBUG nova.compute.manager [req-fbd2c8e7-2235-4871-826f-cfe70a3f421c req-82e12e0f-8155-4e58-858b-da8ca87b11e6 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: caf0a99c-b4d0-4fac-9883-ab0be359b528] Received event network-changed-dc081d90-d263-461d-a7af-bc8649b2d24e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 09:36:06 compute-1 nova_compute[189066]: 2025-12-05 09:36:06.656 189070 DEBUG nova.compute.manager [req-fbd2c8e7-2235-4871-826f-cfe70a3f421c req-82e12e0f-8155-4e58-858b-da8ca87b11e6 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: caf0a99c-b4d0-4fac-9883-ab0be359b528] Refreshing instance network info cache due to event network-changed-dc081d90-d263-461d-a7af-bc8649b2d24e. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Dec 05 09:36:06 compute-1 nova_compute[189066]: 2025-12-05 09:36:06.656 189070 DEBUG oslo_concurrency.lockutils [req-fbd2c8e7-2235-4871-826f-cfe70a3f421c req-82e12e0f-8155-4e58-858b-da8ca87b11e6 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Acquiring lock "refresh_cache-caf0a99c-b4d0-4fac-9883-ab0be359b528" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 09:36:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:36:08.881 105272 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:36:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:36:08.882 105272 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:36:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:36:08.882 105272 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:36:09 compute-1 nova_compute[189066]: 2025-12-05 09:36:09.363 189070 DEBUG nova.network.neutron [None req-acd1d181-0a96-4bf5-bc4c-e12df5cce28e 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: caf0a99c-b4d0-4fac-9883-ab0be359b528] Updating instance_info_cache with network_info: [{"id": "dc081d90-d263-461d-a7af-bc8649b2d24e", "address": "fa:16:3e:5a:50:59", "network": {"id": "9169e776-8a85-417a-9321-7e8c761484e0", "bridge": "br-int", "label": "tempest-network-smoke--736767212", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f918144b49634ed5a43d75f8f7d194d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdc081d90-d2", "ovs_interfaceid": "dc081d90-d263-461d-a7af-bc8649b2d24e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 09:36:09 compute-1 nova_compute[189066]: 2025-12-05 09:36:09.399 189070 DEBUG oslo_concurrency.lockutils [None req-acd1d181-0a96-4bf5-bc4c-e12df5cce28e 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Releasing lock "refresh_cache-caf0a99c-b4d0-4fac-9883-ab0be359b528" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 09:36:09 compute-1 nova_compute[189066]: 2025-12-05 09:36:09.400 189070 DEBUG nova.compute.manager [None req-acd1d181-0a96-4bf5-bc4c-e12df5cce28e 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: caf0a99c-b4d0-4fac-9883-ab0be359b528] Instance network_info: |[{"id": "dc081d90-d263-461d-a7af-bc8649b2d24e", "address": "fa:16:3e:5a:50:59", "network": {"id": "9169e776-8a85-417a-9321-7e8c761484e0", "bridge": "br-int", "label": "tempest-network-smoke--736767212", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f918144b49634ed5a43d75f8f7d194d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdc081d90-d2", "ovs_interfaceid": "dc081d90-d263-461d-a7af-bc8649b2d24e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Dec 05 09:36:09 compute-1 nova_compute[189066]: 2025-12-05 09:36:09.400 189070 DEBUG oslo_concurrency.lockutils [req-fbd2c8e7-2235-4871-826f-cfe70a3f421c req-82e12e0f-8155-4e58-858b-da8ca87b11e6 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Acquired lock "refresh_cache-caf0a99c-b4d0-4fac-9883-ab0be359b528" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 09:36:09 compute-1 nova_compute[189066]: 2025-12-05 09:36:09.401 189070 DEBUG nova.network.neutron [req-fbd2c8e7-2235-4871-826f-cfe70a3f421c req-82e12e0f-8155-4e58-858b-da8ca87b11e6 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: caf0a99c-b4d0-4fac-9883-ab0be359b528] Refreshing network info cache for port dc081d90-d263-461d-a7af-bc8649b2d24e _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Dec 05 09:36:09 compute-1 nova_compute[189066]: 2025-12-05 09:36:09.403 189070 DEBUG nova.virt.libvirt.driver [None req-acd1d181-0a96-4bf5-bc4c-e12df5cce28e 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: caf0a99c-b4d0-4fac-9883-ab0be359b528] Start _get_guest_xml network_info=[{"id": "dc081d90-d263-461d-a7af-bc8649b2d24e", "address": "fa:16:3e:5a:50:59", "network": {"id": "9169e776-8a85-417a-9321-7e8c761484e0", "bridge": "br-int", "label": "tempest-network-smoke--736767212", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f918144b49634ed5a43d75f8f7d194d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdc081d90-d2", "ovs_interfaceid": "dc081d90-d263-461d-a7af-bc8649b2d24e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T09:19:04Z,direct_url=<?>,disk_format='qcow2',id=3ebffd97-b242-42d7-b245-ebdaf8e4377c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='1cb29f3743454b7fb71af92993455437',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T09:19:06Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encrypted': False, 'encryption_secret_uuid': None, 'size': 0, 'boot_index': 0, 'device_name': '/dev/vda', 'disk_bus': 'virtio', 'guest_format': None, 'encryption_options': None, 'encryption_format': None, 'device_type': 'disk', 'image_id': '3ebffd97-b242-42d7-b245-ebdaf8e4377c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Dec 05 09:36:09 compute-1 nova_compute[189066]: 2025-12-05 09:36:09.409 189070 WARNING nova.virt.libvirt.driver [None req-acd1d181-0a96-4bf5-bc4c-e12df5cce28e 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 05 09:36:09 compute-1 nova_compute[189066]: 2025-12-05 09:36:09.423 189070 DEBUG nova.virt.libvirt.host [None req-acd1d181-0a96-4bf5-bc4c-e12df5cce28e 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Dec 05 09:36:09 compute-1 nova_compute[189066]: 2025-12-05 09:36:09.424 189070 DEBUG nova.virt.libvirt.host [None req-acd1d181-0a96-4bf5-bc4c-e12df5cce28e 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Dec 05 09:36:09 compute-1 nova_compute[189066]: 2025-12-05 09:36:09.431 189070 DEBUG nova.virt.libvirt.host [None req-acd1d181-0a96-4bf5-bc4c-e12df5cce28e 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Dec 05 09:36:09 compute-1 nova_compute[189066]: 2025-12-05 09:36:09.432 189070 DEBUG nova.virt.libvirt.host [None req-acd1d181-0a96-4bf5-bc4c-e12df5cce28e 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Dec 05 09:36:09 compute-1 nova_compute[189066]: 2025-12-05 09:36:09.434 189070 DEBUG nova.virt.libvirt.driver [None req-acd1d181-0a96-4bf5-bc4c-e12df5cce28e 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Dec 05 09:36:09 compute-1 nova_compute[189066]: 2025-12-05 09:36:09.434 189070 DEBUG nova.virt.hardware [None req-acd1d181-0a96-4bf5-bc4c-e12df5cce28e 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-05T09:19:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='fbadeab4-f24f-4100-963a-d228b2a6f7c4',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T09:19:04Z,direct_url=<?>,disk_format='qcow2',id=3ebffd97-b242-42d7-b245-ebdaf8e4377c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='1cb29f3743454b7fb71af92993455437',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T09:19:06Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Dec 05 09:36:09 compute-1 nova_compute[189066]: 2025-12-05 09:36:09.434 189070 DEBUG nova.virt.hardware [None req-acd1d181-0a96-4bf5-bc4c-e12df5cce28e 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Dec 05 09:36:09 compute-1 nova_compute[189066]: 2025-12-05 09:36:09.435 189070 DEBUG nova.virt.hardware [None req-acd1d181-0a96-4bf5-bc4c-e12df5cce28e 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Dec 05 09:36:09 compute-1 nova_compute[189066]: 2025-12-05 09:36:09.435 189070 DEBUG nova.virt.hardware [None req-acd1d181-0a96-4bf5-bc4c-e12df5cce28e 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Dec 05 09:36:09 compute-1 nova_compute[189066]: 2025-12-05 09:36:09.435 189070 DEBUG nova.virt.hardware [None req-acd1d181-0a96-4bf5-bc4c-e12df5cce28e 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Dec 05 09:36:09 compute-1 nova_compute[189066]: 2025-12-05 09:36:09.435 189070 DEBUG nova.virt.hardware [None req-acd1d181-0a96-4bf5-bc4c-e12df5cce28e 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Dec 05 09:36:09 compute-1 nova_compute[189066]: 2025-12-05 09:36:09.435 189070 DEBUG nova.virt.hardware [None req-acd1d181-0a96-4bf5-bc4c-e12df5cce28e 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Dec 05 09:36:09 compute-1 nova_compute[189066]: 2025-12-05 09:36:09.436 189070 DEBUG nova.virt.hardware [None req-acd1d181-0a96-4bf5-bc4c-e12df5cce28e 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Dec 05 09:36:09 compute-1 nova_compute[189066]: 2025-12-05 09:36:09.436 189070 DEBUG nova.virt.hardware [None req-acd1d181-0a96-4bf5-bc4c-e12df5cce28e 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Dec 05 09:36:09 compute-1 nova_compute[189066]: 2025-12-05 09:36:09.436 189070 DEBUG nova.virt.hardware [None req-acd1d181-0a96-4bf5-bc4c-e12df5cce28e 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Dec 05 09:36:09 compute-1 nova_compute[189066]: 2025-12-05 09:36:09.436 189070 DEBUG nova.virt.hardware [None req-acd1d181-0a96-4bf5-bc4c-e12df5cce28e 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Dec 05 09:36:09 compute-1 nova_compute[189066]: 2025-12-05 09:36:09.440 189070 DEBUG nova.virt.libvirt.vif [None req-acd1d181-0a96-4bf5-bc4c-e12df5cce28e 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T09:35:53Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1152892970',display_name='tempest-TestNetworkBasicOps-server-1152892970',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1152892970',id=34,image_ref='3ebffd97-b242-42d7-b245-ebdaf8e4377c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBAOX2Isf6e1VVEipyaY1bQ3oo7OBeHw8nA3AYDjK6acPXxSqT3BdAkhZsaMDELIc3uY4dO92h8xGVeDLew9jVMMUIkjYQ57YIHVif+pT2wHO21vXehX0yK2yZSKNfMCQjw==',key_name='tempest-TestNetworkBasicOps-1301626670',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='f918144b49634ed5a43d75f8f7d194d3',ramdisk_id='',reservation_id='r-8kfznvii',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='3ebffd97-b242-42d7-b245-ebdaf8e4377c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1341993432',owner_user_name='tempest-TestNetworkBasicOps-1341993432-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T09:35:55Z,user_data=None,user_id='822a2d80e80e46d9a5a49e1e9560e0d9',uuid=caf0a99c-b4d0-4fac-9883-ab0be359b528,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "dc081d90-d263-461d-a7af-bc8649b2d24e", "address": "fa:16:3e:5a:50:59", "network": {"id": "9169e776-8a85-417a-9321-7e8c761484e0", "bridge": "br-int", "label": "tempest-network-smoke--736767212", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f918144b49634ed5a43d75f8f7d194d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdc081d90-d2", "ovs_interfaceid": "dc081d90-d263-461d-a7af-bc8649b2d24e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Dec 05 09:36:09 compute-1 nova_compute[189066]: 2025-12-05 09:36:09.441 189070 DEBUG nova.network.os_vif_util [None req-acd1d181-0a96-4bf5-bc4c-e12df5cce28e 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Converting VIF {"id": "dc081d90-d263-461d-a7af-bc8649b2d24e", "address": "fa:16:3e:5a:50:59", "network": {"id": "9169e776-8a85-417a-9321-7e8c761484e0", "bridge": "br-int", "label": "tempest-network-smoke--736767212", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f918144b49634ed5a43d75f8f7d194d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdc081d90-d2", "ovs_interfaceid": "dc081d90-d263-461d-a7af-bc8649b2d24e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 09:36:09 compute-1 nova_compute[189066]: 2025-12-05 09:36:09.442 189070 DEBUG nova.network.os_vif_util [None req-acd1d181-0a96-4bf5-bc4c-e12df5cce28e 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:5a:50:59,bridge_name='br-int',has_traffic_filtering=True,id=dc081d90-d263-461d-a7af-bc8649b2d24e,network=Network(9169e776-8a85-417a-9321-7e8c761484e0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdc081d90-d2') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 09:36:09 compute-1 nova_compute[189066]: 2025-12-05 09:36:09.442 189070 DEBUG nova.objects.instance [None req-acd1d181-0a96-4bf5-bc4c-e12df5cce28e 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Lazy-loading 'pci_devices' on Instance uuid caf0a99c-b4d0-4fac-9883-ab0be359b528 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 09:36:09 compute-1 nova_compute[189066]: 2025-12-05 09:36:09.475 189070 DEBUG nova.virt.libvirt.driver [None req-acd1d181-0a96-4bf5-bc4c-e12df5cce28e 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: caf0a99c-b4d0-4fac-9883-ab0be359b528] End _get_guest_xml xml=<domain type="kvm">
Dec 05 09:36:09 compute-1 nova_compute[189066]:   <uuid>caf0a99c-b4d0-4fac-9883-ab0be359b528</uuid>
Dec 05 09:36:09 compute-1 nova_compute[189066]:   <name>instance-00000022</name>
Dec 05 09:36:09 compute-1 nova_compute[189066]:   <memory>131072</memory>
Dec 05 09:36:09 compute-1 nova_compute[189066]:   <vcpu>1</vcpu>
Dec 05 09:36:09 compute-1 nova_compute[189066]:   <metadata>
Dec 05 09:36:09 compute-1 nova_compute[189066]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 05 09:36:09 compute-1 nova_compute[189066]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec 05 09:36:09 compute-1 nova_compute[189066]:       <nova:name>tempest-TestNetworkBasicOps-server-1152892970</nova:name>
Dec 05 09:36:09 compute-1 nova_compute[189066]:       <nova:creationTime>2025-12-05 09:36:09</nova:creationTime>
Dec 05 09:36:09 compute-1 nova_compute[189066]:       <nova:flavor name="m1.nano">
Dec 05 09:36:09 compute-1 nova_compute[189066]:         <nova:memory>128</nova:memory>
Dec 05 09:36:09 compute-1 nova_compute[189066]:         <nova:disk>1</nova:disk>
Dec 05 09:36:09 compute-1 nova_compute[189066]:         <nova:swap>0</nova:swap>
Dec 05 09:36:09 compute-1 nova_compute[189066]:         <nova:ephemeral>0</nova:ephemeral>
Dec 05 09:36:09 compute-1 nova_compute[189066]:         <nova:vcpus>1</nova:vcpus>
Dec 05 09:36:09 compute-1 nova_compute[189066]:       </nova:flavor>
Dec 05 09:36:09 compute-1 nova_compute[189066]:       <nova:owner>
Dec 05 09:36:09 compute-1 nova_compute[189066]:         <nova:user uuid="822a2d80e80e46d9a5a49e1e9560e0d9">tempest-TestNetworkBasicOps-1341993432-project-member</nova:user>
Dec 05 09:36:09 compute-1 nova_compute[189066]:         <nova:project uuid="f918144b49634ed5a43d75f8f7d194d3">tempest-TestNetworkBasicOps-1341993432</nova:project>
Dec 05 09:36:09 compute-1 nova_compute[189066]:       </nova:owner>
Dec 05 09:36:09 compute-1 nova_compute[189066]:       <nova:root type="image" uuid="3ebffd97-b242-42d7-b245-ebdaf8e4377c"/>
Dec 05 09:36:09 compute-1 nova_compute[189066]:       <nova:ports>
Dec 05 09:36:09 compute-1 nova_compute[189066]:         <nova:port uuid="dc081d90-d263-461d-a7af-bc8649b2d24e">
Dec 05 09:36:09 compute-1 nova_compute[189066]:           <nova:ip type="fixed" address="10.100.0.7" ipVersion="4"/>
Dec 05 09:36:09 compute-1 nova_compute[189066]:         </nova:port>
Dec 05 09:36:09 compute-1 nova_compute[189066]:       </nova:ports>
Dec 05 09:36:09 compute-1 nova_compute[189066]:     </nova:instance>
Dec 05 09:36:09 compute-1 nova_compute[189066]:   </metadata>
Dec 05 09:36:09 compute-1 nova_compute[189066]:   <sysinfo type="smbios">
Dec 05 09:36:09 compute-1 nova_compute[189066]:     <system>
Dec 05 09:36:09 compute-1 nova_compute[189066]:       <entry name="manufacturer">RDO</entry>
Dec 05 09:36:09 compute-1 nova_compute[189066]:       <entry name="product">OpenStack Compute</entry>
Dec 05 09:36:09 compute-1 nova_compute[189066]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec 05 09:36:09 compute-1 nova_compute[189066]:       <entry name="serial">caf0a99c-b4d0-4fac-9883-ab0be359b528</entry>
Dec 05 09:36:09 compute-1 nova_compute[189066]:       <entry name="uuid">caf0a99c-b4d0-4fac-9883-ab0be359b528</entry>
Dec 05 09:36:09 compute-1 nova_compute[189066]:       <entry name="family">Virtual Machine</entry>
Dec 05 09:36:09 compute-1 nova_compute[189066]:     </system>
Dec 05 09:36:09 compute-1 nova_compute[189066]:   </sysinfo>
Dec 05 09:36:09 compute-1 nova_compute[189066]:   <os>
Dec 05 09:36:09 compute-1 nova_compute[189066]:     <type arch="x86_64" machine="q35">hvm</type>
Dec 05 09:36:09 compute-1 nova_compute[189066]:     <boot dev="hd"/>
Dec 05 09:36:09 compute-1 nova_compute[189066]:     <smbios mode="sysinfo"/>
Dec 05 09:36:09 compute-1 nova_compute[189066]:   </os>
Dec 05 09:36:09 compute-1 nova_compute[189066]:   <features>
Dec 05 09:36:09 compute-1 nova_compute[189066]:     <acpi/>
Dec 05 09:36:09 compute-1 nova_compute[189066]:     <apic/>
Dec 05 09:36:09 compute-1 nova_compute[189066]:     <vmcoreinfo/>
Dec 05 09:36:09 compute-1 nova_compute[189066]:   </features>
Dec 05 09:36:09 compute-1 nova_compute[189066]:   <clock offset="utc">
Dec 05 09:36:09 compute-1 nova_compute[189066]:     <timer name="pit" tickpolicy="delay"/>
Dec 05 09:36:09 compute-1 nova_compute[189066]:     <timer name="rtc" tickpolicy="catchup"/>
Dec 05 09:36:09 compute-1 nova_compute[189066]:     <timer name="hpet" present="no"/>
Dec 05 09:36:09 compute-1 nova_compute[189066]:   </clock>
Dec 05 09:36:09 compute-1 nova_compute[189066]:   <cpu mode="custom" match="exact">
Dec 05 09:36:09 compute-1 nova_compute[189066]:     <model>Nehalem</model>
Dec 05 09:36:09 compute-1 nova_compute[189066]:     <topology sockets="1" cores="1" threads="1"/>
Dec 05 09:36:09 compute-1 nova_compute[189066]:   </cpu>
Dec 05 09:36:09 compute-1 nova_compute[189066]:   <devices>
Dec 05 09:36:09 compute-1 nova_compute[189066]:     <disk type="file" device="disk">
Dec 05 09:36:09 compute-1 nova_compute[189066]:       <driver name="qemu" type="qcow2" cache="none"/>
Dec 05 09:36:09 compute-1 nova_compute[189066]:       <source file="/var/lib/nova/instances/caf0a99c-b4d0-4fac-9883-ab0be359b528/disk"/>
Dec 05 09:36:09 compute-1 nova_compute[189066]:       <target dev="vda" bus="virtio"/>
Dec 05 09:36:09 compute-1 nova_compute[189066]:     </disk>
Dec 05 09:36:09 compute-1 nova_compute[189066]:     <disk type="file" device="cdrom">
Dec 05 09:36:09 compute-1 nova_compute[189066]:       <driver name="qemu" type="raw" cache="none"/>
Dec 05 09:36:09 compute-1 nova_compute[189066]:       <source file="/var/lib/nova/instances/caf0a99c-b4d0-4fac-9883-ab0be359b528/disk.config"/>
Dec 05 09:36:09 compute-1 nova_compute[189066]:       <target dev="sda" bus="sata"/>
Dec 05 09:36:09 compute-1 nova_compute[189066]:     </disk>
Dec 05 09:36:09 compute-1 nova_compute[189066]:     <interface type="ethernet">
Dec 05 09:36:09 compute-1 nova_compute[189066]:       <mac address="fa:16:3e:5a:50:59"/>
Dec 05 09:36:09 compute-1 nova_compute[189066]:       <model type="virtio"/>
Dec 05 09:36:09 compute-1 nova_compute[189066]:       <driver name="vhost" rx_queue_size="512"/>
Dec 05 09:36:09 compute-1 nova_compute[189066]:       <mtu size="1442"/>
Dec 05 09:36:09 compute-1 nova_compute[189066]:       <target dev="tapdc081d90-d2"/>
Dec 05 09:36:09 compute-1 nova_compute[189066]:     </interface>
Dec 05 09:36:09 compute-1 nova_compute[189066]:     <serial type="pty">
Dec 05 09:36:09 compute-1 nova_compute[189066]:       <log file="/var/lib/nova/instances/caf0a99c-b4d0-4fac-9883-ab0be359b528/console.log" append="off"/>
Dec 05 09:36:09 compute-1 nova_compute[189066]:     </serial>
Dec 05 09:36:09 compute-1 nova_compute[189066]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Dec 05 09:36:09 compute-1 nova_compute[189066]:     <video>
Dec 05 09:36:09 compute-1 nova_compute[189066]:       <model type="virtio"/>
Dec 05 09:36:09 compute-1 nova_compute[189066]:     </video>
Dec 05 09:36:09 compute-1 nova_compute[189066]:     <input type="tablet" bus="usb"/>
Dec 05 09:36:09 compute-1 nova_compute[189066]:     <rng model="virtio">
Dec 05 09:36:09 compute-1 nova_compute[189066]:       <backend model="random">/dev/urandom</backend>
Dec 05 09:36:09 compute-1 nova_compute[189066]:     </rng>
Dec 05 09:36:09 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root"/>
Dec 05 09:36:09 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:36:09 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:36:09 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:36:09 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:36:09 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:36:09 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:36:09 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:36:09 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:36:09 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:36:09 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:36:09 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:36:09 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:36:09 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:36:09 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:36:09 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:36:09 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:36:09 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:36:09 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:36:09 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:36:09 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:36:09 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:36:09 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:36:09 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:36:09 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:36:09 compute-1 nova_compute[189066]:     <controller type="usb" index="0"/>
Dec 05 09:36:09 compute-1 nova_compute[189066]:     <memballoon model="virtio">
Dec 05 09:36:09 compute-1 nova_compute[189066]:       <stats period="10"/>
Dec 05 09:36:09 compute-1 nova_compute[189066]:     </memballoon>
Dec 05 09:36:09 compute-1 nova_compute[189066]:   </devices>
Dec 05 09:36:09 compute-1 nova_compute[189066]: </domain>
Dec 05 09:36:09 compute-1 nova_compute[189066]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Dec 05 09:36:09 compute-1 nova_compute[189066]: 2025-12-05 09:36:09.476 189070 DEBUG nova.compute.manager [None req-acd1d181-0a96-4bf5-bc4c-e12df5cce28e 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: caf0a99c-b4d0-4fac-9883-ab0be359b528] Preparing to wait for external event network-vif-plugged-dc081d90-d263-461d-a7af-bc8649b2d24e prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Dec 05 09:36:09 compute-1 nova_compute[189066]: 2025-12-05 09:36:09.476 189070 DEBUG oslo_concurrency.lockutils [None req-acd1d181-0a96-4bf5-bc4c-e12df5cce28e 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Acquiring lock "caf0a99c-b4d0-4fac-9883-ab0be359b528-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:36:09 compute-1 nova_compute[189066]: 2025-12-05 09:36:09.476 189070 DEBUG oslo_concurrency.lockutils [None req-acd1d181-0a96-4bf5-bc4c-e12df5cce28e 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Lock "caf0a99c-b4d0-4fac-9883-ab0be359b528-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:36:09 compute-1 nova_compute[189066]: 2025-12-05 09:36:09.477 189070 DEBUG oslo_concurrency.lockutils [None req-acd1d181-0a96-4bf5-bc4c-e12df5cce28e 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Lock "caf0a99c-b4d0-4fac-9883-ab0be359b528-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:36:09 compute-1 nova_compute[189066]: 2025-12-05 09:36:09.477 189070 DEBUG nova.virt.libvirt.vif [None req-acd1d181-0a96-4bf5-bc4c-e12df5cce28e 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T09:35:53Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1152892970',display_name='tempest-TestNetworkBasicOps-server-1152892970',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1152892970',id=34,image_ref='3ebffd97-b242-42d7-b245-ebdaf8e4377c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBAOX2Isf6e1VVEipyaY1bQ3oo7OBeHw8nA3AYDjK6acPXxSqT3BdAkhZsaMDELIc3uY4dO92h8xGVeDLew9jVMMUIkjYQ57YIHVif+pT2wHO21vXehX0yK2yZSKNfMCQjw==',key_name='tempest-TestNetworkBasicOps-1301626670',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='f918144b49634ed5a43d75f8f7d194d3',ramdisk_id='',reservation_id='r-8kfznvii',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='3ebffd97-b242-42d7-b245-ebdaf8e4377c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1341993432',owner_user_name='tempest-TestNetworkBasicOps-1341993432-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T09:35:55Z,user_data=None,user_id='822a2d80e80e46d9a5a49e1e9560e0d9',uuid=caf0a99c-b4d0-4fac-9883-ab0be359b528,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "dc081d90-d263-461d-a7af-bc8649b2d24e", "address": "fa:16:3e:5a:50:59", "network": {"id": "9169e776-8a85-417a-9321-7e8c761484e0", "bridge": "br-int", "label": "tempest-network-smoke--736767212", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f918144b49634ed5a43d75f8f7d194d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdc081d90-d2", "ovs_interfaceid": "dc081d90-d263-461d-a7af-bc8649b2d24e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Dec 05 09:36:09 compute-1 nova_compute[189066]: 2025-12-05 09:36:09.478 189070 DEBUG nova.network.os_vif_util [None req-acd1d181-0a96-4bf5-bc4c-e12df5cce28e 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Converting VIF {"id": "dc081d90-d263-461d-a7af-bc8649b2d24e", "address": "fa:16:3e:5a:50:59", "network": {"id": "9169e776-8a85-417a-9321-7e8c761484e0", "bridge": "br-int", "label": "tempest-network-smoke--736767212", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f918144b49634ed5a43d75f8f7d194d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdc081d90-d2", "ovs_interfaceid": "dc081d90-d263-461d-a7af-bc8649b2d24e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 09:36:09 compute-1 nova_compute[189066]: 2025-12-05 09:36:09.478 189070 DEBUG nova.network.os_vif_util [None req-acd1d181-0a96-4bf5-bc4c-e12df5cce28e 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:5a:50:59,bridge_name='br-int',has_traffic_filtering=True,id=dc081d90-d263-461d-a7af-bc8649b2d24e,network=Network(9169e776-8a85-417a-9321-7e8c761484e0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdc081d90-d2') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 09:36:09 compute-1 nova_compute[189066]: 2025-12-05 09:36:09.479 189070 DEBUG os_vif [None req-acd1d181-0a96-4bf5-bc4c-e12df5cce28e 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:5a:50:59,bridge_name='br-int',has_traffic_filtering=True,id=dc081d90-d263-461d-a7af-bc8649b2d24e,network=Network(9169e776-8a85-417a-9321-7e8c761484e0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdc081d90-d2') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Dec 05 09:36:09 compute-1 nova_compute[189066]: 2025-12-05 09:36:09.479 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:36:09 compute-1 nova_compute[189066]: 2025-12-05 09:36:09.480 189070 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 09:36:09 compute-1 nova_compute[189066]: 2025-12-05 09:36:09.480 189070 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 09:36:09 compute-1 nova_compute[189066]: 2025-12-05 09:36:09.483 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:36:09 compute-1 nova_compute[189066]: 2025-12-05 09:36:09.483 189070 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapdc081d90-d2, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 09:36:09 compute-1 nova_compute[189066]: 2025-12-05 09:36:09.484 189070 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapdc081d90-d2, col_values=(('external_ids', {'iface-id': 'dc081d90-d263-461d-a7af-bc8649b2d24e', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:5a:50:59', 'vm-uuid': 'caf0a99c-b4d0-4fac-9883-ab0be359b528'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 09:36:09 compute-1 nova_compute[189066]: 2025-12-05 09:36:09.485 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:36:09 compute-1 NetworkManager[55704]: <info>  [1764927369.4864] manager: (tapdc081d90-d2): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/97)
Dec 05 09:36:09 compute-1 nova_compute[189066]: 2025-12-05 09:36:09.488 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 09:36:09 compute-1 nova_compute[189066]: 2025-12-05 09:36:09.493 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:36:09 compute-1 nova_compute[189066]: 2025-12-05 09:36:09.494 189070 INFO os_vif [None req-acd1d181-0a96-4bf5-bc4c-e12df5cce28e 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:5a:50:59,bridge_name='br-int',has_traffic_filtering=True,id=dc081d90-d263-461d-a7af-bc8649b2d24e,network=Network(9169e776-8a85-417a-9321-7e8c761484e0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdc081d90-d2')
Dec 05 09:36:09 compute-1 nova_compute[189066]: 2025-12-05 09:36:09.947 189070 DEBUG nova.virt.libvirt.driver [None req-acd1d181-0a96-4bf5-bc4c-e12df5cce28e 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 05 09:36:09 compute-1 nova_compute[189066]: 2025-12-05 09:36:09.948 189070 DEBUG nova.virt.libvirt.driver [None req-acd1d181-0a96-4bf5-bc4c-e12df5cce28e 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 05 09:36:09 compute-1 nova_compute[189066]: 2025-12-05 09:36:09.948 189070 DEBUG nova.virt.libvirt.driver [None req-acd1d181-0a96-4bf5-bc4c-e12df5cce28e 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] No VIF found with MAC fa:16:3e:5a:50:59, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Dec 05 09:36:09 compute-1 nova_compute[189066]: 2025-12-05 09:36:09.948 189070 INFO nova.virt.libvirt.driver [None req-acd1d181-0a96-4bf5-bc4c-e12df5cce28e 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: caf0a99c-b4d0-4fac-9883-ab0be359b528] Using config drive
Dec 05 09:36:09 compute-1 nova_compute[189066]: 2025-12-05 09:36:09.998 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:36:10 compute-1 nova_compute[189066]: 2025-12-05 09:36:10.839 189070 INFO nova.virt.libvirt.driver [None req-acd1d181-0a96-4bf5-bc4c-e12df5cce28e 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: caf0a99c-b4d0-4fac-9883-ab0be359b528] Creating config drive at /var/lib/nova/instances/caf0a99c-b4d0-4fac-9883-ab0be359b528/disk.config
Dec 05 09:36:10 compute-1 nova_compute[189066]: 2025-12-05 09:36:10.845 189070 DEBUG oslo_concurrency.processutils [None req-acd1d181-0a96-4bf5-bc4c-e12df5cce28e 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/caf0a99c-b4d0-4fac-9883-ab0be359b528/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp9kn0z99t execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 09:36:10 compute-1 nova_compute[189066]: 2025-12-05 09:36:10.976 189070 DEBUG oslo_concurrency.processutils [None req-acd1d181-0a96-4bf5-bc4c-e12df5cce28e 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/caf0a99c-b4d0-4fac-9883-ab0be359b528/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp9kn0z99t" returned: 0 in 0.131s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 09:36:11 compute-1 kernel: tapdc081d90-d2: entered promiscuous mode
Dec 05 09:36:11 compute-1 NetworkManager[55704]: <info>  [1764927371.0529] manager: (tapdc081d90-d2): new Tun device (/org/freedesktop/NetworkManager/Devices/98)
Dec 05 09:36:11 compute-1 ovn_controller[95809]: 2025-12-05T09:36:11Z|00187|binding|INFO|Claiming lport dc081d90-d263-461d-a7af-bc8649b2d24e for this chassis.
Dec 05 09:36:11 compute-1 ovn_controller[95809]: 2025-12-05T09:36:11Z|00188|binding|INFO|dc081d90-d263-461d-a7af-bc8649b2d24e: Claiming fa:16:3e:5a:50:59 10.100.0.7
Dec 05 09:36:11 compute-1 nova_compute[189066]: 2025-12-05 09:36:11.055 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:36:11 compute-1 nova_compute[189066]: 2025-12-05 09:36:11.057 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:36:11 compute-1 nova_compute[189066]: 2025-12-05 09:36:11.061 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:36:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:36:11.068 105272 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:5a:50:59 10.100.0.7'], port_security=['fa:16:3e:5a:50:59 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': 'caf0a99c-b4d0-4fac-9883-ab0be359b528', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-9169e776-8a85-417a-9321-7e8c761484e0', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'f918144b49634ed5a43d75f8f7d194d3', 'neutron:revision_number': '2', 'neutron:security_group_ids': '346fffc5-09ed-4aab-8fb9-5dfa08cec170', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=4e4f212a-f0a6-4181-b812-3bfaf26b63d6, chassis=[<ovs.db.idl.Row object at 0x7fc06f54c970>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fc06f54c970>], logical_port=dc081d90-d263-461d-a7af-bc8649b2d24e) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 09:36:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:36:11.069 105272 INFO neutron.agent.ovn.metadata.agent [-] Port dc081d90-d263-461d-a7af-bc8649b2d24e in datapath 9169e776-8a85-417a-9321-7e8c761484e0 bound to our chassis
Dec 05 09:36:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:36:11.070 105272 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 9169e776-8a85-417a-9321-7e8c761484e0
Dec 05 09:36:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:36:11.086 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[c9a61550-51bb-47e0-b655-c8af6db29e23]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:36:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:36:11.087 105272 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap9169e776-81 in ovnmeta-9169e776-8a85-417a-9321-7e8c761484e0 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Dec 05 09:36:11 compute-1 systemd-udevd[228762]: Network interface NamePolicy= disabled on kernel command line.
Dec 05 09:36:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:36:11.090 220556 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap9169e776-80 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Dec 05 09:36:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:36:11.090 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[b03c1cfc-ebf9-4506-97a1-dae4a0870a2f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:36:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:36:11.091 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[b44c420c-5166-4f48-9c5f-86c5aa3554b1]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:36:11 compute-1 NetworkManager[55704]: <info>  [1764927371.1033] device (tapdc081d90-d2): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec 05 09:36:11 compute-1 NetworkManager[55704]: <info>  [1764927371.1044] device (tapdc081d90-d2): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec 05 09:36:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:36:11.105 105809 DEBUG oslo.privsep.daemon [-] privsep: reply[1068a59d-e463-4365-9f03-79548ad931bd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:36:11 compute-1 systemd-machined[154815]: New machine qemu-15-instance-00000022.
Dec 05 09:36:11 compute-1 nova_compute[189066]: 2025-12-05 09:36:11.112 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:36:11 compute-1 ovn_controller[95809]: 2025-12-05T09:36:11Z|00189|binding|INFO|Setting lport dc081d90-d263-461d-a7af-bc8649b2d24e ovn-installed in OVS
Dec 05 09:36:11 compute-1 ovn_controller[95809]: 2025-12-05T09:36:11Z|00190|binding|INFO|Setting lport dc081d90-d263-461d-a7af-bc8649b2d24e up in Southbound
Dec 05 09:36:11 compute-1 nova_compute[189066]: 2025-12-05 09:36:11.118 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:36:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:36:11.127 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[07870b09-55f6-4a5b-bb47-07b90b92976f]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:36:11 compute-1 systemd[1]: Started Virtual Machine qemu-15-instance-00000022.
Dec 05 09:36:11 compute-1 podman[228745]: 2025-12-05 09:36:11.144447051 +0000 UTC m=+0.094364632 container health_status 550b604b3b40c506829669f3af0f3e8dc4e7ac01464222ce7f26998708fe1a96 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Dec 05 09:36:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:36:11.163 220896 DEBUG oslo.privsep.daemon [-] privsep: reply[9db7edea-4ae4-48e7-91ff-14b9d8ee0ffd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:36:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:36:11.170 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[4c587952-aadb-4af8-8870-47d1053ec285]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:36:11 compute-1 systemd-udevd[228769]: Network interface NamePolicy= disabled on kernel command line.
Dec 05 09:36:11 compute-1 NetworkManager[55704]: <info>  [1764927371.1714] manager: (tap9169e776-80): new Veth device (/org/freedesktop/NetworkManager/Devices/99)
Dec 05 09:36:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:36:11.208 220896 DEBUG oslo.privsep.daemon [-] privsep: reply[fb36415c-0f53-4575-8311-2abbafcf222d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:36:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:36:11.212 220896 DEBUG oslo.privsep.daemon [-] privsep: reply[69942836-0d9e-4421-899f-1722b61732f7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:36:11 compute-1 NetworkManager[55704]: <info>  [1764927371.2383] device (tap9169e776-80): carrier: link connected
Dec 05 09:36:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:36:11.245 220896 DEBUG oslo.privsep.daemon [-] privsep: reply[7f9abf2e-74ff-4b87-bcf4-6c67fdc4962b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:36:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:36:11.265 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[624db809-be49-4046-8e8c-cd7d115c77ca]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap9169e776-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:92:ff:97'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 56], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 464424, 'reachable_time': 42945, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 228807, 'error': None, 'target': 'ovnmeta-9169e776-8a85-417a-9321-7e8c761484e0', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:36:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:36:11.288 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[44dbc0cf-39a8-4415-b6cf-3c92760dd0d5]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe92:ff97'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 464424, 'tstamp': 464424}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 228808, 'error': None, 'target': 'ovnmeta-9169e776-8a85-417a-9321-7e8c761484e0', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:36:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:36:11.304 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[b097ea75-a441-494f-acbe-5e1e0951e14e]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap9169e776-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:92:ff:97'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 56], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 464424, 'reachable_time': 42945, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 228809, 'error': None, 'target': 'ovnmeta-9169e776-8a85-417a-9321-7e8c761484e0', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:36:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:36:11.342 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[458ed52b-5fc7-4b38-8d34-d6c79376fea3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:36:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:36:11.418 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[55a78a3e-4eaf-45f4-b5a3-3af968a1441d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:36:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:36:11.420 105272 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9169e776-80, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 09:36:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:36:11.420 105272 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 09:36:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:36:11.420 105272 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap9169e776-80, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 09:36:11 compute-1 nova_compute[189066]: 2025-12-05 09:36:11.422 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:36:11 compute-1 kernel: tap9169e776-80: entered promiscuous mode
Dec 05 09:36:11 compute-1 NetworkManager[55704]: <info>  [1764927371.4236] manager: (tap9169e776-80): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/100)
Dec 05 09:36:11 compute-1 nova_compute[189066]: 2025-12-05 09:36:11.425 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:36:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:36:11.427 105272 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap9169e776-80, col_values=(('external_ids', {'iface-id': 'e2b66518-c69f-4ca7-89db-70c0fcadafde'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 09:36:11 compute-1 ovn_controller[95809]: 2025-12-05T09:36:11Z|00191|binding|INFO|Releasing lport e2b66518-c69f-4ca7-89db-70c0fcadafde from this chassis (sb_readonly=0)
Dec 05 09:36:11 compute-1 nova_compute[189066]: 2025-12-05 09:36:11.429 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:36:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:36:11.430 105272 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/9169e776-8a85-417a-9321-7e8c761484e0.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/9169e776-8a85-417a-9321-7e8c761484e0.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Dec 05 09:36:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:36:11.431 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[8a19edaa-5f27-4b86-807d-fb45521c0cbe]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:36:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:36:11.432 105272 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Dec 05 09:36:11 compute-1 ovn_metadata_agent[105267]: global
Dec 05 09:36:11 compute-1 ovn_metadata_agent[105267]:     log         /dev/log local0 debug
Dec 05 09:36:11 compute-1 ovn_metadata_agent[105267]:     log-tag     haproxy-metadata-proxy-9169e776-8a85-417a-9321-7e8c761484e0
Dec 05 09:36:11 compute-1 ovn_metadata_agent[105267]:     user        root
Dec 05 09:36:11 compute-1 ovn_metadata_agent[105267]:     group       root
Dec 05 09:36:11 compute-1 ovn_metadata_agent[105267]:     maxconn     1024
Dec 05 09:36:11 compute-1 ovn_metadata_agent[105267]:     pidfile     /var/lib/neutron/external/pids/9169e776-8a85-417a-9321-7e8c761484e0.pid.haproxy
Dec 05 09:36:11 compute-1 ovn_metadata_agent[105267]:     daemon
Dec 05 09:36:11 compute-1 ovn_metadata_agent[105267]: 
Dec 05 09:36:11 compute-1 ovn_metadata_agent[105267]: defaults
Dec 05 09:36:11 compute-1 ovn_metadata_agent[105267]:     log global
Dec 05 09:36:11 compute-1 ovn_metadata_agent[105267]:     mode http
Dec 05 09:36:11 compute-1 ovn_metadata_agent[105267]:     option httplog
Dec 05 09:36:11 compute-1 ovn_metadata_agent[105267]:     option dontlognull
Dec 05 09:36:11 compute-1 ovn_metadata_agent[105267]:     option http-server-close
Dec 05 09:36:11 compute-1 ovn_metadata_agent[105267]:     option forwardfor
Dec 05 09:36:11 compute-1 ovn_metadata_agent[105267]:     retries                 3
Dec 05 09:36:11 compute-1 ovn_metadata_agent[105267]:     timeout http-request    30s
Dec 05 09:36:11 compute-1 ovn_metadata_agent[105267]:     timeout connect         30s
Dec 05 09:36:11 compute-1 ovn_metadata_agent[105267]:     timeout client          32s
Dec 05 09:36:11 compute-1 ovn_metadata_agent[105267]:     timeout server          32s
Dec 05 09:36:11 compute-1 ovn_metadata_agent[105267]:     timeout http-keep-alive 30s
Dec 05 09:36:11 compute-1 ovn_metadata_agent[105267]: 
Dec 05 09:36:11 compute-1 ovn_metadata_agent[105267]: 
Dec 05 09:36:11 compute-1 ovn_metadata_agent[105267]: listen listener
Dec 05 09:36:11 compute-1 ovn_metadata_agent[105267]:     bind 169.254.169.254:80
Dec 05 09:36:11 compute-1 ovn_metadata_agent[105267]:     server metadata /var/lib/neutron/metadata_proxy
Dec 05 09:36:11 compute-1 ovn_metadata_agent[105267]:     http-request add-header X-OVN-Network-ID 9169e776-8a85-417a-9321-7e8c761484e0
Dec 05 09:36:11 compute-1 ovn_metadata_agent[105267]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Dec 05 09:36:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:36:11.433 105272 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-9169e776-8a85-417a-9321-7e8c761484e0', 'env', 'PROCESS_TAG=haproxy-9169e776-8a85-417a-9321-7e8c761484e0', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/9169e776-8a85-417a-9321-7e8c761484e0.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Dec 05 09:36:11 compute-1 nova_compute[189066]: 2025-12-05 09:36:11.440 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:36:11 compute-1 nova_compute[189066]: 2025-12-05 09:36:11.464 189070 DEBUG nova.virt.driver [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] Emitting event <LifecycleEvent: 1764927371.4642525, caf0a99c-b4d0-4fac-9883-ab0be359b528 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 09:36:11 compute-1 nova_compute[189066]: 2025-12-05 09:36:11.465 189070 INFO nova.compute.manager [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] [instance: caf0a99c-b4d0-4fac-9883-ab0be359b528] VM Started (Lifecycle Event)
Dec 05 09:36:11 compute-1 nova_compute[189066]: 2025-12-05 09:36:11.493 189070 DEBUG nova.compute.manager [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] [instance: caf0a99c-b4d0-4fac-9883-ab0be359b528] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 09:36:11 compute-1 nova_compute[189066]: 2025-12-05 09:36:11.498 189070 DEBUG nova.virt.driver [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] Emitting event <LifecycleEvent: 1764927371.4644847, caf0a99c-b4d0-4fac-9883-ab0be359b528 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 09:36:11 compute-1 nova_compute[189066]: 2025-12-05 09:36:11.498 189070 INFO nova.compute.manager [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] [instance: caf0a99c-b4d0-4fac-9883-ab0be359b528] VM Paused (Lifecycle Event)
Dec 05 09:36:11 compute-1 nova_compute[189066]: 2025-12-05 09:36:11.528 189070 DEBUG nova.compute.manager [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] [instance: caf0a99c-b4d0-4fac-9883-ab0be359b528] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 09:36:11 compute-1 nova_compute[189066]: 2025-12-05 09:36:11.532 189070 DEBUG nova.compute.manager [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] [instance: caf0a99c-b4d0-4fac-9883-ab0be359b528] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 09:36:11 compute-1 nova_compute[189066]: 2025-12-05 09:36:11.568 189070 INFO nova.compute.manager [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] [instance: caf0a99c-b4d0-4fac-9883-ab0be359b528] During sync_power_state the instance has a pending task (spawning). Skip.
Dec 05 09:36:11 compute-1 podman[228846]: 2025-12-05 09:36:11.843063226 +0000 UTC m=+0.057352619 container create df081b9d3533857ba6e9cfbf2ffd731db37f80c472c03a0d58cf99afbcdc99c8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9169e776-8a85-417a-9321-7e8c761484e0, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Dec 05 09:36:11 compute-1 nova_compute[189066]: 2025-12-05 09:36:11.867 189070 DEBUG nova.compute.manager [req-dbfd258a-eb38-4fa3-a2ef-3db6709e0f06 req-69e77efa-c1db-40b0-a79e-ddd04cced049 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: caf0a99c-b4d0-4fac-9883-ab0be359b528] Received event network-vif-plugged-dc081d90-d263-461d-a7af-bc8649b2d24e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 09:36:11 compute-1 nova_compute[189066]: 2025-12-05 09:36:11.868 189070 DEBUG oslo_concurrency.lockutils [req-dbfd258a-eb38-4fa3-a2ef-3db6709e0f06 req-69e77efa-c1db-40b0-a79e-ddd04cced049 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Acquiring lock "caf0a99c-b4d0-4fac-9883-ab0be359b528-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:36:11 compute-1 nova_compute[189066]: 2025-12-05 09:36:11.868 189070 DEBUG oslo_concurrency.lockutils [req-dbfd258a-eb38-4fa3-a2ef-3db6709e0f06 req-69e77efa-c1db-40b0-a79e-ddd04cced049 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Lock "caf0a99c-b4d0-4fac-9883-ab0be359b528-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:36:11 compute-1 nova_compute[189066]: 2025-12-05 09:36:11.868 189070 DEBUG oslo_concurrency.lockutils [req-dbfd258a-eb38-4fa3-a2ef-3db6709e0f06 req-69e77efa-c1db-40b0-a79e-ddd04cced049 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Lock "caf0a99c-b4d0-4fac-9883-ab0be359b528-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:36:11 compute-1 nova_compute[189066]: 2025-12-05 09:36:11.868 189070 DEBUG nova.compute.manager [req-dbfd258a-eb38-4fa3-a2ef-3db6709e0f06 req-69e77efa-c1db-40b0-a79e-ddd04cced049 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: caf0a99c-b4d0-4fac-9883-ab0be359b528] Processing event network-vif-plugged-dc081d90-d263-461d-a7af-bc8649b2d24e _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Dec 05 09:36:11 compute-1 nova_compute[189066]: 2025-12-05 09:36:11.870 189070 DEBUG nova.compute.manager [None req-acd1d181-0a96-4bf5-bc4c-e12df5cce28e 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: caf0a99c-b4d0-4fac-9883-ab0be359b528] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Dec 05 09:36:11 compute-1 nova_compute[189066]: 2025-12-05 09:36:11.874 189070 DEBUG nova.virt.driver [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] Emitting event <LifecycleEvent: 1764927371.873759, caf0a99c-b4d0-4fac-9883-ab0be359b528 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 09:36:11 compute-1 nova_compute[189066]: 2025-12-05 09:36:11.875 189070 INFO nova.compute.manager [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] [instance: caf0a99c-b4d0-4fac-9883-ab0be359b528] VM Resumed (Lifecycle Event)
Dec 05 09:36:11 compute-1 nova_compute[189066]: 2025-12-05 09:36:11.878 189070 DEBUG nova.virt.libvirt.driver [None req-acd1d181-0a96-4bf5-bc4c-e12df5cce28e 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: caf0a99c-b4d0-4fac-9883-ab0be359b528] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Dec 05 09:36:11 compute-1 nova_compute[189066]: 2025-12-05 09:36:11.881 189070 INFO nova.virt.libvirt.driver [-] [instance: caf0a99c-b4d0-4fac-9883-ab0be359b528] Instance spawned successfully.
Dec 05 09:36:11 compute-1 nova_compute[189066]: 2025-12-05 09:36:11.882 189070 DEBUG nova.virt.libvirt.driver [None req-acd1d181-0a96-4bf5-bc4c-e12df5cce28e 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: caf0a99c-b4d0-4fac-9883-ab0be359b528] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Dec 05 09:36:11 compute-1 systemd[1]: Started libpod-conmon-df081b9d3533857ba6e9cfbf2ffd731db37f80c472c03a0d58cf99afbcdc99c8.scope.
Dec 05 09:36:11 compute-1 podman[228846]: 2025-12-05 09:36:11.815589087 +0000 UTC m=+0.029878500 image pull 014dc726c85414b29f2dde7b5d875685d08784761c0f0ffa8630d1583a877bf9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Dec 05 09:36:11 compute-1 nova_compute[189066]: 2025-12-05 09:36:11.911 189070 DEBUG nova.compute.manager [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] [instance: caf0a99c-b4d0-4fac-9883-ab0be359b528] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 09:36:11 compute-1 nova_compute[189066]: 2025-12-05 09:36:11.917 189070 DEBUG nova.compute.manager [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] [instance: caf0a99c-b4d0-4fac-9883-ab0be359b528] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 09:36:11 compute-1 nova_compute[189066]: 2025-12-05 09:36:11.919 189070 DEBUG nova.virt.libvirt.driver [None req-acd1d181-0a96-4bf5-bc4c-e12df5cce28e 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: caf0a99c-b4d0-4fac-9883-ab0be359b528] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 09:36:11 compute-1 nova_compute[189066]: 2025-12-05 09:36:11.920 189070 DEBUG nova.virt.libvirt.driver [None req-acd1d181-0a96-4bf5-bc4c-e12df5cce28e 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: caf0a99c-b4d0-4fac-9883-ab0be359b528] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 09:36:11 compute-1 nova_compute[189066]: 2025-12-05 09:36:11.920 189070 DEBUG nova.virt.libvirt.driver [None req-acd1d181-0a96-4bf5-bc4c-e12df5cce28e 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: caf0a99c-b4d0-4fac-9883-ab0be359b528] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 09:36:11 compute-1 nova_compute[189066]: 2025-12-05 09:36:11.921 189070 DEBUG nova.virt.libvirt.driver [None req-acd1d181-0a96-4bf5-bc4c-e12df5cce28e 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: caf0a99c-b4d0-4fac-9883-ab0be359b528] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 09:36:11 compute-1 nova_compute[189066]: 2025-12-05 09:36:11.921 189070 DEBUG nova.virt.libvirt.driver [None req-acd1d181-0a96-4bf5-bc4c-e12df5cce28e 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: caf0a99c-b4d0-4fac-9883-ab0be359b528] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 09:36:11 compute-1 nova_compute[189066]: 2025-12-05 09:36:11.921 189070 DEBUG nova.virt.libvirt.driver [None req-acd1d181-0a96-4bf5-bc4c-e12df5cce28e 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: caf0a99c-b4d0-4fac-9883-ab0be359b528] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 09:36:11 compute-1 systemd[1]: Started libcrun container.
Dec 05 09:36:11 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/25efba73102b76344ef0a313c000447af8c3a8e9e49e312c169b3e9959726e8d/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 05 09:36:11 compute-1 podman[228846]: 2025-12-05 09:36:11.946736664 +0000 UTC m=+0.161026077 container init df081b9d3533857ba6e9cfbf2ffd731db37f80c472c03a0d58cf99afbcdc99c8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9169e776-8a85-417a-9321-7e8c761484e0, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 05 09:36:11 compute-1 podman[228846]: 2025-12-05 09:36:11.953310395 +0000 UTC m=+0.167599788 container start df081b9d3533857ba6e9cfbf2ffd731db37f80c472c03a0d58cf99afbcdc99c8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9169e776-8a85-417a-9321-7e8c761484e0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3)
Dec 05 09:36:11 compute-1 nova_compute[189066]: 2025-12-05 09:36:11.971 189070 INFO nova.compute.manager [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] [instance: caf0a99c-b4d0-4fac-9883-ab0be359b528] During sync_power_state the instance has a pending task (spawning). Skip.
Dec 05 09:36:11 compute-1 neutron-haproxy-ovnmeta-9169e776-8a85-417a-9321-7e8c761484e0[228861]: [NOTICE]   (228865) : New worker (228867) forked
Dec 05 09:36:11 compute-1 neutron-haproxy-ovnmeta-9169e776-8a85-417a-9321-7e8c761484e0[228861]: [NOTICE]   (228865) : Loading success.
Dec 05 09:36:12 compute-1 nova_compute[189066]: 2025-12-05 09:36:12.010 189070 INFO nova.compute.manager [None req-acd1d181-0a96-4bf5-bc4c-e12df5cce28e 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: caf0a99c-b4d0-4fac-9883-ab0be359b528] Took 16.01 seconds to spawn the instance on the hypervisor.
Dec 05 09:36:12 compute-1 nova_compute[189066]: 2025-12-05 09:36:12.010 189070 DEBUG nova.compute.manager [None req-acd1d181-0a96-4bf5-bc4c-e12df5cce28e 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: caf0a99c-b4d0-4fac-9883-ab0be359b528] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 09:36:12 compute-1 nova_compute[189066]: 2025-12-05 09:36:12.087 189070 INFO nova.compute.manager [None req-acd1d181-0a96-4bf5-bc4c-e12df5cce28e 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: caf0a99c-b4d0-4fac-9883-ab0be359b528] Took 17.09 seconds to build instance.
Dec 05 09:36:12 compute-1 nova_compute[189066]: 2025-12-05 09:36:12.108 189070 DEBUG oslo_concurrency.lockutils [None req-acd1d181-0a96-4bf5-bc4c-e12df5cce28e 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Lock "caf0a99c-b4d0-4fac-9883-ab0be359b528" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 17.292s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:36:12 compute-1 nova_compute[189066]: 2025-12-05 09:36:12.699 189070 DEBUG nova.network.neutron [req-fbd2c8e7-2235-4871-826f-cfe70a3f421c req-82e12e0f-8155-4e58-858b-da8ca87b11e6 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: caf0a99c-b4d0-4fac-9883-ab0be359b528] Updated VIF entry in instance network info cache for port dc081d90-d263-461d-a7af-bc8649b2d24e. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Dec 05 09:36:12 compute-1 nova_compute[189066]: 2025-12-05 09:36:12.700 189070 DEBUG nova.network.neutron [req-fbd2c8e7-2235-4871-826f-cfe70a3f421c req-82e12e0f-8155-4e58-858b-da8ca87b11e6 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: caf0a99c-b4d0-4fac-9883-ab0be359b528] Updating instance_info_cache with network_info: [{"id": "dc081d90-d263-461d-a7af-bc8649b2d24e", "address": "fa:16:3e:5a:50:59", "network": {"id": "9169e776-8a85-417a-9321-7e8c761484e0", "bridge": "br-int", "label": "tempest-network-smoke--736767212", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f918144b49634ed5a43d75f8f7d194d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdc081d90-d2", "ovs_interfaceid": "dc081d90-d263-461d-a7af-bc8649b2d24e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 09:36:12 compute-1 nova_compute[189066]: 2025-12-05 09:36:12.757 189070 DEBUG oslo_concurrency.lockutils [req-fbd2c8e7-2235-4871-826f-cfe70a3f421c req-82e12e0f-8155-4e58-858b-da8ca87b11e6 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Releasing lock "refresh_cache-caf0a99c-b4d0-4fac-9883-ab0be359b528" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 09:36:14 compute-1 nova_compute[189066]: 2025-12-05 09:36:14.019 189070 DEBUG nova.compute.manager [req-c707b719-2749-49e0-a4ee-6d10af08d205 req-2c32a0b8-4622-4de2-958c-35ba28d0f5dc 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: caf0a99c-b4d0-4fac-9883-ab0be359b528] Received event network-vif-plugged-dc081d90-d263-461d-a7af-bc8649b2d24e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 09:36:14 compute-1 nova_compute[189066]: 2025-12-05 09:36:14.019 189070 DEBUG oslo_concurrency.lockutils [req-c707b719-2749-49e0-a4ee-6d10af08d205 req-2c32a0b8-4622-4de2-958c-35ba28d0f5dc 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Acquiring lock "caf0a99c-b4d0-4fac-9883-ab0be359b528-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:36:14 compute-1 nova_compute[189066]: 2025-12-05 09:36:14.020 189070 DEBUG oslo_concurrency.lockutils [req-c707b719-2749-49e0-a4ee-6d10af08d205 req-2c32a0b8-4622-4de2-958c-35ba28d0f5dc 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Lock "caf0a99c-b4d0-4fac-9883-ab0be359b528-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:36:14 compute-1 nova_compute[189066]: 2025-12-05 09:36:14.020 189070 DEBUG oslo_concurrency.lockutils [req-c707b719-2749-49e0-a4ee-6d10af08d205 req-2c32a0b8-4622-4de2-958c-35ba28d0f5dc 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Lock "caf0a99c-b4d0-4fac-9883-ab0be359b528-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:36:14 compute-1 nova_compute[189066]: 2025-12-05 09:36:14.020 189070 DEBUG nova.compute.manager [req-c707b719-2749-49e0-a4ee-6d10af08d205 req-2c32a0b8-4622-4de2-958c-35ba28d0f5dc 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: caf0a99c-b4d0-4fac-9883-ab0be359b528] No waiting events found dispatching network-vif-plugged-dc081d90-d263-461d-a7af-bc8649b2d24e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 09:36:14 compute-1 nova_compute[189066]: 2025-12-05 09:36:14.020 189070 WARNING nova.compute.manager [req-c707b719-2749-49e0-a4ee-6d10af08d205 req-2c32a0b8-4622-4de2-958c-35ba28d0f5dc 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: caf0a99c-b4d0-4fac-9883-ab0be359b528] Received unexpected event network-vif-plugged-dc081d90-d263-461d-a7af-bc8649b2d24e for instance with vm_state active and task_state None.
Dec 05 09:36:14 compute-1 nova_compute[189066]: 2025-12-05 09:36:14.486 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:36:15 compute-1 nova_compute[189066]: 2025-12-05 09:36:15.000 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:36:15 compute-1 nova_compute[189066]: 2025-12-05 09:36:15.015 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:36:15 compute-1 nova_compute[189066]: 2025-12-05 09:36:15.151 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:36:17 compute-1 nova_compute[189066]: 2025-12-05 09:36:17.021 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:36:17 compute-1 nova_compute[189066]: 2025-12-05 09:36:17.274 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:36:17 compute-1 NetworkManager[55704]: <info>  [1764927377.2939] manager: (patch-br-int-to-provnet-540ef657-1e81-4b72-856b-0e1c84504735): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/101)
Dec 05 09:36:17 compute-1 NetworkManager[55704]: <info>  [1764927377.2946] manager: (patch-provnet-540ef657-1e81-4b72-856b-0e1c84504735-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/102)
Dec 05 09:36:17 compute-1 ovn_controller[95809]: 2025-12-05T09:36:17Z|00192|binding|INFO|Releasing lport e2b66518-c69f-4ca7-89db-70c0fcadafde from this chassis (sb_readonly=0)
Dec 05 09:36:17 compute-1 ovn_controller[95809]: 2025-12-05T09:36:17Z|00193|binding|INFO|Releasing lport e2b66518-c69f-4ca7-89db-70c0fcadafde from this chassis (sb_readonly=0)
Dec 05 09:36:17 compute-1 nova_compute[189066]: 2025-12-05 09:36:17.416 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:36:18 compute-1 nova_compute[189066]: 2025-12-05 09:36:18.015 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:36:18 compute-1 podman[228877]: 2025-12-05 09:36:18.640121167 +0000 UTC m=+0.067725684 container health_status d60d3dfd47255bc7c068dbedc03dbf86678863f0414a1b8f8536e4d53dd67a13 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a42d21893a42142b0b00e96296a13ac9d2c0fedf55d6438ad104fd0309c63376-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 05 09:36:19 compute-1 nova_compute[189066]: 2025-12-05 09:36:19.020 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:36:19 compute-1 nova_compute[189066]: 2025-12-05 09:36:19.021 189070 DEBUG nova.compute.manager [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 05 09:36:19 compute-1 nova_compute[189066]: 2025-12-05 09:36:19.022 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:36:19 compute-1 nova_compute[189066]: 2025-12-05 09:36:19.054 189070 DEBUG oslo_concurrency.lockutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:36:19 compute-1 nova_compute[189066]: 2025-12-05 09:36:19.055 189070 DEBUG oslo_concurrency.lockutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:36:19 compute-1 nova_compute[189066]: 2025-12-05 09:36:19.055 189070 DEBUG oslo_concurrency.lockutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:36:19 compute-1 nova_compute[189066]: 2025-12-05 09:36:19.055 189070 DEBUG nova.compute.resource_tracker [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 05 09:36:19 compute-1 nova_compute[189066]: 2025-12-05 09:36:19.138 189070 DEBUG oslo_concurrency.processutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/caf0a99c-b4d0-4fac-9883-ab0be359b528/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 09:36:19 compute-1 nova_compute[189066]: 2025-12-05 09:36:19.203 189070 DEBUG oslo_concurrency.processutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/caf0a99c-b4d0-4fac-9883-ab0be359b528/disk --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 09:36:19 compute-1 nova_compute[189066]: 2025-12-05 09:36:19.204 189070 DEBUG oslo_concurrency.processutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/caf0a99c-b4d0-4fac-9883-ab0be359b528/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 09:36:19 compute-1 nova_compute[189066]: 2025-12-05 09:36:19.262 189070 DEBUG oslo_concurrency.processutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/caf0a99c-b4d0-4fac-9883-ab0be359b528/disk --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 09:36:19 compute-1 nova_compute[189066]: 2025-12-05 09:36:19.454 189070 WARNING nova.virt.libvirt.driver [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 05 09:36:19 compute-1 nova_compute[189066]: 2025-12-05 09:36:19.456 189070 DEBUG nova.compute.resource_tracker [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5576MB free_disk=73.32502365112305GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 05 09:36:19 compute-1 nova_compute[189066]: 2025-12-05 09:36:19.457 189070 DEBUG oslo_concurrency.lockutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:36:19 compute-1 nova_compute[189066]: 2025-12-05 09:36:19.457 189070 DEBUG oslo_concurrency.lockutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:36:19 compute-1 nova_compute[189066]: 2025-12-05 09:36:19.489 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:36:19 compute-1 nova_compute[189066]: 2025-12-05 09:36:19.559 189070 DEBUG nova.compute.resource_tracker [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Instance caf0a99c-b4d0-4fac-9883-ab0be359b528 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Dec 05 09:36:19 compute-1 nova_compute[189066]: 2025-12-05 09:36:19.560 189070 DEBUG nova.compute.resource_tracker [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 05 09:36:19 compute-1 nova_compute[189066]: 2025-12-05 09:36:19.560 189070 DEBUG nova.compute.resource_tracker [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 05 09:36:19 compute-1 nova_compute[189066]: 2025-12-05 09:36:19.619 189070 DEBUG nova.compute.provider_tree [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Inventory has not changed in ProviderTree for provider: be68f9f1-7820-4bfa-8dbd-210e13729f64 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 09:36:19 compute-1 nova_compute[189066]: 2025-12-05 09:36:19.640 189070 DEBUG nova.scheduler.client.report [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Inventory has not changed for provider be68f9f1-7820-4bfa-8dbd-210e13729f64 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 09:36:19 compute-1 nova_compute[189066]: 2025-12-05 09:36:19.677 189070 DEBUG nova.compute.resource_tracker [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 05 09:36:19 compute-1 nova_compute[189066]: 2025-12-05 09:36:19.678 189070 DEBUG oslo_concurrency.lockutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.221s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:36:20 compute-1 nova_compute[189066]: 2025-12-05 09:36:20.031 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:36:20 compute-1 nova_compute[189066]: 2025-12-05 09:36:20.113 189070 DEBUG nova.compute.manager [req-6a76f8ff-1957-4b0d-a464-d772b4e52324 req-3b6c95c3-0c4e-4184-b83d-b0c47ac37d63 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: caf0a99c-b4d0-4fac-9883-ab0be359b528] Received event network-changed-dc081d90-d263-461d-a7af-bc8649b2d24e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 09:36:20 compute-1 nova_compute[189066]: 2025-12-05 09:36:20.114 189070 DEBUG nova.compute.manager [req-6a76f8ff-1957-4b0d-a464-d772b4e52324 req-3b6c95c3-0c4e-4184-b83d-b0c47ac37d63 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: caf0a99c-b4d0-4fac-9883-ab0be359b528] Refreshing instance network info cache due to event network-changed-dc081d90-d263-461d-a7af-bc8649b2d24e. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Dec 05 09:36:20 compute-1 nova_compute[189066]: 2025-12-05 09:36:20.114 189070 DEBUG oslo_concurrency.lockutils [req-6a76f8ff-1957-4b0d-a464-d772b4e52324 req-3b6c95c3-0c4e-4184-b83d-b0c47ac37d63 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Acquiring lock "refresh_cache-caf0a99c-b4d0-4fac-9883-ab0be359b528" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 09:36:20 compute-1 nova_compute[189066]: 2025-12-05 09:36:20.114 189070 DEBUG oslo_concurrency.lockutils [req-6a76f8ff-1957-4b0d-a464-d772b4e52324 req-3b6c95c3-0c4e-4184-b83d-b0c47ac37d63 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Acquired lock "refresh_cache-caf0a99c-b4d0-4fac-9883-ab0be359b528" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 09:36:20 compute-1 nova_compute[189066]: 2025-12-05 09:36:20.115 189070 DEBUG nova.network.neutron [req-6a76f8ff-1957-4b0d-a464-d772b4e52324 req-3b6c95c3-0c4e-4184-b83d-b0c47ac37d63 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: caf0a99c-b4d0-4fac-9883-ab0be359b528] Refreshing network info cache for port dc081d90-d263-461d-a7af-bc8649b2d24e _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Dec 05 09:36:21 compute-1 nova_compute[189066]: 2025-12-05 09:36:21.680 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:36:21 compute-1 nova_compute[189066]: 2025-12-05 09:36:21.681 189070 DEBUG nova.compute.manager [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 05 09:36:21 compute-1 nova_compute[189066]: 2025-12-05 09:36:21.681 189070 DEBUG nova.compute.manager [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 05 09:36:21 compute-1 nova_compute[189066]: 2025-12-05 09:36:21.988 189070 DEBUG oslo_concurrency.lockutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Acquiring lock "refresh_cache-caf0a99c-b4d0-4fac-9883-ab0be359b528" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 09:36:22 compute-1 nova_compute[189066]: 2025-12-05 09:36:22.198 189070 DEBUG nova.network.neutron [req-6a76f8ff-1957-4b0d-a464-d772b4e52324 req-3b6c95c3-0c4e-4184-b83d-b0c47ac37d63 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: caf0a99c-b4d0-4fac-9883-ab0be359b528] Updated VIF entry in instance network info cache for port dc081d90-d263-461d-a7af-bc8649b2d24e. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Dec 05 09:36:22 compute-1 nova_compute[189066]: 2025-12-05 09:36:22.199 189070 DEBUG nova.network.neutron [req-6a76f8ff-1957-4b0d-a464-d772b4e52324 req-3b6c95c3-0c4e-4184-b83d-b0c47ac37d63 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: caf0a99c-b4d0-4fac-9883-ab0be359b528] Updating instance_info_cache with network_info: [{"id": "dc081d90-d263-461d-a7af-bc8649b2d24e", "address": "fa:16:3e:5a:50:59", "network": {"id": "9169e776-8a85-417a-9321-7e8c761484e0", "bridge": "br-int", "label": "tempest-network-smoke--736767212", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.215", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f918144b49634ed5a43d75f8f7d194d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdc081d90-d2", "ovs_interfaceid": "dc081d90-d263-461d-a7af-bc8649b2d24e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 09:36:22 compute-1 nova_compute[189066]: 2025-12-05 09:36:22.231 189070 DEBUG oslo_concurrency.lockutils [req-6a76f8ff-1957-4b0d-a464-d772b4e52324 req-3b6c95c3-0c4e-4184-b83d-b0c47ac37d63 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Releasing lock "refresh_cache-caf0a99c-b4d0-4fac-9883-ab0be359b528" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 09:36:22 compute-1 nova_compute[189066]: 2025-12-05 09:36:22.232 189070 DEBUG oslo_concurrency.lockutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Acquired lock "refresh_cache-caf0a99c-b4d0-4fac-9883-ab0be359b528" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 09:36:22 compute-1 nova_compute[189066]: 2025-12-05 09:36:22.232 189070 DEBUG nova.network.neutron [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] [instance: caf0a99c-b4d0-4fac-9883-ab0be359b528] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Dec 05 09:36:22 compute-1 nova_compute[189066]: 2025-12-05 09:36:22.233 189070 DEBUG nova.objects.instance [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Lazy-loading 'info_cache' on Instance uuid caf0a99c-b4d0-4fac-9883-ab0be359b528 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 09:36:22 compute-1 podman[228904]: 2025-12-05 09:36:22.666448194 +0000 UTC m=+0.103577697 container health_status 0cdc951f3b455a1459471b20e1f1626ca53f702831c125f835d1b670aeb17ccf (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a42d21893a42142b0b00e96296a13ac9d2c0fedf55d6438ad104fd0309c63376-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Dec 05 09:36:24 compute-1 nova_compute[189066]: 2025-12-05 09:36:24.492 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:36:25 compute-1 nova_compute[189066]: 2025-12-05 09:36:25.034 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:36:25 compute-1 nova_compute[189066]: 2025-12-05 09:36:25.613 189070 DEBUG nova.network.neutron [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] [instance: caf0a99c-b4d0-4fac-9883-ab0be359b528] Updating instance_info_cache with network_info: [{"id": "dc081d90-d263-461d-a7af-bc8649b2d24e", "address": "fa:16:3e:5a:50:59", "network": {"id": "9169e776-8a85-417a-9321-7e8c761484e0", "bridge": "br-int", "label": "tempest-network-smoke--736767212", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.215", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f918144b49634ed5a43d75f8f7d194d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdc081d90-d2", "ovs_interfaceid": "dc081d90-d263-461d-a7af-bc8649b2d24e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 09:36:25 compute-1 podman[228956]: 2025-12-05 09:36:25.621002909 +0000 UTC m=+0.054667314 container health_status e8cea6352187f28e672aa25c227f2705a90d58074b8cf42235a56df5d4026fd5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a42d21893a42142b0b00e96296a13ac9d2c0fedf55d6438ad104fd0309c63376-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125)
Dec 05 09:36:25 compute-1 nova_compute[189066]: 2025-12-05 09:36:25.642 189070 DEBUG oslo_concurrency.lockutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Releasing lock "refresh_cache-caf0a99c-b4d0-4fac-9883-ab0be359b528" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 09:36:25 compute-1 nova_compute[189066]: 2025-12-05 09:36:25.643 189070 DEBUG nova.compute.manager [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] [instance: caf0a99c-b4d0-4fac-9883-ab0be359b528] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Dec 05 09:36:25 compute-1 nova_compute[189066]: 2025-12-05 09:36:25.643 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:36:25 compute-1 nova_compute[189066]: 2025-12-05 09:36:25.644 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:36:25 compute-1 nova_compute[189066]: 2025-12-05 09:36:25.644 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:36:26 compute-1 ovn_controller[95809]: 2025-12-05T09:36:26Z|00033|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:5a:50:59 10.100.0.7
Dec 05 09:36:26 compute-1 ovn_controller[95809]: 2025-12-05T09:36:26Z|00034|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:5a:50:59 10.100.0.7
Dec 05 09:36:26 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:36:26.613 105272 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=19, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f2:d3:89', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '8e:43:2c:7d:20:dd'}, ipsec=False) old=SB_Global(nb_cfg=18) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 09:36:26 compute-1 nova_compute[189066]: 2025-12-05 09:36:26.614 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:36:26 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:36:26.615 105272 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Dec 05 09:36:27 compute-1 podman[228975]: 2025-12-05 09:36:27.64656474 +0000 UTC m=+0.083350033 container health_status 3f3288243aefe700d53cffce184353529fbaf6e44f1c19ec4792ed576af41176 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Dec 05 09:36:29 compute-1 nova_compute[189066]: 2025-12-05 09:36:29.495 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:36:29 compute-1 nova_compute[189066]: 2025-12-05 09:36:29.886 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:36:30 compute-1 nova_compute[189066]: 2025-12-05 09:36:30.036 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:36:31 compute-1 podman[228997]: 2025-12-05 09:36:31.636270665 +0000 UTC m=+0.070105140 container health_status b07f36300303a5c1729056a61e344d6c6fd9b2e404082d8e8ce7185d45652c2b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-type=git, vendor=Red Hat, Inc., distribution-scope=public, com.redhat.component=ubi9-minimal-container, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, architecture=x86_64, build-date=2025-08-20T13:12:41, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, managed_by=edpm_ansible, config_id=openstack_network_exporter, io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter)
Dec 05 09:36:32 compute-1 nova_compute[189066]: 2025-12-05 09:36:32.903 189070 INFO nova.compute.manager [None req-de30127e-636c-4374-ae49-b6365d2fc87e 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: caf0a99c-b4d0-4fac-9883-ab0be359b528] Get console output
Dec 05 09:36:32 compute-1 nova_compute[189066]: 2025-12-05 09:36:32.911 220618 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Dec 05 09:36:33 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:36:33.618 105272 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=2deaed7a-68f6-453c-b7f8-10ef033f3762, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '19'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 09:36:34 compute-1 nova_compute[189066]: 2025-12-05 09:36:34.498 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:36:35 compute-1 nova_compute[189066]: 2025-12-05 09:36:35.038 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:36:35 compute-1 nova_compute[189066]: 2025-12-05 09:36:35.198 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:36:35 compute-1 podman[229019]: 2025-12-05 09:36:35.619725297 +0000 UTC m=+0.057300808 container health_status b9f45cdcb567bb250381fa80d86c78c226d5638c7de1b6d79ee017275814ff68 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Dec 05 09:36:37 compute-1 nova_compute[189066]: 2025-12-05 09:36:37.673 189070 DEBUG oslo_concurrency.lockutils [None req-67035611-ee0c-4865-a763-ce66511bce9f 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Acquiring lock "interface-caf0a99c-b4d0-4fac-9883-ab0be359b528-None" by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:36:37 compute-1 nova_compute[189066]: 2025-12-05 09:36:37.674 189070 DEBUG oslo_concurrency.lockutils [None req-67035611-ee0c-4865-a763-ce66511bce9f 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Lock "interface-caf0a99c-b4d0-4fac-9883-ab0be359b528-None" acquired by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:36:37 compute-1 nova_compute[189066]: 2025-12-05 09:36:37.675 189070 DEBUG nova.objects.instance [None req-67035611-ee0c-4865-a763-ce66511bce9f 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Lazy-loading 'flavor' on Instance uuid caf0a99c-b4d0-4fac-9883-ab0be359b528 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 09:36:38 compute-1 nova_compute[189066]: 2025-12-05 09:36:38.086 189070 DEBUG nova.objects.instance [None req-67035611-ee0c-4865-a763-ce66511bce9f 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Lazy-loading 'pci_requests' on Instance uuid caf0a99c-b4d0-4fac-9883-ab0be359b528 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 09:36:38 compute-1 nova_compute[189066]: 2025-12-05 09:36:38.104 189070 DEBUG nova.network.neutron [None req-67035611-ee0c-4865-a763-ce66511bce9f 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: caf0a99c-b4d0-4fac-9883-ab0be359b528] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Dec 05 09:36:38 compute-1 nova_compute[189066]: 2025-12-05 09:36:38.631 189070 DEBUG nova.policy [None req-67035611-ee0c-4865-a763-ce66511bce9f 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '822a2d80e80e46d9a5a49e1e9560e0d9', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'f918144b49634ed5a43d75f8f7d194d3', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Dec 05 09:36:39 compute-1 nova_compute[189066]: 2025-12-05 09:36:39.501 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:36:39 compute-1 nova_compute[189066]: 2025-12-05 09:36:39.618 189070 DEBUG nova.network.neutron [None req-67035611-ee0c-4865-a763-ce66511bce9f 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: caf0a99c-b4d0-4fac-9883-ab0be359b528] Successfully created port: 5f95901b-9655-49a7-a46d-bfe7f4c86a96 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Dec 05 09:36:40 compute-1 nova_compute[189066]: 2025-12-05 09:36:40.095 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:36:40 compute-1 nova_compute[189066]: 2025-12-05 09:36:40.936 189070 DEBUG nova.network.neutron [None req-67035611-ee0c-4865-a763-ce66511bce9f 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: caf0a99c-b4d0-4fac-9883-ab0be359b528] Successfully updated port: 5f95901b-9655-49a7-a46d-bfe7f4c86a96 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Dec 05 09:36:40 compute-1 nova_compute[189066]: 2025-12-05 09:36:40.984 189070 DEBUG oslo_concurrency.lockutils [None req-67035611-ee0c-4865-a763-ce66511bce9f 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Acquiring lock "refresh_cache-caf0a99c-b4d0-4fac-9883-ab0be359b528" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 09:36:40 compute-1 nova_compute[189066]: 2025-12-05 09:36:40.985 189070 DEBUG oslo_concurrency.lockutils [None req-67035611-ee0c-4865-a763-ce66511bce9f 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Acquired lock "refresh_cache-caf0a99c-b4d0-4fac-9883-ab0be359b528" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 09:36:40 compute-1 nova_compute[189066]: 2025-12-05 09:36:40.985 189070 DEBUG nova.network.neutron [None req-67035611-ee0c-4865-a763-ce66511bce9f 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: caf0a99c-b4d0-4fac-9883-ab0be359b528] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Dec 05 09:36:41 compute-1 podman[229044]: 2025-12-05 09:36:41.634903251 +0000 UTC m=+0.068659635 container health_status 550b604b3b40c506829669f3af0f3e8dc4e7ac01464222ce7f26998708fe1a96 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec 05 09:36:41 compute-1 nova_compute[189066]: 2025-12-05 09:36:41.678 189070 DEBUG nova.compute.manager [req-90aa2cac-6f2b-4dfe-9e6d-7dbb1f0c538a req-54b98a39-14e9-4db9-977d-5fd8372dafe3 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: caf0a99c-b4d0-4fac-9883-ab0be359b528] Received event network-changed-5f95901b-9655-49a7-a46d-bfe7f4c86a96 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 09:36:41 compute-1 nova_compute[189066]: 2025-12-05 09:36:41.678 189070 DEBUG nova.compute.manager [req-90aa2cac-6f2b-4dfe-9e6d-7dbb1f0c538a req-54b98a39-14e9-4db9-977d-5fd8372dafe3 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: caf0a99c-b4d0-4fac-9883-ab0be359b528] Refreshing instance network info cache due to event network-changed-5f95901b-9655-49a7-a46d-bfe7f4c86a96. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Dec 05 09:36:41 compute-1 nova_compute[189066]: 2025-12-05 09:36:41.678 189070 DEBUG oslo_concurrency.lockutils [req-90aa2cac-6f2b-4dfe-9e6d-7dbb1f0c538a req-54b98a39-14e9-4db9-977d-5fd8372dafe3 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Acquiring lock "refresh_cache-caf0a99c-b4d0-4fac-9883-ab0be359b528" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 09:36:42 compute-1 nova_compute[189066]: 2025-12-05 09:36:42.064 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:36:44 compute-1 nova_compute[189066]: 2025-12-05 09:36:44.273 189070 DEBUG nova.network.neutron [None req-67035611-ee0c-4865-a763-ce66511bce9f 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: caf0a99c-b4d0-4fac-9883-ab0be359b528] Updating instance_info_cache with network_info: [{"id": "dc081d90-d263-461d-a7af-bc8649b2d24e", "address": "fa:16:3e:5a:50:59", "network": {"id": "9169e776-8a85-417a-9321-7e8c761484e0", "bridge": "br-int", "label": "tempest-network-smoke--736767212", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.215", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f918144b49634ed5a43d75f8f7d194d3", "mtu": null, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdc081d90-d2", "ovs_interfaceid": "dc081d90-d263-461d-a7af-bc8649b2d24e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "5f95901b-9655-49a7-a46d-bfe7f4c86a96", "address": "fa:16:3e:b9:90:da", "network": {"id": "28997311-91ec-41eb-8d69-3ce8625aacb4", "bridge": "br-int", "label": "tempest-network-smoke--2657185", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.20", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f918144b49634ed5a43d75f8f7d194d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5f95901b-96", "ovs_interfaceid": "5f95901b-9655-49a7-a46d-bfe7f4c86a96", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 09:36:44 compute-1 nova_compute[189066]: 2025-12-05 09:36:44.312 189070 DEBUG oslo_concurrency.lockutils [None req-67035611-ee0c-4865-a763-ce66511bce9f 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Releasing lock "refresh_cache-caf0a99c-b4d0-4fac-9883-ab0be359b528" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 09:36:44 compute-1 nova_compute[189066]: 2025-12-05 09:36:44.313 189070 DEBUG oslo_concurrency.lockutils [req-90aa2cac-6f2b-4dfe-9e6d-7dbb1f0c538a req-54b98a39-14e9-4db9-977d-5fd8372dafe3 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Acquired lock "refresh_cache-caf0a99c-b4d0-4fac-9883-ab0be359b528" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 09:36:44 compute-1 nova_compute[189066]: 2025-12-05 09:36:44.313 189070 DEBUG nova.network.neutron [req-90aa2cac-6f2b-4dfe-9e6d-7dbb1f0c538a req-54b98a39-14e9-4db9-977d-5fd8372dafe3 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: caf0a99c-b4d0-4fac-9883-ab0be359b528] Refreshing network info cache for port 5f95901b-9655-49a7-a46d-bfe7f4c86a96 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Dec 05 09:36:44 compute-1 nova_compute[189066]: 2025-12-05 09:36:44.317 189070 DEBUG nova.virt.libvirt.vif [None req-67035611-ee0c-4865-a763-ce66511bce9f 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-05T09:35:53Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1152892970',display_name='tempest-TestNetworkBasicOps-server-1152892970',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1152892970',id=34,image_ref='3ebffd97-b242-42d7-b245-ebdaf8e4377c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBAOX2Isf6e1VVEipyaY1bQ3oo7OBeHw8nA3AYDjK6acPXxSqT3BdAkhZsaMDELIc3uY4dO92h8xGVeDLew9jVMMUIkjYQ57YIHVif+pT2wHO21vXehX0yK2yZSKNfMCQjw==',key_name='tempest-TestNetworkBasicOps-1301626670',keypairs=<?>,launch_index=0,launched_at=2025-12-05T09:36:12Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='f918144b49634ed5a43d75f8f7d194d3',ramdisk_id='',reservation_id='r-8kfznvii',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='3ebffd97-b242-42d7-b245-ebdaf8e4377c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1341993432',owner_user_name='tempest-TestNetworkBasicOps-1341993432-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-12-05T09:36:12Z,user_data=None,user_id='822a2d80e80e46d9a5a49e1e9560e0d9',uuid=caf0a99c-b4d0-4fac-9883-ab0be359b528,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "5f95901b-9655-49a7-a46d-bfe7f4c86a96", "address": "fa:16:3e:b9:90:da", "network": {"id": "28997311-91ec-41eb-8d69-3ce8625aacb4", "bridge": "br-int", "label": "tempest-network-smoke--2657185", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.20", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f918144b49634ed5a43d75f8f7d194d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5f95901b-96", "ovs_interfaceid": "5f95901b-9655-49a7-a46d-bfe7f4c86a96", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Dec 05 09:36:44 compute-1 nova_compute[189066]: 2025-12-05 09:36:44.318 189070 DEBUG nova.network.os_vif_util [None req-67035611-ee0c-4865-a763-ce66511bce9f 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Converting VIF {"id": "5f95901b-9655-49a7-a46d-bfe7f4c86a96", "address": "fa:16:3e:b9:90:da", "network": {"id": "28997311-91ec-41eb-8d69-3ce8625aacb4", "bridge": "br-int", "label": "tempest-network-smoke--2657185", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.20", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f918144b49634ed5a43d75f8f7d194d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5f95901b-96", "ovs_interfaceid": "5f95901b-9655-49a7-a46d-bfe7f4c86a96", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 09:36:44 compute-1 nova_compute[189066]: 2025-12-05 09:36:44.319 189070 DEBUG nova.network.os_vif_util [None req-67035611-ee0c-4865-a763-ce66511bce9f 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b9:90:da,bridge_name='br-int',has_traffic_filtering=True,id=5f95901b-9655-49a7-a46d-bfe7f4c86a96,network=Network(28997311-91ec-41eb-8d69-3ce8625aacb4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5f95901b-96') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 09:36:44 compute-1 nova_compute[189066]: 2025-12-05 09:36:44.319 189070 DEBUG os_vif [None req-67035611-ee0c-4865-a763-ce66511bce9f 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:b9:90:da,bridge_name='br-int',has_traffic_filtering=True,id=5f95901b-9655-49a7-a46d-bfe7f4c86a96,network=Network(28997311-91ec-41eb-8d69-3ce8625aacb4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5f95901b-96') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Dec 05 09:36:44 compute-1 nova_compute[189066]: 2025-12-05 09:36:44.320 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:36:44 compute-1 nova_compute[189066]: 2025-12-05 09:36:44.321 189070 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 09:36:44 compute-1 nova_compute[189066]: 2025-12-05 09:36:44.321 189070 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 09:36:44 compute-1 nova_compute[189066]: 2025-12-05 09:36:44.327 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:36:44 compute-1 nova_compute[189066]: 2025-12-05 09:36:44.328 189070 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap5f95901b-96, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 09:36:44 compute-1 nova_compute[189066]: 2025-12-05 09:36:44.329 189070 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap5f95901b-96, col_values=(('external_ids', {'iface-id': '5f95901b-9655-49a7-a46d-bfe7f4c86a96', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:b9:90:da', 'vm-uuid': 'caf0a99c-b4d0-4fac-9883-ab0be359b528'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 09:36:44 compute-1 NetworkManager[55704]: <info>  [1764927404.3323] manager: (tap5f95901b-96): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/103)
Dec 05 09:36:44 compute-1 nova_compute[189066]: 2025-12-05 09:36:44.331 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:36:44 compute-1 nova_compute[189066]: 2025-12-05 09:36:44.335 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 09:36:44 compute-1 nova_compute[189066]: 2025-12-05 09:36:44.343 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:36:44 compute-1 nova_compute[189066]: 2025-12-05 09:36:44.345 189070 INFO os_vif [None req-67035611-ee0c-4865-a763-ce66511bce9f 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:b9:90:da,bridge_name='br-int',has_traffic_filtering=True,id=5f95901b-9655-49a7-a46d-bfe7f4c86a96,network=Network(28997311-91ec-41eb-8d69-3ce8625aacb4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5f95901b-96')
Dec 05 09:36:44 compute-1 nova_compute[189066]: 2025-12-05 09:36:44.346 189070 DEBUG nova.virt.libvirt.vif [None req-67035611-ee0c-4865-a763-ce66511bce9f 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-05T09:35:53Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1152892970',display_name='tempest-TestNetworkBasicOps-server-1152892970',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1152892970',id=34,image_ref='3ebffd97-b242-42d7-b245-ebdaf8e4377c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBAOX2Isf6e1VVEipyaY1bQ3oo7OBeHw8nA3AYDjK6acPXxSqT3BdAkhZsaMDELIc3uY4dO92h8xGVeDLew9jVMMUIkjYQ57YIHVif+pT2wHO21vXehX0yK2yZSKNfMCQjw==',key_name='tempest-TestNetworkBasicOps-1301626670',keypairs=<?>,launch_index=0,launched_at=2025-12-05T09:36:12Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='f918144b49634ed5a43d75f8f7d194d3',ramdisk_id='',reservation_id='r-8kfznvii',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='3ebffd97-b242-42d7-b245-ebdaf8e4377c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1341993432',owner_user_name='tempest-TestNetworkBasicOps-1341993432-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-12-05T09:36:12Z,user_data=None,user_id='822a2d80e80e46d9a5a49e1e9560e0d9',uuid=caf0a99c-b4d0-4fac-9883-ab0be359b528,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "5f95901b-9655-49a7-a46d-bfe7f4c86a96", "address": "fa:16:3e:b9:90:da", "network": {"id": "28997311-91ec-41eb-8d69-3ce8625aacb4", "bridge": "br-int", "label": "tempest-network-smoke--2657185", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.20", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f918144b49634ed5a43d75f8f7d194d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5f95901b-96", "ovs_interfaceid": "5f95901b-9655-49a7-a46d-bfe7f4c86a96", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Dec 05 09:36:44 compute-1 nova_compute[189066]: 2025-12-05 09:36:44.347 189070 DEBUG nova.network.os_vif_util [None req-67035611-ee0c-4865-a763-ce66511bce9f 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Converting VIF {"id": "5f95901b-9655-49a7-a46d-bfe7f4c86a96", "address": "fa:16:3e:b9:90:da", "network": {"id": "28997311-91ec-41eb-8d69-3ce8625aacb4", "bridge": "br-int", "label": "tempest-network-smoke--2657185", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.20", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f918144b49634ed5a43d75f8f7d194d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5f95901b-96", "ovs_interfaceid": "5f95901b-9655-49a7-a46d-bfe7f4c86a96", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 09:36:44 compute-1 nova_compute[189066]: 2025-12-05 09:36:44.347 189070 DEBUG nova.network.os_vif_util [None req-67035611-ee0c-4865-a763-ce66511bce9f 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b9:90:da,bridge_name='br-int',has_traffic_filtering=True,id=5f95901b-9655-49a7-a46d-bfe7f4c86a96,network=Network(28997311-91ec-41eb-8d69-3ce8625aacb4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5f95901b-96') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 09:36:44 compute-1 nova_compute[189066]: 2025-12-05 09:36:44.351 189070 DEBUG nova.virt.libvirt.guest [None req-67035611-ee0c-4865-a763-ce66511bce9f 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] attach device xml: <interface type="ethernet">
Dec 05 09:36:44 compute-1 nova_compute[189066]:   <mac address="fa:16:3e:b9:90:da"/>
Dec 05 09:36:44 compute-1 nova_compute[189066]:   <model type="virtio"/>
Dec 05 09:36:44 compute-1 nova_compute[189066]:   <driver name="vhost" rx_queue_size="512"/>
Dec 05 09:36:44 compute-1 nova_compute[189066]:   <mtu size="1442"/>
Dec 05 09:36:44 compute-1 nova_compute[189066]:   <target dev="tap5f95901b-96"/>
Dec 05 09:36:44 compute-1 nova_compute[189066]: </interface>
Dec 05 09:36:44 compute-1 nova_compute[189066]:  attach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:339
Dec 05 09:36:44 compute-1 kernel: tap5f95901b-96: entered promiscuous mode
Dec 05 09:36:44 compute-1 NetworkManager[55704]: <info>  [1764927404.3649] manager: (tap5f95901b-96): new Tun device (/org/freedesktop/NetworkManager/Devices/104)
Dec 05 09:36:44 compute-1 nova_compute[189066]: 2025-12-05 09:36:44.364 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:36:44 compute-1 ovn_controller[95809]: 2025-12-05T09:36:44Z|00194|binding|INFO|Claiming lport 5f95901b-9655-49a7-a46d-bfe7f4c86a96 for this chassis.
Dec 05 09:36:44 compute-1 ovn_controller[95809]: 2025-12-05T09:36:44Z|00195|binding|INFO|5f95901b-9655-49a7-a46d-bfe7f4c86a96: Claiming fa:16:3e:b9:90:da 10.100.0.20
Dec 05 09:36:44 compute-1 nova_compute[189066]: 2025-12-05 09:36:44.369 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:36:44 compute-1 nova_compute[189066]: 2025-12-05 09:36:44.404 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:36:44 compute-1 ovn_controller[95809]: 2025-12-05T09:36:44Z|00196|binding|INFO|Setting lport 5f95901b-9655-49a7-a46d-bfe7f4c86a96 ovn-installed in OVS
Dec 05 09:36:44 compute-1 systemd-udevd[229075]: Network interface NamePolicy= disabled on kernel command line.
Dec 05 09:36:44 compute-1 nova_compute[189066]: 2025-12-05 09:36:44.406 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:36:44 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:36:44.423 105272 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b9:90:da 10.100.0.20'], port_security=['fa:16:3e:b9:90:da 10.100.0.20'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.20/28', 'neutron:device_id': 'caf0a99c-b4d0-4fac-9883-ab0be359b528', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-28997311-91ec-41eb-8d69-3ce8625aacb4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'f918144b49634ed5a43d75f8f7d194d3', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'b999a1ea-2a47-43bb-b6d8-1df5b14a091a', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=14d9d9ae-74ae-472f-b0db-60451ce9c2a3, chassis=[<ovs.db.idl.Row object at 0x7fc06f54c970>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fc06f54c970>], logical_port=5f95901b-9655-49a7-a46d-bfe7f4c86a96) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 09:36:44 compute-1 NetworkManager[55704]: <info>  [1764927404.4242] device (tap5f95901b-96): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec 05 09:36:44 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:36:44.424 105272 INFO neutron.agent.ovn.metadata.agent [-] Port 5f95901b-9655-49a7-a46d-bfe7f4c86a96 in datapath 28997311-91ec-41eb-8d69-3ce8625aacb4 bound to our chassis
Dec 05 09:36:44 compute-1 ovn_controller[95809]: 2025-12-05T09:36:44Z|00197|binding|INFO|Setting lport 5f95901b-9655-49a7-a46d-bfe7f4c86a96 up in Southbound
Dec 05 09:36:44 compute-1 NetworkManager[55704]: <info>  [1764927404.4258] device (tap5f95901b-96): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec 05 09:36:44 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:36:44.426 105272 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 28997311-91ec-41eb-8d69-3ce8625aacb4
Dec 05 09:36:44 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:36:44.442 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[7096b051-6589-4707-8f3f-0ca1a42e9961]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:36:44 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:36:44.445 105272 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap28997311-91 in ovnmeta-28997311-91ec-41eb-8d69-3ce8625aacb4 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Dec 05 09:36:44 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:36:44.448 220556 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap28997311-90 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Dec 05 09:36:44 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:36:44.448 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[023a1bdb-5d7f-4769-a8f7-6b17ba316030]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:36:44 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:36:44.450 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[cc7bb5d5-9a5b-40c8-a0fe-20c1147249fb]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:36:44 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:36:44.466 105809 DEBUG oslo.privsep.daemon [-] privsep: reply[7532090d-bb55-4121-adce-2e0a7b18fa30]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:36:44 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:36:44.492 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[da00fb79-6469-4f0a-b3e9-89500b58b3aa]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:36:44 compute-1 nova_compute[189066]: 2025-12-05 09:36:44.495 189070 DEBUG nova.virt.libvirt.driver [None req-67035611-ee0c-4865-a763-ce66511bce9f 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 05 09:36:44 compute-1 nova_compute[189066]: 2025-12-05 09:36:44.496 189070 DEBUG nova.virt.libvirt.driver [None req-67035611-ee0c-4865-a763-ce66511bce9f 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 05 09:36:44 compute-1 nova_compute[189066]: 2025-12-05 09:36:44.496 189070 DEBUG nova.virt.libvirt.driver [None req-67035611-ee0c-4865-a763-ce66511bce9f 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] No VIF found with MAC fa:16:3e:5a:50:59, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Dec 05 09:36:44 compute-1 nova_compute[189066]: 2025-12-05 09:36:44.497 189070 DEBUG nova.virt.libvirt.driver [None req-67035611-ee0c-4865-a763-ce66511bce9f 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] No VIF found with MAC fa:16:3e:b9:90:da, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Dec 05 09:36:44 compute-1 nova_compute[189066]: 2025-12-05 09:36:44.527 189070 DEBUG nova.virt.libvirt.guest [None req-67035611-ee0c-4865-a763-ce66511bce9f 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] set metadata xml: <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 05 09:36:44 compute-1 nova_compute[189066]:   <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec 05 09:36:44 compute-1 nova_compute[189066]:   <nova:name>tempest-TestNetworkBasicOps-server-1152892970</nova:name>
Dec 05 09:36:44 compute-1 nova_compute[189066]:   <nova:creationTime>2025-12-05 09:36:44</nova:creationTime>
Dec 05 09:36:44 compute-1 nova_compute[189066]:   <nova:flavor name="m1.nano">
Dec 05 09:36:44 compute-1 nova_compute[189066]:     <nova:memory>128</nova:memory>
Dec 05 09:36:44 compute-1 nova_compute[189066]:     <nova:disk>1</nova:disk>
Dec 05 09:36:44 compute-1 nova_compute[189066]:     <nova:swap>0</nova:swap>
Dec 05 09:36:44 compute-1 nova_compute[189066]:     <nova:ephemeral>0</nova:ephemeral>
Dec 05 09:36:44 compute-1 nova_compute[189066]:     <nova:vcpus>1</nova:vcpus>
Dec 05 09:36:44 compute-1 nova_compute[189066]:   </nova:flavor>
Dec 05 09:36:44 compute-1 nova_compute[189066]:   <nova:owner>
Dec 05 09:36:44 compute-1 nova_compute[189066]:     <nova:user uuid="822a2d80e80e46d9a5a49e1e9560e0d9">tempest-TestNetworkBasicOps-1341993432-project-member</nova:user>
Dec 05 09:36:44 compute-1 nova_compute[189066]:     <nova:project uuid="f918144b49634ed5a43d75f8f7d194d3">tempest-TestNetworkBasicOps-1341993432</nova:project>
Dec 05 09:36:44 compute-1 nova_compute[189066]:   </nova:owner>
Dec 05 09:36:44 compute-1 nova_compute[189066]:   <nova:root type="image" uuid="3ebffd97-b242-42d7-b245-ebdaf8e4377c"/>
Dec 05 09:36:44 compute-1 nova_compute[189066]:   <nova:ports>
Dec 05 09:36:44 compute-1 nova_compute[189066]:     <nova:port uuid="dc081d90-d263-461d-a7af-bc8649b2d24e">
Dec 05 09:36:44 compute-1 nova_compute[189066]:       <nova:ip type="fixed" address="10.100.0.7" ipVersion="4"/>
Dec 05 09:36:44 compute-1 nova_compute[189066]:     </nova:port>
Dec 05 09:36:44 compute-1 nova_compute[189066]:     <nova:port uuid="5f95901b-9655-49a7-a46d-bfe7f4c86a96">
Dec 05 09:36:44 compute-1 nova_compute[189066]:       <nova:ip type="fixed" address="10.100.0.20" ipVersion="4"/>
Dec 05 09:36:44 compute-1 nova_compute[189066]:     </nova:port>
Dec 05 09:36:44 compute-1 nova_compute[189066]:   </nova:ports>
Dec 05 09:36:44 compute-1 nova_compute[189066]: </nova:instance>
Dec 05 09:36:44 compute-1 nova_compute[189066]:  set_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:359
Dec 05 09:36:44 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:36:44.539 220896 DEBUG oslo.privsep.daemon [-] privsep: reply[37f98aeb-7117-40a3-9a50-6216274d73a0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:36:44 compute-1 NetworkManager[55704]: <info>  [1764927404.5470] manager: (tap28997311-90): new Veth device (/org/freedesktop/NetworkManager/Devices/105)
Dec 05 09:36:44 compute-1 systemd-udevd[229077]: Network interface NamePolicy= disabled on kernel command line.
Dec 05 09:36:44 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:36:44.548 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[35462ce4-7d9d-4aca-a385-d825933dd915]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:36:44 compute-1 nova_compute[189066]: 2025-12-05 09:36:44.557 189070 DEBUG oslo_concurrency.lockutils [None req-67035611-ee0c-4865-a763-ce66511bce9f 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Lock "interface-caf0a99c-b4d0-4fac-9883-ab0be359b528-None" "released" by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" :: held 6.882s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:36:44 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:36:44.596 220896 DEBUG oslo.privsep.daemon [-] privsep: reply[dcb5de42-d701-4e04-b5a5-7d678abd37db]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:36:44 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:36:44.602 220896 DEBUG oslo.privsep.daemon [-] privsep: reply[a4ed2955-5834-4c85-b3f9-dd4d0d1b0b39]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:36:44 compute-1 NetworkManager[55704]: <info>  [1764927404.6342] device (tap28997311-90): carrier: link connected
Dec 05 09:36:44 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:36:44.642 220896 DEBUG oslo.privsep.daemon [-] privsep: reply[998d5603-89ca-49ab-8e66-21d68a51dc8f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:36:44 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:36:44.668 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[fe09dc97-5e70-4a6a-8800-eb7703fdc34d]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap28997311-91'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:0a:52:2a'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 58], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 467763, 'reachable_time': 37427, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 229102, 'error': None, 'target': 'ovnmeta-28997311-91ec-41eb-8d69-3ce8625aacb4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:36:44 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:36:44.690 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[9673f62e-d40f-41ab-bda0-e3df0e3b0c40]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe0a:522a'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 467763, 'tstamp': 467763}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 229103, 'error': None, 'target': 'ovnmeta-28997311-91ec-41eb-8d69-3ce8625aacb4', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:36:44 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:36:44.708 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[8fde89bf-d588-477b-8c49-220f2d12b45f]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap28997311-91'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:0a:52:2a'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 58], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 467763, 'reachable_time': 37427, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 229104, 'error': None, 'target': 'ovnmeta-28997311-91ec-41eb-8d69-3ce8625aacb4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:36:44 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:36:44.752 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[47d7eb56-b480-4487-a33f-d88fa465be1e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:36:44 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:36:44.823 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[eb28c7ff-af0c-4562-bbca-c6507ab968b3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:36:44 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:36:44.825 105272 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap28997311-90, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 09:36:44 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:36:44.825 105272 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 09:36:44 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:36:44.826 105272 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap28997311-90, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 09:36:44 compute-1 nova_compute[189066]: 2025-12-05 09:36:44.856 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:36:44 compute-1 NetworkManager[55704]: <info>  [1764927404.8576] manager: (tap28997311-90): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/106)
Dec 05 09:36:44 compute-1 kernel: tap28997311-90: entered promiscuous mode
Dec 05 09:36:44 compute-1 nova_compute[189066]: 2025-12-05 09:36:44.859 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:36:44 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:36:44.860 105272 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap28997311-90, col_values=(('external_ids', {'iface-id': '2420b7d7-ec11-4848-bfaf-bc0760f821fb'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 09:36:44 compute-1 nova_compute[189066]: 2025-12-05 09:36:44.862 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:36:44 compute-1 ovn_controller[95809]: 2025-12-05T09:36:44Z|00198|binding|INFO|Releasing lport 2420b7d7-ec11-4848-bfaf-bc0760f821fb from this chassis (sb_readonly=0)
Dec 05 09:36:44 compute-1 nova_compute[189066]: 2025-12-05 09:36:44.863 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:36:44 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:36:44.864 105272 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/28997311-91ec-41eb-8d69-3ce8625aacb4.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/28997311-91ec-41eb-8d69-3ce8625aacb4.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Dec 05 09:36:44 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:36:44.865 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[15889234-f74a-48f0-80ab-589b40a3e33e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:36:44 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:36:44.866 105272 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Dec 05 09:36:44 compute-1 ovn_metadata_agent[105267]: global
Dec 05 09:36:44 compute-1 ovn_metadata_agent[105267]:     log         /dev/log local0 debug
Dec 05 09:36:44 compute-1 ovn_metadata_agent[105267]:     log-tag     haproxy-metadata-proxy-28997311-91ec-41eb-8d69-3ce8625aacb4
Dec 05 09:36:44 compute-1 ovn_metadata_agent[105267]:     user        root
Dec 05 09:36:44 compute-1 ovn_metadata_agent[105267]:     group       root
Dec 05 09:36:44 compute-1 ovn_metadata_agent[105267]:     maxconn     1024
Dec 05 09:36:44 compute-1 ovn_metadata_agent[105267]:     pidfile     /var/lib/neutron/external/pids/28997311-91ec-41eb-8d69-3ce8625aacb4.pid.haproxy
Dec 05 09:36:44 compute-1 ovn_metadata_agent[105267]:     daemon
Dec 05 09:36:44 compute-1 ovn_metadata_agent[105267]: 
Dec 05 09:36:44 compute-1 ovn_metadata_agent[105267]: defaults
Dec 05 09:36:44 compute-1 ovn_metadata_agent[105267]:     log global
Dec 05 09:36:44 compute-1 ovn_metadata_agent[105267]:     mode http
Dec 05 09:36:44 compute-1 ovn_metadata_agent[105267]:     option httplog
Dec 05 09:36:44 compute-1 ovn_metadata_agent[105267]:     option dontlognull
Dec 05 09:36:44 compute-1 ovn_metadata_agent[105267]:     option http-server-close
Dec 05 09:36:44 compute-1 ovn_metadata_agent[105267]:     option forwardfor
Dec 05 09:36:44 compute-1 ovn_metadata_agent[105267]:     retries                 3
Dec 05 09:36:44 compute-1 ovn_metadata_agent[105267]:     timeout http-request    30s
Dec 05 09:36:44 compute-1 ovn_metadata_agent[105267]:     timeout connect         30s
Dec 05 09:36:44 compute-1 ovn_metadata_agent[105267]:     timeout client          32s
Dec 05 09:36:44 compute-1 ovn_metadata_agent[105267]:     timeout server          32s
Dec 05 09:36:44 compute-1 ovn_metadata_agent[105267]:     timeout http-keep-alive 30s
Dec 05 09:36:44 compute-1 ovn_metadata_agent[105267]: 
Dec 05 09:36:44 compute-1 ovn_metadata_agent[105267]: 
Dec 05 09:36:44 compute-1 ovn_metadata_agent[105267]: listen listener
Dec 05 09:36:44 compute-1 ovn_metadata_agent[105267]:     bind 169.254.169.254:80
Dec 05 09:36:44 compute-1 ovn_metadata_agent[105267]:     server metadata /var/lib/neutron/metadata_proxy
Dec 05 09:36:44 compute-1 ovn_metadata_agent[105267]:     http-request add-header X-OVN-Network-ID 28997311-91ec-41eb-8d69-3ce8625aacb4
Dec 05 09:36:44 compute-1 ovn_metadata_agent[105267]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Dec 05 09:36:44 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:36:44.867 105272 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-28997311-91ec-41eb-8d69-3ce8625aacb4', 'env', 'PROCESS_TAG=haproxy-28997311-91ec-41eb-8d69-3ce8625aacb4', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/28997311-91ec-41eb-8d69-3ce8625aacb4.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Dec 05 09:36:44 compute-1 nova_compute[189066]: 2025-12-05 09:36:44.874 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:36:45 compute-1 nova_compute[189066]: 2025-12-05 09:36:45.097 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:36:45 compute-1 podman[229136]: 2025-12-05 09:36:45.293638245 +0000 UTC m=+0.058077897 container create fee3f6323b68d63cd118a0cc257d4ff42fe472b9c153d6f214642805b5e98636 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-28997311-91ec-41eb-8d69-3ce8625aacb4, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true)
Dec 05 09:36:45 compute-1 systemd[1]: Started libpod-conmon-fee3f6323b68d63cd118a0cc257d4ff42fe472b9c153d6f214642805b5e98636.scope.
Dec 05 09:36:45 compute-1 podman[229136]: 2025-12-05 09:36:45.264041653 +0000 UTC m=+0.028481325 image pull 014dc726c85414b29f2dde7b5d875685d08784761c0f0ffa8630d1583a877bf9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Dec 05 09:36:45 compute-1 systemd[1]: Started libcrun container.
Dec 05 09:36:45 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/17994f304c950a8dae58807c31a8436b722c9a9e777f06786c14ab8b5c8649fe/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 05 09:36:45 compute-1 podman[229136]: 2025-12-05 09:36:45.396400421 +0000 UTC m=+0.160840093 container init fee3f6323b68d63cd118a0cc257d4ff42fe472b9c153d6f214642805b5e98636 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-28997311-91ec-41eb-8d69-3ce8625aacb4, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0)
Dec 05 09:36:45 compute-1 podman[229136]: 2025-12-05 09:36:45.404697722 +0000 UTC m=+0.169137364 container start fee3f6323b68d63cd118a0cc257d4ff42fe472b9c153d6f214642805b5e98636 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-28997311-91ec-41eb-8d69-3ce8625aacb4, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0)
Dec 05 09:36:45 compute-1 neutron-haproxy-ovnmeta-28997311-91ec-41eb-8d69-3ce8625aacb4[229151]: [NOTICE]   (229155) : New worker (229157) forked
Dec 05 09:36:45 compute-1 neutron-haproxy-ovnmeta-28997311-91ec-41eb-8d69-3ce8625aacb4[229151]: [NOTICE]   (229155) : Loading success.
Dec 05 09:36:46 compute-1 ovn_controller[95809]: 2025-12-05T09:36:46Z|00035|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:b9:90:da 10.100.0.20
Dec 05 09:36:46 compute-1 ovn_controller[95809]: 2025-12-05T09:36:46Z|00036|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:b9:90:da 10.100.0.20
Dec 05 09:36:49 compute-1 nova_compute[189066]: 2025-12-05 09:36:49.332 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:36:49 compute-1 podman[229166]: 2025-12-05 09:36:49.655776002 +0000 UTC m=+0.086744766 container health_status d60d3dfd47255bc7c068dbedc03dbf86678863f0414a1b8f8536e4d53dd67a13 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a42d21893a42142b0b00e96296a13ac9d2c0fedf55d6438ad104fd0309c63376-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 05 09:36:50 compute-1 nova_compute[189066]: 2025-12-05 09:36:50.100 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:36:53 compute-1 nova_compute[189066]: 2025-12-05 09:36:53.557 189070 DEBUG nova.network.neutron [req-90aa2cac-6f2b-4dfe-9e6d-7dbb1f0c538a req-54b98a39-14e9-4db9-977d-5fd8372dafe3 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: caf0a99c-b4d0-4fac-9883-ab0be359b528] Updated VIF entry in instance network info cache for port 5f95901b-9655-49a7-a46d-bfe7f4c86a96. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Dec 05 09:36:53 compute-1 nova_compute[189066]: 2025-12-05 09:36:53.558 189070 DEBUG nova.network.neutron [req-90aa2cac-6f2b-4dfe-9e6d-7dbb1f0c538a req-54b98a39-14e9-4db9-977d-5fd8372dafe3 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: caf0a99c-b4d0-4fac-9883-ab0be359b528] Updating instance_info_cache with network_info: [{"id": "dc081d90-d263-461d-a7af-bc8649b2d24e", "address": "fa:16:3e:5a:50:59", "network": {"id": "9169e776-8a85-417a-9321-7e8c761484e0", "bridge": "br-int", "label": "tempest-network-smoke--736767212", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.215", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f918144b49634ed5a43d75f8f7d194d3", "mtu": null, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdc081d90-d2", "ovs_interfaceid": "dc081d90-d263-461d-a7af-bc8649b2d24e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "5f95901b-9655-49a7-a46d-bfe7f4c86a96", "address": "fa:16:3e:b9:90:da", "network": {"id": "28997311-91ec-41eb-8d69-3ce8625aacb4", "bridge": "br-int", "label": "tempest-network-smoke--2657185", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.20", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f918144b49634ed5a43d75f8f7d194d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5f95901b-96", "ovs_interfaceid": "5f95901b-9655-49a7-a46d-bfe7f4c86a96", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 09:36:53 compute-1 podman[229186]: 2025-12-05 09:36:53.678733308 +0000 UTC m=+0.111339306 container health_status 0cdc951f3b455a1459471b20e1f1626ca53f702831c125f835d1b670aeb17ccf (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a42d21893a42142b0b00e96296a13ac9d2c0fedf55d6438ad104fd0309c63376-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.build-date=20251125)
Dec 05 09:36:53 compute-1 nova_compute[189066]: 2025-12-05 09:36:53.890 189070 DEBUG oslo_concurrency.lockutils [req-90aa2cac-6f2b-4dfe-9e6d-7dbb1f0c538a req-54b98a39-14e9-4db9-977d-5fd8372dafe3 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Releasing lock "refresh_cache-caf0a99c-b4d0-4fac-9883-ab0be359b528" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 09:36:54 compute-1 nova_compute[189066]: 2025-12-05 09:36:54.335 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:36:55 compute-1 nova_compute[189066]: 2025-12-05 09:36:55.103 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:36:56 compute-1 podman[229212]: 2025-12-05 09:36:56.614418982 +0000 UTC m=+0.054595383 container health_status e8cea6352187f28e672aa25c227f2705a90d58074b8cf42235a56df5d4026fd5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a42d21893a42142b0b00e96296a13ac9d2c0fedf55d6438ad104fd0309c63376-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Dec 05 09:36:58 compute-1 podman[229231]: 2025-12-05 09:36:58.628463403 +0000 UTC m=+0.062068734 container health_status 3f3288243aefe700d53cffce184353529fbaf6e44f1c19ec4792ed576af41176 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=multipathd, container_name=multipathd, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 05 09:36:59 compute-1 nova_compute[189066]: 2025-12-05 09:36:59.337 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:36:59 compute-1 nova_compute[189066]: 2025-12-05 09:36:59.886 189070 DEBUG nova.compute.manager [req-7e7b28f8-deb4-4bcf-8439-3a4f97d092a4 req-3392fc91-0249-4eed-a4e6-fa698584912a 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: caf0a99c-b4d0-4fac-9883-ab0be359b528] Received event network-vif-plugged-5f95901b-9655-49a7-a46d-bfe7f4c86a96 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 09:36:59 compute-1 nova_compute[189066]: 2025-12-05 09:36:59.887 189070 DEBUG oslo_concurrency.lockutils [req-7e7b28f8-deb4-4bcf-8439-3a4f97d092a4 req-3392fc91-0249-4eed-a4e6-fa698584912a 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Acquiring lock "caf0a99c-b4d0-4fac-9883-ab0be359b528-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:36:59 compute-1 nova_compute[189066]: 2025-12-05 09:36:59.887 189070 DEBUG oslo_concurrency.lockutils [req-7e7b28f8-deb4-4bcf-8439-3a4f97d092a4 req-3392fc91-0249-4eed-a4e6-fa698584912a 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Lock "caf0a99c-b4d0-4fac-9883-ab0be359b528-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:36:59 compute-1 nova_compute[189066]: 2025-12-05 09:36:59.888 189070 DEBUG oslo_concurrency.lockutils [req-7e7b28f8-deb4-4bcf-8439-3a4f97d092a4 req-3392fc91-0249-4eed-a4e6-fa698584912a 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Lock "caf0a99c-b4d0-4fac-9883-ab0be359b528-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:36:59 compute-1 nova_compute[189066]: 2025-12-05 09:36:59.888 189070 DEBUG nova.compute.manager [req-7e7b28f8-deb4-4bcf-8439-3a4f97d092a4 req-3392fc91-0249-4eed-a4e6-fa698584912a 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: caf0a99c-b4d0-4fac-9883-ab0be359b528] No waiting events found dispatching network-vif-plugged-5f95901b-9655-49a7-a46d-bfe7f4c86a96 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 09:36:59 compute-1 nova_compute[189066]: 2025-12-05 09:36:59.889 189070 WARNING nova.compute.manager [req-7e7b28f8-deb4-4bcf-8439-3a4f97d092a4 req-3392fc91-0249-4eed-a4e6-fa698584912a 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: caf0a99c-b4d0-4fac-9883-ab0be359b528] Received unexpected event network-vif-plugged-5f95901b-9655-49a7-a46d-bfe7f4c86a96 for instance with vm_state active and task_state None.
Dec 05 09:36:59 compute-1 nova_compute[189066]: 2025-12-05 09:36:59.889 189070 DEBUG nova.compute.manager [req-7e7b28f8-deb4-4bcf-8439-3a4f97d092a4 req-3392fc91-0249-4eed-a4e6-fa698584912a 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: caf0a99c-b4d0-4fac-9883-ab0be359b528] Received event network-vif-plugged-5f95901b-9655-49a7-a46d-bfe7f4c86a96 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 09:36:59 compute-1 nova_compute[189066]: 2025-12-05 09:36:59.890 189070 DEBUG oslo_concurrency.lockutils [req-7e7b28f8-deb4-4bcf-8439-3a4f97d092a4 req-3392fc91-0249-4eed-a4e6-fa698584912a 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Acquiring lock "caf0a99c-b4d0-4fac-9883-ab0be359b528-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:36:59 compute-1 nova_compute[189066]: 2025-12-05 09:36:59.890 189070 DEBUG oslo_concurrency.lockutils [req-7e7b28f8-deb4-4bcf-8439-3a4f97d092a4 req-3392fc91-0249-4eed-a4e6-fa698584912a 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Lock "caf0a99c-b4d0-4fac-9883-ab0be359b528-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:36:59 compute-1 nova_compute[189066]: 2025-12-05 09:36:59.890 189070 DEBUG oslo_concurrency.lockutils [req-7e7b28f8-deb4-4bcf-8439-3a4f97d092a4 req-3392fc91-0249-4eed-a4e6-fa698584912a 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Lock "caf0a99c-b4d0-4fac-9883-ab0be359b528-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:36:59 compute-1 nova_compute[189066]: 2025-12-05 09:36:59.891 189070 DEBUG nova.compute.manager [req-7e7b28f8-deb4-4bcf-8439-3a4f97d092a4 req-3392fc91-0249-4eed-a4e6-fa698584912a 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: caf0a99c-b4d0-4fac-9883-ab0be359b528] No waiting events found dispatching network-vif-plugged-5f95901b-9655-49a7-a46d-bfe7f4c86a96 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 09:36:59 compute-1 nova_compute[189066]: 2025-12-05 09:36:59.891 189070 WARNING nova.compute.manager [req-7e7b28f8-deb4-4bcf-8439-3a4f97d092a4 req-3392fc91-0249-4eed-a4e6-fa698584912a 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: caf0a99c-b4d0-4fac-9883-ab0be359b528] Received unexpected event network-vif-plugged-5f95901b-9655-49a7-a46d-bfe7f4c86a96 for instance with vm_state active and task_state None.
Dec 05 09:37:00 compute-1 nova_compute[189066]: 2025-12-05 09:37:00.105 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:37:02 compute-1 podman[229252]: 2025-12-05 09:37:02.629477225 +0000 UTC m=+0.065267613 container health_status b07f36300303a5c1729056a61e344d6c6fd9b2e404082d8e8ce7185d45652c2b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., config_id=openstack_network_exporter, maintainer=Red Hat, Inc., io.openshift.tags=minimal rhel9, io.openshift.expose-services=, architecture=x86_64, distribution-scope=public, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, build-date=2025-08-20T13:12:41, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, release=1755695350)
Dec 05 09:37:04 compute-1 nova_compute[189066]: 2025-12-05 09:37:04.339 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:37:05 compute-1 nova_compute[189066]: 2025-12-05 09:37:05.107 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:37:06 compute-1 podman[229274]: 2025-12-05 09:37:06.625454964 +0000 UTC m=+0.057791760 container health_status b9f45cdcb567bb250381fa80d86c78c226d5638c7de1b6d79ee017275814ff68 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Dec 05 09:37:06 compute-1 nova_compute[189066]: 2025-12-05 09:37:06.865 189070 DEBUG oslo_concurrency.lockutils [None req-d2985bfd-2f0b-44b3-9098-4b8daa690b69 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Acquiring lock "e52dbfff-7d40-42b9-98c0-efbf2c7e1f73" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:37:06 compute-1 nova_compute[189066]: 2025-12-05 09:37:06.866 189070 DEBUG oslo_concurrency.lockutils [None req-d2985bfd-2f0b-44b3-9098-4b8daa690b69 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Lock "e52dbfff-7d40-42b9-98c0-efbf2c7e1f73" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:37:06 compute-1 nova_compute[189066]: 2025-12-05 09:37:06.923 189070 DEBUG nova.compute.manager [None req-d2985bfd-2f0b-44b3-9098-4b8daa690b69 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: e52dbfff-7d40-42b9-98c0-efbf2c7e1f73] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Dec 05 09:37:07 compute-1 nova_compute[189066]: 2025-12-05 09:37:07.060 189070 DEBUG oslo_concurrency.lockutils [None req-d2985bfd-2f0b-44b3-9098-4b8daa690b69 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:37:07 compute-1 nova_compute[189066]: 2025-12-05 09:37:07.061 189070 DEBUG oslo_concurrency.lockutils [None req-d2985bfd-2f0b-44b3-9098-4b8daa690b69 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:37:07 compute-1 nova_compute[189066]: 2025-12-05 09:37:07.074 189070 DEBUG nova.virt.hardware [None req-d2985bfd-2f0b-44b3-9098-4b8daa690b69 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Dec 05 09:37:07 compute-1 nova_compute[189066]: 2025-12-05 09:37:07.074 189070 INFO nova.compute.claims [None req-d2985bfd-2f0b-44b3-9098-4b8daa690b69 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: e52dbfff-7d40-42b9-98c0-efbf2c7e1f73] Claim successful on node compute-1.ctlplane.example.com
Dec 05 09:37:07 compute-1 nova_compute[189066]: 2025-12-05 09:37:07.240 189070 DEBUG nova.compute.provider_tree [None req-d2985bfd-2f0b-44b3-9098-4b8daa690b69 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Inventory has not changed in ProviderTree for provider: be68f9f1-7820-4bfa-8dbd-210e13729f64 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 09:37:07 compute-1 nova_compute[189066]: 2025-12-05 09:37:07.260 189070 DEBUG nova.scheduler.client.report [None req-d2985bfd-2f0b-44b3-9098-4b8daa690b69 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Inventory has not changed for provider be68f9f1-7820-4bfa-8dbd-210e13729f64 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 09:37:07 compute-1 nova_compute[189066]: 2025-12-05 09:37:07.295 189070 DEBUG oslo_concurrency.lockutils [None req-d2985bfd-2f0b-44b3-9098-4b8daa690b69 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.235s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:37:07 compute-1 nova_compute[189066]: 2025-12-05 09:37:07.297 189070 DEBUG nova.compute.manager [None req-d2985bfd-2f0b-44b3-9098-4b8daa690b69 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: e52dbfff-7d40-42b9-98c0-efbf2c7e1f73] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Dec 05 09:37:07 compute-1 nova_compute[189066]: 2025-12-05 09:37:07.358 189070 DEBUG nova.compute.manager [None req-d2985bfd-2f0b-44b3-9098-4b8daa690b69 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: e52dbfff-7d40-42b9-98c0-efbf2c7e1f73] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Dec 05 09:37:07 compute-1 nova_compute[189066]: 2025-12-05 09:37:07.359 189070 DEBUG nova.network.neutron [None req-d2985bfd-2f0b-44b3-9098-4b8daa690b69 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: e52dbfff-7d40-42b9-98c0-efbf2c7e1f73] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Dec 05 09:37:07 compute-1 nova_compute[189066]: 2025-12-05 09:37:07.385 189070 INFO nova.virt.libvirt.driver [None req-d2985bfd-2f0b-44b3-9098-4b8daa690b69 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: e52dbfff-7d40-42b9-98c0-efbf2c7e1f73] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Dec 05 09:37:07 compute-1 nova_compute[189066]: 2025-12-05 09:37:07.416 189070 DEBUG nova.compute.manager [None req-d2985bfd-2f0b-44b3-9098-4b8daa690b69 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: e52dbfff-7d40-42b9-98c0-efbf2c7e1f73] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Dec 05 09:37:07 compute-1 nova_compute[189066]: 2025-12-05 09:37:07.532 189070 DEBUG nova.compute.manager [None req-d2985bfd-2f0b-44b3-9098-4b8daa690b69 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: e52dbfff-7d40-42b9-98c0-efbf2c7e1f73] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Dec 05 09:37:07 compute-1 nova_compute[189066]: 2025-12-05 09:37:07.533 189070 DEBUG nova.virt.libvirt.driver [None req-d2985bfd-2f0b-44b3-9098-4b8daa690b69 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: e52dbfff-7d40-42b9-98c0-efbf2c7e1f73] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Dec 05 09:37:07 compute-1 nova_compute[189066]: 2025-12-05 09:37:07.534 189070 INFO nova.virt.libvirt.driver [None req-d2985bfd-2f0b-44b3-9098-4b8daa690b69 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: e52dbfff-7d40-42b9-98c0-efbf2c7e1f73] Creating image(s)
Dec 05 09:37:07 compute-1 nova_compute[189066]: 2025-12-05 09:37:07.535 189070 DEBUG oslo_concurrency.lockutils [None req-d2985bfd-2f0b-44b3-9098-4b8daa690b69 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Acquiring lock "/var/lib/nova/instances/e52dbfff-7d40-42b9-98c0-efbf2c7e1f73/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:37:07 compute-1 nova_compute[189066]: 2025-12-05 09:37:07.536 189070 DEBUG oslo_concurrency.lockutils [None req-d2985bfd-2f0b-44b3-9098-4b8daa690b69 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Lock "/var/lib/nova/instances/e52dbfff-7d40-42b9-98c0-efbf2c7e1f73/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:37:07 compute-1 nova_compute[189066]: 2025-12-05 09:37:07.537 189070 DEBUG oslo_concurrency.lockutils [None req-d2985bfd-2f0b-44b3-9098-4b8daa690b69 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Lock "/var/lib/nova/instances/e52dbfff-7d40-42b9-98c0-efbf2c7e1f73/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:37:07 compute-1 nova_compute[189066]: 2025-12-05 09:37:07.556 189070 DEBUG oslo_concurrency.processutils [None req-d2985bfd-2f0b-44b3-9098-4b8daa690b69 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5b1bdb5cc50b9bf43ebe3094e9279a12a18df0ce --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 09:37:07 compute-1 nova_compute[189066]: 2025-12-05 09:37:07.587 189070 DEBUG nova.policy [None req-d2985bfd-2f0b-44b3-9098-4b8daa690b69 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '822a2d80e80e46d9a5a49e1e9560e0d9', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'f918144b49634ed5a43d75f8f7d194d3', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Dec 05 09:37:07 compute-1 nova_compute[189066]: 2025-12-05 09:37:07.646 189070 DEBUG oslo_concurrency.processutils [None req-d2985bfd-2f0b-44b3-9098-4b8daa690b69 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5b1bdb5cc50b9bf43ebe3094e9279a12a18df0ce --force-share --output=json" returned: 0 in 0.090s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 09:37:07 compute-1 nova_compute[189066]: 2025-12-05 09:37:07.648 189070 DEBUG oslo_concurrency.lockutils [None req-d2985bfd-2f0b-44b3-9098-4b8daa690b69 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Acquiring lock "5b1bdb5cc50b9bf43ebe3094e9279a12a18df0ce" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:37:07 compute-1 nova_compute[189066]: 2025-12-05 09:37:07.649 189070 DEBUG oslo_concurrency.lockutils [None req-d2985bfd-2f0b-44b3-9098-4b8daa690b69 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Lock "5b1bdb5cc50b9bf43ebe3094e9279a12a18df0ce" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:37:07 compute-1 nova_compute[189066]: 2025-12-05 09:37:07.675 189070 DEBUG oslo_concurrency.processutils [None req-d2985bfd-2f0b-44b3-9098-4b8daa690b69 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5b1bdb5cc50b9bf43ebe3094e9279a12a18df0ce --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 09:37:07 compute-1 nova_compute[189066]: 2025-12-05 09:37:07.749 189070 DEBUG oslo_concurrency.processutils [None req-d2985bfd-2f0b-44b3-9098-4b8daa690b69 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5b1bdb5cc50b9bf43ebe3094e9279a12a18df0ce --force-share --output=json" returned: 0 in 0.073s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 09:37:07 compute-1 nova_compute[189066]: 2025-12-05 09:37:07.750 189070 DEBUG oslo_concurrency.processutils [None req-d2985bfd-2f0b-44b3-9098-4b8daa690b69 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5b1bdb5cc50b9bf43ebe3094e9279a12a18df0ce,backing_fmt=raw /var/lib/nova/instances/e52dbfff-7d40-42b9-98c0-efbf2c7e1f73/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 09:37:07 compute-1 nova_compute[189066]: 2025-12-05 09:37:07.791 189070 DEBUG oslo_concurrency.processutils [None req-d2985bfd-2f0b-44b3-9098-4b8daa690b69 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5b1bdb5cc50b9bf43ebe3094e9279a12a18df0ce,backing_fmt=raw /var/lib/nova/instances/e52dbfff-7d40-42b9-98c0-efbf2c7e1f73/disk 1073741824" returned: 0 in 0.041s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 09:37:07 compute-1 nova_compute[189066]: 2025-12-05 09:37:07.793 189070 DEBUG oslo_concurrency.lockutils [None req-d2985bfd-2f0b-44b3-9098-4b8daa690b69 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Lock "5b1bdb5cc50b9bf43ebe3094e9279a12a18df0ce" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.144s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:37:07 compute-1 nova_compute[189066]: 2025-12-05 09:37:07.794 189070 DEBUG oslo_concurrency.processutils [None req-d2985bfd-2f0b-44b3-9098-4b8daa690b69 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5b1bdb5cc50b9bf43ebe3094e9279a12a18df0ce --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 09:37:07 compute-1 nova_compute[189066]: 2025-12-05 09:37:07.856 189070 DEBUG oslo_concurrency.processutils [None req-d2985bfd-2f0b-44b3-9098-4b8daa690b69 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5b1bdb5cc50b9bf43ebe3094e9279a12a18df0ce --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 09:37:07 compute-1 nova_compute[189066]: 2025-12-05 09:37:07.858 189070 DEBUG nova.virt.disk.api [None req-d2985bfd-2f0b-44b3-9098-4b8daa690b69 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Checking if we can resize image /var/lib/nova/instances/e52dbfff-7d40-42b9-98c0-efbf2c7e1f73/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Dec 05 09:37:07 compute-1 nova_compute[189066]: 2025-12-05 09:37:07.859 189070 DEBUG oslo_concurrency.processutils [None req-d2985bfd-2f0b-44b3-9098-4b8daa690b69 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e52dbfff-7d40-42b9-98c0-efbf2c7e1f73/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 09:37:07 compute-1 nova_compute[189066]: 2025-12-05 09:37:07.923 189070 DEBUG oslo_concurrency.processutils [None req-d2985bfd-2f0b-44b3-9098-4b8daa690b69 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e52dbfff-7d40-42b9-98c0-efbf2c7e1f73/disk --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 09:37:07 compute-1 nova_compute[189066]: 2025-12-05 09:37:07.924 189070 DEBUG nova.virt.disk.api [None req-d2985bfd-2f0b-44b3-9098-4b8daa690b69 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Cannot resize image /var/lib/nova/instances/e52dbfff-7d40-42b9-98c0-efbf2c7e1f73/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Dec 05 09:37:07 compute-1 nova_compute[189066]: 2025-12-05 09:37:07.925 189070 DEBUG nova.objects.instance [None req-d2985bfd-2f0b-44b3-9098-4b8daa690b69 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Lazy-loading 'migration_context' on Instance uuid e52dbfff-7d40-42b9-98c0-efbf2c7e1f73 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 09:37:07 compute-1 nova_compute[189066]: 2025-12-05 09:37:07.945 189070 DEBUG nova.virt.libvirt.driver [None req-d2985bfd-2f0b-44b3-9098-4b8daa690b69 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: e52dbfff-7d40-42b9-98c0-efbf2c7e1f73] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Dec 05 09:37:07 compute-1 nova_compute[189066]: 2025-12-05 09:37:07.946 189070 DEBUG nova.virt.libvirt.driver [None req-d2985bfd-2f0b-44b3-9098-4b8daa690b69 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: e52dbfff-7d40-42b9-98c0-efbf2c7e1f73] Ensure instance console log exists: /var/lib/nova/instances/e52dbfff-7d40-42b9-98c0-efbf2c7e1f73/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Dec 05 09:37:07 compute-1 nova_compute[189066]: 2025-12-05 09:37:07.946 189070 DEBUG oslo_concurrency.lockutils [None req-d2985bfd-2f0b-44b3-9098-4b8daa690b69 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:37:07 compute-1 nova_compute[189066]: 2025-12-05 09:37:07.947 189070 DEBUG oslo_concurrency.lockutils [None req-d2985bfd-2f0b-44b3-9098-4b8daa690b69 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:37:07 compute-1 nova_compute[189066]: 2025-12-05 09:37:07.947 189070 DEBUG oslo_concurrency.lockutils [None req-d2985bfd-2f0b-44b3-9098-4b8daa690b69 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:37:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:37:08.882 105272 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:37:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:37:08.883 105272 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:37:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:37:08.884 105272 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:37:09 compute-1 nova_compute[189066]: 2025-12-05 09:37:09.069 189070 DEBUG nova.network.neutron [None req-d2985bfd-2f0b-44b3-9098-4b8daa690b69 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: e52dbfff-7d40-42b9-98c0-efbf2c7e1f73] Successfully created port: d1c8b84c-6329-4f97-a961-f92de838a0e8 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Dec 05 09:37:09 compute-1 nova_compute[189066]: 2025-12-05 09:37:09.342 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:37:10 compute-1 nova_compute[189066]: 2025-12-05 09:37:10.111 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:37:10 compute-1 nova_compute[189066]: 2025-12-05 09:37:10.598 189070 DEBUG nova.network.neutron [None req-d2985bfd-2f0b-44b3-9098-4b8daa690b69 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: e52dbfff-7d40-42b9-98c0-efbf2c7e1f73] Successfully updated port: d1c8b84c-6329-4f97-a961-f92de838a0e8 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Dec 05 09:37:10 compute-1 nova_compute[189066]: 2025-12-05 09:37:10.631 189070 DEBUG oslo_concurrency.lockutils [None req-d2985bfd-2f0b-44b3-9098-4b8daa690b69 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Acquiring lock "refresh_cache-e52dbfff-7d40-42b9-98c0-efbf2c7e1f73" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 09:37:10 compute-1 nova_compute[189066]: 2025-12-05 09:37:10.632 189070 DEBUG oslo_concurrency.lockutils [None req-d2985bfd-2f0b-44b3-9098-4b8daa690b69 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Acquired lock "refresh_cache-e52dbfff-7d40-42b9-98c0-efbf2c7e1f73" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 09:37:10 compute-1 nova_compute[189066]: 2025-12-05 09:37:10.632 189070 DEBUG nova.network.neutron [None req-d2985bfd-2f0b-44b3-9098-4b8daa690b69 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: e52dbfff-7d40-42b9-98c0-efbf2c7e1f73] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.753 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'caf0a99c-b4d0-4fac-9883-ab0be359b528', 'name': 'tempest-TestNetworkBasicOps-server-1152892970', 'flavor': {'id': 'fbadeab4-f24f-4100-963a-d228b2a6f7c4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '3ebffd97-b242-42d7-b245-ebdaf8e4377c'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000022', 'OS-EXT-SRV-ATTR:host': 'compute-1.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': 'f918144b49634ed5a43d75f8f7d194d3', 'user_id': '822a2d80e80e46d9a5a49e1e9560e0d9', 'hostId': 'de0b1f9031062db40c4b75256bb63dbec51acf6831140f3e7dbaa32f', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.755 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.771 12 DEBUG ceilometer.compute.pollsters [-] caf0a99c-b4d0-4fac-9883-ab0be359b528/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.773 12 DEBUG ceilometer.compute.pollsters [-] caf0a99c-b4d0-4fac-9883-ab0be359b528/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.778 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'cd916d05-5db4-4a05-8cb4-557d76236824', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '822a2d80e80e46d9a5a49e1e9560e0d9', 'user_name': None, 'project_id': 'f918144b49634ed5a43d75f8f7d194d3', 'project_name': None, 'resource_id': 'caf0a99c-b4d0-4fac-9883-ab0be359b528-vda', 'timestamp': '2025-12-05T09:37:10.755817', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1152892970', 'name': 'instance-00000022', 'instance_id': 'caf0a99c-b4d0-4fac-9883-ab0be359b528', 'instance_type': 'm1.nano', 'host': 'de0b1f9031062db40c4b75256bb63dbec51acf6831140f3e7dbaa32f', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'fbadeab4-f24f-4100-963a-d228b2a6f7c4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3ebffd97-b242-42d7-b245-ebdaf8e4377c'}, 'image_ref': '3ebffd97-b242-42d7-b245-ebdaf8e4377c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'f9290d3a-d1bd-11f0-a2f3-fa163e9454b0', 'monotonic_time': 4703.813444043, 'message_signature': '7e0a2ee5109c3d227e476535c98694ae01d7d2fc6f2fd267b4f42b0b3294caa0'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '822a2d80e80e46d9a5a49e1e9560e0d9', 'user_name': None, 'project_id': 'f918144b49634ed5a43d75f8f7d194d3', 'project_name': None, 'resource_id': 'caf0a99c-b4d0-4fac-9883-ab0be359b528-sda', 'timestamp': '2025-12-05T09:37:10.755817', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1152892970', 'name': 'instance-00000022', 'instance_id': 'caf0a99c-b4d0-4fac-9883-ab0be359b528', 'instance_type': 'm1.nano', 'host': 'de0b1f9031062db40c4b75256bb63dbec51acf6831140f3e7dbaa32f', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'fbadeab4-f24f-4100-963a-d228b2a6f7c4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3ebffd97-b242-42d7-b245-ebdaf8e4377c'}, 'image_ref': '3ebffd97-b242-42d7-b245-ebdaf8e4377c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'f929266c-d1bd-11f0-a2f3-fa163e9454b0', 'monotonic_time': 4703.813444043, 'message_signature': '248ede7e82717ced3a7d7ce50f03960aed71fbae50f8251f8b7a58f479579bc5'}]}, 'timestamp': '2025-12-05 09:37:10.773478', '_unique_id': '3aa05fd683904101a1113a01d69ade0b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.778 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.778 12 ERROR oslo_messaging.notify.messaging     yield
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.778 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.778 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.778 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.778 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.778 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.778 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.778 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.778 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.778 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.778 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.778 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.778 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.778 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.778 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.778 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.778 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.778 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.778 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.778 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.778 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.778 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.778 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.778 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.778 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.778 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.778 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.778 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.778 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.778 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.780 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.803 12 DEBUG ceilometer.compute.pollsters [-] caf0a99c-b4d0-4fac-9883-ab0be359b528/disk.device.write.bytes volume: 73089024 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.804 12 DEBUG ceilometer.compute.pollsters [-] caf0a99c-b4d0-4fac-9883-ab0be359b528/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.806 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'fb4c012d-d0d6-47af-b197-20459eaa2800', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 73089024, 'user_id': '822a2d80e80e46d9a5a49e1e9560e0d9', 'user_name': None, 'project_id': 'f918144b49634ed5a43d75f8f7d194d3', 'project_name': None, 'resource_id': 'caf0a99c-b4d0-4fac-9883-ab0be359b528-vda', 'timestamp': '2025-12-05T09:37:10.780974', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1152892970', 'name': 'instance-00000022', 'instance_id': 'caf0a99c-b4d0-4fac-9883-ab0be359b528', 'instance_type': 'm1.nano', 'host': 'de0b1f9031062db40c4b75256bb63dbec51acf6831140f3e7dbaa32f', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'fbadeab4-f24f-4100-963a-d228b2a6f7c4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3ebffd97-b242-42d7-b245-ebdaf8e4377c'}, 'image_ref': '3ebffd97-b242-42d7-b245-ebdaf8e4377c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'f92de5ee-d1bd-11f0-a2f3-fa163e9454b0', 'monotonic_time': 4703.838549346, 'message_signature': '7fc21cd03d59014309ea2cb07f9443dbf4634868d96c313fd17337b3c04f748a'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '822a2d80e80e46d9a5a49e1e9560e0d9', 'user_name': None, 'project_id': 'f918144b49634ed5a43d75f8f7d194d3', 'project_name': None, 'resource_id': 'caf0a99c-b4d0-4fac-9883-ab0be359b528-sda', 'timestamp': '2025-12-05T09:37:10.780974', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1152892970', 'name': 'instance-00000022', 'instance_id': 'caf0a99c-b4d0-4fac-9883-ab0be359b528', 'instance_type': 'm1.nano', 'host': 'de0b1f9031062db40c4b75256bb63dbec51acf6831140f3e7dbaa32f', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'fbadeab4-f24f-4100-963a-d228b2a6f7c4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3ebffd97-b242-42d7-b245-ebdaf8e4377c'}, 'image_ref': '3ebffd97-b242-42d7-b245-ebdaf8e4377c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'f92df7b4-d1bd-11f0-a2f3-fa163e9454b0', 'monotonic_time': 4703.838549346, 'message_signature': '13b43b5c0c1219243b6525c5a3b5acc2feed5b97042ebb3725456f996597b17a'}]}, 'timestamp': '2025-12-05 09:37:10.805017', '_unique_id': '2d327ce695a04fe48ba1be4ad2e9323e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.806 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.806 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.806 12 ERROR oslo_messaging.notify.messaging     yield
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.806 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.806 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.806 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.806 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.806 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.806 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.806 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.806 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.806 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.806 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.806 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.806 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.806 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.806 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.806 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.806 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.806 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.806 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.806 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.806 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.806 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.806 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.806 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.806 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.806 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.806 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.806 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.806 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.806 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.806 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.806 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.806 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.806 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.806 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.806 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.806 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.806 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.806 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.806 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.806 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.806 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.806 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.806 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.806 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.806 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.806 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.806 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.806 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.806 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.806 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.806 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.807 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.807 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.807 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-TestNetworkBasicOps-server-1152892970>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkBasicOps-server-1152892970>]
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.807 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.811 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for caf0a99c-b4d0-4fac-9883-ab0be359b528 / tapdc081d90-d2 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.812 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for caf0a99c-b4d0-4fac-9883-ab0be359b528 / tap5f95901b-96 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.812 12 DEBUG ceilometer.compute.pollsters [-] caf0a99c-b4d0-4fac-9883-ab0be359b528/network.outgoing.bytes volume: 23940 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.812 12 DEBUG ceilometer.compute.pollsters [-] caf0a99c-b4d0-4fac-9883-ab0be359b528/network.outgoing.bytes volume: 1620 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.813 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '85f44d18-28dd-4069-abe3-3fa10e06c089', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 23940, 'user_id': '822a2d80e80e46d9a5a49e1e9560e0d9', 'user_name': None, 'project_id': 'f918144b49634ed5a43d75f8f7d194d3', 'project_name': None, 'resource_id': 'instance-00000022-caf0a99c-b4d0-4fac-9883-ab0be359b528-tapdc081d90-d2', 'timestamp': '2025-12-05T09:37:10.808030', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1152892970', 'name': 'tapdc081d90-d2', 'instance_id': 'caf0a99c-b4d0-4fac-9883-ab0be359b528', 'instance_type': 'm1.nano', 'host': 'de0b1f9031062db40c4b75256bb63dbec51acf6831140f3e7dbaa32f', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'fbadeab4-f24f-4100-963a-d228b2a6f7c4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3ebffd97-b242-42d7-b245-ebdaf8e4377c'}, 'image_ref': '3ebffd97-b242-42d7-b245-ebdaf8e4377c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:5a:50:59', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapdc081d90-d2'}, 'message_id': 'f92f204e-d1bd-11f0-a2f3-fa163e9454b0', 'monotonic_time': 4703.865595835, 'message_signature': '763117b7eb262c0669e7ef40576888b300e696842b350f03bda93e4452631141'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1620, 'user_id': '822a2d80e80e46d9a5a49e1e9560e0d9', 'user_name': None, 'project_id': 'f918144b49634ed5a43d75f8f7d194d3', 'project_name': None, 'resource_id': 'instance-00000022-caf0a99c-b4d0-4fac-9883-ab0be359b528-tap5f95901b-96', 'timestamp': '2025-12-05T09:37:10.808030', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1152892970', 'name': 'tap5f95901b-96', 'instance_id': 'caf0a99c-b4d0-4fac-9883-ab0be359b528', 'instance_type': 'm1.nano', 'host': 'de0b1f9031062db40c4b75256bb63dbec51acf6831140f3e7dbaa32f', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'fbadeab4-f24f-4100-963a-d228b2a6f7c4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3ebffd97-b242-42d7-b245-ebdaf8e4377c'}, 'image_ref': '3ebffd97-b242-42d7-b245-ebdaf8e4377c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:b9:90:da', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5f95901b-96'}, 'message_id': 'f92f2b16-d1bd-11f0-a2f3-fa163e9454b0', 'monotonic_time': 4703.865595835, 'message_signature': '5116f18137f7975245b170d1176ab874d30ab9b05176a833903f38f5cd24125c'}]}, 'timestamp': '2025-12-05 09:37:10.812865', '_unique_id': 'b19f949c2f744a54b2479e555512beb1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.813 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.813 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.813 12 ERROR oslo_messaging.notify.messaging     yield
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.813 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.813 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.813 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.813 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.813 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.813 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.813 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.813 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.813 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.813 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.813 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.813 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.813 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.813 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.813 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.813 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.813 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.813 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.813 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.813 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.813 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.813 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.813 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.813 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.813 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.813 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.813 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.813 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.813 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.813 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.813 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.813 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.813 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.813 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.813 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.813 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.813 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.813 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.813 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.813 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.813 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.813 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.813 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.813 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.813 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.813 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.813 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.813 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.813 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.813 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.813 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.814 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.815 12 DEBUG ceilometer.compute.pollsters [-] caf0a99c-b4d0-4fac-9883-ab0be359b528/disk.device.write.latency volume: 2168815615 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.815 12 DEBUG ceilometer.compute.pollsters [-] caf0a99c-b4d0-4fac-9883-ab0be359b528/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.816 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c099dc92-3908-4bc8-8055-eee7deabeb10', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 2168815615, 'user_id': '822a2d80e80e46d9a5a49e1e9560e0d9', 'user_name': None, 'project_id': 'f918144b49634ed5a43d75f8f7d194d3', 'project_name': None, 'resource_id': 'caf0a99c-b4d0-4fac-9883-ab0be359b528-vda', 'timestamp': '2025-12-05T09:37:10.815008', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1152892970', 'name': 'instance-00000022', 'instance_id': 'caf0a99c-b4d0-4fac-9883-ab0be359b528', 'instance_type': 'm1.nano', 'host': 'de0b1f9031062db40c4b75256bb63dbec51acf6831140f3e7dbaa32f', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'fbadeab4-f24f-4100-963a-d228b2a6f7c4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3ebffd97-b242-42d7-b245-ebdaf8e4377c'}, 'image_ref': '3ebffd97-b242-42d7-b245-ebdaf8e4377c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'f92f8854-d1bd-11f0-a2f3-fa163e9454b0', 'monotonic_time': 4703.838549346, 'message_signature': 'caa136981ddb386b2c175227c1345ee255b925484e0b9b656eb928acdc4f2af5'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '822a2d80e80e46d9a5a49e1e9560e0d9', 'user_name': None, 'project_id': 'f918144b49634ed5a43d75f8f7d194d3', 'project_name': None, 'resource_id': 'caf0a99c-b4d0-4fac-9883-ab0be359b528-sda', 'timestamp': '2025-12-05T09:37:10.815008', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1152892970', 'name': 'instance-00000022', 'instance_id': 'caf0a99c-b4d0-4fac-9883-ab0be359b528', 'instance_type': 'm1.nano', 'host': 'de0b1f9031062db40c4b75256bb63dbec51acf6831140f3e7dbaa32f', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'fbadeab4-f24f-4100-963a-d228b2a6f7c4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3ebffd97-b242-42d7-b245-ebdaf8e4377c'}, 'image_ref': '3ebffd97-b242-42d7-b245-ebdaf8e4377c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'f92f91aa-d1bd-11f0-a2f3-fa163e9454b0', 'monotonic_time': 4703.838549346, 'message_signature': 'e9285626322c56fa0eb3d72fe8c53a0807bd5400557ef29592f9561ec138d536'}]}, 'timestamp': '2025-12-05 09:37:10.815473', '_unique_id': '4c44df2c562b41d4a237758e59b08513'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.816 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.816 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.816 12 ERROR oslo_messaging.notify.messaging     yield
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.816 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.816 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.816 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.816 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.816 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.816 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.816 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.816 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.816 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.816 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.816 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.816 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.816 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.816 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.816 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.816 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.816 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.816 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.816 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.816 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.816 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.816 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.816 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.816 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.816 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.816 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.816 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.816 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.816 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.816 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.816 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.816 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.816 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.816 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.816 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.816 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.816 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.816 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.816 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.816 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.816 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.816 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.816 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.816 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.816 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.816 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.816 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.816 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.816 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.816 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.816 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.816 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.817 12 DEBUG ceilometer.compute.pollsters [-] caf0a99c-b4d0-4fac-9883-ab0be359b528/network.incoming.packets volume: 150 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.817 12 DEBUG ceilometer.compute.pollsters [-] caf0a99c-b4d0-4fac-9883-ab0be359b528/network.incoming.packets volume: 9 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.818 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'cff865a3-a54a-4a6e-a9a2-a61041995691', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 150, 'user_id': '822a2d80e80e46d9a5a49e1e9560e0d9', 'user_name': None, 'project_id': 'f918144b49634ed5a43d75f8f7d194d3', 'project_name': None, 'resource_id': 'instance-00000022-caf0a99c-b4d0-4fac-9883-ab0be359b528-tapdc081d90-d2', 'timestamp': '2025-12-05T09:37:10.817059', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1152892970', 'name': 'tapdc081d90-d2', 'instance_id': 'caf0a99c-b4d0-4fac-9883-ab0be359b528', 'instance_type': 'm1.nano', 'host': 'de0b1f9031062db40c4b75256bb63dbec51acf6831140f3e7dbaa32f', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'fbadeab4-f24f-4100-963a-d228b2a6f7c4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3ebffd97-b242-42d7-b245-ebdaf8e4377c'}, 'image_ref': '3ebffd97-b242-42d7-b245-ebdaf8e4377c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:5a:50:59', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapdc081d90-d2'}, 'message_id': 'f92fd9e4-d1bd-11f0-a2f3-fa163e9454b0', 'monotonic_time': 4703.865595835, 'message_signature': '64bfc51f1bda38d1e55cb8487b445e3177808422edbe079ed08cae28cb4bdaa5'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 9, 'user_id': '822a2d80e80e46d9a5a49e1e9560e0d9', 'user_name': None, 'project_id': 'f918144b49634ed5a43d75f8f7d194d3', 'project_name': None, 'resource_id': 'instance-00000022-caf0a99c-b4d0-4fac-9883-ab0be359b528-tap5f95901b-96', 'timestamp': '2025-12-05T09:37:10.817059', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1152892970', 'name': 'tap5f95901b-96', 'instance_id': 'caf0a99c-b4d0-4fac-9883-ab0be359b528', 'instance_type': 'm1.nano', 'host': 'de0b1f9031062db40c4b75256bb63dbec51acf6831140f3e7dbaa32f', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'fbadeab4-f24f-4100-963a-d228b2a6f7c4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3ebffd97-b242-42d7-b245-ebdaf8e4377c'}, 'image_ref': '3ebffd97-b242-42d7-b245-ebdaf8e4377c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:b9:90:da', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5f95901b-96'}, 'message_id': 'f92fe40c-d1bd-11f0-a2f3-fa163e9454b0', 'monotonic_time': 4703.865595835, 'message_signature': '2a336ed75cdeb28831b38319f8dff978568e7b6b8a68bd3706e95e97b3ff994f'}]}, 'timestamp': '2025-12-05 09:37:10.817625', '_unique_id': '5edbfec688c341b4b58b247e053d388a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.818 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.818 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.818 12 ERROR oslo_messaging.notify.messaging     yield
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.818 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.818 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.818 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.818 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.818 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.818 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.818 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.818 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.818 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.818 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.818 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.818 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.818 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.818 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.818 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.818 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.818 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.818 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.818 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.818 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.818 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.818 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.818 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.818 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.818 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.818 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.818 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.818 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.818 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.818 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.818 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.818 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.818 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.818 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.818 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.818 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.818 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.818 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.818 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.818 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.818 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.818 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.818 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.818 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.818 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.818 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.818 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.818 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.818 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.818 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.818 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.819 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.819 12 DEBUG ceilometer.compute.pollsters [-] caf0a99c-b4d0-4fac-9883-ab0be359b528/disk.device.usage volume: 30081024 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.819 12 DEBUG ceilometer.compute.pollsters [-] caf0a99c-b4d0-4fac-9883-ab0be359b528/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.820 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0f2d2581-b293-49e4-8cc5-39324b734897', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30081024, 'user_id': '822a2d80e80e46d9a5a49e1e9560e0d9', 'user_name': None, 'project_id': 'f918144b49634ed5a43d75f8f7d194d3', 'project_name': None, 'resource_id': 'caf0a99c-b4d0-4fac-9883-ab0be359b528-vda', 'timestamp': '2025-12-05T09:37:10.819181', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1152892970', 'name': 'instance-00000022', 'instance_id': 'caf0a99c-b4d0-4fac-9883-ab0be359b528', 'instance_type': 'm1.nano', 'host': 'de0b1f9031062db40c4b75256bb63dbec51acf6831140f3e7dbaa32f', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'fbadeab4-f24f-4100-963a-d228b2a6f7c4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3ebffd97-b242-42d7-b245-ebdaf8e4377c'}, 'image_ref': '3ebffd97-b242-42d7-b245-ebdaf8e4377c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'f9302db8-d1bd-11f0-a2f3-fa163e9454b0', 'monotonic_time': 4703.813444043, 'message_signature': '265c4605009457e67ceb433ceb60103d1a563a8d45dbc79bf905e0a5c99b9c28'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '822a2d80e80e46d9a5a49e1e9560e0d9', 'user_name': None, 'project_id': 'f918144b49634ed5a43d75f8f7d194d3', 'project_name': None, 'resource_id': 'caf0a99c-b4d0-4fac-9883-ab0be359b528-sda', 'timestamp': '2025-12-05T09:37:10.819181', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1152892970', 'name': 'instance-00000022', 'instance_id': 'caf0a99c-b4d0-4fac-9883-ab0be359b528', 'instance_type': 'm1.nano', 'host': 'de0b1f9031062db40c4b75256bb63dbec51acf6831140f3e7dbaa32f', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'fbadeab4-f24f-4100-963a-d228b2a6f7c4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3ebffd97-b242-42d7-b245-ebdaf8e4377c'}, 'image_ref': '3ebffd97-b242-42d7-b245-ebdaf8e4377c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'f93038ee-d1bd-11f0-a2f3-fa163e9454b0', 'monotonic_time': 4703.813444043, 'message_signature': 'ded9653e9fb85a970e44a23ee1ad5e15ab3e28b4cf1d533de4ddced86a149342'}]}, 'timestamp': '2025-12-05 09:37:10.819760', '_unique_id': 'eeb9a4c7df8845c5af4b8c37402532d3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.820 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.820 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.820 12 ERROR oslo_messaging.notify.messaging     yield
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.820 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.820 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.820 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.820 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.820 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.820 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.820 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.820 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.820 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.820 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.820 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.820 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.820 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.820 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.820 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.820 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.820 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.820 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.820 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.820 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.820 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.820 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.820 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.820 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.820 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.820 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.820 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.820 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.820 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.820 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.820 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.820 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.820 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.820 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.820 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.820 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.820 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.820 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.820 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.820 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.820 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.820 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.820 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.820 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.820 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.820 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.820 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.820 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.820 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.820 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.820 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.821 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.821 12 DEBUG ceilometer.compute.pollsters [-] caf0a99c-b4d0-4fac-9883-ab0be359b528/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.821 12 DEBUG ceilometer.compute.pollsters [-] caf0a99c-b4d0-4fac-9883-ab0be359b528/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.822 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b0f4e301-1659-45eb-a18a-735280a1ffa1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '822a2d80e80e46d9a5a49e1e9560e0d9', 'user_name': None, 'project_id': 'f918144b49634ed5a43d75f8f7d194d3', 'project_name': None, 'resource_id': 'instance-00000022-caf0a99c-b4d0-4fac-9883-ab0be359b528-tapdc081d90-d2', 'timestamp': '2025-12-05T09:37:10.821292', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1152892970', 'name': 'tapdc081d90-d2', 'instance_id': 'caf0a99c-b4d0-4fac-9883-ab0be359b528', 'instance_type': 'm1.nano', 'host': 'de0b1f9031062db40c4b75256bb63dbec51acf6831140f3e7dbaa32f', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'fbadeab4-f24f-4100-963a-d228b2a6f7c4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3ebffd97-b242-42d7-b245-ebdaf8e4377c'}, 'image_ref': '3ebffd97-b242-42d7-b245-ebdaf8e4377c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:5a:50:59', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapdc081d90-d2'}, 'message_id': 'f9307f98-d1bd-11f0-a2f3-fa163e9454b0', 'monotonic_time': 4703.865595835, 'message_signature': '747962a3d770d5a7607714beff197f774edd97c2fb5929214f34d35dfb707ce9'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '822a2d80e80e46d9a5a49e1e9560e0d9', 'user_name': None, 'project_id': 'f918144b49634ed5a43d75f8f7d194d3', 'project_name': None, 'resource_id': 'instance-00000022-caf0a99c-b4d0-4fac-9883-ab0be359b528-tap5f95901b-96', 'timestamp': '2025-12-05T09:37:10.821292', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1152892970', 'name': 'tap5f95901b-96', 'instance_id': 'caf0a99c-b4d0-4fac-9883-ab0be359b528', 'instance_type': 'm1.nano', 'host': 'de0b1f9031062db40c4b75256bb63dbec51acf6831140f3e7dbaa32f', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'fbadeab4-f24f-4100-963a-d228b2a6f7c4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3ebffd97-b242-42d7-b245-ebdaf8e4377c'}, 'image_ref': '3ebffd97-b242-42d7-b245-ebdaf8e4377c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:b9:90:da', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5f95901b-96'}, 'message_id': 'f9308bb4-d1bd-11f0-a2f3-fa163e9454b0', 'monotonic_time': 4703.865595835, 'message_signature': '4358a26a75909ef866a547da1e1b061153070517410cf6802abf545459476d05'}]}, 'timestamp': '2025-12-05 09:37:10.821886', '_unique_id': 'db29a5ebde85448db1607da11b6f1af9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.822 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.822 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.822 12 ERROR oslo_messaging.notify.messaging     yield
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.822 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.822 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.822 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.822 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.822 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.822 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.822 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.822 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.822 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.822 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.822 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.822 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.822 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.822 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.822 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.822 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.822 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.822 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.822 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.822 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.822 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.822 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.822 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.822 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.822 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.822 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.822 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.822 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.822 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.822 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.822 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.822 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.822 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.822 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.822 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.822 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.822 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.822 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.822 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.822 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.822 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.822 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.822 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.822 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.822 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.822 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.822 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.822 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.822 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.822 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.822 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.823 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.823 12 DEBUG ceilometer.compute.pollsters [-] caf0a99c-b4d0-4fac-9883-ab0be359b528/disk.device.read.bytes volume: 29575680 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.823 12 DEBUG ceilometer.compute.pollsters [-] caf0a99c-b4d0-4fac-9883-ab0be359b528/disk.device.read.bytes volume: 274750 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.824 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9cabf0d4-c6c3-4e2c-8c5c-230cea4d2ce0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 29575680, 'user_id': '822a2d80e80e46d9a5a49e1e9560e0d9', 'user_name': None, 'project_id': 'f918144b49634ed5a43d75f8f7d194d3', 'project_name': None, 'resource_id': 'caf0a99c-b4d0-4fac-9883-ab0be359b528-vda', 'timestamp': '2025-12-05T09:37:10.823421', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1152892970', 'name': 'instance-00000022', 'instance_id': 'caf0a99c-b4d0-4fac-9883-ab0be359b528', 'instance_type': 'm1.nano', 'host': 'de0b1f9031062db40c4b75256bb63dbec51acf6831140f3e7dbaa32f', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'fbadeab4-f24f-4100-963a-d228b2a6f7c4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3ebffd97-b242-42d7-b245-ebdaf8e4377c'}, 'image_ref': '3ebffd97-b242-42d7-b245-ebdaf8e4377c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'f930d47a-d1bd-11f0-a2f3-fa163e9454b0', 'monotonic_time': 4703.838549346, 'message_signature': '63d838e8d62aaf6aec9a8ce47c51851ffd99001021e230af1b72653613db442e'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 274750, 'user_id': '822a2d80e80e46d9a5a49e1e9560e0d9', 'user_name': None, 'project_id': 'f918144b49634ed5a43d75f8f7d194d3', 'project_name': None, 'resource_id': 'caf0a99c-b4d0-4fac-9883-ab0be359b528-sda', 'timestamp': '2025-12-05T09:37:10.823421', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1152892970', 'name': 'instance-00000022', 'instance_id': 'caf0a99c-b4d0-4fac-9883-ab0be359b528', 'instance_type': 'm1.nano', 'host': 'de0b1f9031062db40c4b75256bb63dbec51acf6831140f3e7dbaa32f', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'fbadeab4-f24f-4100-963a-d228b2a6f7c4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3ebffd97-b242-42d7-b245-ebdaf8e4377c'}, 'image_ref': '3ebffd97-b242-42d7-b245-ebdaf8e4377c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'f930e14a-d1bd-11f0-a2f3-fa163e9454b0', 'monotonic_time': 4703.838549346, 'message_signature': '375818f48429190a4d4d68b8ec8e578e06e7ae3c2fb7f8709f6fcfedafcd68cd'}]}, 'timestamp': '2025-12-05 09:37:10.824081', '_unique_id': '4e04a7da20b14bc49d6c7c8e5edab0e5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.824 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.824 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.824 12 ERROR oslo_messaging.notify.messaging     yield
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.824 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.824 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.824 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.824 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.824 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.824 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.824 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.824 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.824 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.824 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.824 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.824 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.824 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.824 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.824 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.824 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.824 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.824 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.824 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.824 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.824 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.824 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.824 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.824 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.824 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.824 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.824 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.824 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.824 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.824 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.824 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.824 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.824 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.824 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.824 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.824 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.824 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.824 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.824 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.824 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.824 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.824 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.824 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.824 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.824 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.824 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.824 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.824 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.824 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.824 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.824 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.825 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.842 12 DEBUG ceilometer.compute.pollsters [-] caf0a99c-b4d0-4fac-9883-ab0be359b528/memory.usage volume: 47.84765625 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.844 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9246be36-f08d-4ca1-9a70-cea8429de4e8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 47.84765625, 'user_id': '822a2d80e80e46d9a5a49e1e9560e0d9', 'user_name': None, 'project_id': 'f918144b49634ed5a43d75f8f7d194d3', 'project_name': None, 'resource_id': 'caf0a99c-b4d0-4fac-9883-ab0be359b528', 'timestamp': '2025-12-05T09:37:10.825731', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1152892970', 'name': 'instance-00000022', 'instance_id': 'caf0a99c-b4d0-4fac-9883-ab0be359b528', 'instance_type': 'm1.nano', 'host': 'de0b1f9031062db40c4b75256bb63dbec51acf6831140f3e7dbaa32f', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'fbadeab4-f24f-4100-963a-d228b2a6f7c4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3ebffd97-b242-42d7-b245-ebdaf8e4377c'}, 'image_ref': '3ebffd97-b242-42d7-b245-ebdaf8e4377c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': 'f933cfd6-d1bd-11f0-a2f3-fa163e9454b0', 'monotonic_time': 4703.900037425, 'message_signature': '82c0a360449584a51078064e8b33b6d4913886299f2ce22a4828e2d577782a50'}]}, 'timestamp': '2025-12-05 09:37:10.843453', '_unique_id': '0bed95e00e0743adb0fc80cbf8428529'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.844 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.844 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.844 12 ERROR oslo_messaging.notify.messaging     yield
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.844 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.844 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.844 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.844 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.844 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.844 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.844 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.844 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.844 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.844 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.844 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.844 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.844 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.844 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.844 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.844 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.844 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.844 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.844 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.844 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.844 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.844 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.844 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.844 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.844 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.844 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.844 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.844 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.844 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.844 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.844 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.844 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.844 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.844 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.844 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.844 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.844 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.844 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.844 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.844 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.844 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.844 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.844 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.844 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.844 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.844 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.844 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.844 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.844 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.844 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.844 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.845 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.845 12 DEBUG ceilometer.compute.pollsters [-] caf0a99c-b4d0-4fac-9883-ab0be359b528/network.incoming.bytes volume: 28195 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.846 12 DEBUG ceilometer.compute.pollsters [-] caf0a99c-b4d0-4fac-9883-ab0be359b528/network.incoming.bytes volume: 1330 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.847 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2af99766-6441-443f-a006-aaca71377c8e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 28195, 'user_id': '822a2d80e80e46d9a5a49e1e9560e0d9', 'user_name': None, 'project_id': 'f918144b49634ed5a43d75f8f7d194d3', 'project_name': None, 'resource_id': 'instance-00000022-caf0a99c-b4d0-4fac-9883-ab0be359b528-tapdc081d90-d2', 'timestamp': '2025-12-05T09:37:10.845932', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1152892970', 'name': 'tapdc081d90-d2', 'instance_id': 'caf0a99c-b4d0-4fac-9883-ab0be359b528', 'instance_type': 'm1.nano', 'host': 'de0b1f9031062db40c4b75256bb63dbec51acf6831140f3e7dbaa32f', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'fbadeab4-f24f-4100-963a-d228b2a6f7c4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3ebffd97-b242-42d7-b245-ebdaf8e4377c'}, 'image_ref': '3ebffd97-b242-42d7-b245-ebdaf8e4377c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:5a:50:59', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapdc081d90-d2'}, 'message_id': 'f93442b8-d1bd-11f0-a2f3-fa163e9454b0', 'monotonic_time': 4703.865595835, 'message_signature': '4f8cbe85a1f713647f5969303b356635388bb50e4942c7a3cfd312fffea5cd5c'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1330, 'user_id': '822a2d80e80e46d9a5a49e1e9560e0d9', 'user_name': None, 'project_id': 'f918144b49634ed5a43d75f8f7d194d3', 'project_name': None, 'resource_id': 'instance-00000022-caf0a99c-b4d0-4fac-9883-ab0be359b528-tap5f95901b-96', 'timestamp': '2025-12-05T09:37:10.845932', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1152892970', 'name': 'tap5f95901b-96', 'instance_id': 'caf0a99c-b4d0-4fac-9883-ab0be359b528', 'instance_type': 'm1.nano', 'host': 'de0b1f9031062db40c4b75256bb63dbec51acf6831140f3e7dbaa32f', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'fbadeab4-f24f-4100-963a-d228b2a6f7c4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3ebffd97-b242-42d7-b245-ebdaf8e4377c'}, 'image_ref': '3ebffd97-b242-42d7-b245-ebdaf8e4377c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:b9:90:da', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5f95901b-96'}, 'message_id': 'f9344d4e-d1bd-11f0-a2f3-fa163e9454b0', 'monotonic_time': 4703.865595835, 'message_signature': '2d6b4e0df28d5e0cf79ff2fc6ab46ff674b190ceef364e5377389988524338e7'}]}, 'timestamp': '2025-12-05 09:37:10.846568', '_unique_id': '5745a988a85e47649e86b9bc32124435'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.847 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.847 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.847 12 ERROR oslo_messaging.notify.messaging     yield
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.847 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.847 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.847 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.847 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.847 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.847 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.847 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.847 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.847 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.847 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.847 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.847 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.847 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.847 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.847 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.847 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.847 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.847 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.847 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.847 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.847 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.847 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.847 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.847 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.847 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.847 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.847 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.847 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.847 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.847 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.847 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.847 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.847 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.847 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.847 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.847 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.847 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.847 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.847 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.847 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.847 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.847 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.847 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.847 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.847 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.847 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.847 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.847 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.847 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.847 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.847 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.848 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.848 12 DEBUG ceilometer.compute.pollsters [-] caf0a99c-b4d0-4fac-9883-ab0be359b528/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.848 12 DEBUG ceilometer.compute.pollsters [-] caf0a99c-b4d0-4fac-9883-ab0be359b528/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.849 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1e86bd67-a1ce-4a8d-b910-38257333681b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '822a2d80e80e46d9a5a49e1e9560e0d9', 'user_name': None, 'project_id': 'f918144b49634ed5a43d75f8f7d194d3', 'project_name': None, 'resource_id': 'instance-00000022-caf0a99c-b4d0-4fac-9883-ab0be359b528-tapdc081d90-d2', 'timestamp': '2025-12-05T09:37:10.848396', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1152892970', 'name': 'tapdc081d90-d2', 'instance_id': 'caf0a99c-b4d0-4fac-9883-ab0be359b528', 'instance_type': 'm1.nano', 'host': 'de0b1f9031062db40c4b75256bb63dbec51acf6831140f3e7dbaa32f', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'fbadeab4-f24f-4100-963a-d228b2a6f7c4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3ebffd97-b242-42d7-b245-ebdaf8e4377c'}, 'image_ref': '3ebffd97-b242-42d7-b245-ebdaf8e4377c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:5a:50:59', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapdc081d90-d2'}, 'message_id': 'f934a2da-d1bd-11f0-a2f3-fa163e9454b0', 'monotonic_time': 4703.865595835, 'message_signature': '224e286a8b3a0040c4cf0fc1b049655994af5c71af2d65e8be6fa8fcc0e9a00b'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '822a2d80e80e46d9a5a49e1e9560e0d9', 'user_name': None, 'project_id': 'f918144b49634ed5a43d75f8f7d194d3', 'project_name': None, 'resource_id': 'instance-00000022-caf0a99c-b4d0-4fac-9883-ab0be359b528-tap5f95901b-96', 'timestamp': '2025-12-05T09:37:10.848396', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1152892970', 'name': 'tap5f95901b-96', 'instance_id': 'caf0a99c-b4d0-4fac-9883-ab0be359b528', 'instance_type': 'm1.nano', 'host': 'de0b1f9031062db40c4b75256bb63dbec51acf6831140f3e7dbaa32f', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'fbadeab4-f24f-4100-963a-d228b2a6f7c4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3ebffd97-b242-42d7-b245-ebdaf8e4377c'}, 'image_ref': '3ebffd97-b242-42d7-b245-ebdaf8e4377c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:b9:90:da', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5f95901b-96'}, 'message_id': 'f934ad5c-d1bd-11f0-a2f3-fa163e9454b0', 'monotonic_time': 4703.865595835, 'message_signature': '6fb46630c0eb0afecaeee6cde88166f1017cacab63712d52c4525cb2aa7bc322'}]}, 'timestamp': '2025-12-05 09:37:10.848964', '_unique_id': 'fa5d71bcdcf94aec82a0059bdccd1320'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.849 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.849 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.849 12 ERROR oslo_messaging.notify.messaging     yield
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.849 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.849 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.849 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.849 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.849 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.849 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.849 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.849 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.849 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.849 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.849 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.849 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.849 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.849 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.849 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.849 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.849 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.849 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.849 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.849 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.849 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.849 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.849 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.849 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.849 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.849 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.849 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.849 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.849 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.849 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.849 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.849 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.849 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.849 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.849 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.849 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.849 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.849 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.849 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.849 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.849 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.849 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.849 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.849 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.849 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.849 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.849 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.849 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.849 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.849 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.849 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.850 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.850 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.850 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-TestNetworkBasicOps-server-1152892970>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkBasicOps-server-1152892970>]
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.850 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.851 12 DEBUG ceilometer.compute.pollsters [-] caf0a99c-b4d0-4fac-9883-ab0be359b528/cpu volume: 13140000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.852 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c341888c-85ea-4c62-be10-875875550a95', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 13140000000, 'user_id': '822a2d80e80e46d9a5a49e1e9560e0d9', 'user_name': None, 'project_id': 'f918144b49634ed5a43d75f8f7d194d3', 'project_name': None, 'resource_id': 'caf0a99c-b4d0-4fac-9883-ab0be359b528', 'timestamp': '2025-12-05T09:37:10.851094', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1152892970', 'name': 'instance-00000022', 'instance_id': 'caf0a99c-b4d0-4fac-9883-ab0be359b528', 'instance_type': 'm1.nano', 'host': 'de0b1f9031062db40c4b75256bb63dbec51acf6831140f3e7dbaa32f', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'fbadeab4-f24f-4100-963a-d228b2a6f7c4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3ebffd97-b242-42d7-b245-ebdaf8e4377c'}, 'image_ref': '3ebffd97-b242-42d7-b245-ebdaf8e4377c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': 'f9350e78-d1bd-11f0-a2f3-fa163e9454b0', 'monotonic_time': 4703.900037425, 'message_signature': '7a4a955f08256dfc540363ff5cd77eaf7386c463c9721b92ca16164b342d1d07'}]}, 'timestamp': '2025-12-05 09:37:10.851455', '_unique_id': '59eb4129e6dc4c74ba4454ab3fdf6f88'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.852 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.852 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.852 12 ERROR oslo_messaging.notify.messaging     yield
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.852 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.852 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.852 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.852 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.852 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.852 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.852 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.852 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.852 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.852 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.852 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.852 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.852 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.852 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.852 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.852 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.852 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.852 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.852 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.852 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.852 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.852 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.852 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.852 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.852 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.852 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.852 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.852 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.852 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.852 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.852 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.852 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.852 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.852 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.852 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.852 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.852 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.852 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.852 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.852 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.852 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.852 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.852 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.852 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.852 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.852 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.852 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.852 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.852 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.852 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.852 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.852 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.853 12 DEBUG ceilometer.compute.pollsters [-] caf0a99c-b4d0-4fac-9883-ab0be359b528/disk.device.allocation volume: 31072256 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.853 12 DEBUG ceilometer.compute.pollsters [-] caf0a99c-b4d0-4fac-9883-ab0be359b528/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.854 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd4d9684b-2fe9-4db4-bdbd-106773176229', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 31072256, 'user_id': '822a2d80e80e46d9a5a49e1e9560e0d9', 'user_name': None, 'project_id': 'f918144b49634ed5a43d75f8f7d194d3', 'project_name': None, 'resource_id': 'caf0a99c-b4d0-4fac-9883-ab0be359b528-vda', 'timestamp': '2025-12-05T09:37:10.853062', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1152892970', 'name': 'instance-00000022', 'instance_id': 'caf0a99c-b4d0-4fac-9883-ab0be359b528', 'instance_type': 'm1.nano', 'host': 'de0b1f9031062db40c4b75256bb63dbec51acf6831140f3e7dbaa32f', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'fbadeab4-f24f-4100-963a-d228b2a6f7c4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3ebffd97-b242-42d7-b245-ebdaf8e4377c'}, 'image_ref': '3ebffd97-b242-42d7-b245-ebdaf8e4377c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'f9355838-d1bd-11f0-a2f3-fa163e9454b0', 'monotonic_time': 4703.813444043, 'message_signature': 'c9c3b6c23780f25a3276b7940a16714ea9b45d1de9184ee8ac3a24508e4eb115'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': '822a2d80e80e46d9a5a49e1e9560e0d9', 'user_name': None, 'project_id': 'f918144b49634ed5a43d75f8f7d194d3', 'project_name': None, 'resource_id': 'caf0a99c-b4d0-4fac-9883-ab0be359b528-sda', 'timestamp': '2025-12-05T09:37:10.853062', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1152892970', 'name': 'instance-00000022', 'instance_id': 'caf0a99c-b4d0-4fac-9883-ab0be359b528', 'instance_type': 'm1.nano', 'host': 'de0b1f9031062db40c4b75256bb63dbec51acf6831140f3e7dbaa32f', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'fbadeab4-f24f-4100-963a-d228b2a6f7c4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3ebffd97-b242-42d7-b245-ebdaf8e4377c'}, 'image_ref': '3ebffd97-b242-42d7-b245-ebdaf8e4377c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'f93561fc-d1bd-11f0-a2f3-fa163e9454b0', 'monotonic_time': 4703.813444043, 'message_signature': '0dd8434cd6162999803ee976247da540a17bf25fcf1f108c6a1634f2d2f372a9'}]}, 'timestamp': '2025-12-05 09:37:10.853610', '_unique_id': '1f80b4ab6b464f3e9fbc31ea8bd2e4ab'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.854 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.854 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.854 12 ERROR oslo_messaging.notify.messaging     yield
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.854 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.854 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.854 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.854 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.854 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.854 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.854 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.854 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.854 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.854 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.854 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.854 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.854 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.854 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.854 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.854 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.854 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.854 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.854 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.854 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.854 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.854 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.854 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.854 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.854 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.854 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.854 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.854 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.854 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.854 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.854 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.854 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.854 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.854 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.854 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.854 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.854 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.854 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.854 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.854 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.854 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.854 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.854 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.854 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.854 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.854 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.854 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.854 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.854 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.854 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.854 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.855 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.855 12 DEBUG ceilometer.compute.pollsters [-] caf0a99c-b4d0-4fac-9883-ab0be359b528/disk.device.read.requests volume: 1065 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.855 12 DEBUG ceilometer.compute.pollsters [-] caf0a99c-b4d0-4fac-9883-ab0be359b528/disk.device.read.requests volume: 108 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.856 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4f086599-2a8e-4cdb-a4b2-569780ef11ac', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1065, 'user_id': '822a2d80e80e46d9a5a49e1e9560e0d9', 'user_name': None, 'project_id': 'f918144b49634ed5a43d75f8f7d194d3', 'project_name': None, 'resource_id': 'caf0a99c-b4d0-4fac-9883-ab0be359b528-vda', 'timestamp': '2025-12-05T09:37:10.855144', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1152892970', 'name': 'instance-00000022', 'instance_id': 'caf0a99c-b4d0-4fac-9883-ab0be359b528', 'instance_type': 'm1.nano', 'host': 'de0b1f9031062db40c4b75256bb63dbec51acf6831140f3e7dbaa32f', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'fbadeab4-f24f-4100-963a-d228b2a6f7c4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3ebffd97-b242-42d7-b245-ebdaf8e4377c'}, 'image_ref': '3ebffd97-b242-42d7-b245-ebdaf8e4377c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'f935ab3a-d1bd-11f0-a2f3-fa163e9454b0', 'monotonic_time': 4703.838549346, 'message_signature': '1788c2ed1c3f75aacab5be36fd244feb967c3005e095e97809e745bfb8ee04ca'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 108, 'user_id': '822a2d80e80e46d9a5a49e1e9560e0d9', 'user_name': None, 'project_id': 'f918144b49634ed5a43d75f8f7d194d3', 'project_name': None, 'resource_id': 'caf0a99c-b4d0-4fac-9883-ab0be359b528-sda', 'timestamp': '2025-12-05T09:37:10.855144', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1152892970', 'name': 'instance-00000022', 'instance_id': 'caf0a99c-b4d0-4fac-9883-ab0be359b528', 'instance_type': 'm1.nano', 'host': 'de0b1f9031062db40c4b75256bb63dbec51acf6831140f3e7dbaa32f', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'fbadeab4-f24f-4100-963a-d228b2a6f7c4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3ebffd97-b242-42d7-b245-ebdaf8e4377c'}, 'image_ref': '3ebffd97-b242-42d7-b245-ebdaf8e4377c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'f935b6de-d1bd-11f0-a2f3-fa163e9454b0', 'monotonic_time': 4703.838549346, 'message_signature': '58a67b825677d15820ac8210da8fe1ead2581da9cf39e6a89f156897c5fe3f24'}]}, 'timestamp': '2025-12-05 09:37:10.855756', '_unique_id': '03089609373247d4b7e403c768bc36a8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.856 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.856 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.856 12 ERROR oslo_messaging.notify.messaging     yield
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.856 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.856 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.856 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.856 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.856 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.856 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.856 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.856 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.856 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.856 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.856 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.856 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.856 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.856 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.856 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.856 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.856 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.856 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.856 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.856 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.856 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.856 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.856 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.856 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.856 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.856 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.856 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.856 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.856 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.856 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.856 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.856 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.856 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.856 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.856 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.856 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.856 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.856 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.856 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.856 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.856 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.856 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.856 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.856 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.856 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.856 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.856 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.856 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.856 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.856 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.856 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.857 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.857 12 DEBUG ceilometer.compute.pollsters [-] caf0a99c-b4d0-4fac-9883-ab0be359b528/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.857 12 DEBUG ceilometer.compute.pollsters [-] caf0a99c-b4d0-4fac-9883-ab0be359b528/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.858 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e5548936-91f9-4818-80f2-02afdcf23510', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '822a2d80e80e46d9a5a49e1e9560e0d9', 'user_name': None, 'project_id': 'f918144b49634ed5a43d75f8f7d194d3', 'project_name': None, 'resource_id': 'instance-00000022-caf0a99c-b4d0-4fac-9883-ab0be359b528-tapdc081d90-d2', 'timestamp': '2025-12-05T09:37:10.857396', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1152892970', 'name': 'tapdc081d90-d2', 'instance_id': 'caf0a99c-b4d0-4fac-9883-ab0be359b528', 'instance_type': 'm1.nano', 'host': 'de0b1f9031062db40c4b75256bb63dbec51acf6831140f3e7dbaa32f', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'fbadeab4-f24f-4100-963a-d228b2a6f7c4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3ebffd97-b242-42d7-b245-ebdaf8e4377c'}, 'image_ref': '3ebffd97-b242-42d7-b245-ebdaf8e4377c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:5a:50:59', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapdc081d90-d2'}, 'message_id': 'f936026a-d1bd-11f0-a2f3-fa163e9454b0', 'monotonic_time': 4703.865595835, 'message_signature': 'c7d5d3d781146e94ee0761d8dbaef580619bceea4ec32054f934ebbad0599d73'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '822a2d80e80e46d9a5a49e1e9560e0d9', 'user_name': None, 'project_id': 'f918144b49634ed5a43d75f8f7d194d3', 'project_name': None, 'resource_id': 'instance-00000022-caf0a99c-b4d0-4fac-9883-ab0be359b528-tap5f95901b-96', 'timestamp': '2025-12-05T09:37:10.857396', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1152892970', 'name': 'tap5f95901b-96', 'instance_id': 'caf0a99c-b4d0-4fac-9883-ab0be359b528', 'instance_type': 'm1.nano', 'host': 'de0b1f9031062db40c4b75256bb63dbec51acf6831140f3e7dbaa32f', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'fbadeab4-f24f-4100-963a-d228b2a6f7c4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3ebffd97-b242-42d7-b245-ebdaf8e4377c'}, 'image_ref': '3ebffd97-b242-42d7-b245-ebdaf8e4377c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:b9:90:da', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5f95901b-96'}, 'message_id': 'f9360cba-d1bd-11f0-a2f3-fa163e9454b0', 'monotonic_time': 4703.865595835, 'message_signature': 'd60b1638bc87e7378296f451c177ac43ef6d4e4ec6c7b89e8a8a633802c5077e'}]}, 'timestamp': '2025-12-05 09:37:10.857957', '_unique_id': 'aca03e7e1845434ea7d9dc2c185d1fbb'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.858 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.858 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.858 12 ERROR oslo_messaging.notify.messaging     yield
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.858 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.858 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.858 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.858 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.858 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.858 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.858 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.858 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.858 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.858 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.858 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.858 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.858 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.858 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.858 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.858 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.858 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.858 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.858 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.858 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.858 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.858 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.858 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.858 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.858 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.858 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.858 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.858 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.858 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.858 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.858 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.858 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.858 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.858 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.858 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.858 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.858 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.858 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.858 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.858 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.858 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.858 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.858 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.858 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.858 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.858 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.858 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.858 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.858 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.858 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.858 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.859 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.859 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Dec 05 09:37:10 compute-1 nova_compute[189066]: 2025-12-05 09:37:10.859 189070 DEBUG nova.network.neutron [None req-d2985bfd-2f0b-44b3-9098-4b8daa690b69 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: e52dbfff-7d40-42b9-98c0-efbf2c7e1f73] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.859 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-TestNetworkBasicOps-server-1152892970>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkBasicOps-server-1152892970>]
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.860 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.860 12 DEBUG ceilometer.compute.pollsters [-] caf0a99c-b4d0-4fac-9883-ab0be359b528/disk.device.read.latency volume: 192136762 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.860 12 DEBUG ceilometer.compute.pollsters [-] caf0a99c-b4d0-4fac-9883-ab0be359b528/disk.device.read.latency volume: 27277555 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.861 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'fbd9815e-d3a3-4af6-b932-3fe1bfc24b62', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 192136762, 'user_id': '822a2d80e80e46d9a5a49e1e9560e0d9', 'user_name': None, 'project_id': 'f918144b49634ed5a43d75f8f7d194d3', 'project_name': None, 'resource_id': 'caf0a99c-b4d0-4fac-9883-ab0be359b528-vda', 'timestamp': '2025-12-05T09:37:10.860186', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1152892970', 'name': 'instance-00000022', 'instance_id': 'caf0a99c-b4d0-4fac-9883-ab0be359b528', 'instance_type': 'm1.nano', 'host': 'de0b1f9031062db40c4b75256bb63dbec51acf6831140f3e7dbaa32f', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'fbadeab4-f24f-4100-963a-d228b2a6f7c4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3ebffd97-b242-42d7-b245-ebdaf8e4377c'}, 'image_ref': '3ebffd97-b242-42d7-b245-ebdaf8e4377c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'f9367010-d1bd-11f0-a2f3-fa163e9454b0', 'monotonic_time': 4703.838549346, 'message_signature': '80759bc4c3e9dd87f0aca89c2822b4a9d9f01b2ca96f63534c0f10a09568750b'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 27277555, 'user_id': '822a2d80e80e46d9a5a49e1e9560e0d9', 'user_name': None, 'project_id': 'f918144b49634ed5a43d75f8f7d194d3', 'project_name': None, 'resource_id': 'caf0a99c-b4d0-4fac-9883-ab0be359b528-sda', 'timestamp': '2025-12-05T09:37:10.860186', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1152892970', 'name': 'instance-00000022', 'instance_id': 'caf0a99c-b4d0-4fac-9883-ab0be359b528', 'instance_type': 'm1.nano', 'host': 'de0b1f9031062db40c4b75256bb63dbec51acf6831140f3e7dbaa32f', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'fbadeab4-f24f-4100-963a-d228b2a6f7c4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3ebffd97-b242-42d7-b245-ebdaf8e4377c'}, 'image_ref': '3ebffd97-b242-42d7-b245-ebdaf8e4377c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'f9367dbc-d1bd-11f0-a2f3-fa163e9454b0', 'monotonic_time': 4703.838549346, 'message_signature': '004380a52c5f6089021854be065014b3dea623b037ac6bba78558b4b4385a3c6'}]}, 'timestamp': '2025-12-05 09:37:10.860883', '_unique_id': 'e852c2d6d49d4addafa20514b986dbc0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.861 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.861 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.861 12 ERROR oslo_messaging.notify.messaging     yield
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.861 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.861 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.861 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.861 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.861 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.861 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.861 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.861 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.861 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.861 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.861 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.861 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.861 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.861 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.861 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.861 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.861 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.861 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.861 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.861 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.861 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.861 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.861 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.861 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.861 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.861 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.861 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.861 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.861 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.861 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.861 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.861 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.861 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.861 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.861 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.861 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.861 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.861 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.861 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.861 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.861 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.861 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.861 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.861 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.861 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.861 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.861 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.861 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.861 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.861 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.861 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.862 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.862 12 DEBUG ceilometer.compute.pollsters [-] caf0a99c-b4d0-4fac-9883-ab0be359b528/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.862 12 DEBUG ceilometer.compute.pollsters [-] caf0a99c-b4d0-4fac-9883-ab0be359b528/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.863 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '364fd57f-967b-412f-ace0-ea573ada83b2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '822a2d80e80e46d9a5a49e1e9560e0d9', 'user_name': None, 'project_id': 'f918144b49634ed5a43d75f8f7d194d3', 'project_name': None, 'resource_id': 'instance-00000022-caf0a99c-b4d0-4fac-9883-ab0be359b528-tapdc081d90-d2', 'timestamp': '2025-12-05T09:37:10.862516', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1152892970', 'name': 'tapdc081d90-d2', 'instance_id': 'caf0a99c-b4d0-4fac-9883-ab0be359b528', 'instance_type': 'm1.nano', 'host': 'de0b1f9031062db40c4b75256bb63dbec51acf6831140f3e7dbaa32f', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'fbadeab4-f24f-4100-963a-d228b2a6f7c4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3ebffd97-b242-42d7-b245-ebdaf8e4377c'}, 'image_ref': '3ebffd97-b242-42d7-b245-ebdaf8e4377c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:5a:50:59', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapdc081d90-d2'}, 'message_id': 'f936cace-d1bd-11f0-a2f3-fa163e9454b0', 'monotonic_time': 4703.865595835, 'message_signature': '150a81e704bcc744d5af63c421d8e3a4f9ec6feb6635820c7fa4df38172849c5'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '822a2d80e80e46d9a5a49e1e9560e0d9', 'user_name': None, 'project_id': 'f918144b49634ed5a43d75f8f7d194d3', 'project_name': None, 'resource_id': 'instance-00000022-caf0a99c-b4d0-4fac-9883-ab0be359b528-tap5f95901b-96', 'timestamp': '2025-12-05T09:37:10.862516', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1152892970', 'name': 'tap5f95901b-96', 'instance_id': 'caf0a99c-b4d0-4fac-9883-ab0be359b528', 'instance_type': 'm1.nano', 'host': 'de0b1f9031062db40c4b75256bb63dbec51acf6831140f3e7dbaa32f', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'fbadeab4-f24f-4100-963a-d228b2a6f7c4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3ebffd97-b242-42d7-b245-ebdaf8e4377c'}, 'image_ref': '3ebffd97-b242-42d7-b245-ebdaf8e4377c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:b9:90:da', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5f95901b-96'}, 'message_id': 'f936d4ba-d1bd-11f0-a2f3-fa163e9454b0', 'monotonic_time': 4703.865595835, 'message_signature': '735e02da9929d7068f999e35bbea55e3fdb67d16a51c693019b1e26ec97117c9'}]}, 'timestamp': '2025-12-05 09:37:10.863078', '_unique_id': 'd06ca574566b4c66859c6ec678d85dd0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.863 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.863 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.863 12 ERROR oslo_messaging.notify.messaging     yield
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.863 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.863 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.863 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.863 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.863 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.863 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.863 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.863 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.863 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.863 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.863 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.863 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.863 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.863 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.863 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.863 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.863 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.863 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.863 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.863 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.863 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.863 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.863 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.863 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.863 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.863 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.863 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.863 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.863 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.863 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.863 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.863 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.863 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.863 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.863 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.863 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.863 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.863 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.863 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.863 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.863 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.863 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.863 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.863 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.863 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.863 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.863 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.863 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.863 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.863 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.863 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.864 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.864 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.864 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-TestNetworkBasicOps-server-1152892970>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkBasicOps-server-1152892970>]
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.864 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.865 12 DEBUG ceilometer.compute.pollsters [-] caf0a99c-b4d0-4fac-9883-ab0be359b528/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.865 12 DEBUG ceilometer.compute.pollsters [-] caf0a99c-b4d0-4fac-9883-ab0be359b528/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.866 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9a2acab7-e8ad-4e5c-8cb0-54b09cbb1947', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '822a2d80e80e46d9a5a49e1e9560e0d9', 'user_name': None, 'project_id': 'f918144b49634ed5a43d75f8f7d194d3', 'project_name': None, 'resource_id': 'instance-00000022-caf0a99c-b4d0-4fac-9883-ab0be359b528-tapdc081d90-d2', 'timestamp': '2025-12-05T09:37:10.865032', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1152892970', 'name': 'tapdc081d90-d2', 'instance_id': 'caf0a99c-b4d0-4fac-9883-ab0be359b528', 'instance_type': 'm1.nano', 'host': 'de0b1f9031062db40c4b75256bb63dbec51acf6831140f3e7dbaa32f', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'fbadeab4-f24f-4100-963a-d228b2a6f7c4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3ebffd97-b242-42d7-b245-ebdaf8e4377c'}, 'image_ref': '3ebffd97-b242-42d7-b245-ebdaf8e4377c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:5a:50:59', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapdc081d90-d2'}, 'message_id': 'f9372b90-d1bd-11f0-a2f3-fa163e9454b0', 'monotonic_time': 4703.865595835, 'message_signature': 'f01adb610991713ad5262bc7022bc7543ba174c41037c31a6ec43ca16e23538c'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '822a2d80e80e46d9a5a49e1e9560e0d9', 'user_name': None, 'project_id': 'f918144b49634ed5a43d75f8f7d194d3', 'project_name': None, 'resource_id': 'instance-00000022-caf0a99c-b4d0-4fac-9883-ab0be359b528-tap5f95901b-96', 'timestamp': '2025-12-05T09:37:10.865032', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1152892970', 'name': 'tap5f95901b-96', 'instance_id': 'caf0a99c-b4d0-4fac-9883-ab0be359b528', 'instance_type': 'm1.nano', 'host': 'de0b1f9031062db40c4b75256bb63dbec51acf6831140f3e7dbaa32f', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'fbadeab4-f24f-4100-963a-d228b2a6f7c4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3ebffd97-b242-42d7-b245-ebdaf8e4377c'}, 'image_ref': '3ebffd97-b242-42d7-b245-ebdaf8e4377c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:b9:90:da', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5f95901b-96'}, 'message_id': 'f9373572-d1bd-11f0-a2f3-fa163e9454b0', 'monotonic_time': 4703.865595835, 'message_signature': 'bbd8dc444f615591481bbec8d7e5aab51d87ad5fde4254f78500c2e6456f847a'}]}, 'timestamp': '2025-12-05 09:37:10.865569', '_unique_id': '29820bbbf03a43169a5fddd7c2b83eec'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.866 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.866 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.866 12 ERROR oslo_messaging.notify.messaging     yield
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.866 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.866 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.866 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.866 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.866 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.866 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.866 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.866 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.866 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.866 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.866 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.866 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.866 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.866 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.866 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.866 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.866 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.866 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.866 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.866 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.866 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.866 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.866 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.866 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.866 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.866 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.866 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.866 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.866 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.866 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.866 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.866 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.866 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.866 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.866 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.866 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.866 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.866 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.866 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.866 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.866 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.866 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.866 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.866 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.866 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.866 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.866 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.866 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.866 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.866 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.866 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.866 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.867 12 DEBUG ceilometer.compute.pollsters [-] caf0a99c-b4d0-4fac-9883-ab0be359b528/network.outgoing.packets volume: 151 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.867 12 DEBUG ceilometer.compute.pollsters [-] caf0a99c-b4d0-4fac-9883-ab0be359b528/network.outgoing.packets volume: 16 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.868 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '69713c53-2a4d-4f80-82d8-176b2d78cd90', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 151, 'user_id': '822a2d80e80e46d9a5a49e1e9560e0d9', 'user_name': None, 'project_id': 'f918144b49634ed5a43d75f8f7d194d3', 'project_name': None, 'resource_id': 'instance-00000022-caf0a99c-b4d0-4fac-9883-ab0be359b528-tapdc081d90-d2', 'timestamp': '2025-12-05T09:37:10.867041', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1152892970', 'name': 'tapdc081d90-d2', 'instance_id': 'caf0a99c-b4d0-4fac-9883-ab0be359b528', 'instance_type': 'm1.nano', 'host': 'de0b1f9031062db40c4b75256bb63dbec51acf6831140f3e7dbaa32f', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'fbadeab4-f24f-4100-963a-d228b2a6f7c4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3ebffd97-b242-42d7-b245-ebdaf8e4377c'}, 'image_ref': '3ebffd97-b242-42d7-b245-ebdaf8e4377c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:5a:50:59', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapdc081d90-d2'}, 'message_id': 'f9377adc-d1bd-11f0-a2f3-fa163e9454b0', 'monotonic_time': 4703.865595835, 'message_signature': 'e66fa70c89742b145c42d8fc33a6fbce6b3bb6febef03bf37fa868e8b3ed4d12'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 16, 'user_id': '822a2d80e80e46d9a5a49e1e9560e0d9', 'user_name': None, 'project_id': 'f918144b49634ed5a43d75f8f7d194d3', 'project_name': None, 'resource_id': 'instance-00000022-caf0a99c-b4d0-4fac-9883-ab0be359b528-tap5f95901b-96', 'timestamp': '2025-12-05T09:37:10.867041', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1152892970', 'name': 'tap5f95901b-96', 'instance_id': 'caf0a99c-b4d0-4fac-9883-ab0be359b528', 'instance_type': 'm1.nano', 'host': 'de0b1f9031062db40c4b75256bb63dbec51acf6831140f3e7dbaa32f', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'fbadeab4-f24f-4100-963a-d228b2a6f7c4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3ebffd97-b242-42d7-b245-ebdaf8e4377c'}, 'image_ref': '3ebffd97-b242-42d7-b245-ebdaf8e4377c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:b9:90:da', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5f95901b-96'}, 'message_id': 'f93784e6-d1bd-11f0-a2f3-fa163e9454b0', 'monotonic_time': 4703.865595835, 'message_signature': '39e66cce09d67c1088cabf0ca1b740fd2e24edd9d1f27cbf3fa32cd469562ac8'}]}, 'timestamp': '2025-12-05 09:37:10.867610', '_unique_id': '23c4da9fc0d048598cfd6a084ea79b91'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.868 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.868 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.868 12 ERROR oslo_messaging.notify.messaging     yield
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.868 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.868 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.868 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.868 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.868 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.868 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.868 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.868 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.868 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.868 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.868 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.868 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.868 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.868 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.868 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.868 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.868 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.868 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.868 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.868 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.868 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.868 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.868 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.868 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.868 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.868 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.868 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.868 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.868 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.868 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.868 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.868 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.868 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.868 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.868 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.868 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.868 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.868 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.868 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.868 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.868 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.868 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.868 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.868 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.868 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.868 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.868 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.868 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.868 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.868 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.868 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.868 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.869 12 DEBUG ceilometer.compute.pollsters [-] caf0a99c-b4d0-4fac-9883-ab0be359b528/disk.device.write.requests volume: 323 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.869 12 DEBUG ceilometer.compute.pollsters [-] caf0a99c-b4d0-4fac-9883-ab0be359b528/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.870 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5a393ea4-6650-4dff-997b-9d0a2ca14437', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 323, 'user_id': '822a2d80e80e46d9a5a49e1e9560e0d9', 'user_name': None, 'project_id': 'f918144b49634ed5a43d75f8f7d194d3', 'project_name': None, 'resource_id': 'caf0a99c-b4d0-4fac-9883-ab0be359b528-vda', 'timestamp': '2025-12-05T09:37:10.869104', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1152892970', 'name': 'instance-00000022', 'instance_id': 'caf0a99c-b4d0-4fac-9883-ab0be359b528', 'instance_type': 'm1.nano', 'host': 'de0b1f9031062db40c4b75256bb63dbec51acf6831140f3e7dbaa32f', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'fbadeab4-f24f-4100-963a-d228b2a6f7c4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3ebffd97-b242-42d7-b245-ebdaf8e4377c'}, 'image_ref': '3ebffd97-b242-42d7-b245-ebdaf8e4377c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'f937cac8-d1bd-11f0-a2f3-fa163e9454b0', 'monotonic_time': 4703.838549346, 'message_signature': '1379de02d8af1e71c85dfad17a2a92d4a32ae57fd7d5a3b56757ba1533ef48b7'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '822a2d80e80e46d9a5a49e1e9560e0d9', 'user_name': None, 'project_id': 'f918144b49634ed5a43d75f8f7d194d3', 'project_name': None, 'resource_id': 'caf0a99c-b4d0-4fac-9883-ab0be359b528-sda', 'timestamp': '2025-12-05T09:37:10.869104', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1152892970', 'name': 'instance-00000022', 'instance_id': 'caf0a99c-b4d0-4fac-9883-ab0be359b528', 'instance_type': 'm1.nano', 'host': 'de0b1f9031062db40c4b75256bb63dbec51acf6831140f3e7dbaa32f', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'fbadeab4-f24f-4100-963a-d228b2a6f7c4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3ebffd97-b242-42d7-b245-ebdaf8e4377c'}, 'image_ref': '3ebffd97-b242-42d7-b245-ebdaf8e4377c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'f937d45a-d1bd-11f0-a2f3-fa163e9454b0', 'monotonic_time': 4703.838549346, 'message_signature': '3381c17298fa6c7f260ce62bc45a6ed25c5799ec52b4844db49836c4157e64c2'}]}, 'timestamp': '2025-12-05 09:37:10.869629', '_unique_id': '1b483c5cebfc492ab55fe5c294fdfc17'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.870 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.870 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.870 12 ERROR oslo_messaging.notify.messaging     yield
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.870 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.870 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.870 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.870 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.870 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.870 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.870 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.870 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.870 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.870 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.870 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.870 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.870 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.870 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.870 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.870 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.870 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.870 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.870 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.870 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.870 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.870 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.870 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.870 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.870 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.870 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.870 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.870 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.870 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.870 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.870 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.870 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.870 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.870 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.870 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.870 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.870 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.870 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.870 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.870 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.870 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.870 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.870 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.870 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.870 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.870 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.870 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.870 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.870 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.870 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.870 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.871 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.871 12 DEBUG ceilometer.compute.pollsters [-] caf0a99c-b4d0-4fac-9883-ab0be359b528/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.871 12 DEBUG ceilometer.compute.pollsters [-] caf0a99c-b4d0-4fac-9883-ab0be359b528/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.872 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c40f32a3-8e19-4761-b9f1-99d715b6bdb2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '822a2d80e80e46d9a5a49e1e9560e0d9', 'user_name': None, 'project_id': 'f918144b49634ed5a43d75f8f7d194d3', 'project_name': None, 'resource_id': 'instance-00000022-caf0a99c-b4d0-4fac-9883-ab0be359b528-tapdc081d90-d2', 'timestamp': '2025-12-05T09:37:10.871308', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1152892970', 'name': 'tapdc081d90-d2', 'instance_id': 'caf0a99c-b4d0-4fac-9883-ab0be359b528', 'instance_type': 'm1.nano', 'host': 'de0b1f9031062db40c4b75256bb63dbec51acf6831140f3e7dbaa32f', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'fbadeab4-f24f-4100-963a-d228b2a6f7c4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3ebffd97-b242-42d7-b245-ebdaf8e4377c'}, 'image_ref': '3ebffd97-b242-42d7-b245-ebdaf8e4377c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:5a:50:59', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapdc081d90-d2'}, 'message_id': 'f9382158-d1bd-11f0-a2f3-fa163e9454b0', 'monotonic_time': 4703.865595835, 'message_signature': 'ce678a231b148cefb1fbe8202294444eb0a0241ba642997d7e7e8c981900128d'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '822a2d80e80e46d9a5a49e1e9560e0d9', 'user_name': None, 'project_id': 'f918144b49634ed5a43d75f8f7d194d3', 'project_name': None, 'resource_id': 'instance-00000022-caf0a99c-b4d0-4fac-9883-ab0be359b528-tap5f95901b-96', 'timestamp': '2025-12-05T09:37:10.871308', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1152892970', 'name': 'tap5f95901b-96', 'instance_id': 'caf0a99c-b4d0-4fac-9883-ab0be359b528', 'instance_type': 'm1.nano', 'host': 'de0b1f9031062db40c4b75256bb63dbec51acf6831140f3e7dbaa32f', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'fbadeab4-f24f-4100-963a-d228b2a6f7c4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3ebffd97-b242-42d7-b245-ebdaf8e4377c'}, 'image_ref': '3ebffd97-b242-42d7-b245-ebdaf8e4377c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:b9:90:da', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5f95901b-96'}, 'message_id': 'f9382d88-d1bd-11f0-a2f3-fa163e9454b0', 'monotonic_time': 4703.865595835, 'message_signature': '7a9573156aeaaaa1ba1630848e8e0ab0d0625f4f424abf01e22af806189d4e55'}]}, 'timestamp': '2025-12-05 09:37:10.871903', '_unique_id': '6e3ef5dddef7420ea153c2a012888b6d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.872 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.872 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.872 12 ERROR oslo_messaging.notify.messaging     yield
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.872 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.872 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.872 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.872 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.872 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.872 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.872 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.872 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.872 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.872 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.872 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.872 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.872 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.872 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.872 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.872 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.872 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.872 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.872 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.872 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.872 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.872 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.872 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.872 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.872 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.872 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.872 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.872 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.872 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.872 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.872 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.872 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.872 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.872 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.872 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.872 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.872 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.872 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.872 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.872 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.872 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.872 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.872 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.872 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.872 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.872 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.872 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.872 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.872 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.872 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 09:37:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:37:10.872 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:37:11 compute-1 nova_compute[189066]: 2025-12-05 09:37:11.993 189070 DEBUG nova.compute.manager [req-319cd26c-4bc0-4f20-be7f-ccaab6a8eaa9 req-5d988715-3349-4877-b99e-23832633aff4 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: e52dbfff-7d40-42b9-98c0-efbf2c7e1f73] Received event network-changed-d1c8b84c-6329-4f97-a961-f92de838a0e8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 09:37:11 compute-1 nova_compute[189066]: 2025-12-05 09:37:11.994 189070 DEBUG nova.compute.manager [req-319cd26c-4bc0-4f20-be7f-ccaab6a8eaa9 req-5d988715-3349-4877-b99e-23832633aff4 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: e52dbfff-7d40-42b9-98c0-efbf2c7e1f73] Refreshing instance network info cache due to event network-changed-d1c8b84c-6329-4f97-a961-f92de838a0e8. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Dec 05 09:37:11 compute-1 nova_compute[189066]: 2025-12-05 09:37:11.994 189070 DEBUG oslo_concurrency.lockutils [req-319cd26c-4bc0-4f20-be7f-ccaab6a8eaa9 req-5d988715-3349-4877-b99e-23832633aff4 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Acquiring lock "refresh_cache-e52dbfff-7d40-42b9-98c0-efbf2c7e1f73" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 09:37:12 compute-1 nova_compute[189066]: 2025-12-05 09:37:12.561 189070 DEBUG nova.network.neutron [None req-d2985bfd-2f0b-44b3-9098-4b8daa690b69 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: e52dbfff-7d40-42b9-98c0-efbf2c7e1f73] Updating instance_info_cache with network_info: [{"id": "d1c8b84c-6329-4f97-a961-f92de838a0e8", "address": "fa:16:3e:60:cb:db", "network": {"id": "28997311-91ec-41eb-8d69-3ce8625aacb4", "bridge": "br-int", "label": "tempest-network-smoke--2657185", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.30", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f918144b49634ed5a43d75f8f7d194d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd1c8b84c-63", "ovs_interfaceid": "d1c8b84c-6329-4f97-a961-f92de838a0e8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 09:37:12 compute-1 nova_compute[189066]: 2025-12-05 09:37:12.613 189070 DEBUG oslo_concurrency.lockutils [None req-d2985bfd-2f0b-44b3-9098-4b8daa690b69 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Releasing lock "refresh_cache-e52dbfff-7d40-42b9-98c0-efbf2c7e1f73" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 09:37:12 compute-1 nova_compute[189066]: 2025-12-05 09:37:12.613 189070 DEBUG nova.compute.manager [None req-d2985bfd-2f0b-44b3-9098-4b8daa690b69 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: e52dbfff-7d40-42b9-98c0-efbf2c7e1f73] Instance network_info: |[{"id": "d1c8b84c-6329-4f97-a961-f92de838a0e8", "address": "fa:16:3e:60:cb:db", "network": {"id": "28997311-91ec-41eb-8d69-3ce8625aacb4", "bridge": "br-int", "label": "tempest-network-smoke--2657185", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.30", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f918144b49634ed5a43d75f8f7d194d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd1c8b84c-63", "ovs_interfaceid": "d1c8b84c-6329-4f97-a961-f92de838a0e8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Dec 05 09:37:12 compute-1 nova_compute[189066]: 2025-12-05 09:37:12.613 189070 DEBUG oslo_concurrency.lockutils [req-319cd26c-4bc0-4f20-be7f-ccaab6a8eaa9 req-5d988715-3349-4877-b99e-23832633aff4 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Acquired lock "refresh_cache-e52dbfff-7d40-42b9-98c0-efbf2c7e1f73" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 09:37:12 compute-1 nova_compute[189066]: 2025-12-05 09:37:12.614 189070 DEBUG nova.network.neutron [req-319cd26c-4bc0-4f20-be7f-ccaab6a8eaa9 req-5d988715-3349-4877-b99e-23832633aff4 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: e52dbfff-7d40-42b9-98c0-efbf2c7e1f73] Refreshing network info cache for port d1c8b84c-6329-4f97-a961-f92de838a0e8 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Dec 05 09:37:12 compute-1 nova_compute[189066]: 2025-12-05 09:37:12.617 189070 DEBUG nova.virt.libvirt.driver [None req-d2985bfd-2f0b-44b3-9098-4b8daa690b69 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: e52dbfff-7d40-42b9-98c0-efbf2c7e1f73] Start _get_guest_xml network_info=[{"id": "d1c8b84c-6329-4f97-a961-f92de838a0e8", "address": "fa:16:3e:60:cb:db", "network": {"id": "28997311-91ec-41eb-8d69-3ce8625aacb4", "bridge": "br-int", "label": "tempest-network-smoke--2657185", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.30", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f918144b49634ed5a43d75f8f7d194d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd1c8b84c-63", "ovs_interfaceid": "d1c8b84c-6329-4f97-a961-f92de838a0e8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T09:19:04Z,direct_url=<?>,disk_format='qcow2',id=3ebffd97-b242-42d7-b245-ebdaf8e4377c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='1cb29f3743454b7fb71af92993455437',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T09:19:06Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encrypted': False, 'encryption_secret_uuid': None, 'size': 0, 'boot_index': 0, 'device_name': '/dev/vda', 'disk_bus': 'virtio', 'guest_format': None, 'encryption_options': None, 'encryption_format': None, 'device_type': 'disk', 'image_id': '3ebffd97-b242-42d7-b245-ebdaf8e4377c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Dec 05 09:37:12 compute-1 nova_compute[189066]: 2025-12-05 09:37:12.622 189070 WARNING nova.virt.libvirt.driver [None req-d2985bfd-2f0b-44b3-9098-4b8daa690b69 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 05 09:37:12 compute-1 nova_compute[189066]: 2025-12-05 09:37:12.629 189070 DEBUG nova.virt.libvirt.host [None req-d2985bfd-2f0b-44b3-9098-4b8daa690b69 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Dec 05 09:37:12 compute-1 nova_compute[189066]: 2025-12-05 09:37:12.630 189070 DEBUG nova.virt.libvirt.host [None req-d2985bfd-2f0b-44b3-9098-4b8daa690b69 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Dec 05 09:37:12 compute-1 podman[229313]: 2025-12-05 09:37:12.64687746 +0000 UTC m=+0.083482886 container health_status 550b604b3b40c506829669f3af0f3e8dc4e7ac01464222ce7f26998708fe1a96 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Dec 05 09:37:12 compute-1 nova_compute[189066]: 2025-12-05 09:37:12.657 189070 DEBUG nova.virt.libvirt.host [None req-d2985bfd-2f0b-44b3-9098-4b8daa690b69 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Dec 05 09:37:12 compute-1 nova_compute[189066]: 2025-12-05 09:37:12.658 189070 DEBUG nova.virt.libvirt.host [None req-d2985bfd-2f0b-44b3-9098-4b8daa690b69 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Dec 05 09:37:12 compute-1 nova_compute[189066]: 2025-12-05 09:37:12.660 189070 DEBUG nova.virt.libvirt.driver [None req-d2985bfd-2f0b-44b3-9098-4b8daa690b69 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Dec 05 09:37:12 compute-1 nova_compute[189066]: 2025-12-05 09:37:12.660 189070 DEBUG nova.virt.hardware [None req-d2985bfd-2f0b-44b3-9098-4b8daa690b69 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-05T09:19:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='fbadeab4-f24f-4100-963a-d228b2a6f7c4',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T09:19:04Z,direct_url=<?>,disk_format='qcow2',id=3ebffd97-b242-42d7-b245-ebdaf8e4377c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='1cb29f3743454b7fb71af92993455437',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T09:19:06Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Dec 05 09:37:12 compute-1 nova_compute[189066]: 2025-12-05 09:37:12.660 189070 DEBUG nova.virt.hardware [None req-d2985bfd-2f0b-44b3-9098-4b8daa690b69 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Dec 05 09:37:12 compute-1 nova_compute[189066]: 2025-12-05 09:37:12.660 189070 DEBUG nova.virt.hardware [None req-d2985bfd-2f0b-44b3-9098-4b8daa690b69 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Dec 05 09:37:12 compute-1 nova_compute[189066]: 2025-12-05 09:37:12.660 189070 DEBUG nova.virt.hardware [None req-d2985bfd-2f0b-44b3-9098-4b8daa690b69 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Dec 05 09:37:12 compute-1 nova_compute[189066]: 2025-12-05 09:37:12.661 189070 DEBUG nova.virt.hardware [None req-d2985bfd-2f0b-44b3-9098-4b8daa690b69 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Dec 05 09:37:12 compute-1 nova_compute[189066]: 2025-12-05 09:37:12.661 189070 DEBUG nova.virt.hardware [None req-d2985bfd-2f0b-44b3-9098-4b8daa690b69 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Dec 05 09:37:12 compute-1 nova_compute[189066]: 2025-12-05 09:37:12.661 189070 DEBUG nova.virt.hardware [None req-d2985bfd-2f0b-44b3-9098-4b8daa690b69 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Dec 05 09:37:12 compute-1 nova_compute[189066]: 2025-12-05 09:37:12.661 189070 DEBUG nova.virt.hardware [None req-d2985bfd-2f0b-44b3-9098-4b8daa690b69 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Dec 05 09:37:12 compute-1 nova_compute[189066]: 2025-12-05 09:37:12.661 189070 DEBUG nova.virt.hardware [None req-d2985bfd-2f0b-44b3-9098-4b8daa690b69 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Dec 05 09:37:12 compute-1 nova_compute[189066]: 2025-12-05 09:37:12.661 189070 DEBUG nova.virt.hardware [None req-d2985bfd-2f0b-44b3-9098-4b8daa690b69 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Dec 05 09:37:12 compute-1 nova_compute[189066]: 2025-12-05 09:37:12.662 189070 DEBUG nova.virt.hardware [None req-d2985bfd-2f0b-44b3-9098-4b8daa690b69 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Dec 05 09:37:12 compute-1 nova_compute[189066]: 2025-12-05 09:37:12.665 189070 DEBUG nova.virt.libvirt.vif [None req-d2985bfd-2f0b-44b3-9098-4b8daa690b69 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T09:37:05Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1271291293',display_name='tempest-TestNetworkBasicOps-server-1271291293',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1271291293',id=37,image_ref='3ebffd97-b242-42d7-b245-ebdaf8e4377c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBCav6h9q6cQCtydy6fN3pqy63ChTHOaaGTonpqkFfpDVoHbnkUPZN1pcIIG4bFlZC/XU1xrWdoTW8YYzeVALsPEHsG1RbeGovqMrSlnxhaDobsygDgDTJKMvArd6D7J6Mw==',key_name='tempest-TestNetworkBasicOps-1935241973',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='f918144b49634ed5a43d75f8f7d194d3',ramdisk_id='',reservation_id='r-cqmdqu5h',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='3ebffd97-b242-42d7-b245-ebdaf8e4377c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1341993432',owner_user_name='tempest-TestNetworkBasicOps-1341993432-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T09:37:07Z,user_data=None,user_id='822a2d80e80e46d9a5a49e1e9560e0d9',uuid=e52dbfff-7d40-42b9-98c0-efbf2c7e1f73,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "d1c8b84c-6329-4f97-a961-f92de838a0e8", "address": "fa:16:3e:60:cb:db", "network": {"id": "28997311-91ec-41eb-8d69-3ce8625aacb4", "bridge": "br-int", "label": "tempest-network-smoke--2657185", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.30", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f918144b49634ed5a43d75f8f7d194d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd1c8b84c-63", "ovs_interfaceid": "d1c8b84c-6329-4f97-a961-f92de838a0e8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Dec 05 09:37:12 compute-1 nova_compute[189066]: 2025-12-05 09:37:12.666 189070 DEBUG nova.network.os_vif_util [None req-d2985bfd-2f0b-44b3-9098-4b8daa690b69 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Converting VIF {"id": "d1c8b84c-6329-4f97-a961-f92de838a0e8", "address": "fa:16:3e:60:cb:db", "network": {"id": "28997311-91ec-41eb-8d69-3ce8625aacb4", "bridge": "br-int", "label": "tempest-network-smoke--2657185", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.30", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f918144b49634ed5a43d75f8f7d194d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd1c8b84c-63", "ovs_interfaceid": "d1c8b84c-6329-4f97-a961-f92de838a0e8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 09:37:12 compute-1 nova_compute[189066]: 2025-12-05 09:37:12.666 189070 DEBUG nova.network.os_vif_util [None req-d2985bfd-2f0b-44b3-9098-4b8daa690b69 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:60:cb:db,bridge_name='br-int',has_traffic_filtering=True,id=d1c8b84c-6329-4f97-a961-f92de838a0e8,network=Network(28997311-91ec-41eb-8d69-3ce8625aacb4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd1c8b84c-63') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 09:37:12 compute-1 nova_compute[189066]: 2025-12-05 09:37:12.667 189070 DEBUG nova.objects.instance [None req-d2985bfd-2f0b-44b3-9098-4b8daa690b69 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Lazy-loading 'pci_devices' on Instance uuid e52dbfff-7d40-42b9-98c0-efbf2c7e1f73 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 09:37:12 compute-1 nova_compute[189066]: 2025-12-05 09:37:12.737 189070 DEBUG nova.virt.libvirt.driver [None req-d2985bfd-2f0b-44b3-9098-4b8daa690b69 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: e52dbfff-7d40-42b9-98c0-efbf2c7e1f73] End _get_guest_xml xml=<domain type="kvm">
Dec 05 09:37:12 compute-1 nova_compute[189066]:   <uuid>e52dbfff-7d40-42b9-98c0-efbf2c7e1f73</uuid>
Dec 05 09:37:12 compute-1 nova_compute[189066]:   <name>instance-00000025</name>
Dec 05 09:37:12 compute-1 nova_compute[189066]:   <memory>131072</memory>
Dec 05 09:37:12 compute-1 nova_compute[189066]:   <vcpu>1</vcpu>
Dec 05 09:37:12 compute-1 nova_compute[189066]:   <metadata>
Dec 05 09:37:12 compute-1 nova_compute[189066]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 05 09:37:12 compute-1 nova_compute[189066]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec 05 09:37:12 compute-1 nova_compute[189066]:       <nova:name>tempest-TestNetworkBasicOps-server-1271291293</nova:name>
Dec 05 09:37:12 compute-1 nova_compute[189066]:       <nova:creationTime>2025-12-05 09:37:12</nova:creationTime>
Dec 05 09:37:12 compute-1 nova_compute[189066]:       <nova:flavor name="m1.nano">
Dec 05 09:37:12 compute-1 nova_compute[189066]:         <nova:memory>128</nova:memory>
Dec 05 09:37:12 compute-1 nova_compute[189066]:         <nova:disk>1</nova:disk>
Dec 05 09:37:12 compute-1 nova_compute[189066]:         <nova:swap>0</nova:swap>
Dec 05 09:37:12 compute-1 nova_compute[189066]:         <nova:ephemeral>0</nova:ephemeral>
Dec 05 09:37:12 compute-1 nova_compute[189066]:         <nova:vcpus>1</nova:vcpus>
Dec 05 09:37:12 compute-1 nova_compute[189066]:       </nova:flavor>
Dec 05 09:37:12 compute-1 nova_compute[189066]:       <nova:owner>
Dec 05 09:37:12 compute-1 nova_compute[189066]:         <nova:user uuid="822a2d80e80e46d9a5a49e1e9560e0d9">tempest-TestNetworkBasicOps-1341993432-project-member</nova:user>
Dec 05 09:37:12 compute-1 nova_compute[189066]:         <nova:project uuid="f918144b49634ed5a43d75f8f7d194d3">tempest-TestNetworkBasicOps-1341993432</nova:project>
Dec 05 09:37:12 compute-1 nova_compute[189066]:       </nova:owner>
Dec 05 09:37:12 compute-1 nova_compute[189066]:       <nova:root type="image" uuid="3ebffd97-b242-42d7-b245-ebdaf8e4377c"/>
Dec 05 09:37:12 compute-1 nova_compute[189066]:       <nova:ports>
Dec 05 09:37:12 compute-1 nova_compute[189066]:         <nova:port uuid="d1c8b84c-6329-4f97-a961-f92de838a0e8">
Dec 05 09:37:12 compute-1 nova_compute[189066]:           <nova:ip type="fixed" address="10.100.0.30" ipVersion="4"/>
Dec 05 09:37:12 compute-1 nova_compute[189066]:         </nova:port>
Dec 05 09:37:12 compute-1 nova_compute[189066]:       </nova:ports>
Dec 05 09:37:12 compute-1 nova_compute[189066]:     </nova:instance>
Dec 05 09:37:12 compute-1 nova_compute[189066]:   </metadata>
Dec 05 09:37:12 compute-1 nova_compute[189066]:   <sysinfo type="smbios">
Dec 05 09:37:12 compute-1 nova_compute[189066]:     <system>
Dec 05 09:37:12 compute-1 nova_compute[189066]:       <entry name="manufacturer">RDO</entry>
Dec 05 09:37:12 compute-1 nova_compute[189066]:       <entry name="product">OpenStack Compute</entry>
Dec 05 09:37:12 compute-1 nova_compute[189066]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec 05 09:37:12 compute-1 nova_compute[189066]:       <entry name="serial">e52dbfff-7d40-42b9-98c0-efbf2c7e1f73</entry>
Dec 05 09:37:12 compute-1 nova_compute[189066]:       <entry name="uuid">e52dbfff-7d40-42b9-98c0-efbf2c7e1f73</entry>
Dec 05 09:37:12 compute-1 nova_compute[189066]:       <entry name="family">Virtual Machine</entry>
Dec 05 09:37:12 compute-1 nova_compute[189066]:     </system>
Dec 05 09:37:12 compute-1 nova_compute[189066]:   </sysinfo>
Dec 05 09:37:12 compute-1 nova_compute[189066]:   <os>
Dec 05 09:37:12 compute-1 nova_compute[189066]:     <type arch="x86_64" machine="q35">hvm</type>
Dec 05 09:37:12 compute-1 nova_compute[189066]:     <boot dev="hd"/>
Dec 05 09:37:12 compute-1 nova_compute[189066]:     <smbios mode="sysinfo"/>
Dec 05 09:37:12 compute-1 nova_compute[189066]:   </os>
Dec 05 09:37:12 compute-1 nova_compute[189066]:   <features>
Dec 05 09:37:12 compute-1 nova_compute[189066]:     <acpi/>
Dec 05 09:37:12 compute-1 nova_compute[189066]:     <apic/>
Dec 05 09:37:12 compute-1 nova_compute[189066]:     <vmcoreinfo/>
Dec 05 09:37:12 compute-1 nova_compute[189066]:   </features>
Dec 05 09:37:12 compute-1 nova_compute[189066]:   <clock offset="utc">
Dec 05 09:37:12 compute-1 nova_compute[189066]:     <timer name="pit" tickpolicy="delay"/>
Dec 05 09:37:12 compute-1 nova_compute[189066]:     <timer name="rtc" tickpolicy="catchup"/>
Dec 05 09:37:12 compute-1 nova_compute[189066]:     <timer name="hpet" present="no"/>
Dec 05 09:37:12 compute-1 nova_compute[189066]:   </clock>
Dec 05 09:37:12 compute-1 nova_compute[189066]:   <cpu mode="custom" match="exact">
Dec 05 09:37:12 compute-1 nova_compute[189066]:     <model>Nehalem</model>
Dec 05 09:37:12 compute-1 nova_compute[189066]:     <topology sockets="1" cores="1" threads="1"/>
Dec 05 09:37:12 compute-1 nova_compute[189066]:   </cpu>
Dec 05 09:37:12 compute-1 nova_compute[189066]:   <devices>
Dec 05 09:37:12 compute-1 nova_compute[189066]:     <disk type="file" device="disk">
Dec 05 09:37:12 compute-1 nova_compute[189066]:       <driver name="qemu" type="qcow2" cache="none"/>
Dec 05 09:37:12 compute-1 nova_compute[189066]:       <source file="/var/lib/nova/instances/e52dbfff-7d40-42b9-98c0-efbf2c7e1f73/disk"/>
Dec 05 09:37:12 compute-1 nova_compute[189066]:       <target dev="vda" bus="virtio"/>
Dec 05 09:37:12 compute-1 nova_compute[189066]:     </disk>
Dec 05 09:37:12 compute-1 nova_compute[189066]:     <disk type="file" device="cdrom">
Dec 05 09:37:12 compute-1 nova_compute[189066]:       <driver name="qemu" type="raw" cache="none"/>
Dec 05 09:37:12 compute-1 nova_compute[189066]:       <source file="/var/lib/nova/instances/e52dbfff-7d40-42b9-98c0-efbf2c7e1f73/disk.config"/>
Dec 05 09:37:12 compute-1 nova_compute[189066]:       <target dev="sda" bus="sata"/>
Dec 05 09:37:12 compute-1 nova_compute[189066]:     </disk>
Dec 05 09:37:12 compute-1 nova_compute[189066]:     <interface type="ethernet">
Dec 05 09:37:12 compute-1 nova_compute[189066]:       <mac address="fa:16:3e:60:cb:db"/>
Dec 05 09:37:12 compute-1 nova_compute[189066]:       <model type="virtio"/>
Dec 05 09:37:12 compute-1 nova_compute[189066]:       <driver name="vhost" rx_queue_size="512"/>
Dec 05 09:37:12 compute-1 nova_compute[189066]:       <mtu size="1442"/>
Dec 05 09:37:12 compute-1 nova_compute[189066]:       <target dev="tapd1c8b84c-63"/>
Dec 05 09:37:12 compute-1 nova_compute[189066]:     </interface>
Dec 05 09:37:12 compute-1 nova_compute[189066]:     <serial type="pty">
Dec 05 09:37:12 compute-1 nova_compute[189066]:       <log file="/var/lib/nova/instances/e52dbfff-7d40-42b9-98c0-efbf2c7e1f73/console.log" append="off"/>
Dec 05 09:37:12 compute-1 nova_compute[189066]:     </serial>
Dec 05 09:37:12 compute-1 nova_compute[189066]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Dec 05 09:37:12 compute-1 nova_compute[189066]:     <video>
Dec 05 09:37:12 compute-1 nova_compute[189066]:       <model type="virtio"/>
Dec 05 09:37:12 compute-1 nova_compute[189066]:     </video>
Dec 05 09:37:12 compute-1 nova_compute[189066]:     <input type="tablet" bus="usb"/>
Dec 05 09:37:12 compute-1 nova_compute[189066]:     <rng model="virtio">
Dec 05 09:37:12 compute-1 nova_compute[189066]:       <backend model="random">/dev/urandom</backend>
Dec 05 09:37:12 compute-1 nova_compute[189066]:     </rng>
Dec 05 09:37:12 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root"/>
Dec 05 09:37:12 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:37:12 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:37:12 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:37:12 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:37:12 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:37:12 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:37:12 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:37:12 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:37:12 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:37:12 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:37:12 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:37:12 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:37:12 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:37:12 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:37:12 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:37:12 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:37:12 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:37:12 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:37:12 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:37:12 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:37:12 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:37:12 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:37:12 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:37:12 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:37:12 compute-1 nova_compute[189066]:     <controller type="usb" index="0"/>
Dec 05 09:37:12 compute-1 nova_compute[189066]:     <memballoon model="virtio">
Dec 05 09:37:12 compute-1 nova_compute[189066]:       <stats period="10"/>
Dec 05 09:37:12 compute-1 nova_compute[189066]:     </memballoon>
Dec 05 09:37:12 compute-1 nova_compute[189066]:   </devices>
Dec 05 09:37:12 compute-1 nova_compute[189066]: </domain>
Dec 05 09:37:12 compute-1 nova_compute[189066]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Dec 05 09:37:12 compute-1 nova_compute[189066]: 2025-12-05 09:37:12.737 189070 DEBUG nova.compute.manager [None req-d2985bfd-2f0b-44b3-9098-4b8daa690b69 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: e52dbfff-7d40-42b9-98c0-efbf2c7e1f73] Preparing to wait for external event network-vif-plugged-d1c8b84c-6329-4f97-a961-f92de838a0e8 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Dec 05 09:37:12 compute-1 nova_compute[189066]: 2025-12-05 09:37:12.738 189070 DEBUG oslo_concurrency.lockutils [None req-d2985bfd-2f0b-44b3-9098-4b8daa690b69 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Acquiring lock "e52dbfff-7d40-42b9-98c0-efbf2c7e1f73-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:37:12 compute-1 nova_compute[189066]: 2025-12-05 09:37:12.738 189070 DEBUG oslo_concurrency.lockutils [None req-d2985bfd-2f0b-44b3-9098-4b8daa690b69 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Lock "e52dbfff-7d40-42b9-98c0-efbf2c7e1f73-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:37:12 compute-1 nova_compute[189066]: 2025-12-05 09:37:12.738 189070 DEBUG oslo_concurrency.lockutils [None req-d2985bfd-2f0b-44b3-9098-4b8daa690b69 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Lock "e52dbfff-7d40-42b9-98c0-efbf2c7e1f73-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:37:12 compute-1 nova_compute[189066]: 2025-12-05 09:37:12.739 189070 DEBUG nova.virt.libvirt.vif [None req-d2985bfd-2f0b-44b3-9098-4b8daa690b69 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T09:37:05Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1271291293',display_name='tempest-TestNetworkBasicOps-server-1271291293',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1271291293',id=37,image_ref='3ebffd97-b242-42d7-b245-ebdaf8e4377c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBCav6h9q6cQCtydy6fN3pqy63ChTHOaaGTonpqkFfpDVoHbnkUPZN1pcIIG4bFlZC/XU1xrWdoTW8YYzeVALsPEHsG1RbeGovqMrSlnxhaDobsygDgDTJKMvArd6D7J6Mw==',key_name='tempest-TestNetworkBasicOps-1935241973',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='f918144b49634ed5a43d75f8f7d194d3',ramdisk_id='',reservation_id='r-cqmdqu5h',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='3ebffd97-b242-42d7-b245-ebdaf8e4377c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1341993432',owner_user_name='tempest-TestNetworkBasicOps-1341993432-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T09:37:07Z,user_data=None,user_id='822a2d80e80e46d9a5a49e1e9560e0d9',uuid=e52dbfff-7d40-42b9-98c0-efbf2c7e1f73,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "d1c8b84c-6329-4f97-a961-f92de838a0e8", "address": "fa:16:3e:60:cb:db", "network": {"id": "28997311-91ec-41eb-8d69-3ce8625aacb4", "bridge": "br-int", "label": "tempest-network-smoke--2657185", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.30", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f918144b49634ed5a43d75f8f7d194d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd1c8b84c-63", "ovs_interfaceid": "d1c8b84c-6329-4f97-a961-f92de838a0e8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Dec 05 09:37:12 compute-1 nova_compute[189066]: 2025-12-05 09:37:12.739 189070 DEBUG nova.network.os_vif_util [None req-d2985bfd-2f0b-44b3-9098-4b8daa690b69 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Converting VIF {"id": "d1c8b84c-6329-4f97-a961-f92de838a0e8", "address": "fa:16:3e:60:cb:db", "network": {"id": "28997311-91ec-41eb-8d69-3ce8625aacb4", "bridge": "br-int", "label": "tempest-network-smoke--2657185", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.30", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f918144b49634ed5a43d75f8f7d194d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd1c8b84c-63", "ovs_interfaceid": "d1c8b84c-6329-4f97-a961-f92de838a0e8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 09:37:12 compute-1 nova_compute[189066]: 2025-12-05 09:37:12.739 189070 DEBUG nova.network.os_vif_util [None req-d2985bfd-2f0b-44b3-9098-4b8daa690b69 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:60:cb:db,bridge_name='br-int',has_traffic_filtering=True,id=d1c8b84c-6329-4f97-a961-f92de838a0e8,network=Network(28997311-91ec-41eb-8d69-3ce8625aacb4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd1c8b84c-63') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 09:37:12 compute-1 nova_compute[189066]: 2025-12-05 09:37:12.740 189070 DEBUG os_vif [None req-d2985bfd-2f0b-44b3-9098-4b8daa690b69 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:60:cb:db,bridge_name='br-int',has_traffic_filtering=True,id=d1c8b84c-6329-4f97-a961-f92de838a0e8,network=Network(28997311-91ec-41eb-8d69-3ce8625aacb4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd1c8b84c-63') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Dec 05 09:37:12 compute-1 nova_compute[189066]: 2025-12-05 09:37:12.740 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:37:12 compute-1 nova_compute[189066]: 2025-12-05 09:37:12.740 189070 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 09:37:12 compute-1 nova_compute[189066]: 2025-12-05 09:37:12.741 189070 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 09:37:12 compute-1 nova_compute[189066]: 2025-12-05 09:37:12.744 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:37:12 compute-1 nova_compute[189066]: 2025-12-05 09:37:12.745 189070 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd1c8b84c-63, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 09:37:12 compute-1 nova_compute[189066]: 2025-12-05 09:37:12.746 189070 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapd1c8b84c-63, col_values=(('external_ids', {'iface-id': 'd1c8b84c-6329-4f97-a961-f92de838a0e8', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:60:cb:db', 'vm-uuid': 'e52dbfff-7d40-42b9-98c0-efbf2c7e1f73'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 09:37:12 compute-1 nova_compute[189066]: 2025-12-05 09:37:12.747 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:37:12 compute-1 NetworkManager[55704]: <info>  [1764927432.7489] manager: (tapd1c8b84c-63): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/107)
Dec 05 09:37:12 compute-1 nova_compute[189066]: 2025-12-05 09:37:12.751 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 09:37:12 compute-1 nova_compute[189066]: 2025-12-05 09:37:12.757 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:37:12 compute-1 nova_compute[189066]: 2025-12-05 09:37:12.758 189070 INFO os_vif [None req-d2985bfd-2f0b-44b3-9098-4b8daa690b69 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:60:cb:db,bridge_name='br-int',has_traffic_filtering=True,id=d1c8b84c-6329-4f97-a961-f92de838a0e8,network=Network(28997311-91ec-41eb-8d69-3ce8625aacb4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd1c8b84c-63')
Dec 05 09:37:12 compute-1 nova_compute[189066]: 2025-12-05 09:37:12.903 189070 DEBUG nova.virt.libvirt.driver [None req-d2985bfd-2f0b-44b3-9098-4b8daa690b69 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 05 09:37:12 compute-1 nova_compute[189066]: 2025-12-05 09:37:12.904 189070 DEBUG nova.virt.libvirt.driver [None req-d2985bfd-2f0b-44b3-9098-4b8daa690b69 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 05 09:37:12 compute-1 nova_compute[189066]: 2025-12-05 09:37:12.904 189070 DEBUG nova.virt.libvirt.driver [None req-d2985bfd-2f0b-44b3-9098-4b8daa690b69 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] No VIF found with MAC fa:16:3e:60:cb:db, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Dec 05 09:37:12 compute-1 nova_compute[189066]: 2025-12-05 09:37:12.905 189070 INFO nova.virt.libvirt.driver [None req-d2985bfd-2f0b-44b3-9098-4b8daa690b69 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: e52dbfff-7d40-42b9-98c0-efbf2c7e1f73] Using config drive
Dec 05 09:37:13 compute-1 nova_compute[189066]: 2025-12-05 09:37:13.741 189070 INFO nova.virt.libvirt.driver [None req-d2985bfd-2f0b-44b3-9098-4b8daa690b69 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: e52dbfff-7d40-42b9-98c0-efbf2c7e1f73] Creating config drive at /var/lib/nova/instances/e52dbfff-7d40-42b9-98c0-efbf2c7e1f73/disk.config
Dec 05 09:37:13 compute-1 nova_compute[189066]: 2025-12-05 09:37:13.746 189070 DEBUG oslo_concurrency.processutils [None req-d2985bfd-2f0b-44b3-9098-4b8daa690b69 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/e52dbfff-7d40-42b9-98c0-efbf2c7e1f73/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpkqba_mrf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 09:37:13 compute-1 nova_compute[189066]: 2025-12-05 09:37:13.879 189070 DEBUG oslo_concurrency.processutils [None req-d2985bfd-2f0b-44b3-9098-4b8daa690b69 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/e52dbfff-7d40-42b9-98c0-efbf2c7e1f73/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpkqba_mrf" returned: 0 in 0.132s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 09:37:13 compute-1 kernel: tapd1c8b84c-63: entered promiscuous mode
Dec 05 09:37:13 compute-1 NetworkManager[55704]: <info>  [1764927433.9587] manager: (tapd1c8b84c-63): new Tun device (/org/freedesktop/NetworkManager/Devices/108)
Dec 05 09:37:13 compute-1 nova_compute[189066]: 2025-12-05 09:37:13.961 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:37:13 compute-1 ovn_controller[95809]: 2025-12-05T09:37:13Z|00199|binding|INFO|Claiming lport d1c8b84c-6329-4f97-a961-f92de838a0e8 for this chassis.
Dec 05 09:37:13 compute-1 ovn_controller[95809]: 2025-12-05T09:37:13Z|00200|binding|INFO|d1c8b84c-6329-4f97-a961-f92de838a0e8: Claiming fa:16:3e:60:cb:db 10.100.0.30
Dec 05 09:37:13 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:37:13.974 105272 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:60:cb:db 10.100.0.30'], port_security=['fa:16:3e:60:cb:db 10.100.0.30'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.30/28', 'neutron:device_id': 'e52dbfff-7d40-42b9-98c0-efbf2c7e1f73', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-28997311-91ec-41eb-8d69-3ce8625aacb4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'f918144b49634ed5a43d75f8f7d194d3', 'neutron:revision_number': '2', 'neutron:security_group_ids': '9272454f-abea-43fa-98ec-14d77b96a6ac', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=14d9d9ae-74ae-472f-b0db-60451ce9c2a3, chassis=[<ovs.db.idl.Row object at 0x7fc06f54c970>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fc06f54c970>], logical_port=d1c8b84c-6329-4f97-a961-f92de838a0e8) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 09:37:13 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:37:13.976 105272 INFO neutron.agent.ovn.metadata.agent [-] Port d1c8b84c-6329-4f97-a961-f92de838a0e8 in datapath 28997311-91ec-41eb-8d69-3ce8625aacb4 bound to our chassis
Dec 05 09:37:13 compute-1 ovn_controller[95809]: 2025-12-05T09:37:13Z|00201|binding|INFO|Setting lport d1c8b84c-6329-4f97-a961-f92de838a0e8 ovn-installed in OVS
Dec 05 09:37:13 compute-1 ovn_controller[95809]: 2025-12-05T09:37:13Z|00202|binding|INFO|Setting lport d1c8b84c-6329-4f97-a961-f92de838a0e8 up in Southbound
Dec 05 09:37:13 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:37:13.978 105272 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 28997311-91ec-41eb-8d69-3ce8625aacb4
Dec 05 09:37:13 compute-1 nova_compute[189066]: 2025-12-05 09:37:13.978 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:37:13 compute-1 nova_compute[189066]: 2025-12-05 09:37:13.982 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:37:13 compute-1 systemd-udevd[229356]: Network interface NamePolicy= disabled on kernel command line.
Dec 05 09:37:14 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:37:13.999 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[9763a893-5d61-45e0-9944-4dcced489c74]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:37:14 compute-1 NetworkManager[55704]: <info>  [1764927434.0083] device (tapd1c8b84c-63): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec 05 09:37:14 compute-1 NetworkManager[55704]: <info>  [1764927434.0094] device (tapd1c8b84c-63): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec 05 09:37:14 compute-1 systemd-machined[154815]: New machine qemu-16-instance-00000025.
Dec 05 09:37:14 compute-1 systemd[1]: Started Virtual Machine qemu-16-instance-00000025.
Dec 05 09:37:14 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:37:14.035 220896 DEBUG oslo.privsep.daemon [-] privsep: reply[487f50ee-c05e-4c6a-94fc-98326f5d2f93]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:37:14 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:37:14.039 220896 DEBUG oslo.privsep.daemon [-] privsep: reply[6979a72b-52ba-499f-b441-57a05a3e55be]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:37:14 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:37:14.073 220896 DEBUG oslo.privsep.daemon [-] privsep: reply[2c311c5f-e61e-40f5-909a-a3950bdacdbf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:37:14 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:37:14.095 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[8205cacf-0e5a-4460-b82b-dcda0f6c8502]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap28997311-91'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:0a:52:2a'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 8, 'tx_packets': 5, 'rx_bytes': 616, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 8, 'tx_packets': 5, 'rx_bytes': 616, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 58], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 467763, 'reachable_time': 36877, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 229371, 'error': None, 'target': 'ovnmeta-28997311-91ec-41eb-8d69-3ce8625aacb4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:37:14 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:37:14.113 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[e10d9b6c-9766-4c6a-a769-2138a7d0aa7c]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap28997311-91'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 467778, 'tstamp': 467778}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 229373, 'error': None, 'target': 'ovnmeta-28997311-91ec-41eb-8d69-3ce8625aacb4', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.17'], ['IFA_LOCAL', '10.100.0.17'], ['IFA_BROADCAST', '10.100.0.31'], ['IFA_LABEL', 'tap28997311-91'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 467781, 'tstamp': 467781}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 229373, 'error': None, 'target': 'ovnmeta-28997311-91ec-41eb-8d69-3ce8625aacb4', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:37:14 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:37:14.115 105272 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap28997311-90, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 09:37:14 compute-1 nova_compute[189066]: 2025-12-05 09:37:14.117 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:37:14 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:37:14.119 105272 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap28997311-90, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 09:37:14 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:37:14.119 105272 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 09:37:14 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:37:14.119 105272 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap28997311-90, col_values=(('external_ids', {'iface-id': '2420b7d7-ec11-4848-bfaf-bc0760f821fb'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 09:37:14 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:37:14.119 105272 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 09:37:14 compute-1 nova_compute[189066]: 2025-12-05 09:37:14.417 189070 DEBUG nova.virt.driver [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] Emitting event <LifecycleEvent: 1764927434.4165978, e52dbfff-7d40-42b9-98c0-efbf2c7e1f73 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 09:37:14 compute-1 nova_compute[189066]: 2025-12-05 09:37:14.417 189070 INFO nova.compute.manager [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] [instance: e52dbfff-7d40-42b9-98c0-efbf2c7e1f73] VM Started (Lifecycle Event)
Dec 05 09:37:14 compute-1 nova_compute[189066]: 2025-12-05 09:37:14.448 189070 DEBUG nova.compute.manager [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] [instance: e52dbfff-7d40-42b9-98c0-efbf2c7e1f73] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 09:37:14 compute-1 nova_compute[189066]: 2025-12-05 09:37:14.456 189070 DEBUG nova.virt.driver [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] Emitting event <LifecycleEvent: 1764927434.4168181, e52dbfff-7d40-42b9-98c0-efbf2c7e1f73 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 09:37:14 compute-1 nova_compute[189066]: 2025-12-05 09:37:14.457 189070 INFO nova.compute.manager [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] [instance: e52dbfff-7d40-42b9-98c0-efbf2c7e1f73] VM Paused (Lifecycle Event)
Dec 05 09:37:14 compute-1 nova_compute[189066]: 2025-12-05 09:37:14.508 189070 DEBUG nova.compute.manager [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] [instance: e52dbfff-7d40-42b9-98c0-efbf2c7e1f73] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 09:37:14 compute-1 nova_compute[189066]: 2025-12-05 09:37:14.511 189070 DEBUG nova.compute.manager [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] [instance: e52dbfff-7d40-42b9-98c0-efbf2c7e1f73] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 09:37:14 compute-1 nova_compute[189066]: 2025-12-05 09:37:14.827 189070 INFO nova.compute.manager [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] [instance: e52dbfff-7d40-42b9-98c0-efbf2c7e1f73] During sync_power_state the instance has a pending task (spawning). Skip.
Dec 05 09:37:15 compute-1 nova_compute[189066]: 2025-12-05 09:37:15.114 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:37:15 compute-1 nova_compute[189066]: 2025-12-05 09:37:15.898 189070 DEBUG nova.network.neutron [req-319cd26c-4bc0-4f20-be7f-ccaab6a8eaa9 req-5d988715-3349-4877-b99e-23832633aff4 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: e52dbfff-7d40-42b9-98c0-efbf2c7e1f73] Updated VIF entry in instance network info cache for port d1c8b84c-6329-4f97-a961-f92de838a0e8. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Dec 05 09:37:15 compute-1 nova_compute[189066]: 2025-12-05 09:37:15.899 189070 DEBUG nova.network.neutron [req-319cd26c-4bc0-4f20-be7f-ccaab6a8eaa9 req-5d988715-3349-4877-b99e-23832633aff4 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: e52dbfff-7d40-42b9-98c0-efbf2c7e1f73] Updating instance_info_cache with network_info: [{"id": "d1c8b84c-6329-4f97-a961-f92de838a0e8", "address": "fa:16:3e:60:cb:db", "network": {"id": "28997311-91ec-41eb-8d69-3ce8625aacb4", "bridge": "br-int", "label": "tempest-network-smoke--2657185", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.30", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f918144b49634ed5a43d75f8f7d194d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd1c8b84c-63", "ovs_interfaceid": "d1c8b84c-6329-4f97-a961-f92de838a0e8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 09:37:15 compute-1 nova_compute[189066]: 2025-12-05 09:37:15.915 189070 DEBUG oslo_concurrency.lockutils [req-319cd26c-4bc0-4f20-be7f-ccaab6a8eaa9 req-5d988715-3349-4877-b99e-23832633aff4 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Releasing lock "refresh_cache-e52dbfff-7d40-42b9-98c0-efbf2c7e1f73" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 09:37:15 compute-1 nova_compute[189066]: 2025-12-05 09:37:15.944 189070 DEBUG nova.compute.manager [req-af53a484-4fa3-4238-a8af-52482f1c8a92 req-12f54ffa-25ce-4416-bfe7-28f7988dd801 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: e52dbfff-7d40-42b9-98c0-efbf2c7e1f73] Received event network-vif-plugged-d1c8b84c-6329-4f97-a961-f92de838a0e8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 09:37:15 compute-1 nova_compute[189066]: 2025-12-05 09:37:15.944 189070 DEBUG oslo_concurrency.lockutils [req-af53a484-4fa3-4238-a8af-52482f1c8a92 req-12f54ffa-25ce-4416-bfe7-28f7988dd801 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Acquiring lock "e52dbfff-7d40-42b9-98c0-efbf2c7e1f73-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:37:15 compute-1 nova_compute[189066]: 2025-12-05 09:37:15.945 189070 DEBUG oslo_concurrency.lockutils [req-af53a484-4fa3-4238-a8af-52482f1c8a92 req-12f54ffa-25ce-4416-bfe7-28f7988dd801 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Lock "e52dbfff-7d40-42b9-98c0-efbf2c7e1f73-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:37:15 compute-1 nova_compute[189066]: 2025-12-05 09:37:15.945 189070 DEBUG oslo_concurrency.lockutils [req-af53a484-4fa3-4238-a8af-52482f1c8a92 req-12f54ffa-25ce-4416-bfe7-28f7988dd801 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Lock "e52dbfff-7d40-42b9-98c0-efbf2c7e1f73-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:37:15 compute-1 nova_compute[189066]: 2025-12-05 09:37:15.945 189070 DEBUG nova.compute.manager [req-af53a484-4fa3-4238-a8af-52482f1c8a92 req-12f54ffa-25ce-4416-bfe7-28f7988dd801 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: e52dbfff-7d40-42b9-98c0-efbf2c7e1f73] Processing event network-vif-plugged-d1c8b84c-6329-4f97-a961-f92de838a0e8 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Dec 05 09:37:15 compute-1 nova_compute[189066]: 2025-12-05 09:37:15.946 189070 DEBUG nova.compute.manager [None req-d2985bfd-2f0b-44b3-9098-4b8daa690b69 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: e52dbfff-7d40-42b9-98c0-efbf2c7e1f73] Instance event wait completed in 1 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Dec 05 09:37:15 compute-1 nova_compute[189066]: 2025-12-05 09:37:15.949 189070 DEBUG nova.virt.driver [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] Emitting event <LifecycleEvent: 1764927435.9495883, e52dbfff-7d40-42b9-98c0-efbf2c7e1f73 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 09:37:15 compute-1 nova_compute[189066]: 2025-12-05 09:37:15.949 189070 INFO nova.compute.manager [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] [instance: e52dbfff-7d40-42b9-98c0-efbf2c7e1f73] VM Resumed (Lifecycle Event)
Dec 05 09:37:15 compute-1 nova_compute[189066]: 2025-12-05 09:37:15.952 189070 DEBUG nova.virt.libvirt.driver [None req-d2985bfd-2f0b-44b3-9098-4b8daa690b69 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: e52dbfff-7d40-42b9-98c0-efbf2c7e1f73] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Dec 05 09:37:15 compute-1 nova_compute[189066]: 2025-12-05 09:37:15.955 189070 INFO nova.virt.libvirt.driver [-] [instance: e52dbfff-7d40-42b9-98c0-efbf2c7e1f73] Instance spawned successfully.
Dec 05 09:37:15 compute-1 nova_compute[189066]: 2025-12-05 09:37:15.956 189070 DEBUG nova.virt.libvirt.driver [None req-d2985bfd-2f0b-44b3-9098-4b8daa690b69 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: e52dbfff-7d40-42b9-98c0-efbf2c7e1f73] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Dec 05 09:37:15 compute-1 nova_compute[189066]: 2025-12-05 09:37:15.989 189070 DEBUG nova.compute.manager [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] [instance: e52dbfff-7d40-42b9-98c0-efbf2c7e1f73] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 09:37:15 compute-1 nova_compute[189066]: 2025-12-05 09:37:15.993 189070 DEBUG nova.virt.libvirt.driver [None req-d2985bfd-2f0b-44b3-9098-4b8daa690b69 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: e52dbfff-7d40-42b9-98c0-efbf2c7e1f73] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 09:37:15 compute-1 nova_compute[189066]: 2025-12-05 09:37:15.994 189070 DEBUG nova.virt.libvirt.driver [None req-d2985bfd-2f0b-44b3-9098-4b8daa690b69 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: e52dbfff-7d40-42b9-98c0-efbf2c7e1f73] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 09:37:15 compute-1 nova_compute[189066]: 2025-12-05 09:37:15.994 189070 DEBUG nova.virt.libvirt.driver [None req-d2985bfd-2f0b-44b3-9098-4b8daa690b69 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: e52dbfff-7d40-42b9-98c0-efbf2c7e1f73] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 09:37:15 compute-1 nova_compute[189066]: 2025-12-05 09:37:15.995 189070 DEBUG nova.virt.libvirt.driver [None req-d2985bfd-2f0b-44b3-9098-4b8daa690b69 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: e52dbfff-7d40-42b9-98c0-efbf2c7e1f73] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 09:37:15 compute-1 nova_compute[189066]: 2025-12-05 09:37:15.995 189070 DEBUG nova.virt.libvirt.driver [None req-d2985bfd-2f0b-44b3-9098-4b8daa690b69 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: e52dbfff-7d40-42b9-98c0-efbf2c7e1f73] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 09:37:15 compute-1 nova_compute[189066]: 2025-12-05 09:37:15.996 189070 DEBUG nova.virt.libvirt.driver [None req-d2985bfd-2f0b-44b3-9098-4b8daa690b69 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: e52dbfff-7d40-42b9-98c0-efbf2c7e1f73] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 09:37:16 compute-1 nova_compute[189066]: 2025-12-05 09:37:16.000 189070 DEBUG nova.compute.manager [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] [instance: e52dbfff-7d40-42b9-98c0-efbf2c7e1f73] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 09:37:16 compute-1 nova_compute[189066]: 2025-12-05 09:37:16.020 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:37:16 compute-1 nova_compute[189066]: 2025-12-05 09:37:16.044 189070 INFO nova.compute.manager [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] [instance: e52dbfff-7d40-42b9-98c0-efbf2c7e1f73] During sync_power_state the instance has a pending task (spawning). Skip.
Dec 05 09:37:16 compute-1 nova_compute[189066]: 2025-12-05 09:37:16.082 189070 INFO nova.compute.manager [None req-d2985bfd-2f0b-44b3-9098-4b8daa690b69 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: e52dbfff-7d40-42b9-98c0-efbf2c7e1f73] Took 8.55 seconds to spawn the instance on the hypervisor.
Dec 05 09:37:16 compute-1 nova_compute[189066]: 2025-12-05 09:37:16.083 189070 DEBUG nova.compute.manager [None req-d2985bfd-2f0b-44b3-9098-4b8daa690b69 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: e52dbfff-7d40-42b9-98c0-efbf2c7e1f73] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 09:37:16 compute-1 nova_compute[189066]: 2025-12-05 09:37:16.177 189070 INFO nova.compute.manager [None req-d2985bfd-2f0b-44b3-9098-4b8daa690b69 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: e52dbfff-7d40-42b9-98c0-efbf2c7e1f73] Took 9.17 seconds to build instance.
Dec 05 09:37:16 compute-1 nova_compute[189066]: 2025-12-05 09:37:16.249 189070 DEBUG oslo_concurrency.lockutils [None req-d2985bfd-2f0b-44b3-9098-4b8daa690b69 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Lock "e52dbfff-7d40-42b9-98c0-efbf2c7e1f73" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 9.383s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:37:17 compute-1 nova_compute[189066]: 2025-12-05 09:37:17.748 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:37:18 compute-1 nova_compute[189066]: 2025-12-05 09:37:18.015 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:37:18 compute-1 nova_compute[189066]: 2025-12-05 09:37:18.458 189070 DEBUG nova.compute.manager [req-551f8314-4221-471f-ab5a-7750eddbfdb5 req-5765cb3e-17d2-4470-b7b0-fe92f69f85ef 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: e52dbfff-7d40-42b9-98c0-efbf2c7e1f73] Received event network-vif-plugged-d1c8b84c-6329-4f97-a961-f92de838a0e8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 09:37:18 compute-1 nova_compute[189066]: 2025-12-05 09:37:18.459 189070 DEBUG oslo_concurrency.lockutils [req-551f8314-4221-471f-ab5a-7750eddbfdb5 req-5765cb3e-17d2-4470-b7b0-fe92f69f85ef 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Acquiring lock "e52dbfff-7d40-42b9-98c0-efbf2c7e1f73-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:37:18 compute-1 nova_compute[189066]: 2025-12-05 09:37:18.459 189070 DEBUG oslo_concurrency.lockutils [req-551f8314-4221-471f-ab5a-7750eddbfdb5 req-5765cb3e-17d2-4470-b7b0-fe92f69f85ef 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Lock "e52dbfff-7d40-42b9-98c0-efbf2c7e1f73-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:37:18 compute-1 nova_compute[189066]: 2025-12-05 09:37:18.459 189070 DEBUG oslo_concurrency.lockutils [req-551f8314-4221-471f-ab5a-7750eddbfdb5 req-5765cb3e-17d2-4470-b7b0-fe92f69f85ef 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Lock "e52dbfff-7d40-42b9-98c0-efbf2c7e1f73-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:37:18 compute-1 nova_compute[189066]: 2025-12-05 09:37:18.459 189070 DEBUG nova.compute.manager [req-551f8314-4221-471f-ab5a-7750eddbfdb5 req-5765cb3e-17d2-4470-b7b0-fe92f69f85ef 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: e52dbfff-7d40-42b9-98c0-efbf2c7e1f73] No waiting events found dispatching network-vif-plugged-d1c8b84c-6329-4f97-a961-f92de838a0e8 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 09:37:18 compute-1 nova_compute[189066]: 2025-12-05 09:37:18.460 189070 WARNING nova.compute.manager [req-551f8314-4221-471f-ab5a-7750eddbfdb5 req-5765cb3e-17d2-4470-b7b0-fe92f69f85ef 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: e52dbfff-7d40-42b9-98c0-efbf2c7e1f73] Received unexpected event network-vif-plugged-d1c8b84c-6329-4f97-a961-f92de838a0e8 for instance with vm_state active and task_state None.
Dec 05 09:37:19 compute-1 nova_compute[189066]: 2025-12-05 09:37:19.022 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:37:20 compute-1 nova_compute[189066]: 2025-12-05 09:37:20.021 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:37:20 compute-1 nova_compute[189066]: 2025-12-05 09:37:20.022 189070 DEBUG nova.compute.manager [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Dec 05 09:37:20 compute-1 nova_compute[189066]: 2025-12-05 09:37:20.051 189070 DEBUG nova.compute.manager [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Dec 05 09:37:20 compute-1 nova_compute[189066]: 2025-12-05 09:37:20.168 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:37:20 compute-1 podman[229381]: 2025-12-05 09:37:20.628373121 +0000 UTC m=+0.067387054 container health_status d60d3dfd47255bc7c068dbedc03dbf86678863f0414a1b8f8536e4d53dd67a13 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a42d21893a42142b0b00e96296a13ac9d2c0fedf55d6438ad104fd0309c63376-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Dec 05 09:37:21 compute-1 nova_compute[189066]: 2025-12-05 09:37:21.052 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:37:21 compute-1 nova_compute[189066]: 2025-12-05 09:37:21.053 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:37:21 compute-1 nova_compute[189066]: 2025-12-05 09:37:21.053 189070 DEBUG nova.compute.manager [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 05 09:37:21 compute-1 nova_compute[189066]: 2025-12-05 09:37:21.053 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:37:21 compute-1 nova_compute[189066]: 2025-12-05 09:37:21.079 189070 DEBUG oslo_concurrency.lockutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:37:21 compute-1 nova_compute[189066]: 2025-12-05 09:37:21.079 189070 DEBUG oslo_concurrency.lockutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:37:21 compute-1 nova_compute[189066]: 2025-12-05 09:37:21.080 189070 DEBUG oslo_concurrency.lockutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:37:21 compute-1 nova_compute[189066]: 2025-12-05 09:37:21.080 189070 DEBUG nova.compute.resource_tracker [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 05 09:37:21 compute-1 nova_compute[189066]: 2025-12-05 09:37:21.170 189070 DEBUG oslo_concurrency.processutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/caf0a99c-b4d0-4fac-9883-ab0be359b528/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 09:37:21 compute-1 nova_compute[189066]: 2025-12-05 09:37:21.237 189070 DEBUG oslo_concurrency.processutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/caf0a99c-b4d0-4fac-9883-ab0be359b528/disk --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 09:37:21 compute-1 nova_compute[189066]: 2025-12-05 09:37:21.239 189070 DEBUG oslo_concurrency.processutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/caf0a99c-b4d0-4fac-9883-ab0be359b528/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 09:37:21 compute-1 nova_compute[189066]: 2025-12-05 09:37:21.306 189070 DEBUG oslo_concurrency.processutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/caf0a99c-b4d0-4fac-9883-ab0be359b528/disk --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 09:37:21 compute-1 nova_compute[189066]: 2025-12-05 09:37:21.313 189070 DEBUG oslo_concurrency.processutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e52dbfff-7d40-42b9-98c0-efbf2c7e1f73/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 09:37:21 compute-1 nova_compute[189066]: 2025-12-05 09:37:21.372 189070 DEBUG oslo_concurrency.processutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e52dbfff-7d40-42b9-98c0-efbf2c7e1f73/disk --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 09:37:21 compute-1 nova_compute[189066]: 2025-12-05 09:37:21.373 189070 DEBUG oslo_concurrency.processutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e52dbfff-7d40-42b9-98c0-efbf2c7e1f73/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 09:37:21 compute-1 nova_compute[189066]: 2025-12-05 09:37:21.430 189070 DEBUG oslo_concurrency.processutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e52dbfff-7d40-42b9-98c0-efbf2c7e1f73/disk --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 09:37:21 compute-1 nova_compute[189066]: 2025-12-05 09:37:21.628 189070 WARNING nova.virt.libvirt.driver [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 05 09:37:21 compute-1 nova_compute[189066]: 2025-12-05 09:37:21.630 189070 DEBUG nova.compute.resource_tracker [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5466MB free_disk=73.29557037353516GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 05 09:37:21 compute-1 nova_compute[189066]: 2025-12-05 09:37:21.630 189070 DEBUG oslo_concurrency.lockutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:37:21 compute-1 nova_compute[189066]: 2025-12-05 09:37:21.630 189070 DEBUG oslo_concurrency.lockutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:37:21 compute-1 nova_compute[189066]: 2025-12-05 09:37:21.822 189070 DEBUG nova.compute.resource_tracker [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Instance caf0a99c-b4d0-4fac-9883-ab0be359b528 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Dec 05 09:37:21 compute-1 nova_compute[189066]: 2025-12-05 09:37:21.822 189070 DEBUG nova.compute.resource_tracker [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Instance e52dbfff-7d40-42b9-98c0-efbf2c7e1f73 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Dec 05 09:37:21 compute-1 nova_compute[189066]: 2025-12-05 09:37:21.823 189070 DEBUG nova.compute.resource_tracker [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 05 09:37:21 compute-1 nova_compute[189066]: 2025-12-05 09:37:21.823 189070 DEBUG nova.compute.resource_tracker [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 05 09:37:21 compute-1 nova_compute[189066]: 2025-12-05 09:37:21.907 189070 DEBUG nova.scheduler.client.report [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Refreshing inventories for resource provider be68f9f1-7820-4bfa-8dbd-210e13729f64 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Dec 05 09:37:21 compute-1 nova_compute[189066]: 2025-12-05 09:37:21.931 189070 DEBUG nova.scheduler.client.report [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Updating ProviderTree inventory for provider be68f9f1-7820-4bfa-8dbd-210e13729f64 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Dec 05 09:37:21 compute-1 nova_compute[189066]: 2025-12-05 09:37:21.932 189070 DEBUG nova.compute.provider_tree [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Updating inventory in ProviderTree for provider be68f9f1-7820-4bfa-8dbd-210e13729f64 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Dec 05 09:37:21 compute-1 nova_compute[189066]: 2025-12-05 09:37:21.954 189070 DEBUG nova.scheduler.client.report [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Refreshing aggregate associations for resource provider be68f9f1-7820-4bfa-8dbd-210e13729f64, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Dec 05 09:37:21 compute-1 nova_compute[189066]: 2025-12-05 09:37:21.986 189070 DEBUG nova.scheduler.client.report [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Refreshing trait associations for resource provider be68f9f1-7820-4bfa-8dbd-210e13729f64, traits: COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_DEVICE_TAGGING,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_STORAGE_BUS_USB,COMPUTE_NODE,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_VOLUME_EXTEND,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_GRAPHICS_MODEL_NONE,HW_CPU_X86_MMX,COMPUTE_TRUSTED_CERTS,COMPUTE_SECURITY_UEFI_SECURE_BOOT,HW_CPU_X86_SSE,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_RESCUE_BFV,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_NET_VIF_MODEL_E1000E,HW_CPU_X86_SSE41,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_STORAGE_BUS_FDC,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_STORAGE_BUS_IDE,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_SECURITY_TPM_1_2,HW_CPU_X86_SSE42,COMPUTE_SECURITY_TPM_2_0,COMPUTE_NET_VIF_MODEL_NE2K_PCI,HW_CPU_X86_SSSE3,COMPUTE_STORAGE_BUS_SATA,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_ACCELERATORS,HW_CPU_X86_SSE2,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_NET_VIF_MODEL_RTL8139 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Dec 05 09:37:22 compute-1 nova_compute[189066]: 2025-12-05 09:37:22.064 189070 DEBUG nova.compute.provider_tree [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Inventory has not changed in ProviderTree for provider: be68f9f1-7820-4bfa-8dbd-210e13729f64 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 09:37:22 compute-1 nova_compute[189066]: 2025-12-05 09:37:22.086 189070 DEBUG nova.scheduler.client.report [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Inventory has not changed for provider be68f9f1-7820-4bfa-8dbd-210e13729f64 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 09:37:22 compute-1 nova_compute[189066]: 2025-12-05 09:37:22.125 189070 DEBUG nova.compute.resource_tracker [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 05 09:37:22 compute-1 nova_compute[189066]: 2025-12-05 09:37:22.125 189070 DEBUG oslo_concurrency.lockutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.495s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:37:22 compute-1 nova_compute[189066]: 2025-12-05 09:37:22.126 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:37:22 compute-1 nova_compute[189066]: 2025-12-05 09:37:22.127 189070 DEBUG nova.compute.manager [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Dec 05 09:37:22 compute-1 nova_compute[189066]: 2025-12-05 09:37:22.757 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:37:23 compute-1 nova_compute[189066]: 2025-12-05 09:37:23.193 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:37:23 compute-1 nova_compute[189066]: 2025-12-05 09:37:23.194 189070 DEBUG nova.compute.manager [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 05 09:37:23 compute-1 nova_compute[189066]: 2025-12-05 09:37:23.194 189070 DEBUG nova.compute.manager [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 05 09:37:23 compute-1 nova_compute[189066]: 2025-12-05 09:37:23.495 189070 DEBUG oslo_concurrency.lockutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Acquiring lock "refresh_cache-caf0a99c-b4d0-4fac-9883-ab0be359b528" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 09:37:23 compute-1 nova_compute[189066]: 2025-12-05 09:37:23.495 189070 DEBUG oslo_concurrency.lockutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Acquired lock "refresh_cache-caf0a99c-b4d0-4fac-9883-ab0be359b528" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 09:37:23 compute-1 nova_compute[189066]: 2025-12-05 09:37:23.496 189070 DEBUG nova.network.neutron [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] [instance: caf0a99c-b4d0-4fac-9883-ab0be359b528] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Dec 05 09:37:23 compute-1 nova_compute[189066]: 2025-12-05 09:37:23.496 189070 DEBUG nova.objects.instance [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Lazy-loading 'info_cache' on Instance uuid caf0a99c-b4d0-4fac-9883-ab0be359b528 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 09:37:24 compute-1 podman[229414]: 2025-12-05 09:37:24.686966095 +0000 UTC m=+0.124125907 container health_status 0cdc951f3b455a1459471b20e1f1626ca53f702831c125f835d1b670aeb17ccf (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a42d21893a42142b0b00e96296a13ac9d2c0fedf55d6438ad104fd0309c63376-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_controller)
Dec 05 09:37:25 compute-1 nova_compute[189066]: 2025-12-05 09:37:25.171 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:37:26 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:37:26.890 105272 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=20, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f2:d3:89', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '8e:43:2c:7d:20:dd'}, ipsec=False) old=SB_Global(nb_cfg=19) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 09:37:26 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:37:26.892 105272 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Dec 05 09:37:26 compute-1 nova_compute[189066]: 2025-12-05 09:37:26.892 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:37:27 compute-1 nova_compute[189066]: 2025-12-05 09:37:27.184 189070 DEBUG nova.network.neutron [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] [instance: caf0a99c-b4d0-4fac-9883-ab0be359b528] Updating instance_info_cache with network_info: [{"id": "dc081d90-d263-461d-a7af-bc8649b2d24e", "address": "fa:16:3e:5a:50:59", "network": {"id": "9169e776-8a85-417a-9321-7e8c761484e0", "bridge": "br-int", "label": "tempest-network-smoke--736767212", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.215", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f918144b49634ed5a43d75f8f7d194d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdc081d90-d2", "ovs_interfaceid": "dc081d90-d263-461d-a7af-bc8649b2d24e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "5f95901b-9655-49a7-a46d-bfe7f4c86a96", "address": "fa:16:3e:b9:90:da", "network": {"id": "28997311-91ec-41eb-8d69-3ce8625aacb4", "bridge": "br-int", "label": "tempest-network-smoke--2657185", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.20", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f918144b49634ed5a43d75f8f7d194d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5f95901b-96", "ovs_interfaceid": "5f95901b-9655-49a7-a46d-bfe7f4c86a96", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 09:37:27 compute-1 nova_compute[189066]: 2025-12-05 09:37:27.210 189070 DEBUG oslo_concurrency.lockutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Releasing lock "refresh_cache-caf0a99c-b4d0-4fac-9883-ab0be359b528" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 09:37:27 compute-1 nova_compute[189066]: 2025-12-05 09:37:27.211 189070 DEBUG nova.compute.manager [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] [instance: caf0a99c-b4d0-4fac-9883-ab0be359b528] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Dec 05 09:37:27 compute-1 nova_compute[189066]: 2025-12-05 09:37:27.212 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:37:27 compute-1 nova_compute[189066]: 2025-12-05 09:37:27.212 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:37:27 compute-1 podman[229440]: 2025-12-05 09:37:27.636508758 +0000 UTC m=+0.066942124 container health_status e8cea6352187f28e672aa25c227f2705a90d58074b8cf42235a56df5d4026fd5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a42d21893a42142b0b00e96296a13ac9d2c0fedf55d6438ad104fd0309c63376-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent)
Dec 05 09:37:27 compute-1 nova_compute[189066]: 2025-12-05 09:37:27.759 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:37:28 compute-1 nova_compute[189066]: 2025-12-05 09:37:28.021 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:37:28 compute-1 ovn_controller[95809]: 2025-12-05T09:37:28Z|00203|binding|INFO|Releasing lport e2b66518-c69f-4ca7-89db-70c0fcadafde from this chassis (sb_readonly=0)
Dec 05 09:37:28 compute-1 ovn_controller[95809]: 2025-12-05T09:37:28Z|00204|binding|INFO|Releasing lport 2420b7d7-ec11-4848-bfaf-bc0760f821fb from this chassis (sb_readonly=0)
Dec 05 09:37:28 compute-1 nova_compute[189066]: 2025-12-05 09:37:28.500 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:37:28 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:37:28.895 105272 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=2deaed7a-68f6-453c-b7f8-10ef033f3762, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '20'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 09:37:29 compute-1 podman[229477]: 2025-12-05 09:37:29.618184019 +0000 UTC m=+0.053391343 container health_status 3f3288243aefe700d53cffce184353529fbaf6e44f1c19ec4792ed576af41176 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 05 09:37:29 compute-1 ovn_controller[95809]: 2025-12-05T09:37:29Z|00037|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:60:cb:db 10.100.0.30
Dec 05 09:37:29 compute-1 ovn_controller[95809]: 2025-12-05T09:37:29Z|00038|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:60:cb:db 10.100.0.30
Dec 05 09:37:30 compute-1 nova_compute[189066]: 2025-12-05 09:37:30.174 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:37:32 compute-1 nova_compute[189066]: 2025-12-05 09:37:32.762 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:37:33 compute-1 podman[229499]: 2025-12-05 09:37:33.65812367 +0000 UTC m=+0.100678406 container health_status b07f36300303a5c1729056a61e344d6c6fd9b2e404082d8e8ce7185d45652c2b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, build-date=2025-08-20T13:12:41, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, architecture=x86_64, config_id=openstack_network_exporter, maintainer=Red Hat, Inc., name=ubi9-minimal, version=9.6, container_name=openstack_network_exporter, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, managed_by=edpm_ansible, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 05 09:37:35 compute-1 nova_compute[189066]: 2025-12-05 09:37:35.177 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:37:37 compute-1 nova_compute[189066]: 2025-12-05 09:37:37.324 189070 DEBUG nova.compute.manager [req-e5702b2e-daff-4b14-8745-77466b9bcfd5 req-cf6b0b16-4f88-4b44-a75e-a55a402ad604 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: caf0a99c-b4d0-4fac-9883-ab0be359b528] Received event network-changed-5f95901b-9655-49a7-a46d-bfe7f4c86a96 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 09:37:37 compute-1 nova_compute[189066]: 2025-12-05 09:37:37.324 189070 DEBUG nova.compute.manager [req-e5702b2e-daff-4b14-8745-77466b9bcfd5 req-cf6b0b16-4f88-4b44-a75e-a55a402ad604 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: caf0a99c-b4d0-4fac-9883-ab0be359b528] Refreshing instance network info cache due to event network-changed-5f95901b-9655-49a7-a46d-bfe7f4c86a96. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Dec 05 09:37:37 compute-1 nova_compute[189066]: 2025-12-05 09:37:37.325 189070 DEBUG oslo_concurrency.lockutils [req-e5702b2e-daff-4b14-8745-77466b9bcfd5 req-cf6b0b16-4f88-4b44-a75e-a55a402ad604 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Acquiring lock "refresh_cache-caf0a99c-b4d0-4fac-9883-ab0be359b528" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 09:37:37 compute-1 nova_compute[189066]: 2025-12-05 09:37:37.325 189070 DEBUG oslo_concurrency.lockutils [req-e5702b2e-daff-4b14-8745-77466b9bcfd5 req-cf6b0b16-4f88-4b44-a75e-a55a402ad604 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Acquired lock "refresh_cache-caf0a99c-b4d0-4fac-9883-ab0be359b528" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 09:37:37 compute-1 nova_compute[189066]: 2025-12-05 09:37:37.325 189070 DEBUG nova.network.neutron [req-e5702b2e-daff-4b14-8745-77466b9bcfd5 req-cf6b0b16-4f88-4b44-a75e-a55a402ad604 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: caf0a99c-b4d0-4fac-9883-ab0be359b528] Refreshing network info cache for port 5f95901b-9655-49a7-a46d-bfe7f4c86a96 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Dec 05 09:37:37 compute-1 podman[229523]: 2025-12-05 09:37:37.636400196 +0000 UTC m=+0.065748414 container health_status b9f45cdcb567bb250381fa80d86c78c226d5638c7de1b6d79ee017275814ff68 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Dec 05 09:37:37 compute-1 nova_compute[189066]: 2025-12-05 09:37:37.765 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:37:38 compute-1 nova_compute[189066]: 2025-12-05 09:37:38.947 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:37:39 compute-1 nova_compute[189066]: 2025-12-05 09:37:39.754 189070 DEBUG nova.network.neutron [req-e5702b2e-daff-4b14-8745-77466b9bcfd5 req-cf6b0b16-4f88-4b44-a75e-a55a402ad604 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: caf0a99c-b4d0-4fac-9883-ab0be359b528] Updated VIF entry in instance network info cache for port 5f95901b-9655-49a7-a46d-bfe7f4c86a96. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Dec 05 09:37:39 compute-1 nova_compute[189066]: 2025-12-05 09:37:39.755 189070 DEBUG nova.network.neutron [req-e5702b2e-daff-4b14-8745-77466b9bcfd5 req-cf6b0b16-4f88-4b44-a75e-a55a402ad604 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: caf0a99c-b4d0-4fac-9883-ab0be359b528] Updating instance_info_cache with network_info: [{"id": "dc081d90-d263-461d-a7af-bc8649b2d24e", "address": "fa:16:3e:5a:50:59", "network": {"id": "9169e776-8a85-417a-9321-7e8c761484e0", "bridge": "br-int", "label": "tempest-network-smoke--736767212", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.215", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f918144b49634ed5a43d75f8f7d194d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdc081d90-d2", "ovs_interfaceid": "dc081d90-d263-461d-a7af-bc8649b2d24e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "5f95901b-9655-49a7-a46d-bfe7f4c86a96", "address": "fa:16:3e:b9:90:da", "network": {"id": "28997311-91ec-41eb-8d69-3ce8625aacb4", "bridge": "br-int", "label": "tempest-network-smoke--2657185", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.20", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f918144b49634ed5a43d75f8f7d194d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5f95901b-96", "ovs_interfaceid": "5f95901b-9655-49a7-a46d-bfe7f4c86a96", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 09:37:39 compute-1 nova_compute[189066]: 2025-12-05 09:37:39.883 189070 DEBUG oslo_concurrency.lockutils [req-e5702b2e-daff-4b14-8745-77466b9bcfd5 req-cf6b0b16-4f88-4b44-a75e-a55a402ad604 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Releasing lock "refresh_cache-caf0a99c-b4d0-4fac-9883-ab0be359b528" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 09:37:40 compute-1 nova_compute[189066]: 2025-12-05 09:37:40.181 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:37:42 compute-1 nova_compute[189066]: 2025-12-05 09:37:42.769 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:37:43 compute-1 podman[229547]: 2025-12-05 09:37:43.611904515 +0000 UTC m=+0.054233794 container health_status 550b604b3b40c506829669f3af0f3e8dc4e7ac01464222ce7f26998708fe1a96 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Dec 05 09:37:43 compute-1 nova_compute[189066]: 2025-12-05 09:37:43.991 189070 DEBUG oslo_concurrency.lockutils [None req-b790e920-bc39-4cf6-909a-0743c87de692 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Acquiring lock "e52dbfff-7d40-42b9-98c0-efbf2c7e1f73" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:37:43 compute-1 nova_compute[189066]: 2025-12-05 09:37:43.991 189070 DEBUG oslo_concurrency.lockutils [None req-b790e920-bc39-4cf6-909a-0743c87de692 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Lock "e52dbfff-7d40-42b9-98c0-efbf2c7e1f73" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:37:43 compute-1 nova_compute[189066]: 2025-12-05 09:37:43.992 189070 DEBUG oslo_concurrency.lockutils [None req-b790e920-bc39-4cf6-909a-0743c87de692 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Acquiring lock "e52dbfff-7d40-42b9-98c0-efbf2c7e1f73-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:37:43 compute-1 nova_compute[189066]: 2025-12-05 09:37:43.992 189070 DEBUG oslo_concurrency.lockutils [None req-b790e920-bc39-4cf6-909a-0743c87de692 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Lock "e52dbfff-7d40-42b9-98c0-efbf2c7e1f73-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:37:43 compute-1 nova_compute[189066]: 2025-12-05 09:37:43.992 189070 DEBUG oslo_concurrency.lockutils [None req-b790e920-bc39-4cf6-909a-0743c87de692 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Lock "e52dbfff-7d40-42b9-98c0-efbf2c7e1f73-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:37:43 compute-1 nova_compute[189066]: 2025-12-05 09:37:43.995 189070 INFO nova.compute.manager [None req-b790e920-bc39-4cf6-909a-0743c87de692 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: e52dbfff-7d40-42b9-98c0-efbf2c7e1f73] Terminating instance
Dec 05 09:37:43 compute-1 nova_compute[189066]: 2025-12-05 09:37:43.996 189070 DEBUG nova.compute.manager [None req-b790e920-bc39-4cf6-909a-0743c87de692 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: e52dbfff-7d40-42b9-98c0-efbf2c7e1f73] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Dec 05 09:37:44 compute-1 kernel: tapd1c8b84c-63 (unregistering): left promiscuous mode
Dec 05 09:37:44 compute-1 NetworkManager[55704]: <info>  [1764927464.0272] device (tapd1c8b84c-63): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec 05 09:37:44 compute-1 nova_compute[189066]: 2025-12-05 09:37:44.036 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:37:44 compute-1 ovn_controller[95809]: 2025-12-05T09:37:44Z|00205|binding|INFO|Releasing lport d1c8b84c-6329-4f97-a961-f92de838a0e8 from this chassis (sb_readonly=0)
Dec 05 09:37:44 compute-1 ovn_controller[95809]: 2025-12-05T09:37:44Z|00206|binding|INFO|Setting lport d1c8b84c-6329-4f97-a961-f92de838a0e8 down in Southbound
Dec 05 09:37:44 compute-1 ovn_controller[95809]: 2025-12-05T09:37:44Z|00207|binding|INFO|Removing iface tapd1c8b84c-63 ovn-installed in OVS
Dec 05 09:37:44 compute-1 nova_compute[189066]: 2025-12-05 09:37:44.040 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:37:44 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:37:44.053 105272 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:60:cb:db 10.100.0.30'], port_security=['fa:16:3e:60:cb:db 10.100.0.30'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.30/28', 'neutron:device_id': 'e52dbfff-7d40-42b9-98c0-efbf2c7e1f73', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-28997311-91ec-41eb-8d69-3ce8625aacb4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'f918144b49634ed5a43d75f8f7d194d3', 'neutron:revision_number': '4', 'neutron:security_group_ids': '9272454f-abea-43fa-98ec-14d77b96a6ac', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=14d9d9ae-74ae-472f-b0db-60451ce9c2a3, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fc06f54c970>], logical_port=d1c8b84c-6329-4f97-a961-f92de838a0e8) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fc06f54c970>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 09:37:44 compute-1 nova_compute[189066]: 2025-12-05 09:37:44.054 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:37:44 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:37:44.056 105272 INFO neutron.agent.ovn.metadata.agent [-] Port d1c8b84c-6329-4f97-a961-f92de838a0e8 in datapath 28997311-91ec-41eb-8d69-3ce8625aacb4 unbound from our chassis
Dec 05 09:37:44 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:37:44.058 105272 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 28997311-91ec-41eb-8d69-3ce8625aacb4
Dec 05 09:37:44 compute-1 systemd[1]: machine-qemu\x2d16\x2dinstance\x2d00000025.scope: Deactivated successfully.
Dec 05 09:37:44 compute-1 systemd[1]: machine-qemu\x2d16\x2dinstance\x2d00000025.scope: Consumed 14.805s CPU time.
Dec 05 09:37:44 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:37:44.081 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[86203e7f-f813-4145-85f2-882ceb274209]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:37:44 compute-1 systemd-machined[154815]: Machine qemu-16-instance-00000025 terminated.
Dec 05 09:37:44 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:37:44.163 220896 DEBUG oslo.privsep.daemon [-] privsep: reply[f3f1bcac-7dfa-417e-ad58-05342e48e27e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:37:44 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:37:44.168 220896 DEBUG oslo.privsep.daemon [-] privsep: reply[d744506c-5540-4fdf-9c0f-c4f0d0983760]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:37:44 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:37:44.200 220896 DEBUG oslo.privsep.daemon [-] privsep: reply[511c2dbf-19ef-4474-9c2f-f1398afb1811]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:37:44 compute-1 nova_compute[189066]: 2025-12-05 09:37:44.222 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:37:44 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:37:44.226 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[04b2e965-53fb-455e-aeba-20303de2d6fd]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap28997311-91'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:0a:52:2a'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 11, 'tx_packets': 7, 'rx_bytes': 790, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 11, 'tx_packets': 7, 'rx_bytes': 790, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 58], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 467763, 'reachable_time': 36877, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 7, 'inoctets': 524, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 7, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 524, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 7, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 229584, 'error': None, 'target': 'ovnmeta-28997311-91ec-41eb-8d69-3ce8625aacb4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:37:44 compute-1 nova_compute[189066]: 2025-12-05 09:37:44.228 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:37:44 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:37:44.251 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[9a4feea8-f107-4336-b61a-54f09d8f2d2c]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap28997311-91'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 467778, 'tstamp': 467778}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 229594, 'error': None, 'target': 'ovnmeta-28997311-91ec-41eb-8d69-3ce8625aacb4', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.17'], ['IFA_LOCAL', '10.100.0.17'], ['IFA_BROADCAST', '10.100.0.31'], ['IFA_LABEL', 'tap28997311-91'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 467781, 'tstamp': 467781}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 229594, 'error': None, 'target': 'ovnmeta-28997311-91ec-41eb-8d69-3ce8625aacb4', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:37:44 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:37:44.253 105272 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap28997311-90, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 09:37:44 compute-1 nova_compute[189066]: 2025-12-05 09:37:44.255 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:37:44 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:37:44.262 105272 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap28997311-90, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 09:37:44 compute-1 nova_compute[189066]: 2025-12-05 09:37:44.262 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:37:44 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:37:44.263 105272 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 09:37:44 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:37:44.263 105272 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap28997311-90, col_values=(('external_ids', {'iface-id': '2420b7d7-ec11-4848-bfaf-bc0760f821fb'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 09:37:44 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:37:44.264 105272 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 09:37:44 compute-1 nova_compute[189066]: 2025-12-05 09:37:44.267 189070 INFO nova.virt.libvirt.driver [-] [instance: e52dbfff-7d40-42b9-98c0-efbf2c7e1f73] Instance destroyed successfully.
Dec 05 09:37:44 compute-1 nova_compute[189066]: 2025-12-05 09:37:44.268 189070 DEBUG nova.objects.instance [None req-b790e920-bc39-4cf6-909a-0743c87de692 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Lazy-loading 'resources' on Instance uuid e52dbfff-7d40-42b9-98c0-efbf2c7e1f73 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 09:37:44 compute-1 nova_compute[189066]: 2025-12-05 09:37:44.292 189070 DEBUG nova.virt.libvirt.vif [None req-b790e920-bc39-4cf6-909a-0743c87de692 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-05T09:37:05Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1271291293',display_name='tempest-TestNetworkBasicOps-server-1271291293',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1271291293',id=37,image_ref='3ebffd97-b242-42d7-b245-ebdaf8e4377c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBCav6h9q6cQCtydy6fN3pqy63ChTHOaaGTonpqkFfpDVoHbnkUPZN1pcIIG4bFlZC/XU1xrWdoTW8YYzeVALsPEHsG1RbeGovqMrSlnxhaDobsygDgDTJKMvArd6D7J6Mw==',key_name='tempest-TestNetworkBasicOps-1935241973',keypairs=<?>,launch_index=0,launched_at=2025-12-05T09:37:16Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='f918144b49634ed5a43d75f8f7d194d3',ramdisk_id='',reservation_id='r-cqmdqu5h',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='3ebffd97-b242-42d7-b245-ebdaf8e4377c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1341993432',owner_user_name='tempest-TestNetworkBasicOps-1341993432-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-05T09:37:16Z,user_data=None,user_id='822a2d80e80e46d9a5a49e1e9560e0d9',uuid=e52dbfff-7d40-42b9-98c0-efbf2c7e1f73,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "d1c8b84c-6329-4f97-a961-f92de838a0e8", "address": "fa:16:3e:60:cb:db", "network": {"id": "28997311-91ec-41eb-8d69-3ce8625aacb4", "bridge": "br-int", "label": "tempest-network-smoke--2657185", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.30", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f918144b49634ed5a43d75f8f7d194d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd1c8b84c-63", "ovs_interfaceid": "d1c8b84c-6329-4f97-a961-f92de838a0e8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Dec 05 09:37:44 compute-1 nova_compute[189066]: 2025-12-05 09:37:44.293 189070 DEBUG nova.network.os_vif_util [None req-b790e920-bc39-4cf6-909a-0743c87de692 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Converting VIF {"id": "d1c8b84c-6329-4f97-a961-f92de838a0e8", "address": "fa:16:3e:60:cb:db", "network": {"id": "28997311-91ec-41eb-8d69-3ce8625aacb4", "bridge": "br-int", "label": "tempest-network-smoke--2657185", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.30", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f918144b49634ed5a43d75f8f7d194d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd1c8b84c-63", "ovs_interfaceid": "d1c8b84c-6329-4f97-a961-f92de838a0e8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 09:37:44 compute-1 nova_compute[189066]: 2025-12-05 09:37:44.294 189070 DEBUG nova.network.os_vif_util [None req-b790e920-bc39-4cf6-909a-0743c87de692 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:60:cb:db,bridge_name='br-int',has_traffic_filtering=True,id=d1c8b84c-6329-4f97-a961-f92de838a0e8,network=Network(28997311-91ec-41eb-8d69-3ce8625aacb4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd1c8b84c-63') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 09:37:44 compute-1 nova_compute[189066]: 2025-12-05 09:37:44.295 189070 DEBUG os_vif [None req-b790e920-bc39-4cf6-909a-0743c87de692 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:60:cb:db,bridge_name='br-int',has_traffic_filtering=True,id=d1c8b84c-6329-4f97-a961-f92de838a0e8,network=Network(28997311-91ec-41eb-8d69-3ce8625aacb4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd1c8b84c-63') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Dec 05 09:37:44 compute-1 nova_compute[189066]: 2025-12-05 09:37:44.298 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:37:44 compute-1 nova_compute[189066]: 2025-12-05 09:37:44.298 189070 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd1c8b84c-63, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 09:37:44 compute-1 nova_compute[189066]: 2025-12-05 09:37:44.303 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 09:37:44 compute-1 nova_compute[189066]: 2025-12-05 09:37:44.309 189070 INFO os_vif [None req-b790e920-bc39-4cf6-909a-0743c87de692 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:60:cb:db,bridge_name='br-int',has_traffic_filtering=True,id=d1c8b84c-6329-4f97-a961-f92de838a0e8,network=Network(28997311-91ec-41eb-8d69-3ce8625aacb4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd1c8b84c-63')
Dec 05 09:37:44 compute-1 nova_compute[189066]: 2025-12-05 09:37:44.310 189070 INFO nova.virt.libvirt.driver [None req-b790e920-bc39-4cf6-909a-0743c87de692 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: e52dbfff-7d40-42b9-98c0-efbf2c7e1f73] Deleting instance files /var/lib/nova/instances/e52dbfff-7d40-42b9-98c0-efbf2c7e1f73_del
Dec 05 09:37:44 compute-1 nova_compute[189066]: 2025-12-05 09:37:44.311 189070 INFO nova.virt.libvirt.driver [None req-b790e920-bc39-4cf6-909a-0743c87de692 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: e52dbfff-7d40-42b9-98c0-efbf2c7e1f73] Deletion of /var/lib/nova/instances/e52dbfff-7d40-42b9-98c0-efbf2c7e1f73_del complete
Dec 05 09:37:44 compute-1 nova_compute[189066]: 2025-12-05 09:37:44.415 189070 INFO nova.compute.manager [None req-b790e920-bc39-4cf6-909a-0743c87de692 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: e52dbfff-7d40-42b9-98c0-efbf2c7e1f73] Took 0.42 seconds to destroy the instance on the hypervisor.
Dec 05 09:37:44 compute-1 nova_compute[189066]: 2025-12-05 09:37:44.416 189070 DEBUG oslo.service.loopingcall [None req-b790e920-bc39-4cf6-909a-0743c87de692 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Dec 05 09:37:44 compute-1 nova_compute[189066]: 2025-12-05 09:37:44.416 189070 DEBUG nova.compute.manager [-] [instance: e52dbfff-7d40-42b9-98c0-efbf2c7e1f73] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Dec 05 09:37:44 compute-1 nova_compute[189066]: 2025-12-05 09:37:44.416 189070 DEBUG nova.network.neutron [-] [instance: e52dbfff-7d40-42b9-98c0-efbf2c7e1f73] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Dec 05 09:37:44 compute-1 nova_compute[189066]: 2025-12-05 09:37:44.773 189070 DEBUG nova.compute.manager [req-ffc57d84-d187-42ff-9b68-f43a1f9695e9 req-1125b4ff-f096-4b58-9f74-b12ceb8c6937 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: e52dbfff-7d40-42b9-98c0-efbf2c7e1f73] Received event network-vif-unplugged-d1c8b84c-6329-4f97-a961-f92de838a0e8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 09:37:44 compute-1 nova_compute[189066]: 2025-12-05 09:37:44.774 189070 DEBUG oslo_concurrency.lockutils [req-ffc57d84-d187-42ff-9b68-f43a1f9695e9 req-1125b4ff-f096-4b58-9f74-b12ceb8c6937 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Acquiring lock "e52dbfff-7d40-42b9-98c0-efbf2c7e1f73-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:37:44 compute-1 nova_compute[189066]: 2025-12-05 09:37:44.774 189070 DEBUG oslo_concurrency.lockutils [req-ffc57d84-d187-42ff-9b68-f43a1f9695e9 req-1125b4ff-f096-4b58-9f74-b12ceb8c6937 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Lock "e52dbfff-7d40-42b9-98c0-efbf2c7e1f73-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:37:44 compute-1 nova_compute[189066]: 2025-12-05 09:37:44.775 189070 DEBUG oslo_concurrency.lockutils [req-ffc57d84-d187-42ff-9b68-f43a1f9695e9 req-1125b4ff-f096-4b58-9f74-b12ceb8c6937 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Lock "e52dbfff-7d40-42b9-98c0-efbf2c7e1f73-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:37:44 compute-1 nova_compute[189066]: 2025-12-05 09:37:44.775 189070 DEBUG nova.compute.manager [req-ffc57d84-d187-42ff-9b68-f43a1f9695e9 req-1125b4ff-f096-4b58-9f74-b12ceb8c6937 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: e52dbfff-7d40-42b9-98c0-efbf2c7e1f73] No waiting events found dispatching network-vif-unplugged-d1c8b84c-6329-4f97-a961-f92de838a0e8 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 09:37:44 compute-1 nova_compute[189066]: 2025-12-05 09:37:44.775 189070 DEBUG nova.compute.manager [req-ffc57d84-d187-42ff-9b68-f43a1f9695e9 req-1125b4ff-f096-4b58-9f74-b12ceb8c6937 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: e52dbfff-7d40-42b9-98c0-efbf2c7e1f73] Received event network-vif-unplugged-d1c8b84c-6329-4f97-a961-f92de838a0e8 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Dec 05 09:37:45 compute-1 nova_compute[189066]: 2025-12-05 09:37:45.185 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:37:46 compute-1 nova_compute[189066]: 2025-12-05 09:37:46.009 189070 DEBUG nova.network.neutron [-] [instance: e52dbfff-7d40-42b9-98c0-efbf2c7e1f73] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 09:37:46 compute-1 nova_compute[189066]: 2025-12-05 09:37:46.037 189070 INFO nova.compute.manager [-] [instance: e52dbfff-7d40-42b9-98c0-efbf2c7e1f73] Took 1.62 seconds to deallocate network for instance.
Dec 05 09:37:46 compute-1 nova_compute[189066]: 2025-12-05 09:37:46.096 189070 DEBUG oslo_concurrency.lockutils [None req-b790e920-bc39-4cf6-909a-0743c87de692 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:37:46 compute-1 nova_compute[189066]: 2025-12-05 09:37:46.097 189070 DEBUG oslo_concurrency.lockutils [None req-b790e920-bc39-4cf6-909a-0743c87de692 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:37:46 compute-1 nova_compute[189066]: 2025-12-05 09:37:46.212 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:37:46 compute-1 nova_compute[189066]: 2025-12-05 09:37:46.215 189070 DEBUG nova.compute.provider_tree [None req-b790e920-bc39-4cf6-909a-0743c87de692 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Inventory has not changed in ProviderTree for provider: be68f9f1-7820-4bfa-8dbd-210e13729f64 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 09:37:46 compute-1 nova_compute[189066]: 2025-12-05 09:37:46.234 189070 DEBUG nova.scheduler.client.report [None req-b790e920-bc39-4cf6-909a-0743c87de692 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Inventory has not changed for provider be68f9f1-7820-4bfa-8dbd-210e13729f64 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 09:37:46 compute-1 nova_compute[189066]: 2025-12-05 09:37:46.261 189070 DEBUG oslo_concurrency.lockutils [None req-b790e920-bc39-4cf6-909a-0743c87de692 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.165s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:37:46 compute-1 nova_compute[189066]: 2025-12-05 09:37:46.306 189070 INFO nova.scheduler.client.report [None req-b790e920-bc39-4cf6-909a-0743c87de692 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Deleted allocations for instance e52dbfff-7d40-42b9-98c0-efbf2c7e1f73
Dec 05 09:37:46 compute-1 nova_compute[189066]: 2025-12-05 09:37:46.397 189070 DEBUG oslo_concurrency.lockutils [None req-b790e920-bc39-4cf6-909a-0743c87de692 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Lock "e52dbfff-7d40-42b9-98c0-efbf2c7e1f73" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.405s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:37:46 compute-1 nova_compute[189066]: 2025-12-05 09:37:46.971 189070 DEBUG nova.compute.manager [req-6d4e7129-56f9-4d8f-beb1-5c5e45f0f4a9 req-4e5b19b6-fc61-48b9-bd56-f0ac074e4590 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: e52dbfff-7d40-42b9-98c0-efbf2c7e1f73] Received event network-vif-plugged-d1c8b84c-6329-4f97-a961-f92de838a0e8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 09:37:46 compute-1 nova_compute[189066]: 2025-12-05 09:37:46.972 189070 DEBUG oslo_concurrency.lockutils [req-6d4e7129-56f9-4d8f-beb1-5c5e45f0f4a9 req-4e5b19b6-fc61-48b9-bd56-f0ac074e4590 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Acquiring lock "e52dbfff-7d40-42b9-98c0-efbf2c7e1f73-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:37:46 compute-1 nova_compute[189066]: 2025-12-05 09:37:46.972 189070 DEBUG oslo_concurrency.lockutils [req-6d4e7129-56f9-4d8f-beb1-5c5e45f0f4a9 req-4e5b19b6-fc61-48b9-bd56-f0ac074e4590 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Lock "e52dbfff-7d40-42b9-98c0-efbf2c7e1f73-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:37:46 compute-1 nova_compute[189066]: 2025-12-05 09:37:46.972 189070 DEBUG oslo_concurrency.lockutils [req-6d4e7129-56f9-4d8f-beb1-5c5e45f0f4a9 req-4e5b19b6-fc61-48b9-bd56-f0ac074e4590 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Lock "e52dbfff-7d40-42b9-98c0-efbf2c7e1f73-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:37:46 compute-1 nova_compute[189066]: 2025-12-05 09:37:46.972 189070 DEBUG nova.compute.manager [req-6d4e7129-56f9-4d8f-beb1-5c5e45f0f4a9 req-4e5b19b6-fc61-48b9-bd56-f0ac074e4590 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: e52dbfff-7d40-42b9-98c0-efbf2c7e1f73] No waiting events found dispatching network-vif-plugged-d1c8b84c-6329-4f97-a961-f92de838a0e8 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 09:37:46 compute-1 nova_compute[189066]: 2025-12-05 09:37:46.972 189070 WARNING nova.compute.manager [req-6d4e7129-56f9-4d8f-beb1-5c5e45f0f4a9 req-4e5b19b6-fc61-48b9-bd56-f0ac074e4590 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: e52dbfff-7d40-42b9-98c0-efbf2c7e1f73] Received unexpected event network-vif-plugged-d1c8b84c-6329-4f97-a961-f92de838a0e8 for instance with vm_state deleted and task_state None.
Dec 05 09:37:46 compute-1 nova_compute[189066]: 2025-12-05 09:37:46.973 189070 DEBUG nova.compute.manager [req-6d4e7129-56f9-4d8f-beb1-5c5e45f0f4a9 req-4e5b19b6-fc61-48b9-bd56-f0ac074e4590 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: e52dbfff-7d40-42b9-98c0-efbf2c7e1f73] Received event network-vif-deleted-d1c8b84c-6329-4f97-a961-f92de838a0e8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 09:37:48 compute-1 nova_compute[189066]: 2025-12-05 09:37:48.410 189070 DEBUG oslo_concurrency.lockutils [None req-702a17de-0400-4457-93c0-f356fcae3e1e 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Acquiring lock "interface-caf0a99c-b4d0-4fac-9883-ab0be359b528-5f95901b-9655-49a7-a46d-bfe7f4c86a96" by "nova.compute.manager.ComputeManager.detach_interface.<locals>.do_detach_interface" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:37:48 compute-1 nova_compute[189066]: 2025-12-05 09:37:48.411 189070 DEBUG oslo_concurrency.lockutils [None req-702a17de-0400-4457-93c0-f356fcae3e1e 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Lock "interface-caf0a99c-b4d0-4fac-9883-ab0be359b528-5f95901b-9655-49a7-a46d-bfe7f4c86a96" acquired by "nova.compute.manager.ComputeManager.detach_interface.<locals>.do_detach_interface" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:37:48 compute-1 nova_compute[189066]: 2025-12-05 09:37:48.430 189070 DEBUG nova.objects.instance [None req-702a17de-0400-4457-93c0-f356fcae3e1e 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Lazy-loading 'flavor' on Instance uuid caf0a99c-b4d0-4fac-9883-ab0be359b528 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 09:37:48 compute-1 nova_compute[189066]: 2025-12-05 09:37:48.458 189070 DEBUG nova.virt.libvirt.vif [None req-702a17de-0400-4457-93c0-f356fcae3e1e 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-05T09:35:53Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1152892970',display_name='tempest-TestNetworkBasicOps-server-1152892970',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1152892970',id=34,image_ref='3ebffd97-b242-42d7-b245-ebdaf8e4377c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBAOX2Isf6e1VVEipyaY1bQ3oo7OBeHw8nA3AYDjK6acPXxSqT3BdAkhZsaMDELIc3uY4dO92h8xGVeDLew9jVMMUIkjYQ57YIHVif+pT2wHO21vXehX0yK2yZSKNfMCQjw==',key_name='tempest-TestNetworkBasicOps-1301626670',keypairs=<?>,launch_index=0,launched_at=2025-12-05T09:36:12Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='f918144b49634ed5a43d75f8f7d194d3',ramdisk_id='',reservation_id='r-8kfznvii',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='3ebffd97-b242-42d7-b245-ebdaf8e4377c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1341993432',owner_user_name='tempest-TestNetworkBasicOps-1341993432-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-12-05T09:36:12Z,user_data=None,user_id='822a2d80e80e46d9a5a49e1e9560e0d9',uuid=caf0a99c-b4d0-4fac-9883-ab0be359b528,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "5f95901b-9655-49a7-a46d-bfe7f4c86a96", "address": "fa:16:3e:b9:90:da", "network": {"id": "28997311-91ec-41eb-8d69-3ce8625aacb4", "bridge": "br-int", "label": "tempest-network-smoke--2657185", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.20", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f918144b49634ed5a43d75f8f7d194d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5f95901b-96", "ovs_interfaceid": "5f95901b-9655-49a7-a46d-bfe7f4c86a96", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Dec 05 09:37:48 compute-1 nova_compute[189066]: 2025-12-05 09:37:48.459 189070 DEBUG nova.network.os_vif_util [None req-702a17de-0400-4457-93c0-f356fcae3e1e 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Converting VIF {"id": "5f95901b-9655-49a7-a46d-bfe7f4c86a96", "address": "fa:16:3e:b9:90:da", "network": {"id": "28997311-91ec-41eb-8d69-3ce8625aacb4", "bridge": "br-int", "label": "tempest-network-smoke--2657185", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.20", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f918144b49634ed5a43d75f8f7d194d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5f95901b-96", "ovs_interfaceid": "5f95901b-9655-49a7-a46d-bfe7f4c86a96", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 09:37:48 compute-1 nova_compute[189066]: 2025-12-05 09:37:48.460 189070 DEBUG nova.network.os_vif_util [None req-702a17de-0400-4457-93c0-f356fcae3e1e 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:b9:90:da,bridge_name='br-int',has_traffic_filtering=True,id=5f95901b-9655-49a7-a46d-bfe7f4c86a96,network=Network(28997311-91ec-41eb-8d69-3ce8625aacb4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5f95901b-96') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 09:37:48 compute-1 nova_compute[189066]: 2025-12-05 09:37:48.464 189070 DEBUG nova.virt.libvirt.guest [None req-702a17de-0400-4457-93c0-f356fcae3e1e 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:b9:90:da"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap5f95901b-96"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257
Dec 05 09:37:48 compute-1 nova_compute[189066]: 2025-12-05 09:37:48.468 189070 DEBUG nova.virt.libvirt.guest [None req-702a17de-0400-4457-93c0-f356fcae3e1e 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:b9:90:da"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap5f95901b-96"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257
Dec 05 09:37:48 compute-1 nova_compute[189066]: 2025-12-05 09:37:48.471 189070 DEBUG nova.virt.libvirt.driver [None req-702a17de-0400-4457-93c0-f356fcae3e1e 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Attempting to detach device tap5f95901b-96 from instance caf0a99c-b4d0-4fac-9883-ab0be359b528 from the persistent domain config. _detach_from_persistent /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2487
Dec 05 09:37:48 compute-1 nova_compute[189066]: 2025-12-05 09:37:48.472 189070 DEBUG nova.virt.libvirt.guest [None req-702a17de-0400-4457-93c0-f356fcae3e1e 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] detach device xml: <interface type="ethernet">
Dec 05 09:37:48 compute-1 nova_compute[189066]:   <mac address="fa:16:3e:b9:90:da"/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:   <model type="virtio"/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:   <driver name="vhost" rx_queue_size="512"/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:   <mtu size="1442"/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:   <target dev="tap5f95901b-96"/>
Dec 05 09:37:48 compute-1 nova_compute[189066]: </interface>
Dec 05 09:37:48 compute-1 nova_compute[189066]:  detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465
Dec 05 09:37:48 compute-1 nova_compute[189066]: 2025-12-05 09:37:48.479 189070 DEBUG nova.virt.libvirt.guest [None req-702a17de-0400-4457-93c0-f356fcae3e1e 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:b9:90:da"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap5f95901b-96"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257
Dec 05 09:37:48 compute-1 nova_compute[189066]: 2025-12-05 09:37:48.485 189070 DEBUG nova.virt.libvirt.guest [None req-702a17de-0400-4457-93c0-f356fcae3e1e 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] interface for config: <interface type="ethernet"><mac address="fa:16:3e:b9:90:da"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap5f95901b-96"/></interface>not found in domain: <domain type='kvm' id='15'>
Dec 05 09:37:48 compute-1 nova_compute[189066]:   <name>instance-00000022</name>
Dec 05 09:37:48 compute-1 nova_compute[189066]:   <uuid>caf0a99c-b4d0-4fac-9883-ab0be359b528</uuid>
Dec 05 09:37:48 compute-1 nova_compute[189066]:   <metadata>
Dec 05 09:37:48 compute-1 nova_compute[189066]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1" xmlns:instance="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 05 09:37:48 compute-1 nova_compute[189066]:   <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:   <nova:name>tempest-TestNetworkBasicOps-server-1152892970</nova:name>
Dec 05 09:37:48 compute-1 nova_compute[189066]:   <nova:creationTime>2025-12-05 09:36:44</nova:creationTime>
Dec 05 09:37:48 compute-1 nova_compute[189066]:   <nova:flavor name="m1.nano">
Dec 05 09:37:48 compute-1 nova_compute[189066]:     <nova:memory>128</nova:memory>
Dec 05 09:37:48 compute-1 nova_compute[189066]:     <nova:disk>1</nova:disk>
Dec 05 09:37:48 compute-1 nova_compute[189066]:     <nova:swap>0</nova:swap>
Dec 05 09:37:48 compute-1 nova_compute[189066]:     <nova:ephemeral>0</nova:ephemeral>
Dec 05 09:37:48 compute-1 nova_compute[189066]:     <nova:vcpus>1</nova:vcpus>
Dec 05 09:37:48 compute-1 nova_compute[189066]:   </nova:flavor>
Dec 05 09:37:48 compute-1 nova_compute[189066]:   <nova:owner>
Dec 05 09:37:48 compute-1 nova_compute[189066]:     <nova:user uuid="822a2d80e80e46d9a5a49e1e9560e0d9">tempest-TestNetworkBasicOps-1341993432-project-member</nova:user>
Dec 05 09:37:48 compute-1 nova_compute[189066]:     <nova:project uuid="f918144b49634ed5a43d75f8f7d194d3">tempest-TestNetworkBasicOps-1341993432</nova:project>
Dec 05 09:37:48 compute-1 nova_compute[189066]:   </nova:owner>
Dec 05 09:37:48 compute-1 nova_compute[189066]:   <nova:root type="image" uuid="3ebffd97-b242-42d7-b245-ebdaf8e4377c"/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:   <nova:ports>
Dec 05 09:37:48 compute-1 nova_compute[189066]:     <nova:port uuid="dc081d90-d263-461d-a7af-bc8649b2d24e">
Dec 05 09:37:48 compute-1 nova_compute[189066]:       <nova:ip type="fixed" address="10.100.0.7" ipVersion="4"/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:     </nova:port>
Dec 05 09:37:48 compute-1 nova_compute[189066]:     <nova:port uuid="5f95901b-9655-49a7-a46d-bfe7f4c86a96">
Dec 05 09:37:48 compute-1 nova_compute[189066]:       <nova:ip type="fixed" address="10.100.0.20" ipVersion="4"/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:     </nova:port>
Dec 05 09:37:48 compute-1 nova_compute[189066]:   </nova:ports>
Dec 05 09:37:48 compute-1 nova_compute[189066]: </nova:instance>
Dec 05 09:37:48 compute-1 nova_compute[189066]:   </metadata>
Dec 05 09:37:48 compute-1 nova_compute[189066]:   <memory unit='KiB'>131072</memory>
Dec 05 09:37:48 compute-1 nova_compute[189066]:   <currentMemory unit='KiB'>131072</currentMemory>
Dec 05 09:37:48 compute-1 nova_compute[189066]:   <vcpu placement='static'>1</vcpu>
Dec 05 09:37:48 compute-1 nova_compute[189066]:   <resource>
Dec 05 09:37:48 compute-1 nova_compute[189066]:     <partition>/machine</partition>
Dec 05 09:37:48 compute-1 nova_compute[189066]:   </resource>
Dec 05 09:37:48 compute-1 nova_compute[189066]:   <sysinfo type='smbios'>
Dec 05 09:37:48 compute-1 nova_compute[189066]:     <system>
Dec 05 09:37:48 compute-1 nova_compute[189066]:       <entry name='manufacturer'>RDO</entry>
Dec 05 09:37:48 compute-1 nova_compute[189066]:       <entry name='product'>OpenStack Compute</entry>
Dec 05 09:37:48 compute-1 nova_compute[189066]:       <entry name='version'>27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec 05 09:37:48 compute-1 nova_compute[189066]:       <entry name='serial'>caf0a99c-b4d0-4fac-9883-ab0be359b528</entry>
Dec 05 09:37:48 compute-1 nova_compute[189066]:       <entry name='uuid'>caf0a99c-b4d0-4fac-9883-ab0be359b528</entry>
Dec 05 09:37:48 compute-1 nova_compute[189066]:       <entry name='family'>Virtual Machine</entry>
Dec 05 09:37:48 compute-1 nova_compute[189066]:     </system>
Dec 05 09:37:48 compute-1 nova_compute[189066]:   </sysinfo>
Dec 05 09:37:48 compute-1 nova_compute[189066]:   <os>
Dec 05 09:37:48 compute-1 nova_compute[189066]:     <type arch='x86_64' machine='pc-q35-rhel9.8.0'>hvm</type>
Dec 05 09:37:48 compute-1 nova_compute[189066]:     <boot dev='hd'/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:     <smbios mode='sysinfo'/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:   </os>
Dec 05 09:37:48 compute-1 nova_compute[189066]:   <features>
Dec 05 09:37:48 compute-1 nova_compute[189066]:     <acpi/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:     <apic/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:     <vmcoreinfo state='on'/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:   </features>
Dec 05 09:37:48 compute-1 nova_compute[189066]:   <cpu mode='custom' match='exact' check='full'>
Dec 05 09:37:48 compute-1 nova_compute[189066]:     <model fallback='forbid'>Nehalem</model>
Dec 05 09:37:48 compute-1 nova_compute[189066]:     <topology sockets='1' dies='1' clusters='1' cores='1' threads='1'/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:     <feature policy='require' name='x2apic'/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:     <feature policy='require' name='hypervisor'/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:     <feature policy='require' name='vme'/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:   </cpu>
Dec 05 09:37:48 compute-1 nova_compute[189066]:   <clock offset='utc'>
Dec 05 09:37:48 compute-1 nova_compute[189066]:     <timer name='pit' tickpolicy='delay'/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:     <timer name='rtc' tickpolicy='catchup'/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:     <timer name='hpet' present='no'/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:   </clock>
Dec 05 09:37:48 compute-1 nova_compute[189066]:   <on_poweroff>destroy</on_poweroff>
Dec 05 09:37:48 compute-1 nova_compute[189066]:   <on_reboot>restart</on_reboot>
Dec 05 09:37:48 compute-1 nova_compute[189066]:   <on_crash>destroy</on_crash>
Dec 05 09:37:48 compute-1 nova_compute[189066]:   <devices>
Dec 05 09:37:48 compute-1 nova_compute[189066]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Dec 05 09:37:48 compute-1 nova_compute[189066]:     <disk type='file' device='disk'>
Dec 05 09:37:48 compute-1 nova_compute[189066]:       <driver name='qemu' type='qcow2' cache='none'/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:       <source file='/var/lib/nova/instances/caf0a99c-b4d0-4fac-9883-ab0be359b528/disk' index='2'/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:       <backingStore type='file' index='3'>
Dec 05 09:37:48 compute-1 nova_compute[189066]:         <format type='raw'/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:         <source file='/var/lib/nova/instances/_base/5b1bdb5cc50b9bf43ebe3094e9279a12a18df0ce'/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:         <backingStore/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:       </backingStore>
Dec 05 09:37:48 compute-1 nova_compute[189066]:       <target dev='vda' bus='virtio'/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:       <alias name='virtio-disk0'/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:       <address type='pci' domain='0x0000' bus='0x03' slot='0x00' function='0x0'/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:     </disk>
Dec 05 09:37:48 compute-1 nova_compute[189066]:     <disk type='file' device='cdrom'>
Dec 05 09:37:48 compute-1 nova_compute[189066]:       <driver name='qemu' type='raw' cache='none'/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:       <source file='/var/lib/nova/instances/caf0a99c-b4d0-4fac-9883-ab0be359b528/disk.config' index='1'/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:       <backingStore/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:       <target dev='sda' bus='sata'/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:       <readonly/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:       <alias name='sata0-0-0'/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:       <address type='drive' controller='0' bus='0' target='0' unit='0'/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:     </disk>
Dec 05 09:37:48 compute-1 nova_compute[189066]:     <controller type='pci' index='0' model='pcie-root'>
Dec 05 09:37:48 compute-1 nova_compute[189066]:       <alias name='pcie.0'/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:     </controller>
Dec 05 09:37:48 compute-1 nova_compute[189066]:     <controller type='pci' index='1' model='pcie-root-port'>
Dec 05 09:37:48 compute-1 nova_compute[189066]:       <model name='pcie-root-port'/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:       <target chassis='1' port='0x10'/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:       <alias name='pci.1'/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x0' multifunction='on'/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:     </controller>
Dec 05 09:37:48 compute-1 nova_compute[189066]:     <controller type='pci' index='2' model='pcie-root-port'>
Dec 05 09:37:48 compute-1 nova_compute[189066]:       <model name='pcie-root-port'/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:       <target chassis='2' port='0x11'/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:       <alias name='pci.2'/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x1'/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:     </controller>
Dec 05 09:37:48 compute-1 nova_compute[189066]:     <controller type='pci' index='3' model='pcie-root-port'>
Dec 05 09:37:48 compute-1 nova_compute[189066]:       <model name='pcie-root-port'/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:       <target chassis='3' port='0x12'/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:       <alias name='pci.3'/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x2'/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:     </controller>
Dec 05 09:37:48 compute-1 nova_compute[189066]:     <controller type='pci' index='4' model='pcie-root-port'>
Dec 05 09:37:48 compute-1 nova_compute[189066]:       <model name='pcie-root-port'/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:       <target chassis='4' port='0x13'/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:       <alias name='pci.4'/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x3'/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:     </controller>
Dec 05 09:37:48 compute-1 nova_compute[189066]:     <controller type='pci' index='5' model='pcie-root-port'>
Dec 05 09:37:48 compute-1 nova_compute[189066]:       <model name='pcie-root-port'/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:       <target chassis='5' port='0x14'/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:       <alias name='pci.5'/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x4'/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:     </controller>
Dec 05 09:37:48 compute-1 nova_compute[189066]:     <controller type='pci' index='6' model='pcie-root-port'>
Dec 05 09:37:48 compute-1 nova_compute[189066]:       <model name='pcie-root-port'/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:       <target chassis='6' port='0x15'/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:       <alias name='pci.6'/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x5'/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:     </controller>
Dec 05 09:37:48 compute-1 nova_compute[189066]:     <controller type='pci' index='7' model='pcie-root-port'>
Dec 05 09:37:48 compute-1 nova_compute[189066]:       <model name='pcie-root-port'/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:       <target chassis='7' port='0x16'/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:       <alias name='pci.7'/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x6'/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:     </controller>
Dec 05 09:37:48 compute-1 nova_compute[189066]:     <controller type='pci' index='8' model='pcie-root-port'>
Dec 05 09:37:48 compute-1 nova_compute[189066]:       <model name='pcie-root-port'/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:       <target chassis='8' port='0x17'/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:       <alias name='pci.8'/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x7'/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:     </controller>
Dec 05 09:37:48 compute-1 nova_compute[189066]:     <controller type='pci' index='9' model='pcie-root-port'>
Dec 05 09:37:48 compute-1 nova_compute[189066]:       <model name='pcie-root-port'/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:       <target chassis='9' port='0x18'/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:       <alias name='pci.9'/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0' multifunction='on'/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:     </controller>
Dec 05 09:37:48 compute-1 nova_compute[189066]:     <controller type='pci' index='10' model='pcie-root-port'>
Dec 05 09:37:48 compute-1 nova_compute[189066]:       <model name='pcie-root-port'/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:       <target chassis='10' port='0x19'/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:       <alias name='pci.10'/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x1'/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:     </controller>
Dec 05 09:37:48 compute-1 nova_compute[189066]:     <controller type='pci' index='11' model='pcie-root-port'>
Dec 05 09:37:48 compute-1 nova_compute[189066]:       <model name='pcie-root-port'/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:       <target chassis='11' port='0x1a'/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:       <alias name='pci.11'/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x2'/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:     </controller>
Dec 05 09:37:48 compute-1 nova_compute[189066]:     <controller type='pci' index='12' model='pcie-root-port'>
Dec 05 09:37:48 compute-1 nova_compute[189066]:       <model name='pcie-root-port'/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:       <target chassis='12' port='0x1b'/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:       <alias name='pci.12'/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x3'/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:     </controller>
Dec 05 09:37:48 compute-1 nova_compute[189066]:     <controller type='pci' index='13' model='pcie-root-port'>
Dec 05 09:37:48 compute-1 nova_compute[189066]:       <model name='pcie-root-port'/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:       <target chassis='13' port='0x1c'/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:       <alias name='pci.13'/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x4'/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:     </controller>
Dec 05 09:37:48 compute-1 nova_compute[189066]:     <controller type='pci' index='14' model='pcie-root-port'>
Dec 05 09:37:48 compute-1 nova_compute[189066]:       <model name='pcie-root-port'/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:       <target chassis='14' port='0x1d'/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:       <alias name='pci.14'/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x5'/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:     </controller>
Dec 05 09:37:48 compute-1 nova_compute[189066]:     <controller type='pci' index='15' model='pcie-root-port'>
Dec 05 09:37:48 compute-1 nova_compute[189066]:       <model name='pcie-root-port'/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:       <target chassis='15' port='0x1e'/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:       <alias name='pci.15'/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x6'/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:     </controller>
Dec 05 09:37:48 compute-1 nova_compute[189066]:     <controller type='pci' index='16' model='pcie-root-port'>
Dec 05 09:37:48 compute-1 nova_compute[189066]:       <model name='pcie-root-port'/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:       <target chassis='16' port='0x1f'/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:       <alias name='pci.16'/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x7'/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:     </controller>
Dec 05 09:37:48 compute-1 nova_compute[189066]:     <controller type='pci' index='17' model='pcie-root-port'>
Dec 05 09:37:48 compute-1 nova_compute[189066]:       <model name='pcie-root-port'/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:       <target chassis='17' port='0x20'/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:       <alias name='pci.17'/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x0' multifunction='on'/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:     </controller>
Dec 05 09:37:48 compute-1 nova_compute[189066]:     <controller type='pci' index='18' model='pcie-root-port'>
Dec 05 09:37:48 compute-1 nova_compute[189066]:       <model name='pcie-root-port'/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:       <target chassis='18' port='0x21'/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:       <alias name='pci.18'/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x1'/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:     </controller>
Dec 05 09:37:48 compute-1 nova_compute[189066]:     <controller type='pci' index='19' model='pcie-root-port'>
Dec 05 09:37:48 compute-1 nova_compute[189066]:       <model name='pcie-root-port'/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:       <target chassis='19' port='0x22'/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:       <alias name='pci.19'/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x2'/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:     </controller>
Dec 05 09:37:48 compute-1 nova_compute[189066]:     <controller type='pci' index='20' model='pcie-root-port'>
Dec 05 09:37:48 compute-1 nova_compute[189066]:       <model name='pcie-root-port'/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:       <target chassis='20' port='0x23'/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:       <alias name='pci.20'/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x3'/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:     </controller>
Dec 05 09:37:48 compute-1 nova_compute[189066]:     <controller type='pci' index='21' model='pcie-root-port'>
Dec 05 09:37:48 compute-1 nova_compute[189066]:       <model name='pcie-root-port'/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:       <target chassis='21' port='0x24'/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:       <alias name='pci.21'/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x4'/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:     </controller>
Dec 05 09:37:48 compute-1 nova_compute[189066]:     <controller type='pci' index='22' model='pcie-root-port'>
Dec 05 09:37:48 compute-1 nova_compute[189066]:       <model name='pcie-root-port'/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:       <target chassis='22' port='0x25'/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:       <alias name='pci.22'/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x5'/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:     </controller>
Dec 05 09:37:48 compute-1 nova_compute[189066]:     <controller type='pci' index='23' model='pcie-root-port'>
Dec 05 09:37:48 compute-1 nova_compute[189066]:       <model name='pcie-root-port'/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:       <target chassis='23' port='0x26'/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:       <alias name='pci.23'/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x6'/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:     </controller>
Dec 05 09:37:48 compute-1 nova_compute[189066]:     <controller type='pci' index='24' model='pcie-root-port'>
Dec 05 09:37:48 compute-1 nova_compute[189066]:       <model name='pcie-root-port'/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:       <target chassis='24' port='0x27'/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:       <alias name='pci.24'/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x7'/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:     </controller>
Dec 05 09:37:48 compute-1 nova_compute[189066]:     <controller type='pci' index='25' model='pcie-root-port'>
Dec 05 09:37:48 compute-1 nova_compute[189066]:       <model name='pcie-root-port'/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:       <target chassis='25' port='0x28'/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:       <alias name='pci.25'/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x05' function='0x0'/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:     </controller>
Dec 05 09:37:48 compute-1 nova_compute[189066]:     <controller type='pci' index='26' model='pcie-to-pci-bridge'>
Dec 05 09:37:48 compute-1 nova_compute[189066]:       <model name='pcie-pci-bridge'/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:       <alias name='pci.26'/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:       <address type='pci' domain='0x0000' bus='0x01' slot='0x00' function='0x0'/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:     </controller>
Dec 05 09:37:48 compute-1 nova_compute[189066]:     <controller type='usb' index='0' model='piix3-uhci'>
Dec 05 09:37:48 compute-1 nova_compute[189066]:       <alias name='usb'/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:       <address type='pci' domain='0x0000' bus='0x1a' slot='0x01' function='0x0'/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:     </controller>
Dec 05 09:37:48 compute-1 nova_compute[189066]:     <controller type='sata' index='0'>
Dec 05 09:37:48 compute-1 nova_compute[189066]:       <alias name='ide'/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x1f' function='0x2'/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:     </controller>
Dec 05 09:37:48 compute-1 nova_compute[189066]:     <interface type='ethernet'>
Dec 05 09:37:48 compute-1 nova_compute[189066]:       <mac address='fa:16:3e:5a:50:59'/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:       <target dev='tapdc081d90-d2'/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:       <model type='virtio'/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:       <driver name='vhost' rx_queue_size='512'/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:       <mtu size='1442'/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:       <alias name='net0'/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:       <address type='pci' domain='0x0000' bus='0x02' slot='0x00' function='0x0'/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:     </interface>
Dec 05 09:37:48 compute-1 nova_compute[189066]:     <interface type='ethernet'>
Dec 05 09:37:48 compute-1 nova_compute[189066]:       <mac address='fa:16:3e:b9:90:da'/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:       <target dev='tap5f95901b-96'/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:       <model type='virtio'/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:       <driver name='vhost' rx_queue_size='512'/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:       <mtu size='1442'/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:       <alias name='net1'/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:       <address type='pci' domain='0x0000' bus='0x06' slot='0x00' function='0x0'/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:     </interface>
Dec 05 09:37:48 compute-1 nova_compute[189066]:     <serial type='pty'>
Dec 05 09:37:48 compute-1 nova_compute[189066]:       <source path='/dev/pts/0'/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:       <log file='/var/lib/nova/instances/caf0a99c-b4d0-4fac-9883-ab0be359b528/console.log' append='off'/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:       <target type='isa-serial' port='0'>
Dec 05 09:37:48 compute-1 nova_compute[189066]:         <model name='isa-serial'/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:       </target>
Dec 05 09:37:48 compute-1 nova_compute[189066]:       <alias name='serial0'/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:     </serial>
Dec 05 09:37:48 compute-1 nova_compute[189066]:     <console type='pty' tty='/dev/pts/0'>
Dec 05 09:37:48 compute-1 nova_compute[189066]:       <source path='/dev/pts/0'/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:       <log file='/var/lib/nova/instances/caf0a99c-b4d0-4fac-9883-ab0be359b528/console.log' append='off'/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:       <target type='serial' port='0'/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:       <alias name='serial0'/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:     </console>
Dec 05 09:37:48 compute-1 nova_compute[189066]:     <input type='tablet' bus='usb'>
Dec 05 09:37:48 compute-1 nova_compute[189066]:       <alias name='input0'/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:       <address type='usb' bus='0' port='1'/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:     </input>
Dec 05 09:37:48 compute-1 nova_compute[189066]:     <input type='mouse' bus='ps2'>
Dec 05 09:37:48 compute-1 nova_compute[189066]:       <alias name='input1'/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:     </input>
Dec 05 09:37:48 compute-1 nova_compute[189066]:     <input type='keyboard' bus='ps2'>
Dec 05 09:37:48 compute-1 nova_compute[189066]:       <alias name='input2'/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:     </input>
Dec 05 09:37:48 compute-1 nova_compute[189066]:     <graphics type='vnc' port='5900' autoport='yes' listen='::0'>
Dec 05 09:37:48 compute-1 nova_compute[189066]:       <listen type='address' address='::0'/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:     </graphics>
Dec 05 09:37:48 compute-1 nova_compute[189066]:     <audio id='1' type='none'/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:     <video>
Dec 05 09:37:48 compute-1 nova_compute[189066]:       <model type='virtio' heads='1' primary='yes'/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:       <alias name='video0'/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x01' function='0x0'/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:     </video>
Dec 05 09:37:48 compute-1 nova_compute[189066]:     <watchdog model='itco' action='reset'>
Dec 05 09:37:48 compute-1 nova_compute[189066]:       <alias name='watchdog0'/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:     </watchdog>
Dec 05 09:37:48 compute-1 nova_compute[189066]:     <memballoon model='virtio'>
Dec 05 09:37:48 compute-1 nova_compute[189066]:       <stats period='10'/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:       <alias name='balloon0'/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:       <address type='pci' domain='0x0000' bus='0x04' slot='0x00' function='0x0'/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:     </memballoon>
Dec 05 09:37:48 compute-1 nova_compute[189066]:     <rng model='virtio'>
Dec 05 09:37:48 compute-1 nova_compute[189066]:       <backend model='random'>/dev/urandom</backend>
Dec 05 09:37:48 compute-1 nova_compute[189066]:       <alias name='rng0'/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:       <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x0'/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:     </rng>
Dec 05 09:37:48 compute-1 nova_compute[189066]:   </devices>
Dec 05 09:37:48 compute-1 nova_compute[189066]:   <seclabel type='dynamic' model='selinux' relabel='yes'>
Dec 05 09:37:48 compute-1 nova_compute[189066]:     <label>system_u:system_r:svirt_t:s0:c66,c853</label>
Dec 05 09:37:48 compute-1 nova_compute[189066]:     <imagelabel>system_u:object_r:svirt_image_t:s0:c66,c853</imagelabel>
Dec 05 09:37:48 compute-1 nova_compute[189066]:   </seclabel>
Dec 05 09:37:48 compute-1 nova_compute[189066]:   <seclabel type='dynamic' model='dac' relabel='yes'>
Dec 05 09:37:48 compute-1 nova_compute[189066]:     <label>+107:+107</label>
Dec 05 09:37:48 compute-1 nova_compute[189066]:     <imagelabel>+107:+107</imagelabel>
Dec 05 09:37:48 compute-1 nova_compute[189066]:   </seclabel>
Dec 05 09:37:48 compute-1 nova_compute[189066]: </domain>
Dec 05 09:37:48 compute-1 nova_compute[189066]:  get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:282
Dec 05 09:37:48 compute-1 nova_compute[189066]: 2025-12-05 09:37:48.487 189070 INFO nova.virt.libvirt.driver [None req-702a17de-0400-4457-93c0-f356fcae3e1e 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Successfully detached device tap5f95901b-96 from instance caf0a99c-b4d0-4fac-9883-ab0be359b528 from the persistent domain config.
Dec 05 09:37:48 compute-1 nova_compute[189066]: 2025-12-05 09:37:48.487 189070 DEBUG nova.virt.libvirt.driver [None req-702a17de-0400-4457-93c0-f356fcae3e1e 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] (1/8): Attempting to detach device tap5f95901b-96 with device alias net1 from instance caf0a99c-b4d0-4fac-9883-ab0be359b528 from the live domain config. _detach_from_live_with_retry /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2523
Dec 05 09:37:48 compute-1 nova_compute[189066]: 2025-12-05 09:37:48.488 189070 DEBUG nova.virt.libvirt.guest [None req-702a17de-0400-4457-93c0-f356fcae3e1e 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] detach device xml: <interface type="ethernet">
Dec 05 09:37:48 compute-1 nova_compute[189066]:   <mac address="fa:16:3e:b9:90:da"/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:   <model type="virtio"/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:   <driver name="vhost" rx_queue_size="512"/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:   <mtu size="1442"/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:   <target dev="tap5f95901b-96"/>
Dec 05 09:37:48 compute-1 nova_compute[189066]: </interface>
Dec 05 09:37:48 compute-1 nova_compute[189066]:  detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465
Dec 05 09:37:48 compute-1 kernel: tap5f95901b-96 (unregistering): left promiscuous mode
Dec 05 09:37:48 compute-1 NetworkManager[55704]: <info>  [1764927468.6219] device (tap5f95901b-96): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec 05 09:37:48 compute-1 nova_compute[189066]: 2025-12-05 09:37:48.630 189070 DEBUG nova.virt.libvirt.driver [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] Received event <DeviceRemovedEvent: 1764927468.6302931, caf0a99c-b4d0-4fac-9883-ab0be359b528 => net1> from libvirt while the driver is waiting for it; dispatched. emit_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2370
Dec 05 09:37:48 compute-1 ovn_controller[95809]: 2025-12-05T09:37:48Z|00208|binding|INFO|Releasing lport 5f95901b-9655-49a7-a46d-bfe7f4c86a96 from this chassis (sb_readonly=0)
Dec 05 09:37:48 compute-1 ovn_controller[95809]: 2025-12-05T09:37:48Z|00209|binding|INFO|Setting lport 5f95901b-9655-49a7-a46d-bfe7f4c86a96 down in Southbound
Dec 05 09:37:48 compute-1 ovn_controller[95809]: 2025-12-05T09:37:48Z|00210|binding|INFO|Removing iface tap5f95901b-96 ovn-installed in OVS
Dec 05 09:37:48 compute-1 nova_compute[189066]: 2025-12-05 09:37:48.632 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:37:48 compute-1 nova_compute[189066]: 2025-12-05 09:37:48.636 189070 DEBUG nova.virt.libvirt.driver [None req-702a17de-0400-4457-93c0-f356fcae3e1e 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Start waiting for the detach event from libvirt for device tap5f95901b-96 with device alias net1 for instance caf0a99c-b4d0-4fac-9883-ab0be359b528 _detach_from_live_and_wait_for_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2599
Dec 05 09:37:48 compute-1 nova_compute[189066]: 2025-12-05 09:37:48.637 189070 DEBUG nova.virt.libvirt.guest [None req-702a17de-0400-4457-93c0-f356fcae3e1e 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:b9:90:da"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap5f95901b-96"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257
Dec 05 09:37:48 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:37:48.639 105272 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b9:90:da 10.100.0.20', 'unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.20/28', 'neutron:device_id': 'caf0a99c-b4d0-4fac-9883-ab0be359b528', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-28997311-91ec-41eb-8d69-3ce8625aacb4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'f918144b49634ed5a43d75f8f7d194d3', 'neutron:revision_number': '5', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=14d9d9ae-74ae-472f-b0db-60451ce9c2a3, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fc06f54c970>], logical_port=5f95901b-9655-49a7-a46d-bfe7f4c86a96) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fc06f54c970>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 09:37:48 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:37:48.641 105272 INFO neutron.agent.ovn.metadata.agent [-] Port 5f95901b-9655-49a7-a46d-bfe7f4c86a96 in datapath 28997311-91ec-41eb-8d69-3ce8625aacb4 unbound from our chassis
Dec 05 09:37:48 compute-1 nova_compute[189066]: 2025-12-05 09:37:48.642 189070 DEBUG nova.virt.libvirt.guest [None req-702a17de-0400-4457-93c0-f356fcae3e1e 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] interface for config: <interface type="ethernet"><mac address="fa:16:3e:b9:90:da"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap5f95901b-96"/></interface>not found in domain: <domain type='kvm' id='15'>
Dec 05 09:37:48 compute-1 nova_compute[189066]:   <name>instance-00000022</name>
Dec 05 09:37:48 compute-1 nova_compute[189066]:   <uuid>caf0a99c-b4d0-4fac-9883-ab0be359b528</uuid>
Dec 05 09:37:48 compute-1 nova_compute[189066]:   <metadata>
Dec 05 09:37:48 compute-1 nova_compute[189066]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1" xmlns:instance="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 05 09:37:48 compute-1 nova_compute[189066]:   <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:   <nova:name>tempest-TestNetworkBasicOps-server-1152892970</nova:name>
Dec 05 09:37:48 compute-1 nova_compute[189066]:   <nova:creationTime>2025-12-05 09:36:44</nova:creationTime>
Dec 05 09:37:48 compute-1 nova_compute[189066]:   <nova:flavor name="m1.nano">
Dec 05 09:37:48 compute-1 nova_compute[189066]:     <nova:memory>128</nova:memory>
Dec 05 09:37:48 compute-1 nova_compute[189066]:     <nova:disk>1</nova:disk>
Dec 05 09:37:48 compute-1 nova_compute[189066]:     <nova:swap>0</nova:swap>
Dec 05 09:37:48 compute-1 nova_compute[189066]:     <nova:ephemeral>0</nova:ephemeral>
Dec 05 09:37:48 compute-1 nova_compute[189066]:     <nova:vcpus>1</nova:vcpus>
Dec 05 09:37:48 compute-1 nova_compute[189066]:   </nova:flavor>
Dec 05 09:37:48 compute-1 nova_compute[189066]:   <nova:owner>
Dec 05 09:37:48 compute-1 nova_compute[189066]:     <nova:user uuid="822a2d80e80e46d9a5a49e1e9560e0d9">tempest-TestNetworkBasicOps-1341993432-project-member</nova:user>
Dec 05 09:37:48 compute-1 nova_compute[189066]:     <nova:project uuid="f918144b49634ed5a43d75f8f7d194d3">tempest-TestNetworkBasicOps-1341993432</nova:project>
Dec 05 09:37:48 compute-1 nova_compute[189066]:   </nova:owner>
Dec 05 09:37:48 compute-1 nova_compute[189066]:   <nova:root type="image" uuid="3ebffd97-b242-42d7-b245-ebdaf8e4377c"/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:   <nova:ports>
Dec 05 09:37:48 compute-1 nova_compute[189066]:     <nova:port uuid="dc081d90-d263-461d-a7af-bc8649b2d24e">
Dec 05 09:37:48 compute-1 nova_compute[189066]:       <nova:ip type="fixed" address="10.100.0.7" ipVersion="4"/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:     </nova:port>
Dec 05 09:37:48 compute-1 nova_compute[189066]:     <nova:port uuid="5f95901b-9655-49a7-a46d-bfe7f4c86a96">
Dec 05 09:37:48 compute-1 nova_compute[189066]:       <nova:ip type="fixed" address="10.100.0.20" ipVersion="4"/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:     </nova:port>
Dec 05 09:37:48 compute-1 nova_compute[189066]:   </nova:ports>
Dec 05 09:37:48 compute-1 nova_compute[189066]: </nova:instance>
Dec 05 09:37:48 compute-1 nova_compute[189066]:   </metadata>
Dec 05 09:37:48 compute-1 nova_compute[189066]:   <memory unit='KiB'>131072</memory>
Dec 05 09:37:48 compute-1 nova_compute[189066]:   <currentMemory unit='KiB'>131072</currentMemory>
Dec 05 09:37:48 compute-1 nova_compute[189066]:   <vcpu placement='static'>1</vcpu>
Dec 05 09:37:48 compute-1 nova_compute[189066]:   <resource>
Dec 05 09:37:48 compute-1 nova_compute[189066]:     <partition>/machine</partition>
Dec 05 09:37:48 compute-1 nova_compute[189066]:   </resource>
Dec 05 09:37:48 compute-1 nova_compute[189066]:   <sysinfo type='smbios'>
Dec 05 09:37:48 compute-1 nova_compute[189066]:     <system>
Dec 05 09:37:48 compute-1 nova_compute[189066]:       <entry name='manufacturer'>RDO</entry>
Dec 05 09:37:48 compute-1 nova_compute[189066]:       <entry name='product'>OpenStack Compute</entry>
Dec 05 09:37:48 compute-1 nova_compute[189066]:       <entry name='version'>27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec 05 09:37:48 compute-1 nova_compute[189066]:       <entry name='serial'>caf0a99c-b4d0-4fac-9883-ab0be359b528</entry>
Dec 05 09:37:48 compute-1 nova_compute[189066]:       <entry name='uuid'>caf0a99c-b4d0-4fac-9883-ab0be359b528</entry>
Dec 05 09:37:48 compute-1 nova_compute[189066]:       <entry name='family'>Virtual Machine</entry>
Dec 05 09:37:48 compute-1 nova_compute[189066]:     </system>
Dec 05 09:37:48 compute-1 nova_compute[189066]:   </sysinfo>
Dec 05 09:37:48 compute-1 nova_compute[189066]:   <os>
Dec 05 09:37:48 compute-1 nova_compute[189066]:     <type arch='x86_64' machine='pc-q35-rhel9.8.0'>hvm</type>
Dec 05 09:37:48 compute-1 nova_compute[189066]:     <boot dev='hd'/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:     <smbios mode='sysinfo'/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:   </os>
Dec 05 09:37:48 compute-1 nova_compute[189066]:   <features>
Dec 05 09:37:48 compute-1 nova_compute[189066]:     <acpi/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:     <apic/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:     <vmcoreinfo state='on'/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:   </features>
Dec 05 09:37:48 compute-1 nova_compute[189066]:   <cpu mode='custom' match='exact' check='full'>
Dec 05 09:37:48 compute-1 nova_compute[189066]:     <model fallback='forbid'>Nehalem</model>
Dec 05 09:37:48 compute-1 nova_compute[189066]:     <topology sockets='1' dies='1' clusters='1' cores='1' threads='1'/>
Dec 05 09:37:48 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:37:48.643 105272 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 28997311-91ec-41eb-8d69-3ce8625aacb4, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 05 09:37:48 compute-1 nova_compute[189066]:     <feature policy='require' name='x2apic'/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:     <feature policy='require' name='hypervisor'/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:     <feature policy='require' name='vme'/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:   </cpu>
Dec 05 09:37:48 compute-1 nova_compute[189066]:   <clock offset='utc'>
Dec 05 09:37:48 compute-1 nova_compute[189066]:     <timer name='pit' tickpolicy='delay'/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:     <timer name='rtc' tickpolicy='catchup'/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:     <timer name='hpet' present='no'/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:   </clock>
Dec 05 09:37:48 compute-1 nova_compute[189066]:   <on_poweroff>destroy</on_poweroff>
Dec 05 09:37:48 compute-1 nova_compute[189066]:   <on_reboot>restart</on_reboot>
Dec 05 09:37:48 compute-1 nova_compute[189066]:   <on_crash>destroy</on_crash>
Dec 05 09:37:48 compute-1 nova_compute[189066]:   <devices>
Dec 05 09:37:48 compute-1 nova_compute[189066]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Dec 05 09:37:48 compute-1 nova_compute[189066]:     <disk type='file' device='disk'>
Dec 05 09:37:48 compute-1 nova_compute[189066]:       <driver name='qemu' type='qcow2' cache='none'/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:       <source file='/var/lib/nova/instances/caf0a99c-b4d0-4fac-9883-ab0be359b528/disk' index='2'/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:       <backingStore type='file' index='3'>
Dec 05 09:37:48 compute-1 nova_compute[189066]:         <format type='raw'/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:         <source file='/var/lib/nova/instances/_base/5b1bdb5cc50b9bf43ebe3094e9279a12a18df0ce'/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:         <backingStore/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:       </backingStore>
Dec 05 09:37:48 compute-1 nova_compute[189066]:       <target dev='vda' bus='virtio'/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:       <alias name='virtio-disk0'/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:       <address type='pci' domain='0x0000' bus='0x03' slot='0x00' function='0x0'/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:     </disk>
Dec 05 09:37:48 compute-1 nova_compute[189066]:     <disk type='file' device='cdrom'>
Dec 05 09:37:48 compute-1 nova_compute[189066]:       <driver name='qemu' type='raw' cache='none'/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:       <source file='/var/lib/nova/instances/caf0a99c-b4d0-4fac-9883-ab0be359b528/disk.config' index='1'/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:       <backingStore/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:       <target dev='sda' bus='sata'/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:       <readonly/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:       <alias name='sata0-0-0'/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:       <address type='drive' controller='0' bus='0' target='0' unit='0'/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:     </disk>
Dec 05 09:37:48 compute-1 nova_compute[189066]:     <controller type='pci' index='0' model='pcie-root'>
Dec 05 09:37:48 compute-1 nova_compute[189066]:       <alias name='pcie.0'/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:     </controller>
Dec 05 09:37:48 compute-1 nova_compute[189066]:     <controller type='pci' index='1' model='pcie-root-port'>
Dec 05 09:37:48 compute-1 nova_compute[189066]:       <model name='pcie-root-port'/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:       <target chassis='1' port='0x10'/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:       <alias name='pci.1'/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x0' multifunction='on'/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:     </controller>
Dec 05 09:37:48 compute-1 nova_compute[189066]:     <controller type='pci' index='2' model='pcie-root-port'>
Dec 05 09:37:48 compute-1 nova_compute[189066]:       <model name='pcie-root-port'/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:       <target chassis='2' port='0x11'/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:       <alias name='pci.2'/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x1'/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:     </controller>
Dec 05 09:37:48 compute-1 nova_compute[189066]:     <controller type='pci' index='3' model='pcie-root-port'>
Dec 05 09:37:48 compute-1 nova_compute[189066]:       <model name='pcie-root-port'/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:       <target chassis='3' port='0x12'/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:       <alias name='pci.3'/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x2'/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:     </controller>
Dec 05 09:37:48 compute-1 nova_compute[189066]:     <controller type='pci' index='4' model='pcie-root-port'>
Dec 05 09:37:48 compute-1 nova_compute[189066]:       <model name='pcie-root-port'/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:       <target chassis='4' port='0x13'/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:       <alias name='pci.4'/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x3'/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:     </controller>
Dec 05 09:37:48 compute-1 nova_compute[189066]:     <controller type='pci' index='5' model='pcie-root-port'>
Dec 05 09:37:48 compute-1 nova_compute[189066]:       <model name='pcie-root-port'/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:       <target chassis='5' port='0x14'/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:       <alias name='pci.5'/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x4'/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:     </controller>
Dec 05 09:37:48 compute-1 nova_compute[189066]:     <controller type='pci' index='6' model='pcie-root-port'>
Dec 05 09:37:48 compute-1 nova_compute[189066]:       <model name='pcie-root-port'/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:       <target chassis='6' port='0x15'/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:       <alias name='pci.6'/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x5'/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:     </controller>
Dec 05 09:37:48 compute-1 nova_compute[189066]:     <controller type='pci' index='7' model='pcie-root-port'>
Dec 05 09:37:48 compute-1 nova_compute[189066]:       <model name='pcie-root-port'/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:       <target chassis='7' port='0x16'/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:       <alias name='pci.7'/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x6'/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:     </controller>
Dec 05 09:37:48 compute-1 nova_compute[189066]:     <controller type='pci' index='8' model='pcie-root-port'>
Dec 05 09:37:48 compute-1 nova_compute[189066]:       <model name='pcie-root-port'/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:       <target chassis='8' port='0x17'/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:       <alias name='pci.8'/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x7'/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:     </controller>
Dec 05 09:37:48 compute-1 nova_compute[189066]:     <controller type='pci' index='9' model='pcie-root-port'>
Dec 05 09:37:48 compute-1 nova_compute[189066]:       <model name='pcie-root-port'/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:       <target chassis='9' port='0x18'/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:       <alias name='pci.9'/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0' multifunction='on'/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:     </controller>
Dec 05 09:37:48 compute-1 nova_compute[189066]:     <controller type='pci' index='10' model='pcie-root-port'>
Dec 05 09:37:48 compute-1 nova_compute[189066]:       <model name='pcie-root-port'/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:       <target chassis='10' port='0x19'/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:       <alias name='pci.10'/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x1'/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:     </controller>
Dec 05 09:37:48 compute-1 nova_compute[189066]:     <controller type='pci' index='11' model='pcie-root-port'>
Dec 05 09:37:48 compute-1 nova_compute[189066]:       <model name='pcie-root-port'/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:       <target chassis='11' port='0x1a'/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:       <alias name='pci.11'/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x2'/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:     </controller>
Dec 05 09:37:48 compute-1 nova_compute[189066]:     <controller type='pci' index='12' model='pcie-root-port'>
Dec 05 09:37:48 compute-1 nova_compute[189066]:       <model name='pcie-root-port'/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:       <target chassis='12' port='0x1b'/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:       <alias name='pci.12'/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x3'/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:     </controller>
Dec 05 09:37:48 compute-1 nova_compute[189066]:     <controller type='pci' index='13' model='pcie-root-port'>
Dec 05 09:37:48 compute-1 nova_compute[189066]:       <model name='pcie-root-port'/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:       <target chassis='13' port='0x1c'/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:       <alias name='pci.13'/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x4'/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:     </controller>
Dec 05 09:37:48 compute-1 nova_compute[189066]:     <controller type='pci' index='14' model='pcie-root-port'>
Dec 05 09:37:48 compute-1 nova_compute[189066]:       <model name='pcie-root-port'/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:       <target chassis='14' port='0x1d'/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:       <alias name='pci.14'/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x5'/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:     </controller>
Dec 05 09:37:48 compute-1 nova_compute[189066]:     <controller type='pci' index='15' model='pcie-root-port'>
Dec 05 09:37:48 compute-1 nova_compute[189066]:       <model name='pcie-root-port'/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:       <target chassis='15' port='0x1e'/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:       <alias name='pci.15'/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x6'/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:     </controller>
Dec 05 09:37:48 compute-1 nova_compute[189066]:     <controller type='pci' index='16' model='pcie-root-port'>
Dec 05 09:37:48 compute-1 nova_compute[189066]:       <model name='pcie-root-port'/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:       <target chassis='16' port='0x1f'/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:       <alias name='pci.16'/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x7'/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:     </controller>
Dec 05 09:37:48 compute-1 nova_compute[189066]:     <controller type='pci' index='17' model='pcie-root-port'>
Dec 05 09:37:48 compute-1 nova_compute[189066]:       <model name='pcie-root-port'/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:       <target chassis='17' port='0x20'/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:       <alias name='pci.17'/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x0' multifunction='on'/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:     </controller>
Dec 05 09:37:48 compute-1 nova_compute[189066]:     <controller type='pci' index='18' model='pcie-root-port'>
Dec 05 09:37:48 compute-1 nova_compute[189066]:       <model name='pcie-root-port'/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:       <target chassis='18' port='0x21'/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:       <alias name='pci.18'/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x1'/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:     </controller>
Dec 05 09:37:48 compute-1 nova_compute[189066]:     <controller type='pci' index='19' model='pcie-root-port'>
Dec 05 09:37:48 compute-1 nova_compute[189066]:       <model name='pcie-root-port'/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:       <target chassis='19' port='0x22'/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:       <alias name='pci.19'/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x2'/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:     </controller>
Dec 05 09:37:48 compute-1 nova_compute[189066]:     <controller type='pci' index='20' model='pcie-root-port'>
Dec 05 09:37:48 compute-1 nova_compute[189066]:       <model name='pcie-root-port'/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:       <target chassis='20' port='0x23'/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:       <alias name='pci.20'/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x3'/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:     </controller>
Dec 05 09:37:48 compute-1 nova_compute[189066]:     <controller type='pci' index='21' model='pcie-root-port'>
Dec 05 09:37:48 compute-1 nova_compute[189066]:       <model name='pcie-root-port'/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:       <target chassis='21' port='0x24'/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:       <alias name='pci.21'/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x4'/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:     </controller>
Dec 05 09:37:48 compute-1 nova_compute[189066]:     <controller type='pci' index='22' model='pcie-root-port'>
Dec 05 09:37:48 compute-1 nova_compute[189066]:       <model name='pcie-root-port'/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:       <target chassis='22' port='0x25'/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:       <alias name='pci.22'/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x5'/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:     </controller>
Dec 05 09:37:48 compute-1 nova_compute[189066]:     <controller type='pci' index='23' model='pcie-root-port'>
Dec 05 09:37:48 compute-1 nova_compute[189066]:       <model name='pcie-root-port'/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:       <target chassis='23' port='0x26'/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:       <alias name='pci.23'/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x6'/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:     </controller>
Dec 05 09:37:48 compute-1 nova_compute[189066]:     <controller type='pci' index='24' model='pcie-root-port'>
Dec 05 09:37:48 compute-1 nova_compute[189066]:       <model name='pcie-root-port'/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:       <target chassis='24' port='0x27'/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:       <alias name='pci.24'/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x7'/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:     </controller>
Dec 05 09:37:48 compute-1 nova_compute[189066]:     <controller type='pci' index='25' model='pcie-root-port'>
Dec 05 09:37:48 compute-1 nova_compute[189066]:       <model name='pcie-root-port'/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:       <target chassis='25' port='0x28'/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:       <alias name='pci.25'/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x05' function='0x0'/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:     </controller>
Dec 05 09:37:48 compute-1 nova_compute[189066]:     <controller type='pci' index='26' model='pcie-to-pci-bridge'>
Dec 05 09:37:48 compute-1 nova_compute[189066]:       <model name='pcie-pci-bridge'/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:       <alias name='pci.26'/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:       <address type='pci' domain='0x0000' bus='0x01' slot='0x00' function='0x0'/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:     </controller>
Dec 05 09:37:48 compute-1 nova_compute[189066]:     <controller type='usb' index='0' model='piix3-uhci'>
Dec 05 09:37:48 compute-1 nova_compute[189066]:       <alias name='usb'/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:       <address type='pci' domain='0x0000' bus='0x1a' slot='0x01' function='0x0'/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:     </controller>
Dec 05 09:37:48 compute-1 nova_compute[189066]:     <controller type='sata' index='0'>
Dec 05 09:37:48 compute-1 nova_compute[189066]:       <alias name='ide'/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x1f' function='0x2'/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:     </controller>
Dec 05 09:37:48 compute-1 nova_compute[189066]:     <interface type='ethernet'>
Dec 05 09:37:48 compute-1 nova_compute[189066]:       <mac address='fa:16:3e:5a:50:59'/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:       <target dev='tapdc081d90-d2'/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:       <model type='virtio'/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:       <driver name='vhost' rx_queue_size='512'/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:       <mtu size='1442'/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:       <alias name='net0'/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:       <address type='pci' domain='0x0000' bus='0x02' slot='0x00' function='0x0'/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:     </interface>
Dec 05 09:37:48 compute-1 nova_compute[189066]:     <serial type='pty'>
Dec 05 09:37:48 compute-1 nova_compute[189066]:       <source path='/dev/pts/0'/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:       <log file='/var/lib/nova/instances/caf0a99c-b4d0-4fac-9883-ab0be359b528/console.log' append='off'/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:       <target type='isa-serial' port='0'>
Dec 05 09:37:48 compute-1 nova_compute[189066]:         <model name='isa-serial'/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:       </target>
Dec 05 09:37:48 compute-1 nova_compute[189066]:       <alias name='serial0'/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:     </serial>
Dec 05 09:37:48 compute-1 nova_compute[189066]:     <console type='pty' tty='/dev/pts/0'>
Dec 05 09:37:48 compute-1 nova_compute[189066]:       <source path='/dev/pts/0'/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:       <log file='/var/lib/nova/instances/caf0a99c-b4d0-4fac-9883-ab0be359b528/console.log' append='off'/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:       <target type='serial' port='0'/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:       <alias name='serial0'/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:     </console>
Dec 05 09:37:48 compute-1 nova_compute[189066]:     <input type='tablet' bus='usb'>
Dec 05 09:37:48 compute-1 nova_compute[189066]:       <alias name='input0'/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:       <address type='usb' bus='0' port='1'/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:     </input>
Dec 05 09:37:48 compute-1 nova_compute[189066]:     <input type='mouse' bus='ps2'>
Dec 05 09:37:48 compute-1 nova_compute[189066]:       <alias name='input1'/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:     </input>
Dec 05 09:37:48 compute-1 nova_compute[189066]:     <input type='keyboard' bus='ps2'>
Dec 05 09:37:48 compute-1 nova_compute[189066]:       <alias name='input2'/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:     </input>
Dec 05 09:37:48 compute-1 nova_compute[189066]:     <graphics type='vnc' port='5900' autoport='yes' listen='::0'>
Dec 05 09:37:48 compute-1 nova_compute[189066]:       <listen type='address' address='::0'/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:     </graphics>
Dec 05 09:37:48 compute-1 nova_compute[189066]:     <audio id='1' type='none'/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:     <video>
Dec 05 09:37:48 compute-1 nova_compute[189066]:       <model type='virtio' heads='1' primary='yes'/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:       <alias name='video0'/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x01' function='0x0'/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:     </video>
Dec 05 09:37:48 compute-1 nova_compute[189066]:     <watchdog model='itco' action='reset'>
Dec 05 09:37:48 compute-1 nova_compute[189066]:       <alias name='watchdog0'/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:     </watchdog>
Dec 05 09:37:48 compute-1 nova_compute[189066]:     <memballoon model='virtio'>
Dec 05 09:37:48 compute-1 nova_compute[189066]:       <stats period='10'/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:       <alias name='balloon0'/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:       <address type='pci' domain='0x0000' bus='0x04' slot='0x00' function='0x0'/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:     </memballoon>
Dec 05 09:37:48 compute-1 nova_compute[189066]:     <rng model='virtio'>
Dec 05 09:37:48 compute-1 nova_compute[189066]:       <backend model='random'>/dev/urandom</backend>
Dec 05 09:37:48 compute-1 nova_compute[189066]:       <alias name='rng0'/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:       <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x0'/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:     </rng>
Dec 05 09:37:48 compute-1 nova_compute[189066]:   </devices>
Dec 05 09:37:48 compute-1 nova_compute[189066]:   <seclabel type='dynamic' model='selinux' relabel='yes'>
Dec 05 09:37:48 compute-1 nova_compute[189066]:     <label>system_u:system_r:svirt_t:s0:c66,c853</label>
Dec 05 09:37:48 compute-1 nova_compute[189066]:     <imagelabel>system_u:object_r:svirt_image_t:s0:c66,c853</imagelabel>
Dec 05 09:37:48 compute-1 nova_compute[189066]:   </seclabel>
Dec 05 09:37:48 compute-1 nova_compute[189066]:   <seclabel type='dynamic' model='dac' relabel='yes'>
Dec 05 09:37:48 compute-1 nova_compute[189066]:     <label>+107:+107</label>
Dec 05 09:37:48 compute-1 nova_compute[189066]:     <imagelabel>+107:+107</imagelabel>
Dec 05 09:37:48 compute-1 nova_compute[189066]:   </seclabel>
Dec 05 09:37:48 compute-1 nova_compute[189066]: </domain>
Dec 05 09:37:48 compute-1 nova_compute[189066]:  get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:282
Dec 05 09:37:48 compute-1 nova_compute[189066]: 2025-12-05 09:37:48.642 189070 INFO nova.virt.libvirt.driver [None req-702a17de-0400-4457-93c0-f356fcae3e1e 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Successfully detached device tap5f95901b-96 from instance caf0a99c-b4d0-4fac-9883-ab0be359b528 from the live domain config.
Dec 05 09:37:48 compute-1 nova_compute[189066]: 2025-12-05 09:37:48.643 189070 DEBUG nova.virt.libvirt.vif [None req-702a17de-0400-4457-93c0-f356fcae3e1e 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-05T09:35:53Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1152892970',display_name='tempest-TestNetworkBasicOps-server-1152892970',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1152892970',id=34,image_ref='3ebffd97-b242-42d7-b245-ebdaf8e4377c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBAOX2Isf6e1VVEipyaY1bQ3oo7OBeHw8nA3AYDjK6acPXxSqT3BdAkhZsaMDELIc3uY4dO92h8xGVeDLew9jVMMUIkjYQ57YIHVif+pT2wHO21vXehX0yK2yZSKNfMCQjw==',key_name='tempest-TestNetworkBasicOps-1301626670',keypairs=<?>,launch_index=0,launched_at=2025-12-05T09:36:12Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='f918144b49634ed5a43d75f8f7d194d3',ramdisk_id='',reservation_id='r-8kfznvii',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='3ebffd97-b242-42d7-b245-ebdaf8e4377c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1341993432',owner_user_name='tempest-TestNetworkBasicOps-1341993432-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-12-05T09:36:12Z,user_data=None,user_id='822a2d80e80e46d9a5a49e1e9560e0d9',uuid=caf0a99c-b4d0-4fac-9883-ab0be359b528,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "5f95901b-9655-49a7-a46d-bfe7f4c86a96", "address": "fa:16:3e:b9:90:da", "network": {"id": "28997311-91ec-41eb-8d69-3ce8625aacb4", "bridge": "br-int", "label": "tempest-network-smoke--2657185", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.20", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f918144b49634ed5a43d75f8f7d194d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5f95901b-96", "ovs_interfaceid": "5f95901b-9655-49a7-a46d-bfe7f4c86a96", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Dec 05 09:37:48 compute-1 nova_compute[189066]: 2025-12-05 09:37:48.645 189070 DEBUG nova.network.os_vif_util [None req-702a17de-0400-4457-93c0-f356fcae3e1e 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Converting VIF {"id": "5f95901b-9655-49a7-a46d-bfe7f4c86a96", "address": "fa:16:3e:b9:90:da", "network": {"id": "28997311-91ec-41eb-8d69-3ce8625aacb4", "bridge": "br-int", "label": "tempest-network-smoke--2657185", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.20", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f918144b49634ed5a43d75f8f7d194d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5f95901b-96", "ovs_interfaceid": "5f95901b-9655-49a7-a46d-bfe7f4c86a96", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 09:37:48 compute-1 nova_compute[189066]: 2025-12-05 09:37:48.646 189070 DEBUG nova.network.os_vif_util [None req-702a17de-0400-4457-93c0-f356fcae3e1e 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:b9:90:da,bridge_name='br-int',has_traffic_filtering=True,id=5f95901b-9655-49a7-a46d-bfe7f4c86a96,network=Network(28997311-91ec-41eb-8d69-3ce8625aacb4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5f95901b-96') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 09:37:48 compute-1 nova_compute[189066]: 2025-12-05 09:37:48.646 189070 DEBUG os_vif [None req-702a17de-0400-4457-93c0-f356fcae3e1e 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:b9:90:da,bridge_name='br-int',has_traffic_filtering=True,id=5f95901b-9655-49a7-a46d-bfe7f4c86a96,network=Network(28997311-91ec-41eb-8d69-3ce8625aacb4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5f95901b-96') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Dec 05 09:37:48 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:37:48.644 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[7daad704-de33-4126-b967-a6ad99c5fc5f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:37:48 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:37:48.647 105272 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-28997311-91ec-41eb-8d69-3ce8625aacb4 namespace which is not needed anymore
Dec 05 09:37:48 compute-1 nova_compute[189066]: 2025-12-05 09:37:48.648 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:37:48 compute-1 nova_compute[189066]: 2025-12-05 09:37:48.650 189070 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap5f95901b-96, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 09:37:48 compute-1 nova_compute[189066]: 2025-12-05 09:37:48.651 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:37:48 compute-1 nova_compute[189066]: 2025-12-05 09:37:48.653 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 09:37:48 compute-1 nova_compute[189066]: 2025-12-05 09:37:48.654 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:37:48 compute-1 nova_compute[189066]: 2025-12-05 09:37:48.658 189070 INFO os_vif [None req-702a17de-0400-4457-93c0-f356fcae3e1e 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:b9:90:da,bridge_name='br-int',has_traffic_filtering=True,id=5f95901b-9655-49a7-a46d-bfe7f4c86a96,network=Network(28997311-91ec-41eb-8d69-3ce8625aacb4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5f95901b-96')
Dec 05 09:37:48 compute-1 nova_compute[189066]: 2025-12-05 09:37:48.660 189070 DEBUG nova.virt.libvirt.guest [None req-702a17de-0400-4457-93c0-f356fcae3e1e 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] set metadata xml: <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 05 09:37:48 compute-1 nova_compute[189066]:   <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:   <nova:name>tempest-TestNetworkBasicOps-server-1152892970</nova:name>
Dec 05 09:37:48 compute-1 nova_compute[189066]:   <nova:creationTime>2025-12-05 09:37:48</nova:creationTime>
Dec 05 09:37:48 compute-1 nova_compute[189066]:   <nova:flavor name="m1.nano">
Dec 05 09:37:48 compute-1 nova_compute[189066]:     <nova:memory>128</nova:memory>
Dec 05 09:37:48 compute-1 nova_compute[189066]:     <nova:disk>1</nova:disk>
Dec 05 09:37:48 compute-1 nova_compute[189066]:     <nova:swap>0</nova:swap>
Dec 05 09:37:48 compute-1 nova_compute[189066]:     <nova:ephemeral>0</nova:ephemeral>
Dec 05 09:37:48 compute-1 nova_compute[189066]:     <nova:vcpus>1</nova:vcpus>
Dec 05 09:37:48 compute-1 nova_compute[189066]:   </nova:flavor>
Dec 05 09:37:48 compute-1 nova_compute[189066]:   <nova:owner>
Dec 05 09:37:48 compute-1 nova_compute[189066]:     <nova:user uuid="822a2d80e80e46d9a5a49e1e9560e0d9">tempest-TestNetworkBasicOps-1341993432-project-member</nova:user>
Dec 05 09:37:48 compute-1 nova_compute[189066]:     <nova:project uuid="f918144b49634ed5a43d75f8f7d194d3">tempest-TestNetworkBasicOps-1341993432</nova:project>
Dec 05 09:37:48 compute-1 nova_compute[189066]:   </nova:owner>
Dec 05 09:37:48 compute-1 nova_compute[189066]:   <nova:root type="image" uuid="3ebffd97-b242-42d7-b245-ebdaf8e4377c"/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:   <nova:ports>
Dec 05 09:37:48 compute-1 nova_compute[189066]:     <nova:port uuid="dc081d90-d263-461d-a7af-bc8649b2d24e">
Dec 05 09:37:48 compute-1 nova_compute[189066]:       <nova:ip type="fixed" address="10.100.0.7" ipVersion="4"/>
Dec 05 09:37:48 compute-1 nova_compute[189066]:     </nova:port>
Dec 05 09:37:48 compute-1 nova_compute[189066]:   </nova:ports>
Dec 05 09:37:48 compute-1 nova_compute[189066]: </nova:instance>
Dec 05 09:37:48 compute-1 nova_compute[189066]:  set_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:359
Dec 05 09:37:48 compute-1 neutron-haproxy-ovnmeta-28997311-91ec-41eb-8d69-3ce8625aacb4[229151]: [NOTICE]   (229155) : haproxy version is 2.8.14-c23fe91
Dec 05 09:37:48 compute-1 neutron-haproxy-ovnmeta-28997311-91ec-41eb-8d69-3ce8625aacb4[229151]: [NOTICE]   (229155) : path to executable is /usr/sbin/haproxy
Dec 05 09:37:48 compute-1 neutron-haproxy-ovnmeta-28997311-91ec-41eb-8d69-3ce8625aacb4[229151]: [WARNING]  (229155) : Exiting Master process...
Dec 05 09:37:48 compute-1 neutron-haproxy-ovnmeta-28997311-91ec-41eb-8d69-3ce8625aacb4[229151]: [WARNING]  (229155) : Exiting Master process...
Dec 05 09:37:48 compute-1 neutron-haproxy-ovnmeta-28997311-91ec-41eb-8d69-3ce8625aacb4[229151]: [ALERT]    (229155) : Current worker (229157) exited with code 143 (Terminated)
Dec 05 09:37:48 compute-1 neutron-haproxy-ovnmeta-28997311-91ec-41eb-8d69-3ce8625aacb4[229151]: [WARNING]  (229155) : All workers exited. Exiting... (0)
Dec 05 09:37:48 compute-1 systemd[1]: libpod-fee3f6323b68d63cd118a0cc257d4ff42fe472b9c153d6f214642805b5e98636.scope: Deactivated successfully.
Dec 05 09:37:48 compute-1 podman[229626]: 2025-12-05 09:37:48.806730557 +0000 UTC m=+0.047534171 container died fee3f6323b68d63cd118a0cc257d4ff42fe472b9c153d6f214642805b5e98636 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-28997311-91ec-41eb-8d69-3ce8625aacb4, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Dec 05 09:37:48 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-fee3f6323b68d63cd118a0cc257d4ff42fe472b9c153d6f214642805b5e98636-userdata-shm.mount: Deactivated successfully.
Dec 05 09:37:48 compute-1 systemd[1]: var-lib-containers-storage-overlay-17994f304c950a8dae58807c31a8436b722c9a9e777f06786c14ab8b5c8649fe-merged.mount: Deactivated successfully.
Dec 05 09:37:48 compute-1 podman[229626]: 2025-12-05 09:37:48.847661554 +0000 UTC m=+0.088465168 container cleanup fee3f6323b68d63cd118a0cc257d4ff42fe472b9c153d6f214642805b5e98636 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-28997311-91ec-41eb-8d69-3ce8625aacb4, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Dec 05 09:37:48 compute-1 systemd[1]: libpod-conmon-fee3f6323b68d63cd118a0cc257d4ff42fe472b9c153d6f214642805b5e98636.scope: Deactivated successfully.
Dec 05 09:37:48 compute-1 podman[229655]: 2025-12-05 09:37:48.916886013 +0000 UTC m=+0.045264505 container remove fee3f6323b68d63cd118a0cc257d4ff42fe472b9c153d6f214642805b5e98636 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-28997311-91ec-41eb-8d69-3ce8625aacb4, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Dec 05 09:37:48 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:37:48.923 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[14228fd3-96e7-495f-a65a-fa4c45b980cf]: (4, ('Fri Dec  5 09:37:48 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-28997311-91ec-41eb-8d69-3ce8625aacb4 (fee3f6323b68d63cd118a0cc257d4ff42fe472b9c153d6f214642805b5e98636)\nfee3f6323b68d63cd118a0cc257d4ff42fe472b9c153d6f214642805b5e98636\nFri Dec  5 09:37:48 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-28997311-91ec-41eb-8d69-3ce8625aacb4 (fee3f6323b68d63cd118a0cc257d4ff42fe472b9c153d6f214642805b5e98636)\nfee3f6323b68d63cd118a0cc257d4ff42fe472b9c153d6f214642805b5e98636\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:37:48 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:37:48.926 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[f863f702-b79c-4336-a26f-938a14ce074f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:37:48 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:37:48.927 105272 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap28997311-90, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 09:37:48 compute-1 nova_compute[189066]: 2025-12-05 09:37:48.929 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:37:48 compute-1 kernel: tap28997311-90: left promiscuous mode
Dec 05 09:37:48 compute-1 nova_compute[189066]: 2025-12-05 09:37:48.940 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:37:48 compute-1 nova_compute[189066]: 2025-12-05 09:37:48.942 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:37:48 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:37:48.945 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[b8e3c268-0846-4d75-b7e7-6ce6bac49ec8]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:37:48 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:37:48.966 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[1e34858b-1857-40c4-96f7-a01e5f47dc34]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:37:48 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:37:48.968 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[f473faf4-745e-4e5a-9e7a-09f37b40a1a1]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:37:48 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:37:48.993 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[b2b48334-f0ef-4b70-bc74-ff0cdfd451ab]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 467753, 'reachable_time': 23196, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 229670, 'error': None, 'target': 'ovnmeta-28997311-91ec-41eb-8d69-3ce8625aacb4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:37:49 compute-1 systemd[1]: run-netns-ovnmeta\x2d28997311\x2d91ec\x2d41eb\x2d8d69\x2d3ce8625aacb4.mount: Deactivated successfully.
Dec 05 09:37:49 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:37:49.000 105809 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-28997311-91ec-41eb-8d69-3ce8625aacb4 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Dec 05 09:37:49 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:37:49.001 105809 DEBUG oslo.privsep.daemon [-] privsep: reply[07820c24-1d81-47d8-8354-d57604ac8122]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:37:49 compute-1 nova_compute[189066]: 2025-12-05 09:37:49.263 189070 DEBUG oslo_concurrency.lockutils [None req-702a17de-0400-4457-93c0-f356fcae3e1e 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Acquiring lock "refresh_cache-caf0a99c-b4d0-4fac-9883-ab0be359b528" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 09:37:49 compute-1 nova_compute[189066]: 2025-12-05 09:37:49.263 189070 DEBUG oslo_concurrency.lockutils [None req-702a17de-0400-4457-93c0-f356fcae3e1e 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Acquired lock "refresh_cache-caf0a99c-b4d0-4fac-9883-ab0be359b528" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 09:37:49 compute-1 nova_compute[189066]: 2025-12-05 09:37:49.263 189070 DEBUG nova.network.neutron [None req-702a17de-0400-4457-93c0-f356fcae3e1e 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: caf0a99c-b4d0-4fac-9883-ab0be359b528] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Dec 05 09:37:50 compute-1 sshd-session[229497]: ssh_dispatch_run_fatal: Connection from 101.47.162.91 port 35104: Connection timed out [preauth]
Dec 05 09:37:50 compute-1 nova_compute[189066]: 2025-12-05 09:37:50.187 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:37:51 compute-1 nova_compute[189066]: 2025-12-05 09:37:51.215 189070 DEBUG nova.compute.manager [req-3fe1bf3d-be4b-47bc-9884-8a63d41354e6 req-ac6ee929-892d-4bfa-9c3b-e6239431b0e7 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: caf0a99c-b4d0-4fac-9883-ab0be359b528] Received event network-vif-deleted-5f95901b-9655-49a7-a46d-bfe7f4c86a96 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 09:37:51 compute-1 nova_compute[189066]: 2025-12-05 09:37:51.215 189070 INFO nova.compute.manager [req-3fe1bf3d-be4b-47bc-9884-8a63d41354e6 req-ac6ee929-892d-4bfa-9c3b-e6239431b0e7 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: caf0a99c-b4d0-4fac-9883-ab0be359b528] Neutron deleted interface 5f95901b-9655-49a7-a46d-bfe7f4c86a96; detaching it from the instance and deleting it from the info cache
Dec 05 09:37:51 compute-1 nova_compute[189066]: 2025-12-05 09:37:51.216 189070 DEBUG nova.network.neutron [req-3fe1bf3d-be4b-47bc-9884-8a63d41354e6 req-ac6ee929-892d-4bfa-9c3b-e6239431b0e7 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: caf0a99c-b4d0-4fac-9883-ab0be359b528] Updating instance_info_cache with network_info: [{"id": "dc081d90-d263-461d-a7af-bc8649b2d24e", "address": "fa:16:3e:5a:50:59", "network": {"id": "9169e776-8a85-417a-9321-7e8c761484e0", "bridge": "br-int", "label": "tempest-network-smoke--736767212", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.215", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f918144b49634ed5a43d75f8f7d194d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdc081d90-d2", "ovs_interfaceid": "dc081d90-d263-461d-a7af-bc8649b2d24e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 09:37:51 compute-1 nova_compute[189066]: 2025-12-05 09:37:51.245 189070 DEBUG nova.objects.instance [req-3fe1bf3d-be4b-47bc-9884-8a63d41354e6 req-ac6ee929-892d-4bfa-9c3b-e6239431b0e7 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Lazy-loading 'system_metadata' on Instance uuid caf0a99c-b4d0-4fac-9883-ab0be359b528 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 09:37:51 compute-1 nova_compute[189066]: 2025-12-05 09:37:51.276 189070 DEBUG nova.objects.instance [req-3fe1bf3d-be4b-47bc-9884-8a63d41354e6 req-ac6ee929-892d-4bfa-9c3b-e6239431b0e7 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Lazy-loading 'flavor' on Instance uuid caf0a99c-b4d0-4fac-9883-ab0be359b528 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 09:37:51 compute-1 nova_compute[189066]: 2025-12-05 09:37:51.301 189070 DEBUG nova.virt.libvirt.vif [req-3fe1bf3d-be4b-47bc-9884-8a63d41354e6 req-ac6ee929-892d-4bfa-9c3b-e6239431b0e7 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-05T09:35:53Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1152892970',display_name='tempest-TestNetworkBasicOps-server-1152892970',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1152892970',id=34,image_ref='3ebffd97-b242-42d7-b245-ebdaf8e4377c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBAOX2Isf6e1VVEipyaY1bQ3oo7OBeHw8nA3AYDjK6acPXxSqT3BdAkhZsaMDELIc3uY4dO92h8xGVeDLew9jVMMUIkjYQ57YIHVif+pT2wHO21vXehX0yK2yZSKNfMCQjw==',key_name='tempest-TestNetworkBasicOps-1301626670',keypairs=<?>,launch_index=0,launched_at=2025-12-05T09:36:12Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata=<?>,migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='f918144b49634ed5a43d75f8f7d194d3',ramdisk_id='',reservation_id='r-8kfznvii',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='3ebffd97-b242-42d7-b245-ebdaf8e4377c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1341993432',owner_user_name='tempest-TestNetworkBasicOps-1341993432-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-12-05T09:36:12Z,user_data=None,user_id='822a2d80e80e46d9a5a49e1e9560e0d9',uuid=caf0a99c-b4d0-4fac-9883-ab0be359b528,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "5f95901b-9655-49a7-a46d-bfe7f4c86a96", "address": "fa:16:3e:b9:90:da", "network": {"id": "28997311-91ec-41eb-8d69-3ce8625aacb4", "bridge": "br-int", "label": "tempest-network-smoke--2657185", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.20", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f918144b49634ed5a43d75f8f7d194d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5f95901b-96", "ovs_interfaceid": "5f95901b-9655-49a7-a46d-bfe7f4c86a96", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Dec 05 09:37:51 compute-1 nova_compute[189066]: 2025-12-05 09:37:51.302 189070 DEBUG nova.network.os_vif_util [req-3fe1bf3d-be4b-47bc-9884-8a63d41354e6 req-ac6ee929-892d-4bfa-9c3b-e6239431b0e7 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Converting VIF {"id": "5f95901b-9655-49a7-a46d-bfe7f4c86a96", "address": "fa:16:3e:b9:90:da", "network": {"id": "28997311-91ec-41eb-8d69-3ce8625aacb4", "bridge": "br-int", "label": "tempest-network-smoke--2657185", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.20", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f918144b49634ed5a43d75f8f7d194d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5f95901b-96", "ovs_interfaceid": "5f95901b-9655-49a7-a46d-bfe7f4c86a96", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 09:37:51 compute-1 nova_compute[189066]: 2025-12-05 09:37:51.302 189070 DEBUG nova.network.os_vif_util [req-3fe1bf3d-be4b-47bc-9884-8a63d41354e6 req-ac6ee929-892d-4bfa-9c3b-e6239431b0e7 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:b9:90:da,bridge_name='br-int',has_traffic_filtering=True,id=5f95901b-9655-49a7-a46d-bfe7f4c86a96,network=Network(28997311-91ec-41eb-8d69-3ce8625aacb4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5f95901b-96') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 09:37:51 compute-1 nova_compute[189066]: 2025-12-05 09:37:51.306 189070 DEBUG nova.virt.libvirt.guest [req-3fe1bf3d-be4b-47bc-9884-8a63d41354e6 req-ac6ee929-892d-4bfa-9c3b-e6239431b0e7 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:b9:90:da"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap5f95901b-96"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257
Dec 05 09:37:51 compute-1 nova_compute[189066]: 2025-12-05 09:37:51.310 189070 DEBUG nova.virt.libvirt.guest [req-3fe1bf3d-be4b-47bc-9884-8a63d41354e6 req-ac6ee929-892d-4bfa-9c3b-e6239431b0e7 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] interface for config: <interface type="ethernet"><mac address="fa:16:3e:b9:90:da"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap5f95901b-96"/></interface>not found in domain: <domain type='kvm' id='15'>
Dec 05 09:37:51 compute-1 nova_compute[189066]:   <name>instance-00000022</name>
Dec 05 09:37:51 compute-1 nova_compute[189066]:   <uuid>caf0a99c-b4d0-4fac-9883-ab0be359b528</uuid>
Dec 05 09:37:51 compute-1 nova_compute[189066]:   <metadata>
Dec 05 09:37:51 compute-1 nova_compute[189066]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1" xmlns:instance="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 05 09:37:51 compute-1 nova_compute[189066]:   <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:   <nova:name>tempest-TestNetworkBasicOps-server-1152892970</nova:name>
Dec 05 09:37:51 compute-1 nova_compute[189066]:   <nova:creationTime>2025-12-05 09:37:48</nova:creationTime>
Dec 05 09:37:51 compute-1 nova_compute[189066]:   <nova:flavor name="m1.nano">
Dec 05 09:37:51 compute-1 nova_compute[189066]:     <nova:memory>128</nova:memory>
Dec 05 09:37:51 compute-1 nova_compute[189066]:     <nova:disk>1</nova:disk>
Dec 05 09:37:51 compute-1 nova_compute[189066]:     <nova:swap>0</nova:swap>
Dec 05 09:37:51 compute-1 nova_compute[189066]:     <nova:ephemeral>0</nova:ephemeral>
Dec 05 09:37:51 compute-1 nova_compute[189066]:     <nova:vcpus>1</nova:vcpus>
Dec 05 09:37:51 compute-1 nova_compute[189066]:   </nova:flavor>
Dec 05 09:37:51 compute-1 nova_compute[189066]:   <nova:owner>
Dec 05 09:37:51 compute-1 nova_compute[189066]:     <nova:user uuid="822a2d80e80e46d9a5a49e1e9560e0d9">tempest-TestNetworkBasicOps-1341993432-project-member</nova:user>
Dec 05 09:37:51 compute-1 nova_compute[189066]:     <nova:project uuid="f918144b49634ed5a43d75f8f7d194d3">tempest-TestNetworkBasicOps-1341993432</nova:project>
Dec 05 09:37:51 compute-1 nova_compute[189066]:   </nova:owner>
Dec 05 09:37:51 compute-1 nova_compute[189066]:   <nova:root type="image" uuid="3ebffd97-b242-42d7-b245-ebdaf8e4377c"/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:   <nova:ports>
Dec 05 09:37:51 compute-1 nova_compute[189066]:     <nova:port uuid="dc081d90-d263-461d-a7af-bc8649b2d24e">
Dec 05 09:37:51 compute-1 nova_compute[189066]:       <nova:ip type="fixed" address="10.100.0.7" ipVersion="4"/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:     </nova:port>
Dec 05 09:37:51 compute-1 nova_compute[189066]:   </nova:ports>
Dec 05 09:37:51 compute-1 nova_compute[189066]: </nova:instance>
Dec 05 09:37:51 compute-1 nova_compute[189066]:   </metadata>
Dec 05 09:37:51 compute-1 nova_compute[189066]:   <memory unit='KiB'>131072</memory>
Dec 05 09:37:51 compute-1 nova_compute[189066]:   <currentMemory unit='KiB'>131072</currentMemory>
Dec 05 09:37:51 compute-1 nova_compute[189066]:   <vcpu placement='static'>1</vcpu>
Dec 05 09:37:51 compute-1 nova_compute[189066]:   <resource>
Dec 05 09:37:51 compute-1 nova_compute[189066]:     <partition>/machine</partition>
Dec 05 09:37:51 compute-1 nova_compute[189066]:   </resource>
Dec 05 09:37:51 compute-1 nova_compute[189066]:   <sysinfo type='smbios'>
Dec 05 09:37:51 compute-1 nova_compute[189066]:     <system>
Dec 05 09:37:51 compute-1 nova_compute[189066]:       <entry name='manufacturer'>RDO</entry>
Dec 05 09:37:51 compute-1 nova_compute[189066]:       <entry name='product'>OpenStack Compute</entry>
Dec 05 09:37:51 compute-1 nova_compute[189066]:       <entry name='version'>27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec 05 09:37:51 compute-1 nova_compute[189066]:       <entry name='serial'>caf0a99c-b4d0-4fac-9883-ab0be359b528</entry>
Dec 05 09:37:51 compute-1 nova_compute[189066]:       <entry name='uuid'>caf0a99c-b4d0-4fac-9883-ab0be359b528</entry>
Dec 05 09:37:51 compute-1 nova_compute[189066]:       <entry name='family'>Virtual Machine</entry>
Dec 05 09:37:51 compute-1 nova_compute[189066]:     </system>
Dec 05 09:37:51 compute-1 nova_compute[189066]:   </sysinfo>
Dec 05 09:37:51 compute-1 nova_compute[189066]:   <os>
Dec 05 09:37:51 compute-1 nova_compute[189066]:     <type arch='x86_64' machine='pc-q35-rhel9.8.0'>hvm</type>
Dec 05 09:37:51 compute-1 nova_compute[189066]:     <boot dev='hd'/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:     <smbios mode='sysinfo'/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:   </os>
Dec 05 09:37:51 compute-1 nova_compute[189066]:   <features>
Dec 05 09:37:51 compute-1 nova_compute[189066]:     <acpi/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:     <apic/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:     <vmcoreinfo state='on'/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:   </features>
Dec 05 09:37:51 compute-1 nova_compute[189066]:   <cpu mode='custom' match='exact' check='full'>
Dec 05 09:37:51 compute-1 nova_compute[189066]:     <model fallback='forbid'>Nehalem</model>
Dec 05 09:37:51 compute-1 nova_compute[189066]:     <topology sockets='1' dies='1' clusters='1' cores='1' threads='1'/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:     <feature policy='require' name='x2apic'/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:     <feature policy='require' name='hypervisor'/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:     <feature policy='require' name='vme'/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:   </cpu>
Dec 05 09:37:51 compute-1 nova_compute[189066]:   <clock offset='utc'>
Dec 05 09:37:51 compute-1 nova_compute[189066]:     <timer name='pit' tickpolicy='delay'/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:     <timer name='rtc' tickpolicy='catchup'/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:     <timer name='hpet' present='no'/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:   </clock>
Dec 05 09:37:51 compute-1 nova_compute[189066]:   <on_poweroff>destroy</on_poweroff>
Dec 05 09:37:51 compute-1 nova_compute[189066]:   <on_reboot>restart</on_reboot>
Dec 05 09:37:51 compute-1 nova_compute[189066]:   <on_crash>destroy</on_crash>
Dec 05 09:37:51 compute-1 nova_compute[189066]:   <devices>
Dec 05 09:37:51 compute-1 nova_compute[189066]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Dec 05 09:37:51 compute-1 nova_compute[189066]:     <disk type='file' device='disk'>
Dec 05 09:37:51 compute-1 nova_compute[189066]:       <driver name='qemu' type='qcow2' cache='none'/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:       <source file='/var/lib/nova/instances/caf0a99c-b4d0-4fac-9883-ab0be359b528/disk' index='2'/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:       <backingStore type='file' index='3'>
Dec 05 09:37:51 compute-1 nova_compute[189066]:         <format type='raw'/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:         <source file='/var/lib/nova/instances/_base/5b1bdb5cc50b9bf43ebe3094e9279a12a18df0ce'/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:         <backingStore/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:       </backingStore>
Dec 05 09:37:51 compute-1 nova_compute[189066]:       <target dev='vda' bus='virtio'/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:       <alias name='virtio-disk0'/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:       <address type='pci' domain='0x0000' bus='0x03' slot='0x00' function='0x0'/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:     </disk>
Dec 05 09:37:51 compute-1 nova_compute[189066]:     <disk type='file' device='cdrom'>
Dec 05 09:37:51 compute-1 nova_compute[189066]:       <driver name='qemu' type='raw' cache='none'/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:       <source file='/var/lib/nova/instances/caf0a99c-b4d0-4fac-9883-ab0be359b528/disk.config' index='1'/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:       <backingStore/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:       <target dev='sda' bus='sata'/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:       <readonly/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:       <alias name='sata0-0-0'/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:       <address type='drive' controller='0' bus='0' target='0' unit='0'/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:     </disk>
Dec 05 09:37:51 compute-1 nova_compute[189066]:     <controller type='pci' index='0' model='pcie-root'>
Dec 05 09:37:51 compute-1 nova_compute[189066]:       <alias name='pcie.0'/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:     </controller>
Dec 05 09:37:51 compute-1 nova_compute[189066]:     <controller type='pci' index='1' model='pcie-root-port'>
Dec 05 09:37:51 compute-1 nova_compute[189066]:       <model name='pcie-root-port'/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:       <target chassis='1' port='0x10'/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:       <alias name='pci.1'/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x0' multifunction='on'/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:     </controller>
Dec 05 09:37:51 compute-1 nova_compute[189066]:     <controller type='pci' index='2' model='pcie-root-port'>
Dec 05 09:37:51 compute-1 nova_compute[189066]:       <model name='pcie-root-port'/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:       <target chassis='2' port='0x11'/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:       <alias name='pci.2'/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x1'/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:     </controller>
Dec 05 09:37:51 compute-1 nova_compute[189066]:     <controller type='pci' index='3' model='pcie-root-port'>
Dec 05 09:37:51 compute-1 nova_compute[189066]:       <model name='pcie-root-port'/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:       <target chassis='3' port='0x12'/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:       <alias name='pci.3'/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x2'/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:     </controller>
Dec 05 09:37:51 compute-1 nova_compute[189066]:     <controller type='pci' index='4' model='pcie-root-port'>
Dec 05 09:37:51 compute-1 nova_compute[189066]:       <model name='pcie-root-port'/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:       <target chassis='4' port='0x13'/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:       <alias name='pci.4'/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x3'/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:     </controller>
Dec 05 09:37:51 compute-1 nova_compute[189066]:     <controller type='pci' index='5' model='pcie-root-port'>
Dec 05 09:37:51 compute-1 nova_compute[189066]:       <model name='pcie-root-port'/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:       <target chassis='5' port='0x14'/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:       <alias name='pci.5'/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x4'/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:     </controller>
Dec 05 09:37:51 compute-1 nova_compute[189066]:     <controller type='pci' index='6' model='pcie-root-port'>
Dec 05 09:37:51 compute-1 nova_compute[189066]:       <model name='pcie-root-port'/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:       <target chassis='6' port='0x15'/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:       <alias name='pci.6'/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x5'/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:     </controller>
Dec 05 09:37:51 compute-1 nova_compute[189066]:     <controller type='pci' index='7' model='pcie-root-port'>
Dec 05 09:37:51 compute-1 nova_compute[189066]:       <model name='pcie-root-port'/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:       <target chassis='7' port='0x16'/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:       <alias name='pci.7'/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x6'/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:     </controller>
Dec 05 09:37:51 compute-1 nova_compute[189066]:     <controller type='pci' index='8' model='pcie-root-port'>
Dec 05 09:37:51 compute-1 nova_compute[189066]:       <model name='pcie-root-port'/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:       <target chassis='8' port='0x17'/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:       <alias name='pci.8'/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x7'/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:     </controller>
Dec 05 09:37:51 compute-1 nova_compute[189066]:     <controller type='pci' index='9' model='pcie-root-port'>
Dec 05 09:37:51 compute-1 nova_compute[189066]:       <model name='pcie-root-port'/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:       <target chassis='9' port='0x18'/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:       <alias name='pci.9'/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0' multifunction='on'/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:     </controller>
Dec 05 09:37:51 compute-1 nova_compute[189066]:     <controller type='pci' index='10' model='pcie-root-port'>
Dec 05 09:37:51 compute-1 nova_compute[189066]:       <model name='pcie-root-port'/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:       <target chassis='10' port='0x19'/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:       <alias name='pci.10'/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x1'/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:     </controller>
Dec 05 09:37:51 compute-1 nova_compute[189066]:     <controller type='pci' index='11' model='pcie-root-port'>
Dec 05 09:37:51 compute-1 nova_compute[189066]:       <model name='pcie-root-port'/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:       <target chassis='11' port='0x1a'/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:       <alias name='pci.11'/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x2'/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:     </controller>
Dec 05 09:37:51 compute-1 nova_compute[189066]:     <controller type='pci' index='12' model='pcie-root-port'>
Dec 05 09:37:51 compute-1 nova_compute[189066]:       <model name='pcie-root-port'/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:       <target chassis='12' port='0x1b'/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:       <alias name='pci.12'/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x3'/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:     </controller>
Dec 05 09:37:51 compute-1 nova_compute[189066]:     <controller type='pci' index='13' model='pcie-root-port'>
Dec 05 09:37:51 compute-1 nova_compute[189066]:       <model name='pcie-root-port'/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:       <target chassis='13' port='0x1c'/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:       <alias name='pci.13'/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x4'/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:     </controller>
Dec 05 09:37:51 compute-1 nova_compute[189066]:     <controller type='pci' index='14' model='pcie-root-port'>
Dec 05 09:37:51 compute-1 nova_compute[189066]:       <model name='pcie-root-port'/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:       <target chassis='14' port='0x1d'/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:       <alias name='pci.14'/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x5'/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:     </controller>
Dec 05 09:37:51 compute-1 nova_compute[189066]:     <controller type='pci' index='15' model='pcie-root-port'>
Dec 05 09:37:51 compute-1 nova_compute[189066]:       <model name='pcie-root-port'/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:       <target chassis='15' port='0x1e'/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:       <alias name='pci.15'/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x6'/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:     </controller>
Dec 05 09:37:51 compute-1 nova_compute[189066]:     <controller type='pci' index='16' model='pcie-root-port'>
Dec 05 09:37:51 compute-1 nova_compute[189066]:       <model name='pcie-root-port'/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:       <target chassis='16' port='0x1f'/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:       <alias name='pci.16'/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x7'/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:     </controller>
Dec 05 09:37:51 compute-1 nova_compute[189066]:     <controller type='pci' index='17' model='pcie-root-port'>
Dec 05 09:37:51 compute-1 nova_compute[189066]:       <model name='pcie-root-port'/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:       <target chassis='17' port='0x20'/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:       <alias name='pci.17'/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x0' multifunction='on'/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:     </controller>
Dec 05 09:37:51 compute-1 nova_compute[189066]:     <controller type='pci' index='18' model='pcie-root-port'>
Dec 05 09:37:51 compute-1 nova_compute[189066]:       <model name='pcie-root-port'/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:       <target chassis='18' port='0x21'/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:       <alias name='pci.18'/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x1'/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:     </controller>
Dec 05 09:37:51 compute-1 nova_compute[189066]:     <controller type='pci' index='19' model='pcie-root-port'>
Dec 05 09:37:51 compute-1 nova_compute[189066]:       <model name='pcie-root-port'/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:       <target chassis='19' port='0x22'/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:       <alias name='pci.19'/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x2'/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:     </controller>
Dec 05 09:37:51 compute-1 nova_compute[189066]:     <controller type='pci' index='20' model='pcie-root-port'>
Dec 05 09:37:51 compute-1 nova_compute[189066]:       <model name='pcie-root-port'/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:       <target chassis='20' port='0x23'/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:       <alias name='pci.20'/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x3'/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:     </controller>
Dec 05 09:37:51 compute-1 nova_compute[189066]:     <controller type='pci' index='21' model='pcie-root-port'>
Dec 05 09:37:51 compute-1 nova_compute[189066]:       <model name='pcie-root-port'/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:       <target chassis='21' port='0x24'/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:       <alias name='pci.21'/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x4'/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:     </controller>
Dec 05 09:37:51 compute-1 nova_compute[189066]:     <controller type='pci' index='22' model='pcie-root-port'>
Dec 05 09:37:51 compute-1 nova_compute[189066]:       <model name='pcie-root-port'/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:       <target chassis='22' port='0x25'/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:       <alias name='pci.22'/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x5'/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:     </controller>
Dec 05 09:37:51 compute-1 nova_compute[189066]:     <controller type='pci' index='23' model='pcie-root-port'>
Dec 05 09:37:51 compute-1 nova_compute[189066]:       <model name='pcie-root-port'/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:       <target chassis='23' port='0x26'/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:       <alias name='pci.23'/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x6'/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:     </controller>
Dec 05 09:37:51 compute-1 nova_compute[189066]:     <controller type='pci' index='24' model='pcie-root-port'>
Dec 05 09:37:51 compute-1 nova_compute[189066]:       <model name='pcie-root-port'/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:       <target chassis='24' port='0x27'/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:       <alias name='pci.24'/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x7'/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:     </controller>
Dec 05 09:37:51 compute-1 nova_compute[189066]:     <controller type='pci' index='25' model='pcie-root-port'>
Dec 05 09:37:51 compute-1 nova_compute[189066]:       <model name='pcie-root-port'/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:       <target chassis='25' port='0x28'/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:       <alias name='pci.25'/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x05' function='0x0'/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:     </controller>
Dec 05 09:37:51 compute-1 nova_compute[189066]:     <controller type='pci' index='26' model='pcie-to-pci-bridge'>
Dec 05 09:37:51 compute-1 nova_compute[189066]:       <model name='pcie-pci-bridge'/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:       <alias name='pci.26'/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:       <address type='pci' domain='0x0000' bus='0x01' slot='0x00' function='0x0'/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:     </controller>
Dec 05 09:37:51 compute-1 nova_compute[189066]:     <controller type='usb' index='0' model='piix3-uhci'>
Dec 05 09:37:51 compute-1 nova_compute[189066]:       <alias name='usb'/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:       <address type='pci' domain='0x0000' bus='0x1a' slot='0x01' function='0x0'/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:     </controller>
Dec 05 09:37:51 compute-1 nova_compute[189066]:     <controller type='sata' index='0'>
Dec 05 09:37:51 compute-1 nova_compute[189066]:       <alias name='ide'/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x1f' function='0x2'/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:     </controller>
Dec 05 09:37:51 compute-1 nova_compute[189066]:     <interface type='ethernet'>
Dec 05 09:37:51 compute-1 nova_compute[189066]:       <mac address='fa:16:3e:5a:50:59'/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:       <target dev='tapdc081d90-d2'/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:       <model type='virtio'/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:       <driver name='vhost' rx_queue_size='512'/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:       <mtu size='1442'/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:       <alias name='net0'/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:       <address type='pci' domain='0x0000' bus='0x02' slot='0x00' function='0x0'/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:     </interface>
Dec 05 09:37:51 compute-1 nova_compute[189066]:     <serial type='pty'>
Dec 05 09:37:51 compute-1 nova_compute[189066]:       <source path='/dev/pts/0'/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:       <log file='/var/lib/nova/instances/caf0a99c-b4d0-4fac-9883-ab0be359b528/console.log' append='off'/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:       <target type='isa-serial' port='0'>
Dec 05 09:37:51 compute-1 nova_compute[189066]:         <model name='isa-serial'/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:       </target>
Dec 05 09:37:51 compute-1 nova_compute[189066]:       <alias name='serial0'/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:     </serial>
Dec 05 09:37:51 compute-1 nova_compute[189066]:     <console type='pty' tty='/dev/pts/0'>
Dec 05 09:37:51 compute-1 nova_compute[189066]:       <source path='/dev/pts/0'/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:       <log file='/var/lib/nova/instances/caf0a99c-b4d0-4fac-9883-ab0be359b528/console.log' append='off'/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:       <target type='serial' port='0'/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:       <alias name='serial0'/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:     </console>
Dec 05 09:37:51 compute-1 nova_compute[189066]:     <input type='tablet' bus='usb'>
Dec 05 09:37:51 compute-1 nova_compute[189066]:       <alias name='input0'/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:       <address type='usb' bus='0' port='1'/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:     </input>
Dec 05 09:37:51 compute-1 nova_compute[189066]:     <input type='mouse' bus='ps2'>
Dec 05 09:37:51 compute-1 nova_compute[189066]:       <alias name='input1'/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:     </input>
Dec 05 09:37:51 compute-1 nova_compute[189066]:     <input type='keyboard' bus='ps2'>
Dec 05 09:37:51 compute-1 nova_compute[189066]:       <alias name='input2'/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:     </input>
Dec 05 09:37:51 compute-1 nova_compute[189066]:     <graphics type='vnc' port='5900' autoport='yes' listen='::0'>
Dec 05 09:37:51 compute-1 nova_compute[189066]:       <listen type='address' address='::0'/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:     </graphics>
Dec 05 09:37:51 compute-1 nova_compute[189066]:     <audio id='1' type='none'/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:     <video>
Dec 05 09:37:51 compute-1 nova_compute[189066]:       <model type='virtio' heads='1' primary='yes'/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:       <alias name='video0'/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x01' function='0x0'/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:     </video>
Dec 05 09:37:51 compute-1 nova_compute[189066]:     <watchdog model='itco' action='reset'>
Dec 05 09:37:51 compute-1 nova_compute[189066]:       <alias name='watchdog0'/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:     </watchdog>
Dec 05 09:37:51 compute-1 nova_compute[189066]:     <memballoon model='virtio'>
Dec 05 09:37:51 compute-1 nova_compute[189066]:       <stats period='10'/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:       <alias name='balloon0'/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:       <address type='pci' domain='0x0000' bus='0x04' slot='0x00' function='0x0'/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:     </memballoon>
Dec 05 09:37:51 compute-1 nova_compute[189066]:     <rng model='virtio'>
Dec 05 09:37:51 compute-1 nova_compute[189066]:       <backend model='random'>/dev/urandom</backend>
Dec 05 09:37:51 compute-1 nova_compute[189066]:       <alias name='rng0'/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:       <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x0'/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:     </rng>
Dec 05 09:37:51 compute-1 nova_compute[189066]:   </devices>
Dec 05 09:37:51 compute-1 nova_compute[189066]:   <seclabel type='dynamic' model='selinux' relabel='yes'>
Dec 05 09:37:51 compute-1 nova_compute[189066]:     <label>system_u:system_r:svirt_t:s0:c66,c853</label>
Dec 05 09:37:51 compute-1 nova_compute[189066]:     <imagelabel>system_u:object_r:svirt_image_t:s0:c66,c853</imagelabel>
Dec 05 09:37:51 compute-1 nova_compute[189066]:   </seclabel>
Dec 05 09:37:51 compute-1 nova_compute[189066]:   <seclabel type='dynamic' model='dac' relabel='yes'>
Dec 05 09:37:51 compute-1 nova_compute[189066]:     <label>+107:+107</label>
Dec 05 09:37:51 compute-1 nova_compute[189066]:     <imagelabel>+107:+107</imagelabel>
Dec 05 09:37:51 compute-1 nova_compute[189066]:   </seclabel>
Dec 05 09:37:51 compute-1 nova_compute[189066]: </domain>
Dec 05 09:37:51 compute-1 nova_compute[189066]:  get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:282
Dec 05 09:37:51 compute-1 nova_compute[189066]: 2025-12-05 09:37:51.313 189070 DEBUG nova.virt.libvirt.guest [req-3fe1bf3d-be4b-47bc-9884-8a63d41354e6 req-ac6ee929-892d-4bfa-9c3b-e6239431b0e7 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:b9:90:da"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap5f95901b-96"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257
Dec 05 09:37:51 compute-1 nova_compute[189066]: 2025-12-05 09:37:51.318 189070 DEBUG nova.virt.libvirt.guest [req-3fe1bf3d-be4b-47bc-9884-8a63d41354e6 req-ac6ee929-892d-4bfa-9c3b-e6239431b0e7 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] interface for config: <interface type="ethernet"><mac address="fa:16:3e:b9:90:da"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap5f95901b-96"/></interface>not found in domain: <domain type='kvm' id='15'>
Dec 05 09:37:51 compute-1 nova_compute[189066]:   <name>instance-00000022</name>
Dec 05 09:37:51 compute-1 nova_compute[189066]:   <uuid>caf0a99c-b4d0-4fac-9883-ab0be359b528</uuid>
Dec 05 09:37:51 compute-1 nova_compute[189066]:   <metadata>
Dec 05 09:37:51 compute-1 nova_compute[189066]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1" xmlns:instance="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 05 09:37:51 compute-1 nova_compute[189066]:   <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:   <nova:name>tempest-TestNetworkBasicOps-server-1152892970</nova:name>
Dec 05 09:37:51 compute-1 nova_compute[189066]:   <nova:creationTime>2025-12-05 09:37:48</nova:creationTime>
Dec 05 09:37:51 compute-1 nova_compute[189066]:   <nova:flavor name="m1.nano">
Dec 05 09:37:51 compute-1 nova_compute[189066]:     <nova:memory>128</nova:memory>
Dec 05 09:37:51 compute-1 nova_compute[189066]:     <nova:disk>1</nova:disk>
Dec 05 09:37:51 compute-1 nova_compute[189066]:     <nova:swap>0</nova:swap>
Dec 05 09:37:51 compute-1 nova_compute[189066]:     <nova:ephemeral>0</nova:ephemeral>
Dec 05 09:37:51 compute-1 nova_compute[189066]:     <nova:vcpus>1</nova:vcpus>
Dec 05 09:37:51 compute-1 nova_compute[189066]:   </nova:flavor>
Dec 05 09:37:51 compute-1 nova_compute[189066]:   <nova:owner>
Dec 05 09:37:51 compute-1 nova_compute[189066]:     <nova:user uuid="822a2d80e80e46d9a5a49e1e9560e0d9">tempest-TestNetworkBasicOps-1341993432-project-member</nova:user>
Dec 05 09:37:51 compute-1 nova_compute[189066]:     <nova:project uuid="f918144b49634ed5a43d75f8f7d194d3">tempest-TestNetworkBasicOps-1341993432</nova:project>
Dec 05 09:37:51 compute-1 nova_compute[189066]:   </nova:owner>
Dec 05 09:37:51 compute-1 nova_compute[189066]:   <nova:root type="image" uuid="3ebffd97-b242-42d7-b245-ebdaf8e4377c"/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:   <nova:ports>
Dec 05 09:37:51 compute-1 nova_compute[189066]:     <nova:port uuid="dc081d90-d263-461d-a7af-bc8649b2d24e">
Dec 05 09:37:51 compute-1 nova_compute[189066]:       <nova:ip type="fixed" address="10.100.0.7" ipVersion="4"/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:     </nova:port>
Dec 05 09:37:51 compute-1 nova_compute[189066]:   </nova:ports>
Dec 05 09:37:51 compute-1 nova_compute[189066]: </nova:instance>
Dec 05 09:37:51 compute-1 nova_compute[189066]:   </metadata>
Dec 05 09:37:51 compute-1 nova_compute[189066]:   <memory unit='KiB'>131072</memory>
Dec 05 09:37:51 compute-1 nova_compute[189066]:   <currentMemory unit='KiB'>131072</currentMemory>
Dec 05 09:37:51 compute-1 nova_compute[189066]:   <vcpu placement='static'>1</vcpu>
Dec 05 09:37:51 compute-1 nova_compute[189066]:   <resource>
Dec 05 09:37:51 compute-1 nova_compute[189066]:     <partition>/machine</partition>
Dec 05 09:37:51 compute-1 nova_compute[189066]:   </resource>
Dec 05 09:37:51 compute-1 nova_compute[189066]:   <sysinfo type='smbios'>
Dec 05 09:37:51 compute-1 nova_compute[189066]:     <system>
Dec 05 09:37:51 compute-1 nova_compute[189066]:       <entry name='manufacturer'>RDO</entry>
Dec 05 09:37:51 compute-1 nova_compute[189066]:       <entry name='product'>OpenStack Compute</entry>
Dec 05 09:37:51 compute-1 nova_compute[189066]:       <entry name='version'>27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec 05 09:37:51 compute-1 nova_compute[189066]:       <entry name='serial'>caf0a99c-b4d0-4fac-9883-ab0be359b528</entry>
Dec 05 09:37:51 compute-1 nova_compute[189066]:       <entry name='uuid'>caf0a99c-b4d0-4fac-9883-ab0be359b528</entry>
Dec 05 09:37:51 compute-1 nova_compute[189066]:       <entry name='family'>Virtual Machine</entry>
Dec 05 09:37:51 compute-1 nova_compute[189066]:     </system>
Dec 05 09:37:51 compute-1 nova_compute[189066]:   </sysinfo>
Dec 05 09:37:51 compute-1 nova_compute[189066]:   <os>
Dec 05 09:37:51 compute-1 nova_compute[189066]:     <type arch='x86_64' machine='pc-q35-rhel9.8.0'>hvm</type>
Dec 05 09:37:51 compute-1 nova_compute[189066]:     <boot dev='hd'/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:     <smbios mode='sysinfo'/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:   </os>
Dec 05 09:37:51 compute-1 nova_compute[189066]:   <features>
Dec 05 09:37:51 compute-1 nova_compute[189066]:     <acpi/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:     <apic/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:     <vmcoreinfo state='on'/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:   </features>
Dec 05 09:37:51 compute-1 nova_compute[189066]:   <cpu mode='custom' match='exact' check='full'>
Dec 05 09:37:51 compute-1 nova_compute[189066]:     <model fallback='forbid'>Nehalem</model>
Dec 05 09:37:51 compute-1 nova_compute[189066]:     <topology sockets='1' dies='1' clusters='1' cores='1' threads='1'/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:     <feature policy='require' name='x2apic'/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:     <feature policy='require' name='hypervisor'/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:     <feature policy='require' name='vme'/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:   </cpu>
Dec 05 09:37:51 compute-1 nova_compute[189066]:   <clock offset='utc'>
Dec 05 09:37:51 compute-1 nova_compute[189066]:     <timer name='pit' tickpolicy='delay'/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:     <timer name='rtc' tickpolicy='catchup'/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:     <timer name='hpet' present='no'/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:   </clock>
Dec 05 09:37:51 compute-1 nova_compute[189066]:   <on_poweroff>destroy</on_poweroff>
Dec 05 09:37:51 compute-1 nova_compute[189066]:   <on_reboot>restart</on_reboot>
Dec 05 09:37:51 compute-1 nova_compute[189066]:   <on_crash>destroy</on_crash>
Dec 05 09:37:51 compute-1 nova_compute[189066]:   <devices>
Dec 05 09:37:51 compute-1 nova_compute[189066]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Dec 05 09:37:51 compute-1 nova_compute[189066]:     <disk type='file' device='disk'>
Dec 05 09:37:51 compute-1 nova_compute[189066]:       <driver name='qemu' type='qcow2' cache='none'/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:       <source file='/var/lib/nova/instances/caf0a99c-b4d0-4fac-9883-ab0be359b528/disk' index='2'/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:       <backingStore type='file' index='3'>
Dec 05 09:37:51 compute-1 nova_compute[189066]:         <format type='raw'/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:         <source file='/var/lib/nova/instances/_base/5b1bdb5cc50b9bf43ebe3094e9279a12a18df0ce'/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:         <backingStore/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:       </backingStore>
Dec 05 09:37:51 compute-1 nova_compute[189066]:       <target dev='vda' bus='virtio'/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:       <alias name='virtio-disk0'/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:       <address type='pci' domain='0x0000' bus='0x03' slot='0x00' function='0x0'/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:     </disk>
Dec 05 09:37:51 compute-1 nova_compute[189066]:     <disk type='file' device='cdrom'>
Dec 05 09:37:51 compute-1 nova_compute[189066]:       <driver name='qemu' type='raw' cache='none'/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:       <source file='/var/lib/nova/instances/caf0a99c-b4d0-4fac-9883-ab0be359b528/disk.config' index='1'/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:       <backingStore/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:       <target dev='sda' bus='sata'/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:       <readonly/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:       <alias name='sata0-0-0'/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:       <address type='drive' controller='0' bus='0' target='0' unit='0'/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:     </disk>
Dec 05 09:37:51 compute-1 nova_compute[189066]:     <controller type='pci' index='0' model='pcie-root'>
Dec 05 09:37:51 compute-1 nova_compute[189066]:       <alias name='pcie.0'/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:     </controller>
Dec 05 09:37:51 compute-1 nova_compute[189066]:     <controller type='pci' index='1' model='pcie-root-port'>
Dec 05 09:37:51 compute-1 nova_compute[189066]:       <model name='pcie-root-port'/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:       <target chassis='1' port='0x10'/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:       <alias name='pci.1'/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x0' multifunction='on'/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:     </controller>
Dec 05 09:37:51 compute-1 nova_compute[189066]:     <controller type='pci' index='2' model='pcie-root-port'>
Dec 05 09:37:51 compute-1 nova_compute[189066]:       <model name='pcie-root-port'/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:       <target chassis='2' port='0x11'/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:       <alias name='pci.2'/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x1'/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:     </controller>
Dec 05 09:37:51 compute-1 nova_compute[189066]:     <controller type='pci' index='3' model='pcie-root-port'>
Dec 05 09:37:51 compute-1 nova_compute[189066]:       <model name='pcie-root-port'/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:       <target chassis='3' port='0x12'/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:       <alias name='pci.3'/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x2'/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:     </controller>
Dec 05 09:37:51 compute-1 nova_compute[189066]:     <controller type='pci' index='4' model='pcie-root-port'>
Dec 05 09:37:51 compute-1 nova_compute[189066]:       <model name='pcie-root-port'/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:       <target chassis='4' port='0x13'/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:       <alias name='pci.4'/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x3'/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:     </controller>
Dec 05 09:37:51 compute-1 nova_compute[189066]:     <controller type='pci' index='5' model='pcie-root-port'>
Dec 05 09:37:51 compute-1 nova_compute[189066]:       <model name='pcie-root-port'/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:       <target chassis='5' port='0x14'/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:       <alias name='pci.5'/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x4'/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:     </controller>
Dec 05 09:37:51 compute-1 nova_compute[189066]:     <controller type='pci' index='6' model='pcie-root-port'>
Dec 05 09:37:51 compute-1 nova_compute[189066]:       <model name='pcie-root-port'/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:       <target chassis='6' port='0x15'/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:       <alias name='pci.6'/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x5'/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:     </controller>
Dec 05 09:37:51 compute-1 nova_compute[189066]:     <controller type='pci' index='7' model='pcie-root-port'>
Dec 05 09:37:51 compute-1 nova_compute[189066]:       <model name='pcie-root-port'/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:       <target chassis='7' port='0x16'/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:       <alias name='pci.7'/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x6'/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:     </controller>
Dec 05 09:37:51 compute-1 nova_compute[189066]:     <controller type='pci' index='8' model='pcie-root-port'>
Dec 05 09:37:51 compute-1 nova_compute[189066]:       <model name='pcie-root-port'/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:       <target chassis='8' port='0x17'/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:       <alias name='pci.8'/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x7'/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:     </controller>
Dec 05 09:37:51 compute-1 nova_compute[189066]:     <controller type='pci' index='9' model='pcie-root-port'>
Dec 05 09:37:51 compute-1 nova_compute[189066]:       <model name='pcie-root-port'/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:       <target chassis='9' port='0x18'/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:       <alias name='pci.9'/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0' multifunction='on'/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:     </controller>
Dec 05 09:37:51 compute-1 nova_compute[189066]:     <controller type='pci' index='10' model='pcie-root-port'>
Dec 05 09:37:51 compute-1 nova_compute[189066]:       <model name='pcie-root-port'/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:       <target chassis='10' port='0x19'/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:       <alias name='pci.10'/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x1'/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:     </controller>
Dec 05 09:37:51 compute-1 nova_compute[189066]:     <controller type='pci' index='11' model='pcie-root-port'>
Dec 05 09:37:51 compute-1 nova_compute[189066]:       <model name='pcie-root-port'/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:       <target chassis='11' port='0x1a'/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:       <alias name='pci.11'/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x2'/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:     </controller>
Dec 05 09:37:51 compute-1 nova_compute[189066]:     <controller type='pci' index='12' model='pcie-root-port'>
Dec 05 09:37:51 compute-1 nova_compute[189066]:       <model name='pcie-root-port'/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:       <target chassis='12' port='0x1b'/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:       <alias name='pci.12'/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x3'/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:     </controller>
Dec 05 09:37:51 compute-1 nova_compute[189066]:     <controller type='pci' index='13' model='pcie-root-port'>
Dec 05 09:37:51 compute-1 nova_compute[189066]:       <model name='pcie-root-port'/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:       <target chassis='13' port='0x1c'/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:       <alias name='pci.13'/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x4'/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:     </controller>
Dec 05 09:37:51 compute-1 nova_compute[189066]:     <controller type='pci' index='14' model='pcie-root-port'>
Dec 05 09:37:51 compute-1 nova_compute[189066]:       <model name='pcie-root-port'/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:       <target chassis='14' port='0x1d'/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:       <alias name='pci.14'/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x5'/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:     </controller>
Dec 05 09:37:51 compute-1 nova_compute[189066]:     <controller type='pci' index='15' model='pcie-root-port'>
Dec 05 09:37:51 compute-1 nova_compute[189066]:       <model name='pcie-root-port'/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:       <target chassis='15' port='0x1e'/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:       <alias name='pci.15'/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x6'/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:     </controller>
Dec 05 09:37:51 compute-1 nova_compute[189066]:     <controller type='pci' index='16' model='pcie-root-port'>
Dec 05 09:37:51 compute-1 nova_compute[189066]:       <model name='pcie-root-port'/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:       <target chassis='16' port='0x1f'/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:       <alias name='pci.16'/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x7'/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:     </controller>
Dec 05 09:37:51 compute-1 nova_compute[189066]:     <controller type='pci' index='17' model='pcie-root-port'>
Dec 05 09:37:51 compute-1 nova_compute[189066]:       <model name='pcie-root-port'/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:       <target chassis='17' port='0x20'/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:       <alias name='pci.17'/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x0' multifunction='on'/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:     </controller>
Dec 05 09:37:51 compute-1 nova_compute[189066]:     <controller type='pci' index='18' model='pcie-root-port'>
Dec 05 09:37:51 compute-1 nova_compute[189066]:       <model name='pcie-root-port'/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:       <target chassis='18' port='0x21'/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:       <alias name='pci.18'/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x1'/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:     </controller>
Dec 05 09:37:51 compute-1 nova_compute[189066]:     <controller type='pci' index='19' model='pcie-root-port'>
Dec 05 09:37:51 compute-1 nova_compute[189066]:       <model name='pcie-root-port'/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:       <target chassis='19' port='0x22'/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:       <alias name='pci.19'/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x2'/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:     </controller>
Dec 05 09:37:51 compute-1 nova_compute[189066]:     <controller type='pci' index='20' model='pcie-root-port'>
Dec 05 09:37:51 compute-1 nova_compute[189066]:       <model name='pcie-root-port'/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:       <target chassis='20' port='0x23'/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:       <alias name='pci.20'/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x3'/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:     </controller>
Dec 05 09:37:51 compute-1 nova_compute[189066]:     <controller type='pci' index='21' model='pcie-root-port'>
Dec 05 09:37:51 compute-1 nova_compute[189066]:       <model name='pcie-root-port'/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:       <target chassis='21' port='0x24'/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:       <alias name='pci.21'/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x4'/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:     </controller>
Dec 05 09:37:51 compute-1 nova_compute[189066]:     <controller type='pci' index='22' model='pcie-root-port'>
Dec 05 09:37:51 compute-1 nova_compute[189066]:       <model name='pcie-root-port'/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:       <target chassis='22' port='0x25'/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:       <alias name='pci.22'/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x5'/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:     </controller>
Dec 05 09:37:51 compute-1 nova_compute[189066]:     <controller type='pci' index='23' model='pcie-root-port'>
Dec 05 09:37:51 compute-1 nova_compute[189066]:       <model name='pcie-root-port'/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:       <target chassis='23' port='0x26'/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:       <alias name='pci.23'/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x6'/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:     </controller>
Dec 05 09:37:51 compute-1 nova_compute[189066]:     <controller type='pci' index='24' model='pcie-root-port'>
Dec 05 09:37:51 compute-1 nova_compute[189066]:       <model name='pcie-root-port'/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:       <target chassis='24' port='0x27'/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:       <alias name='pci.24'/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x7'/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:     </controller>
Dec 05 09:37:51 compute-1 nova_compute[189066]:     <controller type='pci' index='25' model='pcie-root-port'>
Dec 05 09:37:51 compute-1 nova_compute[189066]:       <model name='pcie-root-port'/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:       <target chassis='25' port='0x28'/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:       <alias name='pci.25'/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x05' function='0x0'/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:     </controller>
Dec 05 09:37:51 compute-1 nova_compute[189066]:     <controller type='pci' index='26' model='pcie-to-pci-bridge'>
Dec 05 09:37:51 compute-1 nova_compute[189066]:       <model name='pcie-pci-bridge'/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:       <alias name='pci.26'/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:       <address type='pci' domain='0x0000' bus='0x01' slot='0x00' function='0x0'/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:     </controller>
Dec 05 09:37:51 compute-1 nova_compute[189066]:     <controller type='usb' index='0' model='piix3-uhci'>
Dec 05 09:37:51 compute-1 nova_compute[189066]:       <alias name='usb'/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:       <address type='pci' domain='0x0000' bus='0x1a' slot='0x01' function='0x0'/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:     </controller>
Dec 05 09:37:51 compute-1 nova_compute[189066]:     <controller type='sata' index='0'>
Dec 05 09:37:51 compute-1 nova_compute[189066]:       <alias name='ide'/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x1f' function='0x2'/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:     </controller>
Dec 05 09:37:51 compute-1 nova_compute[189066]:     <interface type='ethernet'>
Dec 05 09:37:51 compute-1 nova_compute[189066]:       <mac address='fa:16:3e:5a:50:59'/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:       <target dev='tapdc081d90-d2'/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:       <model type='virtio'/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:       <driver name='vhost' rx_queue_size='512'/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:       <mtu size='1442'/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:       <alias name='net0'/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:       <address type='pci' domain='0x0000' bus='0x02' slot='0x00' function='0x0'/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:     </interface>
Dec 05 09:37:51 compute-1 nova_compute[189066]:     <serial type='pty'>
Dec 05 09:37:51 compute-1 nova_compute[189066]:       <source path='/dev/pts/0'/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:       <log file='/var/lib/nova/instances/caf0a99c-b4d0-4fac-9883-ab0be359b528/console.log' append='off'/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:       <target type='isa-serial' port='0'>
Dec 05 09:37:51 compute-1 nova_compute[189066]:         <model name='isa-serial'/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:       </target>
Dec 05 09:37:51 compute-1 nova_compute[189066]:       <alias name='serial0'/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:     </serial>
Dec 05 09:37:51 compute-1 nova_compute[189066]:     <console type='pty' tty='/dev/pts/0'>
Dec 05 09:37:51 compute-1 nova_compute[189066]:       <source path='/dev/pts/0'/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:       <log file='/var/lib/nova/instances/caf0a99c-b4d0-4fac-9883-ab0be359b528/console.log' append='off'/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:       <target type='serial' port='0'/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:       <alias name='serial0'/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:     </console>
Dec 05 09:37:51 compute-1 nova_compute[189066]:     <input type='tablet' bus='usb'>
Dec 05 09:37:51 compute-1 nova_compute[189066]:       <alias name='input0'/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:       <address type='usb' bus='0' port='1'/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:     </input>
Dec 05 09:37:51 compute-1 nova_compute[189066]:     <input type='mouse' bus='ps2'>
Dec 05 09:37:51 compute-1 nova_compute[189066]:       <alias name='input1'/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:     </input>
Dec 05 09:37:51 compute-1 nova_compute[189066]:     <input type='keyboard' bus='ps2'>
Dec 05 09:37:51 compute-1 nova_compute[189066]:       <alias name='input2'/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:     </input>
Dec 05 09:37:51 compute-1 nova_compute[189066]:     <graphics type='vnc' port='5900' autoport='yes' listen='::0'>
Dec 05 09:37:51 compute-1 nova_compute[189066]:       <listen type='address' address='::0'/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:     </graphics>
Dec 05 09:37:51 compute-1 nova_compute[189066]:     <audio id='1' type='none'/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:     <video>
Dec 05 09:37:51 compute-1 nova_compute[189066]:       <model type='virtio' heads='1' primary='yes'/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:       <alias name='video0'/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x01' function='0x0'/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:     </video>
Dec 05 09:37:51 compute-1 nova_compute[189066]:     <watchdog model='itco' action='reset'>
Dec 05 09:37:51 compute-1 nova_compute[189066]:       <alias name='watchdog0'/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:     </watchdog>
Dec 05 09:37:51 compute-1 nova_compute[189066]:     <memballoon model='virtio'>
Dec 05 09:37:51 compute-1 nova_compute[189066]:       <stats period='10'/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:       <alias name='balloon0'/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:       <address type='pci' domain='0x0000' bus='0x04' slot='0x00' function='0x0'/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:     </memballoon>
Dec 05 09:37:51 compute-1 nova_compute[189066]:     <rng model='virtio'>
Dec 05 09:37:51 compute-1 nova_compute[189066]:       <backend model='random'>/dev/urandom</backend>
Dec 05 09:37:51 compute-1 nova_compute[189066]:       <alias name='rng0'/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:       <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x0'/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:     </rng>
Dec 05 09:37:51 compute-1 nova_compute[189066]:   </devices>
Dec 05 09:37:51 compute-1 nova_compute[189066]:   <seclabel type='dynamic' model='selinux' relabel='yes'>
Dec 05 09:37:51 compute-1 nova_compute[189066]:     <label>system_u:system_r:svirt_t:s0:c66,c853</label>
Dec 05 09:37:51 compute-1 nova_compute[189066]:     <imagelabel>system_u:object_r:svirt_image_t:s0:c66,c853</imagelabel>
Dec 05 09:37:51 compute-1 nova_compute[189066]:   </seclabel>
Dec 05 09:37:51 compute-1 nova_compute[189066]:   <seclabel type='dynamic' model='dac' relabel='yes'>
Dec 05 09:37:51 compute-1 nova_compute[189066]:     <label>+107:+107</label>
Dec 05 09:37:51 compute-1 nova_compute[189066]:     <imagelabel>+107:+107</imagelabel>
Dec 05 09:37:51 compute-1 nova_compute[189066]:   </seclabel>
Dec 05 09:37:51 compute-1 nova_compute[189066]: </domain>
Dec 05 09:37:51 compute-1 nova_compute[189066]:  get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:282
Dec 05 09:37:51 compute-1 nova_compute[189066]: 2025-12-05 09:37:51.320 189070 WARNING nova.virt.libvirt.driver [req-3fe1bf3d-be4b-47bc-9884-8a63d41354e6 req-ac6ee929-892d-4bfa-9c3b-e6239431b0e7 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: caf0a99c-b4d0-4fac-9883-ab0be359b528] Detaching interface fa:16:3e:b9:90:da failed because the device is no longer found on the guest.: nova.exception.DeviceNotFound: Device 'tap5f95901b-96' not found.
Dec 05 09:37:51 compute-1 nova_compute[189066]: 2025-12-05 09:37:51.321 189070 DEBUG nova.virt.libvirt.vif [req-3fe1bf3d-be4b-47bc-9884-8a63d41354e6 req-ac6ee929-892d-4bfa-9c3b-e6239431b0e7 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-05T09:35:53Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1152892970',display_name='tempest-TestNetworkBasicOps-server-1152892970',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1152892970',id=34,image_ref='3ebffd97-b242-42d7-b245-ebdaf8e4377c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBAOX2Isf6e1VVEipyaY1bQ3oo7OBeHw8nA3AYDjK6acPXxSqT3BdAkhZsaMDELIc3uY4dO92h8xGVeDLew9jVMMUIkjYQ57YIHVif+pT2wHO21vXehX0yK2yZSKNfMCQjw==',key_name='tempest-TestNetworkBasicOps-1301626670',keypairs=<?>,launch_index=0,launched_at=2025-12-05T09:36:12Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata=<?>,migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='f918144b49634ed5a43d75f8f7d194d3',ramdisk_id='',reservation_id='r-8kfznvii',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='3ebffd97-b242-42d7-b245-ebdaf8e4377c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1341993432',owner_user_name='tempest-TestNetworkBasicOps-1341993432-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-12-05T09:36:12Z,user_data=None,user_id='822a2d80e80e46d9a5a49e1e9560e0d9',uuid=caf0a99c-b4d0-4fac-9883-ab0be359b528,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "5f95901b-9655-49a7-a46d-bfe7f4c86a96", "address": "fa:16:3e:b9:90:da", "network": {"id": "28997311-91ec-41eb-8d69-3ce8625aacb4", "bridge": "br-int", "label": "tempest-network-smoke--2657185", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.20", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f918144b49634ed5a43d75f8f7d194d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5f95901b-96", "ovs_interfaceid": "5f95901b-9655-49a7-a46d-bfe7f4c86a96", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Dec 05 09:37:51 compute-1 nova_compute[189066]: 2025-12-05 09:37:51.322 189070 DEBUG nova.network.os_vif_util [req-3fe1bf3d-be4b-47bc-9884-8a63d41354e6 req-ac6ee929-892d-4bfa-9c3b-e6239431b0e7 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Converting VIF {"id": "5f95901b-9655-49a7-a46d-bfe7f4c86a96", "address": "fa:16:3e:b9:90:da", "network": {"id": "28997311-91ec-41eb-8d69-3ce8625aacb4", "bridge": "br-int", "label": "tempest-network-smoke--2657185", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.20", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f918144b49634ed5a43d75f8f7d194d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5f95901b-96", "ovs_interfaceid": "5f95901b-9655-49a7-a46d-bfe7f4c86a96", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 09:37:51 compute-1 nova_compute[189066]: 2025-12-05 09:37:51.323 189070 DEBUG nova.network.os_vif_util [req-3fe1bf3d-be4b-47bc-9884-8a63d41354e6 req-ac6ee929-892d-4bfa-9c3b-e6239431b0e7 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:b9:90:da,bridge_name='br-int',has_traffic_filtering=True,id=5f95901b-9655-49a7-a46d-bfe7f4c86a96,network=Network(28997311-91ec-41eb-8d69-3ce8625aacb4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5f95901b-96') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 09:37:51 compute-1 nova_compute[189066]: 2025-12-05 09:37:51.323 189070 DEBUG os_vif [req-3fe1bf3d-be4b-47bc-9884-8a63d41354e6 req-ac6ee929-892d-4bfa-9c3b-e6239431b0e7 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:b9:90:da,bridge_name='br-int',has_traffic_filtering=True,id=5f95901b-9655-49a7-a46d-bfe7f4c86a96,network=Network(28997311-91ec-41eb-8d69-3ce8625aacb4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5f95901b-96') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Dec 05 09:37:51 compute-1 nova_compute[189066]: 2025-12-05 09:37:51.329 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:37:51 compute-1 nova_compute[189066]: 2025-12-05 09:37:51.329 189070 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap5f95901b-96, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 09:37:51 compute-1 nova_compute[189066]: 2025-12-05 09:37:51.330 189070 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 09:37:51 compute-1 nova_compute[189066]: 2025-12-05 09:37:51.332 189070 INFO os_vif [req-3fe1bf3d-be4b-47bc-9884-8a63d41354e6 req-ac6ee929-892d-4bfa-9c3b-e6239431b0e7 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:b9:90:da,bridge_name='br-int',has_traffic_filtering=True,id=5f95901b-9655-49a7-a46d-bfe7f4c86a96,network=Network(28997311-91ec-41eb-8d69-3ce8625aacb4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5f95901b-96')
Dec 05 09:37:51 compute-1 nova_compute[189066]: 2025-12-05 09:37:51.333 189070 DEBUG nova.virt.libvirt.guest [req-3fe1bf3d-be4b-47bc-9884-8a63d41354e6 req-ac6ee929-892d-4bfa-9c3b-e6239431b0e7 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] set metadata xml: <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 05 09:37:51 compute-1 nova_compute[189066]:   <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:   <nova:name>tempest-TestNetworkBasicOps-server-1152892970</nova:name>
Dec 05 09:37:51 compute-1 nova_compute[189066]:   <nova:creationTime>2025-12-05 09:37:51</nova:creationTime>
Dec 05 09:37:51 compute-1 nova_compute[189066]:   <nova:flavor name="m1.nano">
Dec 05 09:37:51 compute-1 nova_compute[189066]:     <nova:memory>128</nova:memory>
Dec 05 09:37:51 compute-1 nova_compute[189066]:     <nova:disk>1</nova:disk>
Dec 05 09:37:51 compute-1 nova_compute[189066]:     <nova:swap>0</nova:swap>
Dec 05 09:37:51 compute-1 nova_compute[189066]:     <nova:ephemeral>0</nova:ephemeral>
Dec 05 09:37:51 compute-1 nova_compute[189066]:     <nova:vcpus>1</nova:vcpus>
Dec 05 09:37:51 compute-1 nova_compute[189066]:   </nova:flavor>
Dec 05 09:37:51 compute-1 nova_compute[189066]:   <nova:owner>
Dec 05 09:37:51 compute-1 nova_compute[189066]:     <nova:user uuid="822a2d80e80e46d9a5a49e1e9560e0d9">tempest-TestNetworkBasicOps-1341993432-project-member</nova:user>
Dec 05 09:37:51 compute-1 nova_compute[189066]:     <nova:project uuid="f918144b49634ed5a43d75f8f7d194d3">tempest-TestNetworkBasicOps-1341993432</nova:project>
Dec 05 09:37:51 compute-1 nova_compute[189066]:   </nova:owner>
Dec 05 09:37:51 compute-1 nova_compute[189066]:   <nova:root type="image" uuid="3ebffd97-b242-42d7-b245-ebdaf8e4377c"/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:   <nova:ports>
Dec 05 09:37:51 compute-1 nova_compute[189066]:     <nova:port uuid="dc081d90-d263-461d-a7af-bc8649b2d24e">
Dec 05 09:37:51 compute-1 nova_compute[189066]:       <nova:ip type="fixed" address="10.100.0.7" ipVersion="4"/>
Dec 05 09:37:51 compute-1 nova_compute[189066]:     </nova:port>
Dec 05 09:37:51 compute-1 nova_compute[189066]:   </nova:ports>
Dec 05 09:37:51 compute-1 nova_compute[189066]: </nova:instance>
Dec 05 09:37:51 compute-1 nova_compute[189066]:  set_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:359
Dec 05 09:37:51 compute-1 nova_compute[189066]: 2025-12-05 09:37:51.376 189070 INFO nova.network.neutron [None req-702a17de-0400-4457-93c0-f356fcae3e1e 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: caf0a99c-b4d0-4fac-9883-ab0be359b528] Port 5f95901b-9655-49a7-a46d-bfe7f4c86a96 from network info_cache is no longer associated with instance in Neutron. Removing from network info_cache.
Dec 05 09:37:51 compute-1 nova_compute[189066]: 2025-12-05 09:37:51.377 189070 DEBUG nova.network.neutron [None req-702a17de-0400-4457-93c0-f356fcae3e1e 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: caf0a99c-b4d0-4fac-9883-ab0be359b528] Updating instance_info_cache with network_info: [{"id": "dc081d90-d263-461d-a7af-bc8649b2d24e", "address": "fa:16:3e:5a:50:59", "network": {"id": "9169e776-8a85-417a-9321-7e8c761484e0", "bridge": "br-int", "label": "tempest-network-smoke--736767212", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.215", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f918144b49634ed5a43d75f8f7d194d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdc081d90-d2", "ovs_interfaceid": "dc081d90-d263-461d-a7af-bc8649b2d24e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 09:37:51 compute-1 nova_compute[189066]: 2025-12-05 09:37:51.409 189070 DEBUG oslo_concurrency.lockutils [None req-702a17de-0400-4457-93c0-f356fcae3e1e 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Releasing lock "refresh_cache-caf0a99c-b4d0-4fac-9883-ab0be359b528" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 09:37:51 compute-1 nova_compute[189066]: 2025-12-05 09:37:51.457 189070 DEBUG oslo_concurrency.lockutils [None req-702a17de-0400-4457-93c0-f356fcae3e1e 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Lock "interface-caf0a99c-b4d0-4fac-9883-ab0be359b528-5f95901b-9655-49a7-a46d-bfe7f4c86a96" "released" by "nova.compute.manager.ComputeManager.detach_interface.<locals>.do_detach_interface" :: held 3.046s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:37:51 compute-1 ovn_controller[95809]: 2025-12-05T09:37:51Z|00211|binding|INFO|Releasing lport e2b66518-c69f-4ca7-89db-70c0fcadafde from this chassis (sb_readonly=0)
Dec 05 09:37:51 compute-1 nova_compute[189066]: 2025-12-05 09:37:51.589 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:37:51 compute-1 podman[229671]: 2025-12-05 09:37:51.637931572 +0000 UTC m=+0.074532278 container health_status d60d3dfd47255bc7c068dbedc03dbf86678863f0414a1b8f8536e4d53dd67a13 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a42d21893a42142b0b00e96296a13ac9d2c0fedf55d6438ad104fd0309c63376-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_id=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125)
Dec 05 09:37:52 compute-1 nova_compute[189066]: 2025-12-05 09:37:52.687 189070 DEBUG oslo_concurrency.lockutils [None req-871fd66e-4f60-48b4-be6f-fac2c63ccf41 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Acquiring lock "caf0a99c-b4d0-4fac-9883-ab0be359b528" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:37:52 compute-1 nova_compute[189066]: 2025-12-05 09:37:52.687 189070 DEBUG oslo_concurrency.lockutils [None req-871fd66e-4f60-48b4-be6f-fac2c63ccf41 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Lock "caf0a99c-b4d0-4fac-9883-ab0be359b528" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:37:52 compute-1 nova_compute[189066]: 2025-12-05 09:37:52.688 189070 DEBUG oslo_concurrency.lockutils [None req-871fd66e-4f60-48b4-be6f-fac2c63ccf41 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Acquiring lock "caf0a99c-b4d0-4fac-9883-ab0be359b528-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:37:52 compute-1 nova_compute[189066]: 2025-12-05 09:37:52.688 189070 DEBUG oslo_concurrency.lockutils [None req-871fd66e-4f60-48b4-be6f-fac2c63ccf41 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Lock "caf0a99c-b4d0-4fac-9883-ab0be359b528-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:37:52 compute-1 nova_compute[189066]: 2025-12-05 09:37:52.688 189070 DEBUG oslo_concurrency.lockutils [None req-871fd66e-4f60-48b4-be6f-fac2c63ccf41 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Lock "caf0a99c-b4d0-4fac-9883-ab0be359b528-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:37:52 compute-1 nova_compute[189066]: 2025-12-05 09:37:52.689 189070 INFO nova.compute.manager [None req-871fd66e-4f60-48b4-be6f-fac2c63ccf41 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: caf0a99c-b4d0-4fac-9883-ab0be359b528] Terminating instance
Dec 05 09:37:52 compute-1 nova_compute[189066]: 2025-12-05 09:37:52.690 189070 DEBUG nova.compute.manager [None req-871fd66e-4f60-48b4-be6f-fac2c63ccf41 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: caf0a99c-b4d0-4fac-9883-ab0be359b528] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Dec 05 09:37:52 compute-1 kernel: tapdc081d90-d2 (unregistering): left promiscuous mode
Dec 05 09:37:52 compute-1 NetworkManager[55704]: <info>  [1764927472.7260] device (tapdc081d90-d2): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec 05 09:37:52 compute-1 nova_compute[189066]: 2025-12-05 09:37:52.732 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:37:52 compute-1 ovn_controller[95809]: 2025-12-05T09:37:52Z|00212|binding|INFO|Releasing lport dc081d90-d263-461d-a7af-bc8649b2d24e from this chassis (sb_readonly=0)
Dec 05 09:37:52 compute-1 ovn_controller[95809]: 2025-12-05T09:37:52Z|00213|binding|INFO|Setting lport dc081d90-d263-461d-a7af-bc8649b2d24e down in Southbound
Dec 05 09:37:52 compute-1 ovn_controller[95809]: 2025-12-05T09:37:52Z|00214|binding|INFO|Removing iface tapdc081d90-d2 ovn-installed in OVS
Dec 05 09:37:52 compute-1 nova_compute[189066]: 2025-12-05 09:37:52.754 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:37:52 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:37:52.764 105272 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:5a:50:59 10.100.0.7'], port_security=['fa:16:3e:5a:50:59 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': 'caf0a99c-b4d0-4fac-9883-ab0be359b528', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-9169e776-8a85-417a-9321-7e8c761484e0', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'f918144b49634ed5a43d75f8f7d194d3', 'neutron:revision_number': '4', 'neutron:security_group_ids': '346fffc5-09ed-4aab-8fb9-5dfa08cec170', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=4e4f212a-f0a6-4181-b812-3bfaf26b63d6, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fc06f54c970>], logical_port=dc081d90-d263-461d-a7af-bc8649b2d24e) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fc06f54c970>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 09:37:52 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:37:52.766 105272 INFO neutron.agent.ovn.metadata.agent [-] Port dc081d90-d263-461d-a7af-bc8649b2d24e in datapath 9169e776-8a85-417a-9321-7e8c761484e0 unbound from our chassis
Dec 05 09:37:52 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:37:52.768 105272 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 9169e776-8a85-417a-9321-7e8c761484e0, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 05 09:37:52 compute-1 nova_compute[189066]: 2025-12-05 09:37:52.768 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:37:52 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:37:52.769 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[b9a1a9c7-6426-42ec-b70f-dfa16e5f03ff]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:37:52 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:37:52.770 105272 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-9169e776-8a85-417a-9321-7e8c761484e0 namespace which is not needed anymore
Dec 05 09:37:52 compute-1 systemd[1]: machine-qemu\x2d15\x2dinstance\x2d00000022.scope: Deactivated successfully.
Dec 05 09:37:52 compute-1 systemd[1]: machine-qemu\x2d15\x2dinstance\x2d00000022.scope: Consumed 18.579s CPU time.
Dec 05 09:37:52 compute-1 systemd-machined[154815]: Machine qemu-15-instance-00000022 terminated.
Dec 05 09:37:52 compute-1 neutron-haproxy-ovnmeta-9169e776-8a85-417a-9321-7e8c761484e0[228861]: [NOTICE]   (228865) : haproxy version is 2.8.14-c23fe91
Dec 05 09:37:52 compute-1 neutron-haproxy-ovnmeta-9169e776-8a85-417a-9321-7e8c761484e0[228861]: [NOTICE]   (228865) : path to executable is /usr/sbin/haproxy
Dec 05 09:37:52 compute-1 neutron-haproxy-ovnmeta-9169e776-8a85-417a-9321-7e8c761484e0[228861]: [WARNING]  (228865) : Exiting Master process...
Dec 05 09:37:52 compute-1 neutron-haproxy-ovnmeta-9169e776-8a85-417a-9321-7e8c761484e0[228861]: [WARNING]  (228865) : Exiting Master process...
Dec 05 09:37:52 compute-1 neutron-haproxy-ovnmeta-9169e776-8a85-417a-9321-7e8c761484e0[228861]: [ALERT]    (228865) : Current worker (228867) exited with code 143 (Terminated)
Dec 05 09:37:52 compute-1 neutron-haproxy-ovnmeta-9169e776-8a85-417a-9321-7e8c761484e0[228861]: [WARNING]  (228865) : All workers exited. Exiting... (0)
Dec 05 09:37:52 compute-1 systemd[1]: libpod-df081b9d3533857ba6e9cfbf2ffd731db37f80c472c03a0d58cf99afbcdc99c8.scope: Deactivated successfully.
Dec 05 09:37:52 compute-1 podman[229716]: 2025-12-05 09:37:52.942987415 +0000 UTC m=+0.053640079 container died df081b9d3533857ba6e9cfbf2ffd731db37f80c472c03a0d58cf99afbcdc99c8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9169e776-8a85-417a-9321-7e8c761484e0, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team)
Dec 05 09:37:52 compute-1 nova_compute[189066]: 2025-12-05 09:37:52.955 189070 INFO nova.virt.libvirt.driver [-] [instance: caf0a99c-b4d0-4fac-9883-ab0be359b528] Instance destroyed successfully.
Dec 05 09:37:52 compute-1 nova_compute[189066]: 2025-12-05 09:37:52.956 189070 DEBUG nova.objects.instance [None req-871fd66e-4f60-48b4-be6f-fac2c63ccf41 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Lazy-loading 'resources' on Instance uuid caf0a99c-b4d0-4fac-9883-ab0be359b528 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 09:37:52 compute-1 nova_compute[189066]: 2025-12-05 09:37:52.984 189070 DEBUG nova.virt.libvirt.vif [None req-871fd66e-4f60-48b4-be6f-fac2c63ccf41 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-05T09:35:53Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1152892970',display_name='tempest-TestNetworkBasicOps-server-1152892970',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1152892970',id=34,image_ref='3ebffd97-b242-42d7-b245-ebdaf8e4377c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBAOX2Isf6e1VVEipyaY1bQ3oo7OBeHw8nA3AYDjK6acPXxSqT3BdAkhZsaMDELIc3uY4dO92h8xGVeDLew9jVMMUIkjYQ57YIHVif+pT2wHO21vXehX0yK2yZSKNfMCQjw==',key_name='tempest-TestNetworkBasicOps-1301626670',keypairs=<?>,launch_index=0,launched_at=2025-12-05T09:36:12Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='f918144b49634ed5a43d75f8f7d194d3',ramdisk_id='',reservation_id='r-8kfznvii',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='3ebffd97-b242-42d7-b245-ebdaf8e4377c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1341993432',owner_user_name='tempest-TestNetworkBasicOps-1341993432-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-05T09:36:12Z,user_data=None,user_id='822a2d80e80e46d9a5a49e1e9560e0d9',uuid=caf0a99c-b4d0-4fac-9883-ab0be359b528,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "dc081d90-d263-461d-a7af-bc8649b2d24e", "address": "fa:16:3e:5a:50:59", "network": {"id": "9169e776-8a85-417a-9321-7e8c761484e0", "bridge": "br-int", "label": "tempest-network-smoke--736767212", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.215", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f918144b49634ed5a43d75f8f7d194d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdc081d90-d2", "ovs_interfaceid": "dc081d90-d263-461d-a7af-bc8649b2d24e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Dec 05 09:37:52 compute-1 nova_compute[189066]: 2025-12-05 09:37:52.985 189070 DEBUG nova.network.os_vif_util [None req-871fd66e-4f60-48b4-be6f-fac2c63ccf41 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Converting VIF {"id": "dc081d90-d263-461d-a7af-bc8649b2d24e", "address": "fa:16:3e:5a:50:59", "network": {"id": "9169e776-8a85-417a-9321-7e8c761484e0", "bridge": "br-int", "label": "tempest-network-smoke--736767212", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.215", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f918144b49634ed5a43d75f8f7d194d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdc081d90-d2", "ovs_interfaceid": "dc081d90-d263-461d-a7af-bc8649b2d24e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 09:37:52 compute-1 nova_compute[189066]: 2025-12-05 09:37:52.986 189070 DEBUG nova.network.os_vif_util [None req-871fd66e-4f60-48b4-be6f-fac2c63ccf41 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:5a:50:59,bridge_name='br-int',has_traffic_filtering=True,id=dc081d90-d263-461d-a7af-bc8649b2d24e,network=Network(9169e776-8a85-417a-9321-7e8c761484e0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdc081d90-d2') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 09:37:52 compute-1 nova_compute[189066]: 2025-12-05 09:37:52.986 189070 DEBUG os_vif [None req-871fd66e-4f60-48b4-be6f-fac2c63ccf41 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:5a:50:59,bridge_name='br-int',has_traffic_filtering=True,id=dc081d90-d263-461d-a7af-bc8649b2d24e,network=Network(9169e776-8a85-417a-9321-7e8c761484e0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdc081d90-d2') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Dec 05 09:37:52 compute-1 nova_compute[189066]: 2025-12-05 09:37:52.987 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:37:52 compute-1 nova_compute[189066]: 2025-12-05 09:37:52.988 189070 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapdc081d90-d2, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 09:37:52 compute-1 nova_compute[189066]: 2025-12-05 09:37:52.990 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:37:52 compute-1 nova_compute[189066]: 2025-12-05 09:37:52.992 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:37:52 compute-1 nova_compute[189066]: 2025-12-05 09:37:52.995 189070 INFO os_vif [None req-871fd66e-4f60-48b4-be6f-fac2c63ccf41 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:5a:50:59,bridge_name='br-int',has_traffic_filtering=True,id=dc081d90-d263-461d-a7af-bc8649b2d24e,network=Network(9169e776-8a85-417a-9321-7e8c761484e0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdc081d90-d2')
Dec 05 09:37:52 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-df081b9d3533857ba6e9cfbf2ffd731db37f80c472c03a0d58cf99afbcdc99c8-userdata-shm.mount: Deactivated successfully.
Dec 05 09:37:52 compute-1 nova_compute[189066]: 2025-12-05 09:37:52.997 189070 INFO nova.virt.libvirt.driver [None req-871fd66e-4f60-48b4-be6f-fac2c63ccf41 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: caf0a99c-b4d0-4fac-9883-ab0be359b528] Deleting instance files /var/lib/nova/instances/caf0a99c-b4d0-4fac-9883-ab0be359b528_del
Dec 05 09:37:52 compute-1 nova_compute[189066]: 2025-12-05 09:37:52.998 189070 INFO nova.virt.libvirt.driver [None req-871fd66e-4f60-48b4-be6f-fac2c63ccf41 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: caf0a99c-b4d0-4fac-9883-ab0be359b528] Deletion of /var/lib/nova/instances/caf0a99c-b4d0-4fac-9883-ab0be359b528_del complete
Dec 05 09:37:53 compute-1 systemd[1]: var-lib-containers-storage-overlay-25efba73102b76344ef0a313c000447af8c3a8e9e49e312c169b3e9959726e8d-merged.mount: Deactivated successfully.
Dec 05 09:37:53 compute-1 podman[229716]: 2025-12-05 09:37:53.010406259 +0000 UTC m=+0.121058923 container cleanup df081b9d3533857ba6e9cfbf2ffd731db37f80c472c03a0d58cf99afbcdc99c8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9169e776-8a85-417a-9321-7e8c761484e0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 05 09:37:53 compute-1 systemd[1]: libpod-conmon-df081b9d3533857ba6e9cfbf2ffd731db37f80c472c03a0d58cf99afbcdc99c8.scope: Deactivated successfully.
Dec 05 09:37:53 compute-1 nova_compute[189066]: 2025-12-05 09:37:53.069 189070 INFO nova.compute.manager [None req-871fd66e-4f60-48b4-be6f-fac2c63ccf41 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: caf0a99c-b4d0-4fac-9883-ab0be359b528] Took 0.38 seconds to destroy the instance on the hypervisor.
Dec 05 09:37:53 compute-1 nova_compute[189066]: 2025-12-05 09:37:53.070 189070 DEBUG oslo.service.loopingcall [None req-871fd66e-4f60-48b4-be6f-fac2c63ccf41 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Dec 05 09:37:53 compute-1 nova_compute[189066]: 2025-12-05 09:37:53.070 189070 DEBUG nova.compute.manager [-] [instance: caf0a99c-b4d0-4fac-9883-ab0be359b528] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Dec 05 09:37:53 compute-1 nova_compute[189066]: 2025-12-05 09:37:53.070 189070 DEBUG nova.network.neutron [-] [instance: caf0a99c-b4d0-4fac-9883-ab0be359b528] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Dec 05 09:37:53 compute-1 podman[229764]: 2025-12-05 09:37:53.08591509 +0000 UTC m=+0.048080053 container remove df081b9d3533857ba6e9cfbf2ffd731db37f80c472c03a0d58cf99afbcdc99c8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9169e776-8a85-417a-9321-7e8c761484e0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 05 09:37:53 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:37:53.093 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[dd0089b9-deff-4bd5-8565-2cfd22122d49]: (4, ('Fri Dec  5 09:37:52 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-9169e776-8a85-417a-9321-7e8c761484e0 (df081b9d3533857ba6e9cfbf2ffd731db37f80c472c03a0d58cf99afbcdc99c8)\ndf081b9d3533857ba6e9cfbf2ffd731db37f80c472c03a0d58cf99afbcdc99c8\nFri Dec  5 09:37:53 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-9169e776-8a85-417a-9321-7e8c761484e0 (df081b9d3533857ba6e9cfbf2ffd731db37f80c472c03a0d58cf99afbcdc99c8)\ndf081b9d3533857ba6e9cfbf2ffd731db37f80c472c03a0d58cf99afbcdc99c8\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:37:53 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:37:53.096 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[554df38e-c40c-4ab3-bd24-9504356daecc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:37:53 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:37:53.098 105272 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9169e776-80, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 09:37:53 compute-1 nova_compute[189066]: 2025-12-05 09:37:53.100 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:37:53 compute-1 kernel: tap9169e776-80: left promiscuous mode
Dec 05 09:37:53 compute-1 nova_compute[189066]: 2025-12-05 09:37:53.112 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:37:53 compute-1 nova_compute[189066]: 2025-12-05 09:37:53.114 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:37:53 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:37:53.116 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[c9f535da-7705-49f1-9388-1a7ffac94ff1]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:37:53 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:37:53.138 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[27583dc0-d527-41c7-b842-fc839411f924]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:37:53 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:37:53.141 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[f5966ce0-2986-41e2-a92e-8f3716a1120b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:37:53 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:37:53.163 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[83e110b8-41c7-4f3d-8faa-0060c59ac471]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 464416, 'reachable_time': 33463, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 229778, 'error': None, 'target': 'ovnmeta-9169e776-8a85-417a-9321-7e8c761484e0', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:37:53 compute-1 systemd[1]: run-netns-ovnmeta\x2d9169e776\x2d8a85\x2d417a\x2d9321\x2d7e8c761484e0.mount: Deactivated successfully.
Dec 05 09:37:53 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:37:53.169 105809 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-9169e776-8a85-417a-9321-7e8c761484e0 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Dec 05 09:37:53 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:37:53.169 105809 DEBUG oslo.privsep.daemon [-] privsep: reply[378fb53a-5572-4b7b-b73d-4836c8320884]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:37:54 compute-1 nova_compute[189066]: 2025-12-05 09:37:54.692 189070 DEBUG nova.compute.manager [req-5689c7ac-b5ca-43a1-9460-5043b8b8e145 req-45c413cb-873b-43e6-a0bf-ce99fb631137 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: caf0a99c-b4d0-4fac-9883-ab0be359b528] Received event network-changed-dc081d90-d263-461d-a7af-bc8649b2d24e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 09:37:54 compute-1 nova_compute[189066]: 2025-12-05 09:37:54.693 189070 DEBUG nova.compute.manager [req-5689c7ac-b5ca-43a1-9460-5043b8b8e145 req-45c413cb-873b-43e6-a0bf-ce99fb631137 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: caf0a99c-b4d0-4fac-9883-ab0be359b528] Refreshing instance network info cache due to event network-changed-dc081d90-d263-461d-a7af-bc8649b2d24e. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Dec 05 09:37:54 compute-1 nova_compute[189066]: 2025-12-05 09:37:54.693 189070 DEBUG oslo_concurrency.lockutils [req-5689c7ac-b5ca-43a1-9460-5043b8b8e145 req-45c413cb-873b-43e6-a0bf-ce99fb631137 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Acquiring lock "refresh_cache-caf0a99c-b4d0-4fac-9883-ab0be359b528" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 09:37:54 compute-1 nova_compute[189066]: 2025-12-05 09:37:54.693 189070 DEBUG oslo_concurrency.lockutils [req-5689c7ac-b5ca-43a1-9460-5043b8b8e145 req-45c413cb-873b-43e6-a0bf-ce99fb631137 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Acquired lock "refresh_cache-caf0a99c-b4d0-4fac-9883-ab0be359b528" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 09:37:54 compute-1 nova_compute[189066]: 2025-12-05 09:37:54.694 189070 DEBUG nova.network.neutron [req-5689c7ac-b5ca-43a1-9460-5043b8b8e145 req-45c413cb-873b-43e6-a0bf-ce99fb631137 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: caf0a99c-b4d0-4fac-9883-ab0be359b528] Refreshing network info cache for port dc081d90-d263-461d-a7af-bc8649b2d24e _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Dec 05 09:37:55 compute-1 nova_compute[189066]: 2025-12-05 09:37:55.190 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:37:55 compute-1 podman[229779]: 2025-12-05 09:37:55.687120248 +0000 UTC m=+0.118626404 container health_status 0cdc951f3b455a1459471b20e1f1626ca53f702831c125f835d1b670aeb17ccf (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a42d21893a42142b0b00e96296a13ac9d2c0fedf55d6438ad104fd0309c63376-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.schema-version=1.0)
Dec 05 09:37:57 compute-1 nova_compute[189066]: 2025-12-05 09:37:57.942 189070 DEBUG nova.network.neutron [-] [instance: caf0a99c-b4d0-4fac-9883-ab0be359b528] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 09:37:57 compute-1 nova_compute[189066]: 2025-12-05 09:37:57.968 189070 INFO nova.compute.manager [-] [instance: caf0a99c-b4d0-4fac-9883-ab0be359b528] Took 4.90 seconds to deallocate network for instance.
Dec 05 09:37:57 compute-1 nova_compute[189066]: 2025-12-05 09:37:57.992 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:37:58 compute-1 nova_compute[189066]: 2025-12-05 09:37:58.016 189070 DEBUG oslo_concurrency.lockutils [None req-871fd66e-4f60-48b4-be6f-fac2c63ccf41 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:37:58 compute-1 nova_compute[189066]: 2025-12-05 09:37:58.017 189070 DEBUG oslo_concurrency.lockutils [None req-871fd66e-4f60-48b4-be6f-fac2c63ccf41 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:37:58 compute-1 nova_compute[189066]: 2025-12-05 09:37:58.089 189070 DEBUG nova.compute.provider_tree [None req-871fd66e-4f60-48b4-be6f-fac2c63ccf41 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Inventory has not changed in ProviderTree for provider: be68f9f1-7820-4bfa-8dbd-210e13729f64 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 09:37:58 compute-1 nova_compute[189066]: 2025-12-05 09:37:58.111 189070 DEBUG nova.scheduler.client.report [None req-871fd66e-4f60-48b4-be6f-fac2c63ccf41 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Inventory has not changed for provider be68f9f1-7820-4bfa-8dbd-210e13729f64 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 09:37:58 compute-1 nova_compute[189066]: 2025-12-05 09:37:58.155 189070 DEBUG oslo_concurrency.lockutils [None req-871fd66e-4f60-48b4-be6f-fac2c63ccf41 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.138s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:37:58 compute-1 nova_compute[189066]: 2025-12-05 09:37:58.197 189070 INFO nova.scheduler.client.report [None req-871fd66e-4f60-48b4-be6f-fac2c63ccf41 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Deleted allocations for instance caf0a99c-b4d0-4fac-9883-ab0be359b528
Dec 05 09:37:58 compute-1 nova_compute[189066]: 2025-12-05 09:37:58.295 189070 DEBUG oslo_concurrency.lockutils [None req-871fd66e-4f60-48b4-be6f-fac2c63ccf41 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Lock "caf0a99c-b4d0-4fac-9883-ab0be359b528" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 5.608s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:37:58 compute-1 nova_compute[189066]: 2025-12-05 09:37:58.447 189070 DEBUG nova.network.neutron [req-5689c7ac-b5ca-43a1-9460-5043b8b8e145 req-45c413cb-873b-43e6-a0bf-ce99fb631137 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: caf0a99c-b4d0-4fac-9883-ab0be359b528] Updated VIF entry in instance network info cache for port dc081d90-d263-461d-a7af-bc8649b2d24e. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Dec 05 09:37:58 compute-1 nova_compute[189066]: 2025-12-05 09:37:58.448 189070 DEBUG nova.network.neutron [req-5689c7ac-b5ca-43a1-9460-5043b8b8e145 req-45c413cb-873b-43e6-a0bf-ce99fb631137 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: caf0a99c-b4d0-4fac-9883-ab0be359b528] Updating instance_info_cache with network_info: [{"id": "dc081d90-d263-461d-a7af-bc8649b2d24e", "address": "fa:16:3e:5a:50:59", "network": {"id": "9169e776-8a85-417a-9321-7e8c761484e0", "bridge": "br-int", "label": "tempest-network-smoke--736767212", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f918144b49634ed5a43d75f8f7d194d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdc081d90-d2", "ovs_interfaceid": "dc081d90-d263-461d-a7af-bc8649b2d24e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 09:37:58 compute-1 nova_compute[189066]: 2025-12-05 09:37:58.469 189070 DEBUG oslo_concurrency.lockutils [req-5689c7ac-b5ca-43a1-9460-5043b8b8e145 req-45c413cb-873b-43e6-a0bf-ce99fb631137 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Releasing lock "refresh_cache-caf0a99c-b4d0-4fac-9883-ab0be359b528" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 09:37:58 compute-1 podman[229808]: 2025-12-05 09:37:58.62559163 +0000 UTC m=+0.065483978 container health_status e8cea6352187f28e672aa25c227f2705a90d58074b8cf42235a56df5d4026fd5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a42d21893a42142b0b00e96296a13ac9d2c0fedf55d6438ad104fd0309c63376-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125)
Dec 05 09:37:59 compute-1 nova_compute[189066]: 2025-12-05 09:37:59.265 189070 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764927464.262346, e52dbfff-7d40-42b9-98c0-efbf2c7e1f73 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 09:37:59 compute-1 nova_compute[189066]: 2025-12-05 09:37:59.266 189070 INFO nova.compute.manager [-] [instance: e52dbfff-7d40-42b9-98c0-efbf2c7e1f73] VM Stopped (Lifecycle Event)
Dec 05 09:37:59 compute-1 nova_compute[189066]: 2025-12-05 09:37:59.301 189070 DEBUG nova.compute.manager [None req-4f13687e-2285-4d62-ad83-f21c5eac6ffa - - - - - -] [instance: e52dbfff-7d40-42b9-98c0-efbf2c7e1f73] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 09:37:59 compute-1 nova_compute[189066]: 2025-12-05 09:37:59.883 189070 DEBUG nova.compute.manager [req-40747d58-88e0-4cf9-8726-2b94038ac919 req-730da9be-1351-478b-bfa4-d787b4cd1e29 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: caf0a99c-b4d0-4fac-9883-ab0be359b528] Received event network-vif-plugged-dc081d90-d263-461d-a7af-bc8649b2d24e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 09:37:59 compute-1 nova_compute[189066]: 2025-12-05 09:37:59.884 189070 DEBUG oslo_concurrency.lockutils [req-40747d58-88e0-4cf9-8726-2b94038ac919 req-730da9be-1351-478b-bfa4-d787b4cd1e29 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Acquiring lock "caf0a99c-b4d0-4fac-9883-ab0be359b528-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:37:59 compute-1 nova_compute[189066]: 2025-12-05 09:37:59.884 189070 DEBUG oslo_concurrency.lockutils [req-40747d58-88e0-4cf9-8726-2b94038ac919 req-730da9be-1351-478b-bfa4-d787b4cd1e29 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Lock "caf0a99c-b4d0-4fac-9883-ab0be359b528-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:37:59 compute-1 nova_compute[189066]: 2025-12-05 09:37:59.884 189070 DEBUG oslo_concurrency.lockutils [req-40747d58-88e0-4cf9-8726-2b94038ac919 req-730da9be-1351-478b-bfa4-d787b4cd1e29 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Lock "caf0a99c-b4d0-4fac-9883-ab0be359b528-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:37:59 compute-1 nova_compute[189066]: 2025-12-05 09:37:59.885 189070 DEBUG nova.compute.manager [req-40747d58-88e0-4cf9-8726-2b94038ac919 req-730da9be-1351-478b-bfa4-d787b4cd1e29 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: caf0a99c-b4d0-4fac-9883-ab0be359b528] No waiting events found dispatching network-vif-plugged-dc081d90-d263-461d-a7af-bc8649b2d24e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 09:37:59 compute-1 nova_compute[189066]: 2025-12-05 09:37:59.885 189070 WARNING nova.compute.manager [req-40747d58-88e0-4cf9-8726-2b94038ac919 req-730da9be-1351-478b-bfa4-d787b4cd1e29 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: caf0a99c-b4d0-4fac-9883-ab0be359b528] Received unexpected event network-vif-plugged-dc081d90-d263-461d-a7af-bc8649b2d24e for instance with vm_state deleted and task_state None.
Dec 05 09:37:59 compute-1 nova_compute[189066]: 2025-12-05 09:37:59.885 189070 DEBUG nova.compute.manager [req-40747d58-88e0-4cf9-8726-2b94038ac919 req-730da9be-1351-478b-bfa4-d787b4cd1e29 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: caf0a99c-b4d0-4fac-9883-ab0be359b528] Received event network-vif-deleted-dc081d90-d263-461d-a7af-bc8649b2d24e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 09:38:00 compute-1 nova_compute[189066]: 2025-12-05 09:38:00.192 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:38:00 compute-1 podman[229827]: 2025-12-05 09:38:00.627598827 +0000 UTC m=+0.065489758 container health_status 3f3288243aefe700d53cffce184353529fbaf6e44f1c19ec4792ed576af41176 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.build-date=20251125, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=multipathd, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Dec 05 09:38:03 compute-1 nova_compute[189066]: 2025-12-05 09:38:02.999 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:38:04 compute-1 podman[229847]: 2025-12-05 09:38:04.643997574 +0000 UTC m=+0.067753363 container health_status b07f36300303a5c1729056a61e344d6c6fd9b2e404082d8e8ce7185d45652c2b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, config_id=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, version=9.6, maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, io.buildah.version=1.33.7, com.redhat.component=ubi9-minimal-container, release=1755695350, io.openshift.expose-services=, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter)
Dec 05 09:38:05 compute-1 nova_compute[189066]: 2025-12-05 09:38:05.196 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:38:06 compute-1 nova_compute[189066]: 2025-12-05 09:38:06.907 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:38:07 compute-1 nova_compute[189066]: 2025-12-05 09:38:07.121 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:38:07 compute-1 nova_compute[189066]: 2025-12-05 09:38:07.952 189070 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764927472.9508085, caf0a99c-b4d0-4fac-9883-ab0be359b528 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 09:38:07 compute-1 nova_compute[189066]: 2025-12-05 09:38:07.953 189070 INFO nova.compute.manager [-] [instance: caf0a99c-b4d0-4fac-9883-ab0be359b528] VM Stopped (Lifecycle Event)
Dec 05 09:38:07 compute-1 nova_compute[189066]: 2025-12-05 09:38:07.985 189070 DEBUG nova.compute.manager [None req-cb35dcda-4315-4149-a9c4-b8923da86fd6 - - - - - -] [instance: caf0a99c-b4d0-4fac-9883-ab0be359b528] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 09:38:08 compute-1 nova_compute[189066]: 2025-12-05 09:38:08.001 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:38:08 compute-1 podman[229870]: 2025-12-05 09:38:08.6568422 +0000 UTC m=+0.087816858 container health_status b9f45cdcb567bb250381fa80d86c78c226d5638c7de1b6d79ee017275814ff68 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Dec 05 09:38:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:38:08.884 105272 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:38:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:38:08.885 105272 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:38:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:38:08.886 105272 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:38:10 compute-1 nova_compute[189066]: 2025-12-05 09:38:10.229 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:38:13 compute-1 nova_compute[189066]: 2025-12-05 09:38:13.005 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:38:14 compute-1 podman[229892]: 2025-12-05 09:38:14.661302232 +0000 UTC m=+0.093564280 container health_status 550b604b3b40c506829669f3af0f3e8dc4e7ac01464222ce7f26998708fe1a96 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Dec 05 09:38:15 compute-1 nova_compute[189066]: 2025-12-05 09:38:15.232 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:38:16 compute-1 nova_compute[189066]: 2025-12-05 09:38:16.038 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:38:17 compute-1 nova_compute[189066]: 2025-12-05 09:38:17.014 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:38:18 compute-1 nova_compute[189066]: 2025-12-05 09:38:18.012 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:38:18 compute-1 nova_compute[189066]: 2025-12-05 09:38:18.042 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:38:19 compute-1 nova_compute[189066]: 2025-12-05 09:38:19.021 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:38:20 compute-1 nova_compute[189066]: 2025-12-05 09:38:20.234 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:38:21 compute-1 nova_compute[189066]: 2025-12-05 09:38:21.020 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:38:22 compute-1 nova_compute[189066]: 2025-12-05 09:38:22.020 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:38:22 compute-1 nova_compute[189066]: 2025-12-05 09:38:22.021 189070 DEBUG nova.compute.manager [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 05 09:38:22 compute-1 nova_compute[189066]: 2025-12-05 09:38:22.021 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:38:22 compute-1 nova_compute[189066]: 2025-12-05 09:38:22.075 189070 DEBUG oslo_concurrency.lockutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:38:22 compute-1 nova_compute[189066]: 2025-12-05 09:38:22.075 189070 DEBUG oslo_concurrency.lockutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:38:22 compute-1 nova_compute[189066]: 2025-12-05 09:38:22.076 189070 DEBUG oslo_concurrency.lockutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:38:22 compute-1 nova_compute[189066]: 2025-12-05 09:38:22.076 189070 DEBUG nova.compute.resource_tracker [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 05 09:38:22 compute-1 podman[229916]: 2025-12-05 09:38:22.121979325 +0000 UTC m=+0.069671384 container health_status d60d3dfd47255bc7c068dbedc03dbf86678863f0414a1b8f8536e4d53dd67a13 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a42d21893a42142b0b00e96296a13ac9d2c0fedf55d6438ad104fd0309c63376-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Dec 05 09:38:22 compute-1 nova_compute[189066]: 2025-12-05 09:38:22.274 189070 WARNING nova.virt.libvirt.driver [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 05 09:38:22 compute-1 nova_compute[189066]: 2025-12-05 09:38:22.275 189070 DEBUG nova.compute.resource_tracker [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5775MB free_disk=73.32587051391602GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 05 09:38:22 compute-1 nova_compute[189066]: 2025-12-05 09:38:22.275 189070 DEBUG oslo_concurrency.lockutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:38:22 compute-1 nova_compute[189066]: 2025-12-05 09:38:22.275 189070 DEBUG oslo_concurrency.lockutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:38:22 compute-1 nova_compute[189066]: 2025-12-05 09:38:22.411 189070 DEBUG nova.compute.resource_tracker [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 05 09:38:22 compute-1 nova_compute[189066]: 2025-12-05 09:38:22.411 189070 DEBUG nova.compute.resource_tracker [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 05 09:38:22 compute-1 nova_compute[189066]: 2025-12-05 09:38:22.433 189070 DEBUG nova.compute.provider_tree [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Inventory has not changed in ProviderTree for provider: be68f9f1-7820-4bfa-8dbd-210e13729f64 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 09:38:22 compute-1 nova_compute[189066]: 2025-12-05 09:38:22.656 189070 DEBUG nova.scheduler.client.report [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Inventory has not changed for provider be68f9f1-7820-4bfa-8dbd-210e13729f64 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 09:38:22 compute-1 nova_compute[189066]: 2025-12-05 09:38:22.866 189070 DEBUG nova.compute.resource_tracker [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 05 09:38:22 compute-1 nova_compute[189066]: 2025-12-05 09:38:22.867 189070 DEBUG oslo_concurrency.lockutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.591s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:38:23 compute-1 nova_compute[189066]: 2025-12-05 09:38:23.015 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:38:23 compute-1 nova_compute[189066]: 2025-12-05 09:38:23.867 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:38:23 compute-1 nova_compute[189066]: 2025-12-05 09:38:23.868 189070 DEBUG nova.compute.manager [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 05 09:38:23 compute-1 nova_compute[189066]: 2025-12-05 09:38:23.909 189070 DEBUG nova.compute.manager [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Dec 05 09:38:24 compute-1 nova_compute[189066]: 2025-12-05 09:38:24.020 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:38:24 compute-1 nova_compute[189066]: 2025-12-05 09:38:24.021 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:38:25 compute-1 nova_compute[189066]: 2025-12-05 09:38:25.237 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:38:26 compute-1 podman[229937]: 2025-12-05 09:38:26.65412703 +0000 UTC m=+0.094486251 container health_status 0cdc951f3b455a1459471b20e1f1626ca53f702831c125f835d1b670aeb17ccf (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a42d21893a42142b0b00e96296a13ac9d2c0fedf55d6438ad104fd0309c63376-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Dec 05 09:38:27 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:38:27.644 105272 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=21, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f2:d3:89', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '8e:43:2c:7d:20:dd'}, ipsec=False) old=SB_Global(nb_cfg=20) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 09:38:27 compute-1 nova_compute[189066]: 2025-12-05 09:38:27.645 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:38:27 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:38:27.645 105272 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Dec 05 09:38:28 compute-1 nova_compute[189066]: 2025-12-05 09:38:28.017 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:38:29 compute-1 podman[229966]: 2025-12-05 09:38:29.617252225 +0000 UTC m=+0.059138337 container health_status e8cea6352187f28e672aa25c227f2705a90d58074b8cf42235a56df5d4026fd5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a42d21893a42142b0b00e96296a13ac9d2c0fedf55d6438ad104fd0309c63376-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Dec 05 09:38:30 compute-1 nova_compute[189066]: 2025-12-05 09:38:30.239 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:38:30 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:38:30.648 105272 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=2deaed7a-68f6-453c-b7f8-10ef033f3762, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '21'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 09:38:31 compute-1 podman[229988]: 2025-12-05 09:38:31.637707176 +0000 UTC m=+0.075039686 container health_status 3f3288243aefe700d53cffce184353529fbaf6e44f1c19ec4792ed576af41176 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=multipathd, container_name=multipathd, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Dec 05 09:38:33 compute-1 nova_compute[189066]: 2025-12-05 09:38:33.020 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:38:33 compute-1 sshd-session[229986]: Connection reset by authenticating user root 45.140.17.124 port 36398 [preauth]
Dec 05 09:38:35 compute-1 nova_compute[189066]: 2025-12-05 09:38:35.242 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:38:35 compute-1 sshd-session[230007]: Connection reset by authenticating user root 45.140.17.124 port 27772 [preauth]
Dec 05 09:38:35 compute-1 podman[230009]: 2025-12-05 09:38:35.617548525 +0000 UTC m=+0.059496297 container health_status b07f36300303a5c1729056a61e344d6c6fd9b2e404082d8e8ce7185d45652c2b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, io.buildah.version=1.33.7, vcs-type=git, io.openshift.expose-services=, vendor=Red Hat, Inc., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=openstack_network_exporter, build-date=2025-08-20T13:12:41, version=9.6, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, release=1755695350, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, distribution-scope=public, maintainer=Red Hat, Inc.)
Dec 05 09:38:37 compute-1 sshd-session[230030]: Connection reset by authenticating user root 45.140.17.124 port 27782 [preauth]
Dec 05 09:38:38 compute-1 nova_compute[189066]: 2025-12-05 09:38:38.030 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:38:39 compute-1 podman[230035]: 2025-12-05 09:38:39.61934455 +0000 UTC m=+0.057221471 container health_status b9f45cdcb567bb250381fa80d86c78c226d5638c7de1b6d79ee017275814ff68 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Dec 05 09:38:40 compute-1 sshd-session[230033]: Connection reset by authenticating user root 45.140.17.124 port 27790 [preauth]
Dec 05 09:38:40 compute-1 nova_compute[189066]: 2025-12-05 09:38:40.263 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:38:41 compute-1 nova_compute[189066]: 2025-12-05 09:38:41.048 189070 DEBUG oslo_concurrency.lockutils [None req-110a88ed-70c2-45c4-a212-ffa1c71b817b 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Acquiring lock "6d29aa46-325d-41f0-9726-7eb3e727aab4" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:38:41 compute-1 nova_compute[189066]: 2025-12-05 09:38:41.049 189070 DEBUG oslo_concurrency.lockutils [None req-110a88ed-70c2-45c4-a212-ffa1c71b817b 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Lock "6d29aa46-325d-41f0-9726-7eb3e727aab4" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:38:41 compute-1 nova_compute[189066]: 2025-12-05 09:38:41.068 189070 DEBUG nova.compute.manager [None req-110a88ed-70c2-45c4-a212-ffa1c71b817b 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: 6d29aa46-325d-41f0-9726-7eb3e727aab4] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Dec 05 09:38:41 compute-1 nova_compute[189066]: 2025-12-05 09:38:41.165 189070 DEBUG oslo_concurrency.lockutils [None req-110a88ed-70c2-45c4-a212-ffa1c71b817b 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:38:41 compute-1 nova_compute[189066]: 2025-12-05 09:38:41.166 189070 DEBUG oslo_concurrency.lockutils [None req-110a88ed-70c2-45c4-a212-ffa1c71b817b 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:38:41 compute-1 nova_compute[189066]: 2025-12-05 09:38:41.174 189070 DEBUG nova.virt.hardware [None req-110a88ed-70c2-45c4-a212-ffa1c71b817b 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Dec 05 09:38:41 compute-1 nova_compute[189066]: 2025-12-05 09:38:41.174 189070 INFO nova.compute.claims [None req-110a88ed-70c2-45c4-a212-ffa1c71b817b 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: 6d29aa46-325d-41f0-9726-7eb3e727aab4] Claim successful on node compute-1.ctlplane.example.com
Dec 05 09:38:41 compute-1 nova_compute[189066]: 2025-12-05 09:38:41.406 189070 DEBUG nova.compute.provider_tree [None req-110a88ed-70c2-45c4-a212-ffa1c71b817b 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Inventory has not changed in ProviderTree for provider: be68f9f1-7820-4bfa-8dbd-210e13729f64 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 09:38:41 compute-1 nova_compute[189066]: 2025-12-05 09:38:41.425 189070 DEBUG nova.scheduler.client.report [None req-110a88ed-70c2-45c4-a212-ffa1c71b817b 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Inventory has not changed for provider be68f9f1-7820-4bfa-8dbd-210e13729f64 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 09:38:41 compute-1 nova_compute[189066]: 2025-12-05 09:38:41.453 189070 DEBUG oslo_concurrency.lockutils [None req-110a88ed-70c2-45c4-a212-ffa1c71b817b 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.287s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:38:41 compute-1 nova_compute[189066]: 2025-12-05 09:38:41.454 189070 DEBUG nova.compute.manager [None req-110a88ed-70c2-45c4-a212-ffa1c71b817b 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: 6d29aa46-325d-41f0-9726-7eb3e727aab4] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Dec 05 09:38:41 compute-1 nova_compute[189066]: 2025-12-05 09:38:41.513 189070 DEBUG nova.compute.manager [None req-110a88ed-70c2-45c4-a212-ffa1c71b817b 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: 6d29aa46-325d-41f0-9726-7eb3e727aab4] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Dec 05 09:38:41 compute-1 nova_compute[189066]: 2025-12-05 09:38:41.513 189070 DEBUG nova.network.neutron [None req-110a88ed-70c2-45c4-a212-ffa1c71b817b 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: 6d29aa46-325d-41f0-9726-7eb3e727aab4] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Dec 05 09:38:41 compute-1 nova_compute[189066]: 2025-12-05 09:38:41.547 189070 INFO nova.virt.libvirt.driver [None req-110a88ed-70c2-45c4-a212-ffa1c71b817b 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: 6d29aa46-325d-41f0-9726-7eb3e727aab4] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Dec 05 09:38:41 compute-1 nova_compute[189066]: 2025-12-05 09:38:41.578 189070 DEBUG nova.compute.manager [None req-110a88ed-70c2-45c4-a212-ffa1c71b817b 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: 6d29aa46-325d-41f0-9726-7eb3e727aab4] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Dec 05 09:38:41 compute-1 nova_compute[189066]: 2025-12-05 09:38:41.767 189070 DEBUG nova.compute.manager [None req-110a88ed-70c2-45c4-a212-ffa1c71b817b 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: 6d29aa46-325d-41f0-9726-7eb3e727aab4] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Dec 05 09:38:41 compute-1 nova_compute[189066]: 2025-12-05 09:38:41.769 189070 DEBUG nova.virt.libvirt.driver [None req-110a88ed-70c2-45c4-a212-ffa1c71b817b 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: 6d29aa46-325d-41f0-9726-7eb3e727aab4] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Dec 05 09:38:41 compute-1 nova_compute[189066]: 2025-12-05 09:38:41.769 189070 INFO nova.virt.libvirt.driver [None req-110a88ed-70c2-45c4-a212-ffa1c71b817b 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: 6d29aa46-325d-41f0-9726-7eb3e727aab4] Creating image(s)
Dec 05 09:38:41 compute-1 nova_compute[189066]: 2025-12-05 09:38:41.770 189070 DEBUG oslo_concurrency.lockutils [None req-110a88ed-70c2-45c4-a212-ffa1c71b817b 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Acquiring lock "/var/lib/nova/instances/6d29aa46-325d-41f0-9726-7eb3e727aab4/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:38:41 compute-1 nova_compute[189066]: 2025-12-05 09:38:41.771 189070 DEBUG oslo_concurrency.lockutils [None req-110a88ed-70c2-45c4-a212-ffa1c71b817b 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Lock "/var/lib/nova/instances/6d29aa46-325d-41f0-9726-7eb3e727aab4/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:38:41 compute-1 nova_compute[189066]: 2025-12-05 09:38:41.772 189070 DEBUG oslo_concurrency.lockutils [None req-110a88ed-70c2-45c4-a212-ffa1c71b817b 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Lock "/var/lib/nova/instances/6d29aa46-325d-41f0-9726-7eb3e727aab4/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:38:41 compute-1 nova_compute[189066]: 2025-12-05 09:38:41.790 189070 DEBUG oslo_concurrency.processutils [None req-110a88ed-70c2-45c4-a212-ffa1c71b817b 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5b1bdb5cc50b9bf43ebe3094e9279a12a18df0ce --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 09:38:41 compute-1 nova_compute[189066]: 2025-12-05 09:38:41.853 189070 DEBUG oslo_concurrency.processutils [None req-110a88ed-70c2-45c4-a212-ffa1c71b817b 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5b1bdb5cc50b9bf43ebe3094e9279a12a18df0ce --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 09:38:41 compute-1 nova_compute[189066]: 2025-12-05 09:38:41.855 189070 DEBUG oslo_concurrency.lockutils [None req-110a88ed-70c2-45c4-a212-ffa1c71b817b 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Acquiring lock "5b1bdb5cc50b9bf43ebe3094e9279a12a18df0ce" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:38:41 compute-1 nova_compute[189066]: 2025-12-05 09:38:41.856 189070 DEBUG oslo_concurrency.lockutils [None req-110a88ed-70c2-45c4-a212-ffa1c71b817b 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Lock "5b1bdb5cc50b9bf43ebe3094e9279a12a18df0ce" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:38:41 compute-1 nova_compute[189066]: 2025-12-05 09:38:41.867 189070 DEBUG oslo_concurrency.processutils [None req-110a88ed-70c2-45c4-a212-ffa1c71b817b 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5b1bdb5cc50b9bf43ebe3094e9279a12a18df0ce --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 09:38:41 compute-1 nova_compute[189066]: 2025-12-05 09:38:41.927 189070 DEBUG oslo_concurrency.processutils [None req-110a88ed-70c2-45c4-a212-ffa1c71b817b 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5b1bdb5cc50b9bf43ebe3094e9279a12a18df0ce --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 09:38:41 compute-1 nova_compute[189066]: 2025-12-05 09:38:41.929 189070 DEBUG oslo_concurrency.processutils [None req-110a88ed-70c2-45c4-a212-ffa1c71b817b 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5b1bdb5cc50b9bf43ebe3094e9279a12a18df0ce,backing_fmt=raw /var/lib/nova/instances/6d29aa46-325d-41f0-9726-7eb3e727aab4/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 09:38:41 compute-1 nova_compute[189066]: 2025-12-05 09:38:41.971 189070 DEBUG oslo_concurrency.processutils [None req-110a88ed-70c2-45c4-a212-ffa1c71b817b 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5b1bdb5cc50b9bf43ebe3094e9279a12a18df0ce,backing_fmt=raw /var/lib/nova/instances/6d29aa46-325d-41f0-9726-7eb3e727aab4/disk 1073741824" returned: 0 in 0.042s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 09:38:41 compute-1 nova_compute[189066]: 2025-12-05 09:38:41.972 189070 DEBUG oslo_concurrency.lockutils [None req-110a88ed-70c2-45c4-a212-ffa1c71b817b 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Lock "5b1bdb5cc50b9bf43ebe3094e9279a12a18df0ce" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.116s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:38:41 compute-1 nova_compute[189066]: 2025-12-05 09:38:41.972 189070 DEBUG oslo_concurrency.processutils [None req-110a88ed-70c2-45c4-a212-ffa1c71b817b 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5b1bdb5cc50b9bf43ebe3094e9279a12a18df0ce --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 09:38:42 compute-1 nova_compute[189066]: 2025-12-05 09:38:42.032 189070 DEBUG oslo_concurrency.processutils [None req-110a88ed-70c2-45c4-a212-ffa1c71b817b 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5b1bdb5cc50b9bf43ebe3094e9279a12a18df0ce --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 09:38:42 compute-1 nova_compute[189066]: 2025-12-05 09:38:42.034 189070 DEBUG nova.virt.disk.api [None req-110a88ed-70c2-45c4-a212-ffa1c71b817b 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Checking if we can resize image /var/lib/nova/instances/6d29aa46-325d-41f0-9726-7eb3e727aab4/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Dec 05 09:38:42 compute-1 nova_compute[189066]: 2025-12-05 09:38:42.034 189070 DEBUG oslo_concurrency.processutils [None req-110a88ed-70c2-45c4-a212-ffa1c71b817b 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6d29aa46-325d-41f0-9726-7eb3e727aab4/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 09:38:42 compute-1 nova_compute[189066]: 2025-12-05 09:38:42.095 189070 DEBUG oslo_concurrency.processutils [None req-110a88ed-70c2-45c4-a212-ffa1c71b817b 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6d29aa46-325d-41f0-9726-7eb3e727aab4/disk --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 09:38:42 compute-1 nova_compute[189066]: 2025-12-05 09:38:42.096 189070 DEBUG nova.virt.disk.api [None req-110a88ed-70c2-45c4-a212-ffa1c71b817b 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Cannot resize image /var/lib/nova/instances/6d29aa46-325d-41f0-9726-7eb3e727aab4/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Dec 05 09:38:42 compute-1 nova_compute[189066]: 2025-12-05 09:38:42.097 189070 DEBUG nova.objects.instance [None req-110a88ed-70c2-45c4-a212-ffa1c71b817b 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Lazy-loading 'migration_context' on Instance uuid 6d29aa46-325d-41f0-9726-7eb3e727aab4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 09:38:42 compute-1 nova_compute[189066]: 2025-12-05 09:38:42.107 189070 DEBUG nova.policy [None req-110a88ed-70c2-45c4-a212-ffa1c71b817b 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '822a2d80e80e46d9a5a49e1e9560e0d9', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'f918144b49634ed5a43d75f8f7d194d3', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Dec 05 09:38:42 compute-1 nova_compute[189066]: 2025-12-05 09:38:42.116 189070 DEBUG nova.virt.libvirt.driver [None req-110a88ed-70c2-45c4-a212-ffa1c71b817b 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: 6d29aa46-325d-41f0-9726-7eb3e727aab4] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Dec 05 09:38:42 compute-1 nova_compute[189066]: 2025-12-05 09:38:42.117 189070 DEBUG nova.virt.libvirt.driver [None req-110a88ed-70c2-45c4-a212-ffa1c71b817b 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: 6d29aa46-325d-41f0-9726-7eb3e727aab4] Ensure instance console log exists: /var/lib/nova/instances/6d29aa46-325d-41f0-9726-7eb3e727aab4/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Dec 05 09:38:42 compute-1 nova_compute[189066]: 2025-12-05 09:38:42.117 189070 DEBUG oslo_concurrency.lockutils [None req-110a88ed-70c2-45c4-a212-ffa1c71b817b 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:38:42 compute-1 nova_compute[189066]: 2025-12-05 09:38:42.118 189070 DEBUG oslo_concurrency.lockutils [None req-110a88ed-70c2-45c4-a212-ffa1c71b817b 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:38:42 compute-1 nova_compute[189066]: 2025-12-05 09:38:42.118 189070 DEBUG oslo_concurrency.lockutils [None req-110a88ed-70c2-45c4-a212-ffa1c71b817b 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:38:43 compute-1 nova_compute[189066]: 2025-12-05 09:38:43.034 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:38:43 compute-1 sshd-session[230060]: Connection reset by authenticating user root 45.140.17.124 port 27802 [preauth]
Dec 05 09:38:43 compute-1 nova_compute[189066]: 2025-12-05 09:38:43.873 189070 DEBUG nova.network.neutron [None req-110a88ed-70c2-45c4-a212-ffa1c71b817b 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: 6d29aa46-325d-41f0-9726-7eb3e727aab4] Successfully updated port: e895fa16-2797-4f30-b0cc-644ea2908267 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Dec 05 09:38:43 compute-1 nova_compute[189066]: 2025-12-05 09:38:43.894 189070 DEBUG oslo_concurrency.lockutils [None req-110a88ed-70c2-45c4-a212-ffa1c71b817b 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Acquiring lock "refresh_cache-6d29aa46-325d-41f0-9726-7eb3e727aab4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 09:38:43 compute-1 nova_compute[189066]: 2025-12-05 09:38:43.895 189070 DEBUG oslo_concurrency.lockutils [None req-110a88ed-70c2-45c4-a212-ffa1c71b817b 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Acquired lock "refresh_cache-6d29aa46-325d-41f0-9726-7eb3e727aab4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 09:38:43 compute-1 nova_compute[189066]: 2025-12-05 09:38:43.895 189070 DEBUG nova.network.neutron [None req-110a88ed-70c2-45c4-a212-ffa1c71b817b 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: 6d29aa46-325d-41f0-9726-7eb3e727aab4] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Dec 05 09:38:44 compute-1 nova_compute[189066]: 2025-12-05 09:38:44.027 189070 DEBUG nova.compute.manager [req-68ca5f19-04ee-4aad-8a69-0a50f174ccce req-a05b9a74-b670-4d36-ab6b-501871d8f313 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 6d29aa46-325d-41f0-9726-7eb3e727aab4] Received event network-changed-e895fa16-2797-4f30-b0cc-644ea2908267 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 09:38:44 compute-1 nova_compute[189066]: 2025-12-05 09:38:44.027 189070 DEBUG nova.compute.manager [req-68ca5f19-04ee-4aad-8a69-0a50f174ccce req-a05b9a74-b670-4d36-ab6b-501871d8f313 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 6d29aa46-325d-41f0-9726-7eb3e727aab4] Refreshing instance network info cache due to event network-changed-e895fa16-2797-4f30-b0cc-644ea2908267. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Dec 05 09:38:44 compute-1 nova_compute[189066]: 2025-12-05 09:38:44.027 189070 DEBUG oslo_concurrency.lockutils [req-68ca5f19-04ee-4aad-8a69-0a50f174ccce req-a05b9a74-b670-4d36-ab6b-501871d8f313 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Acquiring lock "refresh_cache-6d29aa46-325d-41f0-9726-7eb3e727aab4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 09:38:44 compute-1 nova_compute[189066]: 2025-12-05 09:38:44.613 189070 DEBUG nova.network.neutron [None req-110a88ed-70c2-45c4-a212-ffa1c71b817b 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: 6d29aa46-325d-41f0-9726-7eb3e727aab4] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Dec 05 09:38:45 compute-1 nova_compute[189066]: 2025-12-05 09:38:45.268 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:38:45 compute-1 podman[230077]: 2025-12-05 09:38:45.611295946 +0000 UTC m=+0.052930996 container health_status 550b604b3b40c506829669f3af0f3e8dc4e7ac01464222ce7f26998708fe1a96 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Dec 05 09:38:45 compute-1 nova_compute[189066]: 2025-12-05 09:38:45.798 189070 DEBUG nova.network.neutron [None req-110a88ed-70c2-45c4-a212-ffa1c71b817b 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: 6d29aa46-325d-41f0-9726-7eb3e727aab4] Updating instance_info_cache with network_info: [{"id": "e895fa16-2797-4f30-b0cc-644ea2908267", "address": "fa:16:3e:ae:d0:75", "network": {"id": "cc4506e6-cfb6-485f-9e7b-3a80985c5a7e", "bridge": "br-int", "label": "tempest-network-smoke--1626007660", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f918144b49634ed5a43d75f8f7d194d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape895fa16-27", "ovs_interfaceid": "e895fa16-2797-4f30-b0cc-644ea2908267", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 09:38:45 compute-1 nova_compute[189066]: 2025-12-05 09:38:45.823 189070 DEBUG oslo_concurrency.lockutils [None req-110a88ed-70c2-45c4-a212-ffa1c71b817b 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Releasing lock "refresh_cache-6d29aa46-325d-41f0-9726-7eb3e727aab4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 09:38:45 compute-1 nova_compute[189066]: 2025-12-05 09:38:45.824 189070 DEBUG nova.compute.manager [None req-110a88ed-70c2-45c4-a212-ffa1c71b817b 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: 6d29aa46-325d-41f0-9726-7eb3e727aab4] Instance network_info: |[{"id": "e895fa16-2797-4f30-b0cc-644ea2908267", "address": "fa:16:3e:ae:d0:75", "network": {"id": "cc4506e6-cfb6-485f-9e7b-3a80985c5a7e", "bridge": "br-int", "label": "tempest-network-smoke--1626007660", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f918144b49634ed5a43d75f8f7d194d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape895fa16-27", "ovs_interfaceid": "e895fa16-2797-4f30-b0cc-644ea2908267", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Dec 05 09:38:45 compute-1 nova_compute[189066]: 2025-12-05 09:38:45.824 189070 DEBUG oslo_concurrency.lockutils [req-68ca5f19-04ee-4aad-8a69-0a50f174ccce req-a05b9a74-b670-4d36-ab6b-501871d8f313 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Acquired lock "refresh_cache-6d29aa46-325d-41f0-9726-7eb3e727aab4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 09:38:45 compute-1 nova_compute[189066]: 2025-12-05 09:38:45.825 189070 DEBUG nova.network.neutron [req-68ca5f19-04ee-4aad-8a69-0a50f174ccce req-a05b9a74-b670-4d36-ab6b-501871d8f313 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 6d29aa46-325d-41f0-9726-7eb3e727aab4] Refreshing network info cache for port e895fa16-2797-4f30-b0cc-644ea2908267 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Dec 05 09:38:45 compute-1 nova_compute[189066]: 2025-12-05 09:38:45.829 189070 DEBUG nova.virt.libvirt.driver [None req-110a88ed-70c2-45c4-a212-ffa1c71b817b 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: 6d29aa46-325d-41f0-9726-7eb3e727aab4] Start _get_guest_xml network_info=[{"id": "e895fa16-2797-4f30-b0cc-644ea2908267", "address": "fa:16:3e:ae:d0:75", "network": {"id": "cc4506e6-cfb6-485f-9e7b-3a80985c5a7e", "bridge": "br-int", "label": "tempest-network-smoke--1626007660", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f918144b49634ed5a43d75f8f7d194d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape895fa16-27", "ovs_interfaceid": "e895fa16-2797-4f30-b0cc-644ea2908267", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T09:19:04Z,direct_url=<?>,disk_format='qcow2',id=3ebffd97-b242-42d7-b245-ebdaf8e4377c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='1cb29f3743454b7fb71af92993455437',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T09:19:06Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encrypted': False, 'encryption_secret_uuid': None, 'size': 0, 'boot_index': 0, 'device_name': '/dev/vda', 'disk_bus': 'virtio', 'guest_format': None, 'encryption_options': None, 'encryption_format': None, 'device_type': 'disk', 'image_id': '3ebffd97-b242-42d7-b245-ebdaf8e4377c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Dec 05 09:38:45 compute-1 nova_compute[189066]: 2025-12-05 09:38:45.836 189070 WARNING nova.virt.libvirt.driver [None req-110a88ed-70c2-45c4-a212-ffa1c71b817b 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 05 09:38:45 compute-1 nova_compute[189066]: 2025-12-05 09:38:45.843 189070 DEBUG nova.virt.libvirt.host [None req-110a88ed-70c2-45c4-a212-ffa1c71b817b 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Dec 05 09:38:45 compute-1 nova_compute[189066]: 2025-12-05 09:38:45.843 189070 DEBUG nova.virt.libvirt.host [None req-110a88ed-70c2-45c4-a212-ffa1c71b817b 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Dec 05 09:38:45 compute-1 nova_compute[189066]: 2025-12-05 09:38:45.853 189070 DEBUG nova.virt.libvirt.host [None req-110a88ed-70c2-45c4-a212-ffa1c71b817b 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Dec 05 09:38:45 compute-1 nova_compute[189066]: 2025-12-05 09:38:45.854 189070 DEBUG nova.virt.libvirt.host [None req-110a88ed-70c2-45c4-a212-ffa1c71b817b 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Dec 05 09:38:45 compute-1 nova_compute[189066]: 2025-12-05 09:38:45.856 189070 DEBUG nova.virt.libvirt.driver [None req-110a88ed-70c2-45c4-a212-ffa1c71b817b 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Dec 05 09:38:45 compute-1 nova_compute[189066]: 2025-12-05 09:38:45.856 189070 DEBUG nova.virt.hardware [None req-110a88ed-70c2-45c4-a212-ffa1c71b817b 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-05T09:19:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='fbadeab4-f24f-4100-963a-d228b2a6f7c4',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T09:19:04Z,direct_url=<?>,disk_format='qcow2',id=3ebffd97-b242-42d7-b245-ebdaf8e4377c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='1cb29f3743454b7fb71af92993455437',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T09:19:06Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Dec 05 09:38:45 compute-1 nova_compute[189066]: 2025-12-05 09:38:45.857 189070 DEBUG nova.virt.hardware [None req-110a88ed-70c2-45c4-a212-ffa1c71b817b 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Dec 05 09:38:45 compute-1 nova_compute[189066]: 2025-12-05 09:38:45.857 189070 DEBUG nova.virt.hardware [None req-110a88ed-70c2-45c4-a212-ffa1c71b817b 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Dec 05 09:38:45 compute-1 nova_compute[189066]: 2025-12-05 09:38:45.857 189070 DEBUG nova.virt.hardware [None req-110a88ed-70c2-45c4-a212-ffa1c71b817b 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Dec 05 09:38:45 compute-1 nova_compute[189066]: 2025-12-05 09:38:45.857 189070 DEBUG nova.virt.hardware [None req-110a88ed-70c2-45c4-a212-ffa1c71b817b 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Dec 05 09:38:45 compute-1 nova_compute[189066]: 2025-12-05 09:38:45.858 189070 DEBUG nova.virt.hardware [None req-110a88ed-70c2-45c4-a212-ffa1c71b817b 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Dec 05 09:38:45 compute-1 nova_compute[189066]: 2025-12-05 09:38:45.858 189070 DEBUG nova.virt.hardware [None req-110a88ed-70c2-45c4-a212-ffa1c71b817b 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Dec 05 09:38:45 compute-1 nova_compute[189066]: 2025-12-05 09:38:45.858 189070 DEBUG nova.virt.hardware [None req-110a88ed-70c2-45c4-a212-ffa1c71b817b 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Dec 05 09:38:45 compute-1 nova_compute[189066]: 2025-12-05 09:38:45.859 189070 DEBUG nova.virt.hardware [None req-110a88ed-70c2-45c4-a212-ffa1c71b817b 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Dec 05 09:38:45 compute-1 nova_compute[189066]: 2025-12-05 09:38:45.859 189070 DEBUG nova.virt.hardware [None req-110a88ed-70c2-45c4-a212-ffa1c71b817b 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Dec 05 09:38:45 compute-1 nova_compute[189066]: 2025-12-05 09:38:45.859 189070 DEBUG nova.virt.hardware [None req-110a88ed-70c2-45c4-a212-ffa1c71b817b 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Dec 05 09:38:45 compute-1 nova_compute[189066]: 2025-12-05 09:38:45.864 189070 DEBUG nova.virt.libvirt.vif [None req-110a88ed-70c2-45c4-a212-ffa1c71b817b 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T09:38:39Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-137292075',display_name='tempest-TestNetworkBasicOps-server-137292075',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-137292075',id=40,image_ref='3ebffd97-b242-42d7-b245-ebdaf8e4377c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBBYkd+LG9XLs4SavaCNkHkE3ehx/EikY9dqHG31SVnXmqKcLCEPdT3GmKjeUultSC2Kd6kMXS8cYi0YDAGEAGxdGJZZ1TtOeZraf//ClPqOEJb5BPrshtg+A2mkYLqgvjw==',key_name='tempest-TestNetworkBasicOps-460138570',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='f918144b49634ed5a43d75f8f7d194d3',ramdisk_id='',reservation_id='r-gx0u88vz',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='3ebffd97-b242-42d7-b245-ebdaf8e4377c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1341993432',owner_user_name='tempest-TestNetworkBasicOps-1341993432-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T09:38:41Z,user_data=None,user_id='822a2d80e80e46d9a5a49e1e9560e0d9',uuid=6d29aa46-325d-41f0-9726-7eb3e727aab4,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "e895fa16-2797-4f30-b0cc-644ea2908267", "address": "fa:16:3e:ae:d0:75", "network": {"id": "cc4506e6-cfb6-485f-9e7b-3a80985c5a7e", "bridge": "br-int", "label": "tempest-network-smoke--1626007660", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f918144b49634ed5a43d75f8f7d194d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape895fa16-27", "ovs_interfaceid": "e895fa16-2797-4f30-b0cc-644ea2908267", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Dec 05 09:38:45 compute-1 nova_compute[189066]: 2025-12-05 09:38:45.864 189070 DEBUG nova.network.os_vif_util [None req-110a88ed-70c2-45c4-a212-ffa1c71b817b 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Converting VIF {"id": "e895fa16-2797-4f30-b0cc-644ea2908267", "address": "fa:16:3e:ae:d0:75", "network": {"id": "cc4506e6-cfb6-485f-9e7b-3a80985c5a7e", "bridge": "br-int", "label": "tempest-network-smoke--1626007660", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f918144b49634ed5a43d75f8f7d194d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape895fa16-27", "ovs_interfaceid": "e895fa16-2797-4f30-b0cc-644ea2908267", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 09:38:45 compute-1 nova_compute[189066]: 2025-12-05 09:38:45.865 189070 DEBUG nova.network.os_vif_util [None req-110a88ed-70c2-45c4-a212-ffa1c71b817b 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ae:d0:75,bridge_name='br-int',has_traffic_filtering=True,id=e895fa16-2797-4f30-b0cc-644ea2908267,network=Network(cc4506e6-cfb6-485f-9e7b-3a80985c5a7e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tape895fa16-27') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 09:38:45 compute-1 nova_compute[189066]: 2025-12-05 09:38:45.866 189070 DEBUG nova.objects.instance [None req-110a88ed-70c2-45c4-a212-ffa1c71b817b 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Lazy-loading 'pci_devices' on Instance uuid 6d29aa46-325d-41f0-9726-7eb3e727aab4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 09:38:45 compute-1 nova_compute[189066]: 2025-12-05 09:38:45.900 189070 DEBUG nova.virt.libvirt.driver [None req-110a88ed-70c2-45c4-a212-ffa1c71b817b 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: 6d29aa46-325d-41f0-9726-7eb3e727aab4] End _get_guest_xml xml=<domain type="kvm">
Dec 05 09:38:45 compute-1 nova_compute[189066]:   <uuid>6d29aa46-325d-41f0-9726-7eb3e727aab4</uuid>
Dec 05 09:38:45 compute-1 nova_compute[189066]:   <name>instance-00000028</name>
Dec 05 09:38:45 compute-1 nova_compute[189066]:   <memory>131072</memory>
Dec 05 09:38:45 compute-1 nova_compute[189066]:   <vcpu>1</vcpu>
Dec 05 09:38:45 compute-1 nova_compute[189066]:   <metadata>
Dec 05 09:38:45 compute-1 nova_compute[189066]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 05 09:38:45 compute-1 nova_compute[189066]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec 05 09:38:45 compute-1 nova_compute[189066]:       <nova:name>tempest-TestNetworkBasicOps-server-137292075</nova:name>
Dec 05 09:38:45 compute-1 nova_compute[189066]:       <nova:creationTime>2025-12-05 09:38:45</nova:creationTime>
Dec 05 09:38:45 compute-1 nova_compute[189066]:       <nova:flavor name="m1.nano">
Dec 05 09:38:45 compute-1 nova_compute[189066]:         <nova:memory>128</nova:memory>
Dec 05 09:38:45 compute-1 nova_compute[189066]:         <nova:disk>1</nova:disk>
Dec 05 09:38:45 compute-1 nova_compute[189066]:         <nova:swap>0</nova:swap>
Dec 05 09:38:45 compute-1 nova_compute[189066]:         <nova:ephemeral>0</nova:ephemeral>
Dec 05 09:38:45 compute-1 nova_compute[189066]:         <nova:vcpus>1</nova:vcpus>
Dec 05 09:38:45 compute-1 nova_compute[189066]:       </nova:flavor>
Dec 05 09:38:45 compute-1 nova_compute[189066]:       <nova:owner>
Dec 05 09:38:45 compute-1 nova_compute[189066]:         <nova:user uuid="822a2d80e80e46d9a5a49e1e9560e0d9">tempest-TestNetworkBasicOps-1341993432-project-member</nova:user>
Dec 05 09:38:45 compute-1 nova_compute[189066]:         <nova:project uuid="f918144b49634ed5a43d75f8f7d194d3">tempest-TestNetworkBasicOps-1341993432</nova:project>
Dec 05 09:38:45 compute-1 nova_compute[189066]:       </nova:owner>
Dec 05 09:38:45 compute-1 nova_compute[189066]:       <nova:root type="image" uuid="3ebffd97-b242-42d7-b245-ebdaf8e4377c"/>
Dec 05 09:38:45 compute-1 nova_compute[189066]:       <nova:ports>
Dec 05 09:38:45 compute-1 nova_compute[189066]:         <nova:port uuid="e895fa16-2797-4f30-b0cc-644ea2908267">
Dec 05 09:38:45 compute-1 nova_compute[189066]:           <nova:ip type="fixed" address="10.100.0.3" ipVersion="4"/>
Dec 05 09:38:45 compute-1 nova_compute[189066]:         </nova:port>
Dec 05 09:38:45 compute-1 nova_compute[189066]:       </nova:ports>
Dec 05 09:38:45 compute-1 nova_compute[189066]:     </nova:instance>
Dec 05 09:38:45 compute-1 nova_compute[189066]:   </metadata>
Dec 05 09:38:45 compute-1 nova_compute[189066]:   <sysinfo type="smbios">
Dec 05 09:38:45 compute-1 nova_compute[189066]:     <system>
Dec 05 09:38:45 compute-1 nova_compute[189066]:       <entry name="manufacturer">RDO</entry>
Dec 05 09:38:45 compute-1 nova_compute[189066]:       <entry name="product">OpenStack Compute</entry>
Dec 05 09:38:45 compute-1 nova_compute[189066]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec 05 09:38:45 compute-1 nova_compute[189066]:       <entry name="serial">6d29aa46-325d-41f0-9726-7eb3e727aab4</entry>
Dec 05 09:38:45 compute-1 nova_compute[189066]:       <entry name="uuid">6d29aa46-325d-41f0-9726-7eb3e727aab4</entry>
Dec 05 09:38:45 compute-1 nova_compute[189066]:       <entry name="family">Virtual Machine</entry>
Dec 05 09:38:45 compute-1 nova_compute[189066]:     </system>
Dec 05 09:38:45 compute-1 nova_compute[189066]:   </sysinfo>
Dec 05 09:38:45 compute-1 nova_compute[189066]:   <os>
Dec 05 09:38:45 compute-1 nova_compute[189066]:     <type arch="x86_64" machine="q35">hvm</type>
Dec 05 09:38:45 compute-1 nova_compute[189066]:     <boot dev="hd"/>
Dec 05 09:38:45 compute-1 nova_compute[189066]:     <smbios mode="sysinfo"/>
Dec 05 09:38:45 compute-1 nova_compute[189066]:   </os>
Dec 05 09:38:45 compute-1 nova_compute[189066]:   <features>
Dec 05 09:38:45 compute-1 nova_compute[189066]:     <acpi/>
Dec 05 09:38:45 compute-1 nova_compute[189066]:     <apic/>
Dec 05 09:38:45 compute-1 nova_compute[189066]:     <vmcoreinfo/>
Dec 05 09:38:45 compute-1 nova_compute[189066]:   </features>
Dec 05 09:38:45 compute-1 nova_compute[189066]:   <clock offset="utc">
Dec 05 09:38:45 compute-1 nova_compute[189066]:     <timer name="pit" tickpolicy="delay"/>
Dec 05 09:38:45 compute-1 nova_compute[189066]:     <timer name="rtc" tickpolicy="catchup"/>
Dec 05 09:38:45 compute-1 nova_compute[189066]:     <timer name="hpet" present="no"/>
Dec 05 09:38:45 compute-1 nova_compute[189066]:   </clock>
Dec 05 09:38:45 compute-1 nova_compute[189066]:   <cpu mode="custom" match="exact">
Dec 05 09:38:45 compute-1 nova_compute[189066]:     <model>Nehalem</model>
Dec 05 09:38:45 compute-1 nova_compute[189066]:     <topology sockets="1" cores="1" threads="1"/>
Dec 05 09:38:45 compute-1 nova_compute[189066]:   </cpu>
Dec 05 09:38:45 compute-1 nova_compute[189066]:   <devices>
Dec 05 09:38:45 compute-1 nova_compute[189066]:     <disk type="file" device="disk">
Dec 05 09:38:45 compute-1 nova_compute[189066]:       <driver name="qemu" type="qcow2" cache="none"/>
Dec 05 09:38:45 compute-1 nova_compute[189066]:       <source file="/var/lib/nova/instances/6d29aa46-325d-41f0-9726-7eb3e727aab4/disk"/>
Dec 05 09:38:45 compute-1 nova_compute[189066]:       <target dev="vda" bus="virtio"/>
Dec 05 09:38:45 compute-1 nova_compute[189066]:     </disk>
Dec 05 09:38:45 compute-1 nova_compute[189066]:     <disk type="file" device="cdrom">
Dec 05 09:38:45 compute-1 nova_compute[189066]:       <driver name="qemu" type="raw" cache="none"/>
Dec 05 09:38:45 compute-1 nova_compute[189066]:       <source file="/var/lib/nova/instances/6d29aa46-325d-41f0-9726-7eb3e727aab4/disk.config"/>
Dec 05 09:38:45 compute-1 nova_compute[189066]:       <target dev="sda" bus="sata"/>
Dec 05 09:38:45 compute-1 nova_compute[189066]:     </disk>
Dec 05 09:38:45 compute-1 nova_compute[189066]:     <interface type="ethernet">
Dec 05 09:38:45 compute-1 nova_compute[189066]:       <mac address="fa:16:3e:ae:d0:75"/>
Dec 05 09:38:45 compute-1 nova_compute[189066]:       <model type="virtio"/>
Dec 05 09:38:45 compute-1 nova_compute[189066]:       <driver name="vhost" rx_queue_size="512"/>
Dec 05 09:38:45 compute-1 nova_compute[189066]:       <mtu size="1442"/>
Dec 05 09:38:45 compute-1 nova_compute[189066]:       <target dev="tape895fa16-27"/>
Dec 05 09:38:45 compute-1 nova_compute[189066]:     </interface>
Dec 05 09:38:45 compute-1 nova_compute[189066]:     <serial type="pty">
Dec 05 09:38:45 compute-1 nova_compute[189066]:       <log file="/var/lib/nova/instances/6d29aa46-325d-41f0-9726-7eb3e727aab4/console.log" append="off"/>
Dec 05 09:38:45 compute-1 nova_compute[189066]:     </serial>
Dec 05 09:38:45 compute-1 nova_compute[189066]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Dec 05 09:38:45 compute-1 nova_compute[189066]:     <video>
Dec 05 09:38:45 compute-1 nova_compute[189066]:       <model type="virtio"/>
Dec 05 09:38:45 compute-1 nova_compute[189066]:     </video>
Dec 05 09:38:45 compute-1 nova_compute[189066]:     <input type="tablet" bus="usb"/>
Dec 05 09:38:45 compute-1 nova_compute[189066]:     <rng model="virtio">
Dec 05 09:38:45 compute-1 nova_compute[189066]:       <backend model="random">/dev/urandom</backend>
Dec 05 09:38:45 compute-1 nova_compute[189066]:     </rng>
Dec 05 09:38:45 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root"/>
Dec 05 09:38:45 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:38:45 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:38:45 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:38:45 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:38:45 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:38:45 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:38:45 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:38:45 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:38:45 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:38:45 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:38:45 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:38:45 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:38:45 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:38:45 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:38:45 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:38:45 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:38:45 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:38:45 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:38:45 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:38:45 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:38:45 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:38:45 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:38:45 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:38:45 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:38:45 compute-1 nova_compute[189066]:     <controller type="usb" index="0"/>
Dec 05 09:38:45 compute-1 nova_compute[189066]:     <memballoon model="virtio">
Dec 05 09:38:45 compute-1 nova_compute[189066]:       <stats period="10"/>
Dec 05 09:38:45 compute-1 nova_compute[189066]:     </memballoon>
Dec 05 09:38:45 compute-1 nova_compute[189066]:   </devices>
Dec 05 09:38:45 compute-1 nova_compute[189066]: </domain>
Dec 05 09:38:45 compute-1 nova_compute[189066]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Dec 05 09:38:45 compute-1 nova_compute[189066]: 2025-12-05 09:38:45.902 189070 DEBUG nova.compute.manager [None req-110a88ed-70c2-45c4-a212-ffa1c71b817b 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: 6d29aa46-325d-41f0-9726-7eb3e727aab4] Preparing to wait for external event network-vif-plugged-e895fa16-2797-4f30-b0cc-644ea2908267 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Dec 05 09:38:45 compute-1 nova_compute[189066]: 2025-12-05 09:38:45.902 189070 DEBUG oslo_concurrency.lockutils [None req-110a88ed-70c2-45c4-a212-ffa1c71b817b 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Acquiring lock "6d29aa46-325d-41f0-9726-7eb3e727aab4-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:38:45 compute-1 nova_compute[189066]: 2025-12-05 09:38:45.903 189070 DEBUG oslo_concurrency.lockutils [None req-110a88ed-70c2-45c4-a212-ffa1c71b817b 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Lock "6d29aa46-325d-41f0-9726-7eb3e727aab4-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:38:45 compute-1 nova_compute[189066]: 2025-12-05 09:38:45.903 189070 DEBUG oslo_concurrency.lockutils [None req-110a88ed-70c2-45c4-a212-ffa1c71b817b 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Lock "6d29aa46-325d-41f0-9726-7eb3e727aab4-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:38:45 compute-1 nova_compute[189066]: 2025-12-05 09:38:45.904 189070 DEBUG nova.virt.libvirt.vif [None req-110a88ed-70c2-45c4-a212-ffa1c71b817b 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T09:38:39Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-137292075',display_name='tempest-TestNetworkBasicOps-server-137292075',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-137292075',id=40,image_ref='3ebffd97-b242-42d7-b245-ebdaf8e4377c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBBYkd+LG9XLs4SavaCNkHkE3ehx/EikY9dqHG31SVnXmqKcLCEPdT3GmKjeUultSC2Kd6kMXS8cYi0YDAGEAGxdGJZZ1TtOeZraf//ClPqOEJb5BPrshtg+A2mkYLqgvjw==',key_name='tempest-TestNetworkBasicOps-460138570',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='f918144b49634ed5a43d75f8f7d194d3',ramdisk_id='',reservation_id='r-gx0u88vz',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='3ebffd97-b242-42d7-b245-ebdaf8e4377c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1341993432',owner_user_name='tempest-TestNetworkBasicOps-1341993432-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T09:38:41Z,user_data=None,user_id='822a2d80e80e46d9a5a49e1e9560e0d9',uuid=6d29aa46-325d-41f0-9726-7eb3e727aab4,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "e895fa16-2797-4f30-b0cc-644ea2908267", "address": "fa:16:3e:ae:d0:75", "network": {"id": "cc4506e6-cfb6-485f-9e7b-3a80985c5a7e", "bridge": "br-int", "label": "tempest-network-smoke--1626007660", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f918144b49634ed5a43d75f8f7d194d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape895fa16-27", "ovs_interfaceid": "e895fa16-2797-4f30-b0cc-644ea2908267", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Dec 05 09:38:45 compute-1 nova_compute[189066]: 2025-12-05 09:38:45.904 189070 DEBUG nova.network.os_vif_util [None req-110a88ed-70c2-45c4-a212-ffa1c71b817b 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Converting VIF {"id": "e895fa16-2797-4f30-b0cc-644ea2908267", "address": "fa:16:3e:ae:d0:75", "network": {"id": "cc4506e6-cfb6-485f-9e7b-3a80985c5a7e", "bridge": "br-int", "label": "tempest-network-smoke--1626007660", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f918144b49634ed5a43d75f8f7d194d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape895fa16-27", "ovs_interfaceid": "e895fa16-2797-4f30-b0cc-644ea2908267", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 09:38:45 compute-1 nova_compute[189066]: 2025-12-05 09:38:45.904 189070 DEBUG nova.network.os_vif_util [None req-110a88ed-70c2-45c4-a212-ffa1c71b817b 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ae:d0:75,bridge_name='br-int',has_traffic_filtering=True,id=e895fa16-2797-4f30-b0cc-644ea2908267,network=Network(cc4506e6-cfb6-485f-9e7b-3a80985c5a7e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tape895fa16-27') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 09:38:45 compute-1 nova_compute[189066]: 2025-12-05 09:38:45.905 189070 DEBUG os_vif [None req-110a88ed-70c2-45c4-a212-ffa1c71b817b 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:ae:d0:75,bridge_name='br-int',has_traffic_filtering=True,id=e895fa16-2797-4f30-b0cc-644ea2908267,network=Network(cc4506e6-cfb6-485f-9e7b-3a80985c5a7e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tape895fa16-27') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Dec 05 09:38:45 compute-1 nova_compute[189066]: 2025-12-05 09:38:45.906 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:38:45 compute-1 nova_compute[189066]: 2025-12-05 09:38:45.907 189070 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 09:38:45 compute-1 nova_compute[189066]: 2025-12-05 09:38:45.907 189070 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 09:38:45 compute-1 nova_compute[189066]: 2025-12-05 09:38:45.910 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:38:45 compute-1 nova_compute[189066]: 2025-12-05 09:38:45.911 189070 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tape895fa16-27, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 09:38:45 compute-1 nova_compute[189066]: 2025-12-05 09:38:45.911 189070 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tape895fa16-27, col_values=(('external_ids', {'iface-id': 'e895fa16-2797-4f30-b0cc-644ea2908267', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:ae:d0:75', 'vm-uuid': '6d29aa46-325d-41f0-9726-7eb3e727aab4'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 09:38:45 compute-1 nova_compute[189066]: 2025-12-05 09:38:45.913 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:38:45 compute-1 NetworkManager[55704]: <info>  [1764927525.9138] manager: (tape895fa16-27): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/109)
Dec 05 09:38:45 compute-1 nova_compute[189066]: 2025-12-05 09:38:45.916 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 09:38:45 compute-1 nova_compute[189066]: 2025-12-05 09:38:45.922 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:38:45 compute-1 nova_compute[189066]: 2025-12-05 09:38:45.922 189070 INFO os_vif [None req-110a88ed-70c2-45c4-a212-ffa1c71b817b 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:ae:d0:75,bridge_name='br-int',has_traffic_filtering=True,id=e895fa16-2797-4f30-b0cc-644ea2908267,network=Network(cc4506e6-cfb6-485f-9e7b-3a80985c5a7e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tape895fa16-27')
Dec 05 09:38:45 compute-1 nova_compute[189066]: 2025-12-05 09:38:45.992 189070 DEBUG nova.virt.libvirt.driver [None req-110a88ed-70c2-45c4-a212-ffa1c71b817b 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 05 09:38:45 compute-1 nova_compute[189066]: 2025-12-05 09:38:45.993 189070 DEBUG nova.virt.libvirt.driver [None req-110a88ed-70c2-45c4-a212-ffa1c71b817b 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 05 09:38:45 compute-1 nova_compute[189066]: 2025-12-05 09:38:45.993 189070 DEBUG nova.virt.libvirt.driver [None req-110a88ed-70c2-45c4-a212-ffa1c71b817b 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] No VIF found with MAC fa:16:3e:ae:d0:75, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Dec 05 09:38:45 compute-1 nova_compute[189066]: 2025-12-05 09:38:45.993 189070 INFO nova.virt.libvirt.driver [None req-110a88ed-70c2-45c4-a212-ffa1c71b817b 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: 6d29aa46-325d-41f0-9726-7eb3e727aab4] Using config drive
Dec 05 09:38:46 compute-1 nova_compute[189066]: 2025-12-05 09:38:46.713 189070 INFO nova.virt.libvirt.driver [None req-110a88ed-70c2-45c4-a212-ffa1c71b817b 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: 6d29aa46-325d-41f0-9726-7eb3e727aab4] Creating config drive at /var/lib/nova/instances/6d29aa46-325d-41f0-9726-7eb3e727aab4/disk.config
Dec 05 09:38:46 compute-1 nova_compute[189066]: 2025-12-05 09:38:46.719 189070 DEBUG oslo_concurrency.processutils [None req-110a88ed-70c2-45c4-a212-ffa1c71b817b 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/6d29aa46-325d-41f0-9726-7eb3e727aab4/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpdm30f1jg execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 09:38:46 compute-1 nova_compute[189066]: 2025-12-05 09:38:46.852 189070 DEBUG oslo_concurrency.processutils [None req-110a88ed-70c2-45c4-a212-ffa1c71b817b 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/6d29aa46-325d-41f0-9726-7eb3e727aab4/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpdm30f1jg" returned: 0 in 0.133s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 09:38:46 compute-1 kernel: tape895fa16-27: entered promiscuous mode
Dec 05 09:38:46 compute-1 NetworkManager[55704]: <info>  [1764927526.9325] manager: (tape895fa16-27): new Tun device (/org/freedesktop/NetworkManager/Devices/110)
Dec 05 09:38:46 compute-1 ovn_controller[95809]: 2025-12-05T09:38:46Z|00215|binding|INFO|Claiming lport e895fa16-2797-4f30-b0cc-644ea2908267 for this chassis.
Dec 05 09:38:46 compute-1 ovn_controller[95809]: 2025-12-05T09:38:46Z|00216|binding|INFO|e895fa16-2797-4f30-b0cc-644ea2908267: Claiming fa:16:3e:ae:d0:75 10.100.0.3
Dec 05 09:38:46 compute-1 nova_compute[189066]: 2025-12-05 09:38:46.939 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:38:46 compute-1 nova_compute[189066]: 2025-12-05 09:38:46.944 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:38:46 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:38:46.956 105272 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ae:d0:75 10.100.0.3'], port_security=['fa:16:3e:ae:d0:75 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-TestNetworkBasicOps-45866682', 'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': '6d29aa46-325d-41f0-9726-7eb3e727aab4', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-cc4506e6-cfb6-485f-9e7b-3a80985c5a7e', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-TestNetworkBasicOps-45866682', 'neutron:project_id': 'f918144b49634ed5a43d75f8f7d194d3', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'b999a1ea-2a47-43bb-b6d8-1df5b14a091a', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=4843ac73-160e-4d90-a31d-4942b2dafdeb, chassis=[<ovs.db.idl.Row object at 0x7fc06f54c970>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fc06f54c970>], logical_port=e895fa16-2797-4f30-b0cc-644ea2908267) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 09:38:46 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:38:46.957 105272 INFO neutron.agent.ovn.metadata.agent [-] Port e895fa16-2797-4f30-b0cc-644ea2908267 in datapath cc4506e6-cfb6-485f-9e7b-3a80985c5a7e bound to our chassis
Dec 05 09:38:46 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:38:46.959 105272 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network cc4506e6-cfb6-485f-9e7b-3a80985c5a7e
Dec 05 09:38:46 compute-1 systemd-udevd[230119]: Network interface NamePolicy= disabled on kernel command line.
Dec 05 09:38:46 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:38:46.974 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[635effae-6d62-428b-9fa1-4857b2e83b24]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:38:46 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:38:46.976 105272 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapcc4506e6-c1 in ovnmeta-cc4506e6-cfb6-485f-9e7b-3a80985c5a7e namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Dec 05 09:38:46 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:38:46.977 220556 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapcc4506e6-c0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Dec 05 09:38:46 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:38:46.977 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[9fc8b513-bd8c-4791-b99f-c71493f2be2f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:38:46 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:38:46.979 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[51004973-7fd1-47e7-b2f0-e441dae326f4]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:38:46 compute-1 systemd-machined[154815]: New machine qemu-17-instance-00000028.
Dec 05 09:38:46 compute-1 NetworkManager[55704]: <info>  [1764927526.9909] device (tape895fa16-27): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec 05 09:38:46 compute-1 NetworkManager[55704]: <info>  [1764927526.9919] device (tape895fa16-27): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec 05 09:38:46 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:38:46.994 105809 DEBUG oslo.privsep.daemon [-] privsep: reply[1e351dfd-c74e-4106-9695-ffa487d7358a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:38:46 compute-1 ovn_controller[95809]: 2025-12-05T09:38:46Z|00217|binding|INFO|Setting lport e895fa16-2797-4f30-b0cc-644ea2908267 ovn-installed in OVS
Dec 05 09:38:46 compute-1 ovn_controller[95809]: 2025-12-05T09:38:46Z|00218|binding|INFO|Setting lport e895fa16-2797-4f30-b0cc-644ea2908267 up in Southbound
Dec 05 09:38:46 compute-1 nova_compute[189066]: 2025-12-05 09:38:46.998 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:38:47 compute-1 systemd[1]: Started Virtual Machine qemu-17-instance-00000028.
Dec 05 09:38:47 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:38:47.023 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[b0e18862-0762-4b16-9c21-0680cb063255]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:38:47 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:38:47.062 220896 DEBUG oslo.privsep.daemon [-] privsep: reply[0f85c3bc-485d-4895-bce5-40de03ce7da0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:38:47 compute-1 NetworkManager[55704]: <info>  [1764927527.0703] manager: (tapcc4506e6-c0): new Veth device (/org/freedesktop/NetworkManager/Devices/111)
Dec 05 09:38:47 compute-1 systemd-udevd[230124]: Network interface NamePolicy= disabled on kernel command line.
Dec 05 09:38:47 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:38:47.072 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[29a00669-d6a4-4b99-ae26-df6a968f7072]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:38:47 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:38:47.117 220896 DEBUG oslo.privsep.daemon [-] privsep: reply[dd0ef1b0-c0bf-40cd-a339-363772e67ac0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:38:47 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:38:47.120 220896 DEBUG oslo.privsep.daemon [-] privsep: reply[4ac1d200-4de0-43b2-867e-62c17e4304b0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:38:47 compute-1 NetworkManager[55704]: <info>  [1764927527.1462] device (tapcc4506e6-c0): carrier: link connected
Dec 05 09:38:47 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:38:47.151 220896 DEBUG oslo.privsep.daemon [-] privsep: reply[6be56df2-0271-413a-83b5-7c4ce0bb515d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:38:47 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:38:47.175 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[401d9abc-47cd-4c5c-b688-a74e7c4faa38]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapcc4506e6-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:25:5d:24'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 63], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 480014, 'reachable_time': 20952, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 230153, 'error': None, 'target': 'ovnmeta-cc4506e6-cfb6-485f-9e7b-3a80985c5a7e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:38:47 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:38:47.202 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[8ed4283f-c847-4634-99f4-606b9644f05f]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe25:5d24'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 480014, 'tstamp': 480014}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 230154, 'error': None, 'target': 'ovnmeta-cc4506e6-cfb6-485f-9e7b-3a80985c5a7e', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:38:47 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:38:47.230 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[4693b756-e72a-4fde-90d4-1afca73c61e3]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapcc4506e6-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:25:5d:24'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 63], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 480014, 'reachable_time': 20952, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 230155, 'error': None, 'target': 'ovnmeta-cc4506e6-cfb6-485f-9e7b-3a80985c5a7e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:38:47 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:38:47.283 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[b07334ef-ab66-4a4d-b4a0-77f82611d70f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:38:47 compute-1 nova_compute[189066]: 2025-12-05 09:38:47.333 189070 DEBUG nova.compute.manager [req-e0ebb694-4149-498e-b72d-919373e9f818 req-1245053f-2d1f-41b1-8214-45d73f7820e4 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 6d29aa46-325d-41f0-9726-7eb3e727aab4] Received event network-vif-plugged-e895fa16-2797-4f30-b0cc-644ea2908267 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 09:38:47 compute-1 nova_compute[189066]: 2025-12-05 09:38:47.335 189070 DEBUG oslo_concurrency.lockutils [req-e0ebb694-4149-498e-b72d-919373e9f818 req-1245053f-2d1f-41b1-8214-45d73f7820e4 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Acquiring lock "6d29aa46-325d-41f0-9726-7eb3e727aab4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:38:47 compute-1 nova_compute[189066]: 2025-12-05 09:38:47.335 189070 DEBUG oslo_concurrency.lockutils [req-e0ebb694-4149-498e-b72d-919373e9f818 req-1245053f-2d1f-41b1-8214-45d73f7820e4 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Lock "6d29aa46-325d-41f0-9726-7eb3e727aab4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:38:47 compute-1 nova_compute[189066]: 2025-12-05 09:38:47.336 189070 DEBUG oslo_concurrency.lockutils [req-e0ebb694-4149-498e-b72d-919373e9f818 req-1245053f-2d1f-41b1-8214-45d73f7820e4 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Lock "6d29aa46-325d-41f0-9726-7eb3e727aab4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:38:47 compute-1 nova_compute[189066]: 2025-12-05 09:38:47.336 189070 DEBUG nova.compute.manager [req-e0ebb694-4149-498e-b72d-919373e9f818 req-1245053f-2d1f-41b1-8214-45d73f7820e4 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 6d29aa46-325d-41f0-9726-7eb3e727aab4] Processing event network-vif-plugged-e895fa16-2797-4f30-b0cc-644ea2908267 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Dec 05 09:38:47 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:38:47.366 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[5b7d6fc6-18b4-46c1-9c72-a2d758cd5d82]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:38:47 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:38:47.368 105272 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapcc4506e6-c0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 09:38:47 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:38:47.368 105272 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 09:38:47 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:38:47.369 105272 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapcc4506e6-c0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 09:38:47 compute-1 kernel: tapcc4506e6-c0: entered promiscuous mode
Dec 05 09:38:47 compute-1 nova_compute[189066]: 2025-12-05 09:38:47.372 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:38:47 compute-1 NetworkManager[55704]: <info>  [1764927527.3741] manager: (tapcc4506e6-c0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/112)
Dec 05 09:38:47 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:38:47.379 105272 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapcc4506e6-c0, col_values=(('external_ids', {'iface-id': '58d8efc9-6765-49f2-921c-3850e3ec8c81'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 09:38:47 compute-1 nova_compute[189066]: 2025-12-05 09:38:47.380 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:38:47 compute-1 ovn_controller[95809]: 2025-12-05T09:38:47Z|00219|binding|INFO|Releasing lport 58d8efc9-6765-49f2-921c-3850e3ec8c81 from this chassis (sb_readonly=0)
Dec 05 09:38:47 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:38:47.382 105272 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/cc4506e6-cfb6-485f-9e7b-3a80985c5a7e.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/cc4506e6-cfb6-485f-9e7b-3a80985c5a7e.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Dec 05 09:38:47 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:38:47.383 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[e71e2c3a-a89f-4bf3-b6e5-148df4de9278]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:38:47 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:38:47.384 105272 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Dec 05 09:38:47 compute-1 ovn_metadata_agent[105267]: global
Dec 05 09:38:47 compute-1 ovn_metadata_agent[105267]:     log         /dev/log local0 debug
Dec 05 09:38:47 compute-1 ovn_metadata_agent[105267]:     log-tag     haproxy-metadata-proxy-cc4506e6-cfb6-485f-9e7b-3a80985c5a7e
Dec 05 09:38:47 compute-1 ovn_metadata_agent[105267]:     user        root
Dec 05 09:38:47 compute-1 ovn_metadata_agent[105267]:     group       root
Dec 05 09:38:47 compute-1 ovn_metadata_agent[105267]:     maxconn     1024
Dec 05 09:38:47 compute-1 ovn_metadata_agent[105267]:     pidfile     /var/lib/neutron/external/pids/cc4506e6-cfb6-485f-9e7b-3a80985c5a7e.pid.haproxy
Dec 05 09:38:47 compute-1 ovn_metadata_agent[105267]:     daemon
Dec 05 09:38:47 compute-1 ovn_metadata_agent[105267]: 
Dec 05 09:38:47 compute-1 ovn_metadata_agent[105267]: defaults
Dec 05 09:38:47 compute-1 ovn_metadata_agent[105267]:     log global
Dec 05 09:38:47 compute-1 ovn_metadata_agent[105267]:     mode http
Dec 05 09:38:47 compute-1 ovn_metadata_agent[105267]:     option httplog
Dec 05 09:38:47 compute-1 ovn_metadata_agent[105267]:     option dontlognull
Dec 05 09:38:47 compute-1 ovn_metadata_agent[105267]:     option http-server-close
Dec 05 09:38:47 compute-1 ovn_metadata_agent[105267]:     option forwardfor
Dec 05 09:38:47 compute-1 ovn_metadata_agent[105267]:     retries                 3
Dec 05 09:38:47 compute-1 ovn_metadata_agent[105267]:     timeout http-request    30s
Dec 05 09:38:47 compute-1 ovn_metadata_agent[105267]:     timeout connect         30s
Dec 05 09:38:47 compute-1 ovn_metadata_agent[105267]:     timeout client          32s
Dec 05 09:38:47 compute-1 ovn_metadata_agent[105267]:     timeout server          32s
Dec 05 09:38:47 compute-1 ovn_metadata_agent[105267]:     timeout http-keep-alive 30s
Dec 05 09:38:47 compute-1 ovn_metadata_agent[105267]: 
Dec 05 09:38:47 compute-1 ovn_metadata_agent[105267]: 
Dec 05 09:38:47 compute-1 ovn_metadata_agent[105267]: listen listener
Dec 05 09:38:47 compute-1 ovn_metadata_agent[105267]:     bind 169.254.169.254:80
Dec 05 09:38:47 compute-1 ovn_metadata_agent[105267]:     server metadata /var/lib/neutron/metadata_proxy
Dec 05 09:38:47 compute-1 ovn_metadata_agent[105267]:     http-request add-header X-OVN-Network-ID cc4506e6-cfb6-485f-9e7b-3a80985c5a7e
Dec 05 09:38:47 compute-1 ovn_metadata_agent[105267]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Dec 05 09:38:47 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:38:47.386 105272 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-cc4506e6-cfb6-485f-9e7b-3a80985c5a7e', 'env', 'PROCESS_TAG=haproxy-cc4506e6-cfb6-485f-9e7b-3a80985c5a7e', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/cc4506e6-cfb6-485f-9e7b-3a80985c5a7e.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Dec 05 09:38:47 compute-1 nova_compute[189066]: 2025-12-05 09:38:47.394 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:38:47 compute-1 nova_compute[189066]: 2025-12-05 09:38:47.830 189070 DEBUG nova.network.neutron [req-68ca5f19-04ee-4aad-8a69-0a50f174ccce req-a05b9a74-b670-4d36-ab6b-501871d8f313 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 6d29aa46-325d-41f0-9726-7eb3e727aab4] Updated VIF entry in instance network info cache for port e895fa16-2797-4f30-b0cc-644ea2908267. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Dec 05 09:38:47 compute-1 nova_compute[189066]: 2025-12-05 09:38:47.831 189070 DEBUG nova.network.neutron [req-68ca5f19-04ee-4aad-8a69-0a50f174ccce req-a05b9a74-b670-4d36-ab6b-501871d8f313 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 6d29aa46-325d-41f0-9726-7eb3e727aab4] Updating instance_info_cache with network_info: [{"id": "e895fa16-2797-4f30-b0cc-644ea2908267", "address": "fa:16:3e:ae:d0:75", "network": {"id": "cc4506e6-cfb6-485f-9e7b-3a80985c5a7e", "bridge": "br-int", "label": "tempest-network-smoke--1626007660", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f918144b49634ed5a43d75f8f7d194d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape895fa16-27", "ovs_interfaceid": "e895fa16-2797-4f30-b0cc-644ea2908267", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 09:38:47 compute-1 nova_compute[189066]: 2025-12-05 09:38:47.854 189070 DEBUG oslo_concurrency.lockutils [req-68ca5f19-04ee-4aad-8a69-0a50f174ccce req-a05b9a74-b670-4d36-ab6b-501871d8f313 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Releasing lock "refresh_cache-6d29aa46-325d-41f0-9726-7eb3e727aab4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 09:38:47 compute-1 podman[230187]: 2025-12-05 09:38:47.79176771 +0000 UTC m=+0.026590592 image pull 014dc726c85414b29f2dde7b5d875685d08784761c0f0ffa8630d1583a877bf9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Dec 05 09:38:47 compute-1 podman[230187]: 2025-12-05 09:38:47.896487651 +0000 UTC m=+0.131310503 container create 6751f3027caf28ec95029c39e0a1119bbfa16210a718e81cdd1b104329f300a8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-cc4506e6-cfb6-485f-9e7b-3a80985c5a7e, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true)
Dec 05 09:38:47 compute-1 systemd[1]: Started libpod-conmon-6751f3027caf28ec95029c39e0a1119bbfa16210a718e81cdd1b104329f300a8.scope.
Dec 05 09:38:47 compute-1 systemd[1]: Started libcrun container.
Dec 05 09:38:47 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ad7a1f7f1b7855d4070468473f63bf74743c55da7ea8bf663c62f87c09ceda1f/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 05 09:38:47 compute-1 podman[230187]: 2025-12-05 09:38:47.987776723 +0000 UTC m=+0.222599595 container init 6751f3027caf28ec95029c39e0a1119bbfa16210a718e81cdd1b104329f300a8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-cc4506e6-cfb6-485f-9e7b-3a80985c5a7e, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Dec 05 09:38:47 compute-1 podman[230187]: 2025-12-05 09:38:47.993096274 +0000 UTC m=+0.227919126 container start 6751f3027caf28ec95029c39e0a1119bbfa16210a718e81cdd1b104329f300a8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-cc4506e6-cfb6-485f-9e7b-3a80985c5a7e, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Dec 05 09:38:48 compute-1 neutron-haproxy-ovnmeta-cc4506e6-cfb6-485f-9e7b-3a80985c5a7e[230202]: [NOTICE]   (230206) : New worker (230208) forked
Dec 05 09:38:48 compute-1 neutron-haproxy-ovnmeta-cc4506e6-cfb6-485f-9e7b-3a80985c5a7e[230202]: [NOTICE]   (230206) : Loading success.
Dec 05 09:38:48 compute-1 nova_compute[189066]: 2025-12-05 09:38:48.508 189070 DEBUG nova.compute.manager [None req-110a88ed-70c2-45c4-a212-ffa1c71b817b 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: 6d29aa46-325d-41f0-9726-7eb3e727aab4] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Dec 05 09:38:48 compute-1 nova_compute[189066]: 2025-12-05 09:38:48.510 189070 DEBUG nova.virt.driver [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] Emitting event <LifecycleEvent: 1764927528.5075192, 6d29aa46-325d-41f0-9726-7eb3e727aab4 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 09:38:48 compute-1 nova_compute[189066]: 2025-12-05 09:38:48.510 189070 INFO nova.compute.manager [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] [instance: 6d29aa46-325d-41f0-9726-7eb3e727aab4] VM Started (Lifecycle Event)
Dec 05 09:38:48 compute-1 nova_compute[189066]: 2025-12-05 09:38:48.514 189070 DEBUG nova.virt.libvirt.driver [None req-110a88ed-70c2-45c4-a212-ffa1c71b817b 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: 6d29aa46-325d-41f0-9726-7eb3e727aab4] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Dec 05 09:38:48 compute-1 nova_compute[189066]: 2025-12-05 09:38:48.518 189070 INFO nova.virt.libvirt.driver [-] [instance: 6d29aa46-325d-41f0-9726-7eb3e727aab4] Instance spawned successfully.
Dec 05 09:38:48 compute-1 nova_compute[189066]: 2025-12-05 09:38:48.519 189070 DEBUG nova.virt.libvirt.driver [None req-110a88ed-70c2-45c4-a212-ffa1c71b817b 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: 6d29aa46-325d-41f0-9726-7eb3e727aab4] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Dec 05 09:38:48 compute-1 nova_compute[189066]: 2025-12-05 09:38:48.699 189070 DEBUG nova.compute.manager [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] [instance: 6d29aa46-325d-41f0-9726-7eb3e727aab4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 09:38:48 compute-1 nova_compute[189066]: 2025-12-05 09:38:48.704 189070 DEBUG nova.compute.manager [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] [instance: 6d29aa46-325d-41f0-9726-7eb3e727aab4] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 09:38:48 compute-1 nova_compute[189066]: 2025-12-05 09:38:48.928 189070 DEBUG nova.virt.libvirt.driver [None req-110a88ed-70c2-45c4-a212-ffa1c71b817b 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: 6d29aa46-325d-41f0-9726-7eb3e727aab4] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 09:38:48 compute-1 nova_compute[189066]: 2025-12-05 09:38:48.929 189070 DEBUG nova.virt.libvirt.driver [None req-110a88ed-70c2-45c4-a212-ffa1c71b817b 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: 6d29aa46-325d-41f0-9726-7eb3e727aab4] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 09:38:48 compute-1 nova_compute[189066]: 2025-12-05 09:38:48.929 189070 DEBUG nova.virt.libvirt.driver [None req-110a88ed-70c2-45c4-a212-ffa1c71b817b 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: 6d29aa46-325d-41f0-9726-7eb3e727aab4] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 09:38:48 compute-1 nova_compute[189066]: 2025-12-05 09:38:48.930 189070 DEBUG nova.virt.libvirt.driver [None req-110a88ed-70c2-45c4-a212-ffa1c71b817b 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: 6d29aa46-325d-41f0-9726-7eb3e727aab4] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 09:38:48 compute-1 nova_compute[189066]: 2025-12-05 09:38:48.930 189070 DEBUG nova.virt.libvirt.driver [None req-110a88ed-70c2-45c4-a212-ffa1c71b817b 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: 6d29aa46-325d-41f0-9726-7eb3e727aab4] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 09:38:48 compute-1 nova_compute[189066]: 2025-12-05 09:38:48.931 189070 DEBUG nova.virt.libvirt.driver [None req-110a88ed-70c2-45c4-a212-ffa1c71b817b 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: 6d29aa46-325d-41f0-9726-7eb3e727aab4] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 09:38:49 compute-1 nova_compute[189066]: 2025-12-05 09:38:49.061 189070 INFO nova.compute.manager [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] [instance: 6d29aa46-325d-41f0-9726-7eb3e727aab4] During sync_power_state the instance has a pending task (spawning). Skip.
Dec 05 09:38:49 compute-1 nova_compute[189066]: 2025-12-05 09:38:49.062 189070 DEBUG nova.virt.driver [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] Emitting event <LifecycleEvent: 1764927528.509161, 6d29aa46-325d-41f0-9726-7eb3e727aab4 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 09:38:49 compute-1 nova_compute[189066]: 2025-12-05 09:38:49.062 189070 INFO nova.compute.manager [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] [instance: 6d29aa46-325d-41f0-9726-7eb3e727aab4] VM Paused (Lifecycle Event)
Dec 05 09:38:49 compute-1 nova_compute[189066]: 2025-12-05 09:38:49.103 189070 DEBUG nova.compute.manager [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] [instance: 6d29aa46-325d-41f0-9726-7eb3e727aab4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 09:38:49 compute-1 nova_compute[189066]: 2025-12-05 09:38:49.109 189070 DEBUG nova.virt.driver [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] Emitting event <LifecycleEvent: 1764927528.5138109, 6d29aa46-325d-41f0-9726-7eb3e727aab4 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 09:38:49 compute-1 nova_compute[189066]: 2025-12-05 09:38:49.109 189070 INFO nova.compute.manager [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] [instance: 6d29aa46-325d-41f0-9726-7eb3e727aab4] VM Resumed (Lifecycle Event)
Dec 05 09:38:49 compute-1 nova_compute[189066]: 2025-12-05 09:38:49.139 189070 INFO nova.compute.manager [None req-110a88ed-70c2-45c4-a212-ffa1c71b817b 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: 6d29aa46-325d-41f0-9726-7eb3e727aab4] Took 7.37 seconds to spawn the instance on the hypervisor.
Dec 05 09:38:49 compute-1 nova_compute[189066]: 2025-12-05 09:38:49.139 189070 DEBUG nova.compute.manager [None req-110a88ed-70c2-45c4-a212-ffa1c71b817b 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: 6d29aa46-325d-41f0-9726-7eb3e727aab4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 09:38:49 compute-1 nova_compute[189066]: 2025-12-05 09:38:49.141 189070 DEBUG nova.compute.manager [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] [instance: 6d29aa46-325d-41f0-9726-7eb3e727aab4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 09:38:49 compute-1 nova_compute[189066]: 2025-12-05 09:38:49.148 189070 DEBUG nova.compute.manager [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] [instance: 6d29aa46-325d-41f0-9726-7eb3e727aab4] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 09:38:49 compute-1 nova_compute[189066]: 2025-12-05 09:38:49.230 189070 INFO nova.compute.manager [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] [instance: 6d29aa46-325d-41f0-9726-7eb3e727aab4] During sync_power_state the instance has a pending task (spawning). Skip.
Dec 05 09:38:49 compute-1 nova_compute[189066]: 2025-12-05 09:38:49.303 189070 INFO nova.compute.manager [None req-110a88ed-70c2-45c4-a212-ffa1c71b817b 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: 6d29aa46-325d-41f0-9726-7eb3e727aab4] Took 8.19 seconds to build instance.
Dec 05 09:38:49 compute-1 nova_compute[189066]: 2025-12-05 09:38:49.366 189070 DEBUG oslo_concurrency.lockutils [None req-110a88ed-70c2-45c4-a212-ffa1c71b817b 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Lock "6d29aa46-325d-41f0-9726-7eb3e727aab4" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 8.317s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:38:49 compute-1 nova_compute[189066]: 2025-12-05 09:38:49.495 189070 DEBUG nova.compute.manager [req-142802bb-b73d-4326-8576-4b718f2470c8 req-7f8cf47a-eaef-4391-98ac-a0abb2f2ecf1 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 6d29aa46-325d-41f0-9726-7eb3e727aab4] Received event network-vif-plugged-e895fa16-2797-4f30-b0cc-644ea2908267 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 09:38:49 compute-1 nova_compute[189066]: 2025-12-05 09:38:49.496 189070 DEBUG oslo_concurrency.lockutils [req-142802bb-b73d-4326-8576-4b718f2470c8 req-7f8cf47a-eaef-4391-98ac-a0abb2f2ecf1 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Acquiring lock "6d29aa46-325d-41f0-9726-7eb3e727aab4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:38:49 compute-1 nova_compute[189066]: 2025-12-05 09:38:49.498 189070 DEBUG oslo_concurrency.lockutils [req-142802bb-b73d-4326-8576-4b718f2470c8 req-7f8cf47a-eaef-4391-98ac-a0abb2f2ecf1 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Lock "6d29aa46-325d-41f0-9726-7eb3e727aab4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:38:49 compute-1 nova_compute[189066]: 2025-12-05 09:38:49.499 189070 DEBUG oslo_concurrency.lockutils [req-142802bb-b73d-4326-8576-4b718f2470c8 req-7f8cf47a-eaef-4391-98ac-a0abb2f2ecf1 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Lock "6d29aa46-325d-41f0-9726-7eb3e727aab4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:38:49 compute-1 nova_compute[189066]: 2025-12-05 09:38:49.500 189070 DEBUG nova.compute.manager [req-142802bb-b73d-4326-8576-4b718f2470c8 req-7f8cf47a-eaef-4391-98ac-a0abb2f2ecf1 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 6d29aa46-325d-41f0-9726-7eb3e727aab4] No waiting events found dispatching network-vif-plugged-e895fa16-2797-4f30-b0cc-644ea2908267 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 09:38:49 compute-1 nova_compute[189066]: 2025-12-05 09:38:49.501 189070 WARNING nova.compute.manager [req-142802bb-b73d-4326-8576-4b718f2470c8 req-7f8cf47a-eaef-4391-98ac-a0abb2f2ecf1 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 6d29aa46-325d-41f0-9726-7eb3e727aab4] Received unexpected event network-vif-plugged-e895fa16-2797-4f30-b0cc-644ea2908267 for instance with vm_state active and task_state None.
Dec 05 09:38:50 compute-1 nova_compute[189066]: 2025-12-05 09:38:50.272 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:38:50 compute-1 nova_compute[189066]: 2025-12-05 09:38:50.914 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:38:52 compute-1 podman[230224]: 2025-12-05 09:38:52.636473819 +0000 UTC m=+0.066765384 container health_status d60d3dfd47255bc7c068dbedc03dbf86678863f0414a1b8f8536e4d53dd67a13 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a42d21893a42142b0b00e96296a13ac9d2c0fedf55d6438ad104fd0309c63376-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_id=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 05 09:38:55 compute-1 nova_compute[189066]: 2025-12-05 09:38:55.153 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:38:55 compute-1 NetworkManager[55704]: <info>  [1764927535.1637] manager: (patch-provnet-540ef657-1e81-4b72-856b-0e1c84504735-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/113)
Dec 05 09:38:55 compute-1 NetworkManager[55704]: <info>  [1764927535.1649] manager: (patch-br-int-to-provnet-540ef657-1e81-4b72-856b-0e1c84504735): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/114)
Dec 05 09:38:55 compute-1 nova_compute[189066]: 2025-12-05 09:38:55.218 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:38:55 compute-1 ovn_controller[95809]: 2025-12-05T09:38:55Z|00220|binding|INFO|Releasing lport 58d8efc9-6765-49f2-921c-3850e3ec8c81 from this chassis (sb_readonly=0)
Dec 05 09:38:55 compute-1 nova_compute[189066]: 2025-12-05 09:38:55.233 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:38:55 compute-1 nova_compute[189066]: 2025-12-05 09:38:55.275 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:38:55 compute-1 nova_compute[189066]: 2025-12-05 09:38:55.642 189070 DEBUG nova.compute.manager [req-d90ee646-d667-4a90-bc9e-2c991768afed req-dfd55115-1482-4240-bad3-2381f204c017 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 6d29aa46-325d-41f0-9726-7eb3e727aab4] Received event network-changed-e895fa16-2797-4f30-b0cc-644ea2908267 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 09:38:55 compute-1 nova_compute[189066]: 2025-12-05 09:38:55.643 189070 DEBUG nova.compute.manager [req-d90ee646-d667-4a90-bc9e-2c991768afed req-dfd55115-1482-4240-bad3-2381f204c017 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 6d29aa46-325d-41f0-9726-7eb3e727aab4] Refreshing instance network info cache due to event network-changed-e895fa16-2797-4f30-b0cc-644ea2908267. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Dec 05 09:38:55 compute-1 nova_compute[189066]: 2025-12-05 09:38:55.643 189070 DEBUG oslo_concurrency.lockutils [req-d90ee646-d667-4a90-bc9e-2c991768afed req-dfd55115-1482-4240-bad3-2381f204c017 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Acquiring lock "refresh_cache-6d29aa46-325d-41f0-9726-7eb3e727aab4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 09:38:55 compute-1 nova_compute[189066]: 2025-12-05 09:38:55.644 189070 DEBUG oslo_concurrency.lockutils [req-d90ee646-d667-4a90-bc9e-2c991768afed req-dfd55115-1482-4240-bad3-2381f204c017 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Acquired lock "refresh_cache-6d29aa46-325d-41f0-9726-7eb3e727aab4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 09:38:55 compute-1 nova_compute[189066]: 2025-12-05 09:38:55.644 189070 DEBUG nova.network.neutron [req-d90ee646-d667-4a90-bc9e-2c991768afed req-dfd55115-1482-4240-bad3-2381f204c017 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 6d29aa46-325d-41f0-9726-7eb3e727aab4] Refreshing network info cache for port e895fa16-2797-4f30-b0cc-644ea2908267 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Dec 05 09:38:55 compute-1 nova_compute[189066]: 2025-12-05 09:38:55.760 189070 DEBUG oslo_concurrency.lockutils [None req-21d68eba-0fe2-4104-92b3-7fe94e8cb1dc 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Acquiring lock "6d29aa46-325d-41f0-9726-7eb3e727aab4" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:38:55 compute-1 nova_compute[189066]: 2025-12-05 09:38:55.761 189070 DEBUG oslo_concurrency.lockutils [None req-21d68eba-0fe2-4104-92b3-7fe94e8cb1dc 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Lock "6d29aa46-325d-41f0-9726-7eb3e727aab4" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:38:55 compute-1 nova_compute[189066]: 2025-12-05 09:38:55.761 189070 DEBUG oslo_concurrency.lockutils [None req-21d68eba-0fe2-4104-92b3-7fe94e8cb1dc 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Acquiring lock "6d29aa46-325d-41f0-9726-7eb3e727aab4-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:38:55 compute-1 nova_compute[189066]: 2025-12-05 09:38:55.761 189070 DEBUG oslo_concurrency.lockutils [None req-21d68eba-0fe2-4104-92b3-7fe94e8cb1dc 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Lock "6d29aa46-325d-41f0-9726-7eb3e727aab4-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:38:55 compute-1 nova_compute[189066]: 2025-12-05 09:38:55.762 189070 DEBUG oslo_concurrency.lockutils [None req-21d68eba-0fe2-4104-92b3-7fe94e8cb1dc 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Lock "6d29aa46-325d-41f0-9726-7eb3e727aab4-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:38:55 compute-1 nova_compute[189066]: 2025-12-05 09:38:55.763 189070 INFO nova.compute.manager [None req-21d68eba-0fe2-4104-92b3-7fe94e8cb1dc 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: 6d29aa46-325d-41f0-9726-7eb3e727aab4] Terminating instance
Dec 05 09:38:55 compute-1 nova_compute[189066]: 2025-12-05 09:38:55.764 189070 DEBUG nova.compute.manager [None req-21d68eba-0fe2-4104-92b3-7fe94e8cb1dc 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: 6d29aa46-325d-41f0-9726-7eb3e727aab4] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Dec 05 09:38:55 compute-1 kernel: tape895fa16-27 (unregistering): left promiscuous mode
Dec 05 09:38:55 compute-1 NetworkManager[55704]: <info>  [1764927535.7832] device (tape895fa16-27): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec 05 09:38:55 compute-1 ovn_controller[95809]: 2025-12-05T09:38:55Z|00221|binding|INFO|Releasing lport e895fa16-2797-4f30-b0cc-644ea2908267 from this chassis (sb_readonly=0)
Dec 05 09:38:55 compute-1 nova_compute[189066]: 2025-12-05 09:38:55.790 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:38:55 compute-1 ovn_controller[95809]: 2025-12-05T09:38:55Z|00222|binding|INFO|Setting lport e895fa16-2797-4f30-b0cc-644ea2908267 down in Southbound
Dec 05 09:38:55 compute-1 ovn_controller[95809]: 2025-12-05T09:38:55Z|00223|binding|INFO|Removing iface tape895fa16-27 ovn-installed in OVS
Dec 05 09:38:55 compute-1 nova_compute[189066]: 2025-12-05 09:38:55.793 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:38:55 compute-1 nova_compute[189066]: 2025-12-05 09:38:55.808 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:38:55 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:38:55.817 105272 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ae:d0:75 10.100.0.3'], port_security=['fa:16:3e:ae:d0:75 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-TestNetworkBasicOps-45866682', 'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': '6d29aa46-325d-41f0-9726-7eb3e727aab4', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-cc4506e6-cfb6-485f-9e7b-3a80985c5a7e', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-TestNetworkBasicOps-45866682', 'neutron:project_id': 'f918144b49634ed5a43d75f8f7d194d3', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'b999a1ea-2a47-43bb-b6d8-1df5b14a091a', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com', 'neutron:port_fip': '192.168.122.187'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=4843ac73-160e-4d90-a31d-4942b2dafdeb, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fc06f54c970>], logical_port=e895fa16-2797-4f30-b0cc-644ea2908267) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fc06f54c970>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 09:38:55 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:38:55.819 105272 INFO neutron.agent.ovn.metadata.agent [-] Port e895fa16-2797-4f30-b0cc-644ea2908267 in datapath cc4506e6-cfb6-485f-9e7b-3a80985c5a7e unbound from our chassis
Dec 05 09:38:55 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:38:55.821 105272 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network cc4506e6-cfb6-485f-9e7b-3a80985c5a7e, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 05 09:38:55 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:38:55.823 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[bf8d46b1-cda4-4b74-b7bc-e9eb4068ec6a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:38:55 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:38:55.824 105272 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-cc4506e6-cfb6-485f-9e7b-3a80985c5a7e namespace which is not needed anymore
Dec 05 09:38:55 compute-1 systemd[1]: machine-qemu\x2d17\x2dinstance\x2d00000028.scope: Deactivated successfully.
Dec 05 09:38:55 compute-1 systemd[1]: machine-qemu\x2d17\x2dinstance\x2d00000028.scope: Consumed 8.965s CPU time.
Dec 05 09:38:55 compute-1 systemd-machined[154815]: Machine qemu-17-instance-00000028 terminated.
Dec 05 09:38:55 compute-1 nova_compute[189066]: 2025-12-05 09:38:55.917 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:38:55 compute-1 neutron-haproxy-ovnmeta-cc4506e6-cfb6-485f-9e7b-3a80985c5a7e[230202]: [NOTICE]   (230206) : haproxy version is 2.8.14-c23fe91
Dec 05 09:38:55 compute-1 neutron-haproxy-ovnmeta-cc4506e6-cfb6-485f-9e7b-3a80985c5a7e[230202]: [NOTICE]   (230206) : path to executable is /usr/sbin/haproxy
Dec 05 09:38:55 compute-1 neutron-haproxy-ovnmeta-cc4506e6-cfb6-485f-9e7b-3a80985c5a7e[230202]: [WARNING]  (230206) : Exiting Master process...
Dec 05 09:38:55 compute-1 neutron-haproxy-ovnmeta-cc4506e6-cfb6-485f-9e7b-3a80985c5a7e[230202]: [ALERT]    (230206) : Current worker (230208) exited with code 143 (Terminated)
Dec 05 09:38:55 compute-1 neutron-haproxy-ovnmeta-cc4506e6-cfb6-485f-9e7b-3a80985c5a7e[230202]: [WARNING]  (230206) : All workers exited. Exiting... (0)
Dec 05 09:38:55 compute-1 systemd[1]: libpod-6751f3027caf28ec95029c39e0a1119bbfa16210a718e81cdd1b104329f300a8.scope: Deactivated successfully.
Dec 05 09:38:55 compute-1 podman[230269]: 2025-12-05 09:38:55.986002792 +0000 UTC m=+0.050160206 container died 6751f3027caf28ec95029c39e0a1119bbfa16210a718e81cdd1b104329f300a8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-cc4506e6-cfb6-485f-9e7b-3a80985c5a7e, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 05 09:38:55 compute-1 nova_compute[189066]: 2025-12-05 09:38:55.990 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:38:55 compute-1 nova_compute[189066]: 2025-12-05 09:38:55.996 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:38:56 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-6751f3027caf28ec95029c39e0a1119bbfa16210a718e81cdd1b104329f300a8-userdata-shm.mount: Deactivated successfully.
Dec 05 09:38:56 compute-1 systemd[1]: var-lib-containers-storage-overlay-ad7a1f7f1b7855d4070468473f63bf74743c55da7ea8bf663c62f87c09ceda1f-merged.mount: Deactivated successfully.
Dec 05 09:38:56 compute-1 podman[230269]: 2025-12-05 09:38:56.027142009 +0000 UTC m=+0.091299373 container cleanup 6751f3027caf28ec95029c39e0a1119bbfa16210a718e81cdd1b104329f300a8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-cc4506e6-cfb6-485f-9e7b-3a80985c5a7e, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3)
Dec 05 09:38:56 compute-1 nova_compute[189066]: 2025-12-05 09:38:56.041 189070 INFO nova.virt.libvirt.driver [-] [instance: 6d29aa46-325d-41f0-9726-7eb3e727aab4] Instance destroyed successfully.
Dec 05 09:38:56 compute-1 nova_compute[189066]: 2025-12-05 09:38:56.042 189070 DEBUG nova.objects.instance [None req-21d68eba-0fe2-4104-92b3-7fe94e8cb1dc 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Lazy-loading 'resources' on Instance uuid 6d29aa46-325d-41f0-9726-7eb3e727aab4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 09:38:56 compute-1 systemd[1]: libpod-conmon-6751f3027caf28ec95029c39e0a1119bbfa16210a718e81cdd1b104329f300a8.scope: Deactivated successfully.
Dec 05 09:38:56 compute-1 nova_compute[189066]: 2025-12-05 09:38:56.059 189070 DEBUG nova.virt.libvirt.vif [None req-21d68eba-0fe2-4104-92b3-7fe94e8cb1dc 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-05T09:38:39Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-137292075',display_name='tempest-TestNetworkBasicOps-server-137292075',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-137292075',id=40,image_ref='3ebffd97-b242-42d7-b245-ebdaf8e4377c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBBYkd+LG9XLs4SavaCNkHkE3ehx/EikY9dqHG31SVnXmqKcLCEPdT3GmKjeUultSC2Kd6kMXS8cYi0YDAGEAGxdGJZZ1TtOeZraf//ClPqOEJb5BPrshtg+A2mkYLqgvjw==',key_name='tempest-TestNetworkBasicOps-460138570',keypairs=<?>,launch_index=0,launched_at=2025-12-05T09:38:49Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='f918144b49634ed5a43d75f8f7d194d3',ramdisk_id='',reservation_id='r-gx0u88vz',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='3ebffd97-b242-42d7-b245-ebdaf8e4377c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1341993432',owner_user_name='tempest-TestNetworkBasicOps-1341993432-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-05T09:38:49Z,user_data=None,user_id='822a2d80e80e46d9a5a49e1e9560e0d9',uuid=6d29aa46-325d-41f0-9726-7eb3e727aab4,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "e895fa16-2797-4f30-b0cc-644ea2908267", "address": "fa:16:3e:ae:d0:75", "network": {"id": "cc4506e6-cfb6-485f-9e7b-3a80985c5a7e", "bridge": "br-int", "label": "tempest-network-smoke--1626007660", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f918144b49634ed5a43d75f8f7d194d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape895fa16-27", "ovs_interfaceid": "e895fa16-2797-4f30-b0cc-644ea2908267", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Dec 05 09:38:56 compute-1 nova_compute[189066]: 2025-12-05 09:38:56.059 189070 DEBUG nova.network.os_vif_util [None req-21d68eba-0fe2-4104-92b3-7fe94e8cb1dc 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Converting VIF {"id": "e895fa16-2797-4f30-b0cc-644ea2908267", "address": "fa:16:3e:ae:d0:75", "network": {"id": "cc4506e6-cfb6-485f-9e7b-3a80985c5a7e", "bridge": "br-int", "label": "tempest-network-smoke--1626007660", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f918144b49634ed5a43d75f8f7d194d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape895fa16-27", "ovs_interfaceid": "e895fa16-2797-4f30-b0cc-644ea2908267", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 09:38:56 compute-1 nova_compute[189066]: 2025-12-05 09:38:56.060 189070 DEBUG nova.network.os_vif_util [None req-21d68eba-0fe2-4104-92b3-7fe94e8cb1dc 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ae:d0:75,bridge_name='br-int',has_traffic_filtering=True,id=e895fa16-2797-4f30-b0cc-644ea2908267,network=Network(cc4506e6-cfb6-485f-9e7b-3a80985c5a7e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tape895fa16-27') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 09:38:56 compute-1 nova_compute[189066]: 2025-12-05 09:38:56.060 189070 DEBUG os_vif [None req-21d68eba-0fe2-4104-92b3-7fe94e8cb1dc 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:ae:d0:75,bridge_name='br-int',has_traffic_filtering=True,id=e895fa16-2797-4f30-b0cc-644ea2908267,network=Network(cc4506e6-cfb6-485f-9e7b-3a80985c5a7e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tape895fa16-27') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Dec 05 09:38:56 compute-1 nova_compute[189066]: 2025-12-05 09:38:56.063 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:38:56 compute-1 nova_compute[189066]: 2025-12-05 09:38:56.063 189070 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tape895fa16-27, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 09:38:56 compute-1 nova_compute[189066]: 2025-12-05 09:38:56.065 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:38:56 compute-1 nova_compute[189066]: 2025-12-05 09:38:56.075 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:38:56 compute-1 nova_compute[189066]: 2025-12-05 09:38:56.079 189070 INFO os_vif [None req-21d68eba-0fe2-4104-92b3-7fe94e8cb1dc 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:ae:d0:75,bridge_name='br-int',has_traffic_filtering=True,id=e895fa16-2797-4f30-b0cc-644ea2908267,network=Network(cc4506e6-cfb6-485f-9e7b-3a80985c5a7e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tape895fa16-27')
Dec 05 09:38:56 compute-1 nova_compute[189066]: 2025-12-05 09:38:56.079 189070 INFO nova.virt.libvirt.driver [None req-21d68eba-0fe2-4104-92b3-7fe94e8cb1dc 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: 6d29aa46-325d-41f0-9726-7eb3e727aab4] Deleting instance files /var/lib/nova/instances/6d29aa46-325d-41f0-9726-7eb3e727aab4_del
Dec 05 09:38:56 compute-1 nova_compute[189066]: 2025-12-05 09:38:56.080 189070 INFO nova.virt.libvirt.driver [None req-21d68eba-0fe2-4104-92b3-7fe94e8cb1dc 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: 6d29aa46-325d-41f0-9726-7eb3e727aab4] Deletion of /var/lib/nova/instances/6d29aa46-325d-41f0-9726-7eb3e727aab4_del complete
Dec 05 09:38:56 compute-1 podman[230313]: 2025-12-05 09:38:56.119423006 +0000 UTC m=+0.062959991 container remove 6751f3027caf28ec95029c39e0a1119bbfa16210a718e81cdd1b104329f300a8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-cc4506e6-cfb6-485f-9e7b-3a80985c5a7e, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 05 09:38:56 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:38:56.126 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[4b761c1c-4579-473e-96df-039a79a01f6e]: (4, ('Fri Dec  5 09:38:55 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-cc4506e6-cfb6-485f-9e7b-3a80985c5a7e (6751f3027caf28ec95029c39e0a1119bbfa16210a718e81cdd1b104329f300a8)\n6751f3027caf28ec95029c39e0a1119bbfa16210a718e81cdd1b104329f300a8\nFri Dec  5 09:38:56 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-cc4506e6-cfb6-485f-9e7b-3a80985c5a7e (6751f3027caf28ec95029c39e0a1119bbfa16210a718e81cdd1b104329f300a8)\n6751f3027caf28ec95029c39e0a1119bbfa16210a718e81cdd1b104329f300a8\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:38:56 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:38:56.129 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[d3f7a5bf-c698-45ae-a4ff-d41a2de2354d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:38:56 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:38:56.130 105272 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapcc4506e6-c0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 09:38:56 compute-1 kernel: tapcc4506e6-c0: left promiscuous mode
Dec 05 09:38:56 compute-1 nova_compute[189066]: 2025-12-05 09:38:56.175 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:38:56 compute-1 nova_compute[189066]: 2025-12-05 09:38:56.180 189070 INFO nova.compute.manager [None req-21d68eba-0fe2-4104-92b3-7fe94e8cb1dc 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: 6d29aa46-325d-41f0-9726-7eb3e727aab4] Took 0.42 seconds to destroy the instance on the hypervisor.
Dec 05 09:38:56 compute-1 nova_compute[189066]: 2025-12-05 09:38:56.181 189070 DEBUG oslo.service.loopingcall [None req-21d68eba-0fe2-4104-92b3-7fe94e8cb1dc 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Dec 05 09:38:56 compute-1 nova_compute[189066]: 2025-12-05 09:38:56.181 189070 DEBUG nova.compute.manager [-] [instance: 6d29aa46-325d-41f0-9726-7eb3e727aab4] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Dec 05 09:38:56 compute-1 nova_compute[189066]: 2025-12-05 09:38:56.181 189070 DEBUG nova.network.neutron [-] [instance: 6d29aa46-325d-41f0-9726-7eb3e727aab4] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Dec 05 09:38:56 compute-1 nova_compute[189066]: 2025-12-05 09:38:56.185 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:38:56 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:38:56.187 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[b27b0ed6-ca3a-4491-8406-24d27b84f296]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:38:56 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:38:56.202 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[44ca5ecd-e929-43ac-91e3-c428f10d4688]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:38:56 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:38:56.203 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[5d178b71-cb67-4c93-ad55-b4ea8213a148]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:38:56 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:38:56.227 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[7a376f86-312d-4984-a902-e5088215ac88]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 480005, 'reachable_time': 41860, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 230330, 'error': None, 'target': 'ovnmeta-cc4506e6-cfb6-485f-9e7b-3a80985c5a7e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:38:56 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:38:56.232 105809 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-cc4506e6-cfb6-485f-9e7b-3a80985c5a7e deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Dec 05 09:38:56 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:38:56.232 105809 DEBUG oslo.privsep.daemon [-] privsep: reply[0a1cead4-605e-4208-8bc6-2727aa070a20]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:38:56 compute-1 systemd[1]: run-netns-ovnmeta\x2dcc4506e6\x2dcfb6\x2d485f\x2d9e7b\x2d3a80985c5a7e.mount: Deactivated successfully.
Dec 05 09:38:56 compute-1 nova_compute[189066]: 2025-12-05 09:38:56.854 189070 DEBUG nova.compute.manager [req-edd01edc-79fe-4393-925b-99a528563658 req-8e5f352d-5ce2-4bea-8969-2749de8f278a 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 6d29aa46-325d-41f0-9726-7eb3e727aab4] Received event network-vif-unplugged-e895fa16-2797-4f30-b0cc-644ea2908267 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 09:38:56 compute-1 nova_compute[189066]: 2025-12-05 09:38:56.855 189070 DEBUG oslo_concurrency.lockutils [req-edd01edc-79fe-4393-925b-99a528563658 req-8e5f352d-5ce2-4bea-8969-2749de8f278a 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Acquiring lock "6d29aa46-325d-41f0-9726-7eb3e727aab4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:38:56 compute-1 nova_compute[189066]: 2025-12-05 09:38:56.855 189070 DEBUG oslo_concurrency.lockutils [req-edd01edc-79fe-4393-925b-99a528563658 req-8e5f352d-5ce2-4bea-8969-2749de8f278a 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Lock "6d29aa46-325d-41f0-9726-7eb3e727aab4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:38:56 compute-1 nova_compute[189066]: 2025-12-05 09:38:56.855 189070 DEBUG oslo_concurrency.lockutils [req-edd01edc-79fe-4393-925b-99a528563658 req-8e5f352d-5ce2-4bea-8969-2749de8f278a 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Lock "6d29aa46-325d-41f0-9726-7eb3e727aab4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:38:56 compute-1 nova_compute[189066]: 2025-12-05 09:38:56.856 189070 DEBUG nova.compute.manager [req-edd01edc-79fe-4393-925b-99a528563658 req-8e5f352d-5ce2-4bea-8969-2749de8f278a 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 6d29aa46-325d-41f0-9726-7eb3e727aab4] No waiting events found dispatching network-vif-unplugged-e895fa16-2797-4f30-b0cc-644ea2908267 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 09:38:56 compute-1 nova_compute[189066]: 2025-12-05 09:38:56.856 189070 DEBUG nova.compute.manager [req-edd01edc-79fe-4393-925b-99a528563658 req-8e5f352d-5ce2-4bea-8969-2749de8f278a 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 6d29aa46-325d-41f0-9726-7eb3e727aab4] Received event network-vif-unplugged-e895fa16-2797-4f30-b0cc-644ea2908267 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Dec 05 09:38:57 compute-1 nova_compute[189066]: 2025-12-05 09:38:57.266 189070 DEBUG nova.network.neutron [req-d90ee646-d667-4a90-bc9e-2c991768afed req-dfd55115-1482-4240-bad3-2381f204c017 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 6d29aa46-325d-41f0-9726-7eb3e727aab4] Updated VIF entry in instance network info cache for port e895fa16-2797-4f30-b0cc-644ea2908267. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Dec 05 09:38:57 compute-1 nova_compute[189066]: 2025-12-05 09:38:57.267 189070 DEBUG nova.network.neutron [req-d90ee646-d667-4a90-bc9e-2c991768afed req-dfd55115-1482-4240-bad3-2381f204c017 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 6d29aa46-325d-41f0-9726-7eb3e727aab4] Updating instance_info_cache with network_info: [{"id": "e895fa16-2797-4f30-b0cc-644ea2908267", "address": "fa:16:3e:ae:d0:75", "network": {"id": "cc4506e6-cfb6-485f-9e7b-3a80985c5a7e", "bridge": "br-int", "label": "tempest-network-smoke--1626007660", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.187", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f918144b49634ed5a43d75f8f7d194d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape895fa16-27", "ovs_interfaceid": "e895fa16-2797-4f30-b0cc-644ea2908267", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 09:38:57 compute-1 nova_compute[189066]: 2025-12-05 09:38:57.289 189070 DEBUG oslo_concurrency.lockutils [req-d90ee646-d667-4a90-bc9e-2c991768afed req-dfd55115-1482-4240-bad3-2381f204c017 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Releasing lock "refresh_cache-6d29aa46-325d-41f0-9726-7eb3e727aab4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 09:38:57 compute-1 podman[230331]: 2025-12-05 09:38:57.651255588 +0000 UTC m=+0.090320761 container health_status 0cdc951f3b455a1459471b20e1f1626ca53f702831c125f835d1b670aeb17ccf (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a42d21893a42142b0b00e96296a13ac9d2c0fedf55d6438ad104fd0309c63376-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 05 09:38:57 compute-1 nova_compute[189066]: 2025-12-05 09:38:57.843 189070 DEBUG nova.network.neutron [-] [instance: 6d29aa46-325d-41f0-9726-7eb3e727aab4] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 09:38:57 compute-1 nova_compute[189066]: 2025-12-05 09:38:57.867 189070 INFO nova.compute.manager [-] [instance: 6d29aa46-325d-41f0-9726-7eb3e727aab4] Took 1.69 seconds to deallocate network for instance.
Dec 05 09:38:57 compute-1 nova_compute[189066]: 2025-12-05 09:38:57.946 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:38:57 compute-1 nova_compute[189066]: 2025-12-05 09:38:57.951 189070 DEBUG oslo_concurrency.lockutils [None req-21d68eba-0fe2-4104-92b3-7fe94e8cb1dc 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:38:57 compute-1 nova_compute[189066]: 2025-12-05 09:38:57.952 189070 DEBUG oslo_concurrency.lockutils [None req-21d68eba-0fe2-4104-92b3-7fe94e8cb1dc 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:38:58 compute-1 nova_compute[189066]: 2025-12-05 09:38:58.027 189070 DEBUG nova.compute.provider_tree [None req-21d68eba-0fe2-4104-92b3-7fe94e8cb1dc 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Inventory has not changed in ProviderTree for provider: be68f9f1-7820-4bfa-8dbd-210e13729f64 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 09:38:58 compute-1 nova_compute[189066]: 2025-12-05 09:38:58.049 189070 DEBUG nova.scheduler.client.report [None req-21d68eba-0fe2-4104-92b3-7fe94e8cb1dc 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Inventory has not changed for provider be68f9f1-7820-4bfa-8dbd-210e13729f64 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 09:38:58 compute-1 nova_compute[189066]: 2025-12-05 09:38:58.075 189070 DEBUG oslo_concurrency.lockutils [None req-21d68eba-0fe2-4104-92b3-7fe94e8cb1dc 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.124s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:38:58 compute-1 nova_compute[189066]: 2025-12-05 09:38:58.122 189070 INFO nova.scheduler.client.report [None req-21d68eba-0fe2-4104-92b3-7fe94e8cb1dc 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Deleted allocations for instance 6d29aa46-325d-41f0-9726-7eb3e727aab4
Dec 05 09:38:58 compute-1 nova_compute[189066]: 2025-12-05 09:38:58.222 189070 DEBUG oslo_concurrency.lockutils [None req-21d68eba-0fe2-4104-92b3-7fe94e8cb1dc 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Lock "6d29aa46-325d-41f0-9726-7eb3e727aab4" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.461s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:38:59 compute-1 nova_compute[189066]: 2025-12-05 09:38:59.372 189070 DEBUG nova.compute.manager [req-2ab49e75-a7f3-4ca7-8c9d-7c69565aa57f req-a551cc23-f951-43fe-9855-a265ef528beb 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 6d29aa46-325d-41f0-9726-7eb3e727aab4] Received event network-vif-plugged-e895fa16-2797-4f30-b0cc-644ea2908267 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 09:38:59 compute-1 nova_compute[189066]: 2025-12-05 09:38:59.373 189070 DEBUG oslo_concurrency.lockutils [req-2ab49e75-a7f3-4ca7-8c9d-7c69565aa57f req-a551cc23-f951-43fe-9855-a265ef528beb 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Acquiring lock "6d29aa46-325d-41f0-9726-7eb3e727aab4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:38:59 compute-1 nova_compute[189066]: 2025-12-05 09:38:59.373 189070 DEBUG oslo_concurrency.lockutils [req-2ab49e75-a7f3-4ca7-8c9d-7c69565aa57f req-a551cc23-f951-43fe-9855-a265ef528beb 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Lock "6d29aa46-325d-41f0-9726-7eb3e727aab4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:38:59 compute-1 nova_compute[189066]: 2025-12-05 09:38:59.373 189070 DEBUG oslo_concurrency.lockutils [req-2ab49e75-a7f3-4ca7-8c9d-7c69565aa57f req-a551cc23-f951-43fe-9855-a265ef528beb 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Lock "6d29aa46-325d-41f0-9726-7eb3e727aab4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:38:59 compute-1 nova_compute[189066]: 2025-12-05 09:38:59.374 189070 DEBUG nova.compute.manager [req-2ab49e75-a7f3-4ca7-8c9d-7c69565aa57f req-a551cc23-f951-43fe-9855-a265ef528beb 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 6d29aa46-325d-41f0-9726-7eb3e727aab4] No waiting events found dispatching network-vif-plugged-e895fa16-2797-4f30-b0cc-644ea2908267 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 09:38:59 compute-1 nova_compute[189066]: 2025-12-05 09:38:59.374 189070 WARNING nova.compute.manager [req-2ab49e75-a7f3-4ca7-8c9d-7c69565aa57f req-a551cc23-f951-43fe-9855-a265ef528beb 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 6d29aa46-325d-41f0-9726-7eb3e727aab4] Received unexpected event network-vif-plugged-e895fa16-2797-4f30-b0cc-644ea2908267 for instance with vm_state deleted and task_state None.
Dec 05 09:39:00 compute-1 nova_compute[189066]: 2025-12-05 09:39:00.277 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:39:00 compute-1 podman[230357]: 2025-12-05 09:39:00.630590948 +0000 UTC m=+0.058916291 container health_status e8cea6352187f28e672aa25c227f2705a90d58074b8cf42235a56df5d4026fd5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a42d21893a42142b0b00e96296a13ac9d2c0fedf55d6438ad104fd0309c63376-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS)
Dec 05 09:39:01 compute-1 nova_compute[189066]: 2025-12-05 09:39:01.067 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:39:02 compute-1 nova_compute[189066]: 2025-12-05 09:39:02.094 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:39:02 compute-1 podman[230376]: 2025-12-05 09:39:02.639932977 +0000 UTC m=+0.078450429 container health_status 3f3288243aefe700d53cffce184353529fbaf6e44f1c19ec4792ed576af41176 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Dec 05 09:39:05 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:39:05.078 105272 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:df:a2:21 10.100.0.2 2001:db8::f816:3eff:fedf:a221'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8::f816:3eff:fedf:a221/64', 'neutron:device_id': 'ovnmeta-73cf2b7d-b284-4d8e-896e-ab561df20f30', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-73cf2b7d-b284-4d8e-896e-ab561df20f30', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fa1cd463d74b49139a088d332d37e611', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=44e16512-8e6e-41d8-8a27-2e539ff56608, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=14946cf1-e45b-478f-9cf6-cbb706b4f055) old=Port_Binding(mac=['fa:16:3e:df:a2:21 10.100.0.2'], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-73cf2b7d-b284-4d8e-896e-ab561df20f30', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-73cf2b7d-b284-4d8e-896e-ab561df20f30', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fa1cd463d74b49139a088d332d37e611', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 09:39:05 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:39:05.080 105272 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 14946cf1-e45b-478f-9cf6-cbb706b4f055 in datapath 73cf2b7d-b284-4d8e-896e-ab561df20f30 updated
Dec 05 09:39:05 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:39:05.082 105272 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 73cf2b7d-b284-4d8e-896e-ab561df20f30, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 05 09:39:05 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:39:05.084 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[78d21bc3-8d46-4241-8db1-ed9b46dd6738]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:39:05 compute-1 nova_compute[189066]: 2025-12-05 09:39:05.279 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:39:06 compute-1 nova_compute[189066]: 2025-12-05 09:39:06.070 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:39:06 compute-1 podman[230396]: 2025-12-05 09:39:06.624840189 +0000 UTC m=+0.066158658 container health_status b07f36300303a5c1729056a61e344d6c6fd9b2e404082d8e8ce7185d45652c2b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, config_id=openstack_network_exporter, vcs-type=git, managed_by=edpm_ansible, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, name=ubi9-minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, release=1755695350, container_name=openstack_network_exporter)
Dec 05 09:39:06 compute-1 nova_compute[189066]: 2025-12-05 09:39:06.908 189070 DEBUG oslo_concurrency.lockutils [None req-ef5b06bb-9038-4b9c-967b-b34065188b6f 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Acquiring lock "732f3a01-82af-4597-8b48-7cc47c00edb8" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:39:06 compute-1 nova_compute[189066]: 2025-12-05 09:39:06.909 189070 DEBUG oslo_concurrency.lockutils [None req-ef5b06bb-9038-4b9c-967b-b34065188b6f 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Lock "732f3a01-82af-4597-8b48-7cc47c00edb8" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:39:06 compute-1 nova_compute[189066]: 2025-12-05 09:39:06.936 189070 DEBUG nova.compute.manager [None req-ef5b06bb-9038-4b9c-967b-b34065188b6f 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: 732f3a01-82af-4597-8b48-7cc47c00edb8] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Dec 05 09:39:07 compute-1 nova_compute[189066]: 2025-12-05 09:39:07.041 189070 DEBUG oslo_concurrency.lockutils [None req-ef5b06bb-9038-4b9c-967b-b34065188b6f 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:39:07 compute-1 nova_compute[189066]: 2025-12-05 09:39:07.041 189070 DEBUG oslo_concurrency.lockutils [None req-ef5b06bb-9038-4b9c-967b-b34065188b6f 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:39:07 compute-1 nova_compute[189066]: 2025-12-05 09:39:07.049 189070 DEBUG nova.virt.hardware [None req-ef5b06bb-9038-4b9c-967b-b34065188b6f 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Dec 05 09:39:07 compute-1 nova_compute[189066]: 2025-12-05 09:39:07.050 189070 INFO nova.compute.claims [None req-ef5b06bb-9038-4b9c-967b-b34065188b6f 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: 732f3a01-82af-4597-8b48-7cc47c00edb8] Claim successful on node compute-1.ctlplane.example.com
Dec 05 09:39:07 compute-1 nova_compute[189066]: 2025-12-05 09:39:07.218 189070 DEBUG nova.compute.provider_tree [None req-ef5b06bb-9038-4b9c-967b-b34065188b6f 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Inventory has not changed in ProviderTree for provider: be68f9f1-7820-4bfa-8dbd-210e13729f64 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 09:39:07 compute-1 nova_compute[189066]: 2025-12-05 09:39:07.236 189070 DEBUG nova.scheduler.client.report [None req-ef5b06bb-9038-4b9c-967b-b34065188b6f 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Inventory has not changed for provider be68f9f1-7820-4bfa-8dbd-210e13729f64 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 09:39:07 compute-1 nova_compute[189066]: 2025-12-05 09:39:07.259 189070 DEBUG oslo_concurrency.lockutils [None req-ef5b06bb-9038-4b9c-967b-b34065188b6f 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.217s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:39:07 compute-1 nova_compute[189066]: 2025-12-05 09:39:07.260 189070 DEBUG nova.compute.manager [None req-ef5b06bb-9038-4b9c-967b-b34065188b6f 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: 732f3a01-82af-4597-8b48-7cc47c00edb8] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Dec 05 09:39:07 compute-1 nova_compute[189066]: 2025-12-05 09:39:07.324 189070 DEBUG nova.compute.manager [None req-ef5b06bb-9038-4b9c-967b-b34065188b6f 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: 732f3a01-82af-4597-8b48-7cc47c00edb8] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Dec 05 09:39:07 compute-1 nova_compute[189066]: 2025-12-05 09:39:07.326 189070 DEBUG nova.network.neutron [None req-ef5b06bb-9038-4b9c-967b-b34065188b6f 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: 732f3a01-82af-4597-8b48-7cc47c00edb8] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Dec 05 09:39:07 compute-1 nova_compute[189066]: 2025-12-05 09:39:07.357 189070 INFO nova.virt.libvirt.driver [None req-ef5b06bb-9038-4b9c-967b-b34065188b6f 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: 732f3a01-82af-4597-8b48-7cc47c00edb8] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Dec 05 09:39:07 compute-1 nova_compute[189066]: 2025-12-05 09:39:07.378 189070 DEBUG nova.compute.manager [None req-ef5b06bb-9038-4b9c-967b-b34065188b6f 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: 732f3a01-82af-4597-8b48-7cc47c00edb8] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Dec 05 09:39:07 compute-1 nova_compute[189066]: 2025-12-05 09:39:07.808 189070 DEBUG nova.compute.manager [None req-ef5b06bb-9038-4b9c-967b-b34065188b6f 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: 732f3a01-82af-4597-8b48-7cc47c00edb8] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Dec 05 09:39:07 compute-1 nova_compute[189066]: 2025-12-05 09:39:07.812 189070 DEBUG nova.virt.libvirt.driver [None req-ef5b06bb-9038-4b9c-967b-b34065188b6f 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: 732f3a01-82af-4597-8b48-7cc47c00edb8] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Dec 05 09:39:07 compute-1 nova_compute[189066]: 2025-12-05 09:39:07.813 189070 INFO nova.virt.libvirt.driver [None req-ef5b06bb-9038-4b9c-967b-b34065188b6f 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: 732f3a01-82af-4597-8b48-7cc47c00edb8] Creating image(s)
Dec 05 09:39:07 compute-1 nova_compute[189066]: 2025-12-05 09:39:07.814 189070 DEBUG oslo_concurrency.lockutils [None req-ef5b06bb-9038-4b9c-967b-b34065188b6f 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Acquiring lock "/var/lib/nova/instances/732f3a01-82af-4597-8b48-7cc47c00edb8/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:39:07 compute-1 nova_compute[189066]: 2025-12-05 09:39:07.814 189070 DEBUG oslo_concurrency.lockutils [None req-ef5b06bb-9038-4b9c-967b-b34065188b6f 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Lock "/var/lib/nova/instances/732f3a01-82af-4597-8b48-7cc47c00edb8/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:39:07 compute-1 nova_compute[189066]: 2025-12-05 09:39:07.816 189070 DEBUG oslo_concurrency.lockutils [None req-ef5b06bb-9038-4b9c-967b-b34065188b6f 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Lock "/var/lib/nova/instances/732f3a01-82af-4597-8b48-7cc47c00edb8/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:39:07 compute-1 nova_compute[189066]: 2025-12-05 09:39:07.832 189070 DEBUG oslo_concurrency.processutils [None req-ef5b06bb-9038-4b9c-967b-b34065188b6f 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5b1bdb5cc50b9bf43ebe3094e9279a12a18df0ce --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 09:39:07 compute-1 nova_compute[189066]: 2025-12-05 09:39:07.901 189070 DEBUG oslo_concurrency.processutils [None req-ef5b06bb-9038-4b9c-967b-b34065188b6f 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5b1bdb5cc50b9bf43ebe3094e9279a12a18df0ce --force-share --output=json" returned: 0 in 0.069s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 09:39:07 compute-1 nova_compute[189066]: 2025-12-05 09:39:07.903 189070 DEBUG oslo_concurrency.lockutils [None req-ef5b06bb-9038-4b9c-967b-b34065188b6f 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Acquiring lock "5b1bdb5cc50b9bf43ebe3094e9279a12a18df0ce" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:39:07 compute-1 nova_compute[189066]: 2025-12-05 09:39:07.904 189070 DEBUG oslo_concurrency.lockutils [None req-ef5b06bb-9038-4b9c-967b-b34065188b6f 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Lock "5b1bdb5cc50b9bf43ebe3094e9279a12a18df0ce" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:39:07 compute-1 nova_compute[189066]: 2025-12-05 09:39:07.920 189070 DEBUG oslo_concurrency.processutils [None req-ef5b06bb-9038-4b9c-967b-b34065188b6f 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5b1bdb5cc50b9bf43ebe3094e9279a12a18df0ce --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 09:39:07 compute-1 nova_compute[189066]: 2025-12-05 09:39:07.983 189070 DEBUG oslo_concurrency.processutils [None req-ef5b06bb-9038-4b9c-967b-b34065188b6f 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5b1bdb5cc50b9bf43ebe3094e9279a12a18df0ce --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 09:39:07 compute-1 nova_compute[189066]: 2025-12-05 09:39:07.985 189070 DEBUG oslo_concurrency.processutils [None req-ef5b06bb-9038-4b9c-967b-b34065188b6f 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5b1bdb5cc50b9bf43ebe3094e9279a12a18df0ce,backing_fmt=raw /var/lib/nova/instances/732f3a01-82af-4597-8b48-7cc47c00edb8/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 09:39:08 compute-1 nova_compute[189066]: 2025-12-05 09:39:08.029 189070 DEBUG oslo_concurrency.processutils [None req-ef5b06bb-9038-4b9c-967b-b34065188b6f 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5b1bdb5cc50b9bf43ebe3094e9279a12a18df0ce,backing_fmt=raw /var/lib/nova/instances/732f3a01-82af-4597-8b48-7cc47c00edb8/disk 1073741824" returned: 0 in 0.044s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 09:39:08 compute-1 nova_compute[189066]: 2025-12-05 09:39:08.030 189070 DEBUG oslo_concurrency.lockutils [None req-ef5b06bb-9038-4b9c-967b-b34065188b6f 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Lock "5b1bdb5cc50b9bf43ebe3094e9279a12a18df0ce" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.126s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:39:08 compute-1 nova_compute[189066]: 2025-12-05 09:39:08.031 189070 DEBUG oslo_concurrency.processutils [None req-ef5b06bb-9038-4b9c-967b-b34065188b6f 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5b1bdb5cc50b9bf43ebe3094e9279a12a18df0ce --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 09:39:08 compute-1 nova_compute[189066]: 2025-12-05 09:39:08.097 189070 DEBUG oslo_concurrency.processutils [None req-ef5b06bb-9038-4b9c-967b-b34065188b6f 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5b1bdb5cc50b9bf43ebe3094e9279a12a18df0ce --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 09:39:08 compute-1 nova_compute[189066]: 2025-12-05 09:39:08.098 189070 DEBUG nova.virt.disk.api [None req-ef5b06bb-9038-4b9c-967b-b34065188b6f 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Checking if we can resize image /var/lib/nova/instances/732f3a01-82af-4597-8b48-7cc47c00edb8/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Dec 05 09:39:08 compute-1 nova_compute[189066]: 2025-12-05 09:39:08.099 189070 DEBUG oslo_concurrency.processutils [None req-ef5b06bb-9038-4b9c-967b-b34065188b6f 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/732f3a01-82af-4597-8b48-7cc47c00edb8/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 09:39:08 compute-1 nova_compute[189066]: 2025-12-05 09:39:08.166 189070 DEBUG oslo_concurrency.processutils [None req-ef5b06bb-9038-4b9c-967b-b34065188b6f 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/732f3a01-82af-4597-8b48-7cc47c00edb8/disk --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 09:39:08 compute-1 nova_compute[189066]: 2025-12-05 09:39:08.167 189070 DEBUG nova.virt.disk.api [None req-ef5b06bb-9038-4b9c-967b-b34065188b6f 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Cannot resize image /var/lib/nova/instances/732f3a01-82af-4597-8b48-7cc47c00edb8/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Dec 05 09:39:08 compute-1 nova_compute[189066]: 2025-12-05 09:39:08.168 189070 DEBUG nova.objects.instance [None req-ef5b06bb-9038-4b9c-967b-b34065188b6f 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Lazy-loading 'migration_context' on Instance uuid 732f3a01-82af-4597-8b48-7cc47c00edb8 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 09:39:08 compute-1 nova_compute[189066]: 2025-12-05 09:39:08.185 189070 DEBUG nova.virt.libvirt.driver [None req-ef5b06bb-9038-4b9c-967b-b34065188b6f 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: 732f3a01-82af-4597-8b48-7cc47c00edb8] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Dec 05 09:39:08 compute-1 nova_compute[189066]: 2025-12-05 09:39:08.186 189070 DEBUG nova.virt.libvirt.driver [None req-ef5b06bb-9038-4b9c-967b-b34065188b6f 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: 732f3a01-82af-4597-8b48-7cc47c00edb8] Ensure instance console log exists: /var/lib/nova/instances/732f3a01-82af-4597-8b48-7cc47c00edb8/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Dec 05 09:39:08 compute-1 nova_compute[189066]: 2025-12-05 09:39:08.186 189070 DEBUG oslo_concurrency.lockutils [None req-ef5b06bb-9038-4b9c-967b-b34065188b6f 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:39:08 compute-1 nova_compute[189066]: 2025-12-05 09:39:08.187 189070 DEBUG oslo_concurrency.lockutils [None req-ef5b06bb-9038-4b9c-967b-b34065188b6f 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:39:08 compute-1 nova_compute[189066]: 2025-12-05 09:39:08.187 189070 DEBUG oslo_concurrency.lockutils [None req-ef5b06bb-9038-4b9c-967b-b34065188b6f 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:39:08 compute-1 nova_compute[189066]: 2025-12-05 09:39:08.303 189070 DEBUG nova.policy [None req-ef5b06bb-9038-4b9c-967b-b34065188b6f 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '822a2d80e80e46d9a5a49e1e9560e0d9', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'f918144b49634ed5a43d75f8f7d194d3', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Dec 05 09:39:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:39:08.885 105272 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:39:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:39:08.886 105272 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:39:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:39:08.886 105272 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:39:10 compute-1 nova_compute[189066]: 2025-12-05 09:39:10.328 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:39:10 compute-1 podman[230432]: 2025-12-05 09:39:10.645015085 +0000 UTC m=+0.073494748 container health_status b9f45cdcb567bb250381fa80d86c78c226d5638c7de1b6d79ee017275814ff68 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Dec 05 09:39:10 compute-1 nova_compute[189066]: 2025-12-05 09:39:10.725 189070 DEBUG nova.network.neutron [None req-ef5b06bb-9038-4b9c-967b-b34065188b6f 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: 732f3a01-82af-4597-8b48-7cc47c00edb8] Successfully updated port: e895fa16-2797-4f30-b0cc-644ea2908267 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Dec 05 09:39:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:39:10.752 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 09:39:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:39:10.754 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 09:39:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:39:10.754 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 09:39:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:39:10.754 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 09:39:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:39:10.754 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 09:39:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:39:10.754 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 09:39:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:39:10.754 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 09:39:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:39:10.755 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 09:39:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:39:10.756 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 09:39:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:39:10.756 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 09:39:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:39:10.756 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 09:39:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:39:10.756 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 09:39:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:39:10.756 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 09:39:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:39:10.756 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 09:39:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:39:10.756 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 09:39:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:39:10.757 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 09:39:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:39:10.757 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 09:39:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:39:10.757 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 09:39:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:39:10.757 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 09:39:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:39:10.757 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 09:39:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:39:10.757 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 09:39:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:39:10.757 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 09:39:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:39:10.757 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 09:39:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:39:10.757 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 09:39:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:39:10.758 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 09:39:10 compute-1 nova_compute[189066]: 2025-12-05 09:39:10.855 189070 DEBUG nova.compute.manager [req-ec218dae-31ba-48a6-9101-a37b244d77be req-25bf8f83-d15b-4b12-bc93-60314544eabb 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 732f3a01-82af-4597-8b48-7cc47c00edb8] Received event network-changed-e895fa16-2797-4f30-b0cc-644ea2908267 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 09:39:10 compute-1 nova_compute[189066]: 2025-12-05 09:39:10.855 189070 DEBUG nova.compute.manager [req-ec218dae-31ba-48a6-9101-a37b244d77be req-25bf8f83-d15b-4b12-bc93-60314544eabb 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 732f3a01-82af-4597-8b48-7cc47c00edb8] Refreshing instance network info cache due to event network-changed-e895fa16-2797-4f30-b0cc-644ea2908267. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Dec 05 09:39:10 compute-1 nova_compute[189066]: 2025-12-05 09:39:10.855 189070 DEBUG oslo_concurrency.lockutils [req-ec218dae-31ba-48a6-9101-a37b244d77be req-25bf8f83-d15b-4b12-bc93-60314544eabb 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Acquiring lock "refresh_cache-732f3a01-82af-4597-8b48-7cc47c00edb8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 09:39:10 compute-1 nova_compute[189066]: 2025-12-05 09:39:10.856 189070 DEBUG oslo_concurrency.lockutils [req-ec218dae-31ba-48a6-9101-a37b244d77be req-25bf8f83-d15b-4b12-bc93-60314544eabb 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Acquired lock "refresh_cache-732f3a01-82af-4597-8b48-7cc47c00edb8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 09:39:10 compute-1 nova_compute[189066]: 2025-12-05 09:39:10.856 189070 DEBUG nova.network.neutron [req-ec218dae-31ba-48a6-9101-a37b244d77be req-25bf8f83-d15b-4b12-bc93-60314544eabb 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 732f3a01-82af-4597-8b48-7cc47c00edb8] Refreshing network info cache for port e895fa16-2797-4f30-b0cc-644ea2908267 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Dec 05 09:39:10 compute-1 nova_compute[189066]: 2025-12-05 09:39:10.866 189070 DEBUG oslo_concurrency.lockutils [None req-ef5b06bb-9038-4b9c-967b-b34065188b6f 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Acquiring lock "refresh_cache-732f3a01-82af-4597-8b48-7cc47c00edb8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 09:39:11 compute-1 nova_compute[189066]: 2025-12-05 09:39:11.040 189070 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764927536.0388362, 6d29aa46-325d-41f0-9726-7eb3e727aab4 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 09:39:11 compute-1 nova_compute[189066]: 2025-12-05 09:39:11.040 189070 INFO nova.compute.manager [-] [instance: 6d29aa46-325d-41f0-9726-7eb3e727aab4] VM Stopped (Lifecycle Event)
Dec 05 09:39:11 compute-1 nova_compute[189066]: 2025-12-05 09:39:11.068 189070 DEBUG nova.compute.manager [None req-51ac9f8d-e1f0-46db-bd16-cc90f5bb77b3 - - - - - -] [instance: 6d29aa46-325d-41f0-9726-7eb3e727aab4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 09:39:11 compute-1 nova_compute[189066]: 2025-12-05 09:39:11.073 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:39:11 compute-1 nova_compute[189066]: 2025-12-05 09:39:11.090 189070 DEBUG nova.network.neutron [req-ec218dae-31ba-48a6-9101-a37b244d77be req-25bf8f83-d15b-4b12-bc93-60314544eabb 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 732f3a01-82af-4597-8b48-7cc47c00edb8] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Dec 05 09:39:12 compute-1 nova_compute[189066]: 2025-12-05 09:39:12.233 189070 DEBUG nova.network.neutron [req-ec218dae-31ba-48a6-9101-a37b244d77be req-25bf8f83-d15b-4b12-bc93-60314544eabb 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 732f3a01-82af-4597-8b48-7cc47c00edb8] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 09:39:12 compute-1 nova_compute[189066]: 2025-12-05 09:39:12.253 189070 DEBUG oslo_concurrency.lockutils [req-ec218dae-31ba-48a6-9101-a37b244d77be req-25bf8f83-d15b-4b12-bc93-60314544eabb 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Releasing lock "refresh_cache-732f3a01-82af-4597-8b48-7cc47c00edb8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 09:39:12 compute-1 nova_compute[189066]: 2025-12-05 09:39:12.255 189070 DEBUG oslo_concurrency.lockutils [None req-ef5b06bb-9038-4b9c-967b-b34065188b6f 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Acquired lock "refresh_cache-732f3a01-82af-4597-8b48-7cc47c00edb8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 09:39:12 compute-1 nova_compute[189066]: 2025-12-05 09:39:12.256 189070 DEBUG nova.network.neutron [None req-ef5b06bb-9038-4b9c-967b-b34065188b6f 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: 732f3a01-82af-4597-8b48-7cc47c00edb8] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Dec 05 09:39:12 compute-1 nova_compute[189066]: 2025-12-05 09:39:12.818 189070 DEBUG nova.network.neutron [None req-ef5b06bb-9038-4b9c-967b-b34065188b6f 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: 732f3a01-82af-4597-8b48-7cc47c00edb8] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Dec 05 09:39:13 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:39:13.072 105272 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:df:a2:21 10.100.0.2 2001:db8:0:1:f816:3eff:fedf:a221 2001:db8::f816:3eff:fedf:a221'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8:0:1:f816:3eff:fedf:a221/64 2001:db8::f816:3eff:fedf:a221/64', 'neutron:device_id': 'ovnmeta-73cf2b7d-b284-4d8e-896e-ab561df20f30', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-73cf2b7d-b284-4d8e-896e-ab561df20f30', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fa1cd463d74b49139a088d332d37e611', 'neutron:revision_number': '4', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=44e16512-8e6e-41d8-8a27-2e539ff56608, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=14946cf1-e45b-478f-9cf6-cbb706b4f055) old=Port_Binding(mac=['fa:16:3e:df:a2:21 10.100.0.2 2001:db8::f816:3eff:fedf:a221'], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8::f816:3eff:fedf:a221/64', 'neutron:device_id': 'ovnmeta-73cf2b7d-b284-4d8e-896e-ab561df20f30', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-73cf2b7d-b284-4d8e-896e-ab561df20f30', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fa1cd463d74b49139a088d332d37e611', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 09:39:13 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:39:13.075 105272 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 14946cf1-e45b-478f-9cf6-cbb706b4f055 in datapath 73cf2b7d-b284-4d8e-896e-ab561df20f30 updated
Dec 05 09:39:13 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:39:13.077 105272 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 73cf2b7d-b284-4d8e-896e-ab561df20f30, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 05 09:39:13 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:39:13.080 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[73ecea29-00e1-4db9-8a84-395e64e73034]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:39:14 compute-1 nova_compute[189066]: 2025-12-05 09:39:14.254 189070 DEBUG nova.network.neutron [None req-ef5b06bb-9038-4b9c-967b-b34065188b6f 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: 732f3a01-82af-4597-8b48-7cc47c00edb8] Updating instance_info_cache with network_info: [{"id": "e895fa16-2797-4f30-b0cc-644ea2908267", "address": "fa:16:3e:ae:d0:75", "network": {"id": "cc4506e6-cfb6-485f-9e7b-3a80985c5a7e", "bridge": "br-int", "label": "tempest-network-smoke--1626007660", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.187", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f918144b49634ed5a43d75f8f7d194d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape895fa16-27", "ovs_interfaceid": "e895fa16-2797-4f30-b0cc-644ea2908267", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 09:39:14 compute-1 nova_compute[189066]: 2025-12-05 09:39:14.290 189070 DEBUG oslo_concurrency.lockutils [None req-ef5b06bb-9038-4b9c-967b-b34065188b6f 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Releasing lock "refresh_cache-732f3a01-82af-4597-8b48-7cc47c00edb8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 09:39:14 compute-1 nova_compute[189066]: 2025-12-05 09:39:14.291 189070 DEBUG nova.compute.manager [None req-ef5b06bb-9038-4b9c-967b-b34065188b6f 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: 732f3a01-82af-4597-8b48-7cc47c00edb8] Instance network_info: |[{"id": "e895fa16-2797-4f30-b0cc-644ea2908267", "address": "fa:16:3e:ae:d0:75", "network": {"id": "cc4506e6-cfb6-485f-9e7b-3a80985c5a7e", "bridge": "br-int", "label": "tempest-network-smoke--1626007660", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.187", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f918144b49634ed5a43d75f8f7d194d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape895fa16-27", "ovs_interfaceid": "e895fa16-2797-4f30-b0cc-644ea2908267", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Dec 05 09:39:14 compute-1 nova_compute[189066]: 2025-12-05 09:39:14.293 189070 DEBUG nova.virt.libvirt.driver [None req-ef5b06bb-9038-4b9c-967b-b34065188b6f 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: 732f3a01-82af-4597-8b48-7cc47c00edb8] Start _get_guest_xml network_info=[{"id": "e895fa16-2797-4f30-b0cc-644ea2908267", "address": "fa:16:3e:ae:d0:75", "network": {"id": "cc4506e6-cfb6-485f-9e7b-3a80985c5a7e", "bridge": "br-int", "label": "tempest-network-smoke--1626007660", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.187", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f918144b49634ed5a43d75f8f7d194d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape895fa16-27", "ovs_interfaceid": "e895fa16-2797-4f30-b0cc-644ea2908267", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T09:19:04Z,direct_url=<?>,disk_format='qcow2',id=3ebffd97-b242-42d7-b245-ebdaf8e4377c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='1cb29f3743454b7fb71af92993455437',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T09:19:06Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encrypted': False, 'encryption_secret_uuid': None, 'size': 0, 'boot_index': 0, 'device_name': '/dev/vda', 'disk_bus': 'virtio', 'guest_format': None, 'encryption_options': None, 'encryption_format': None, 'device_type': 'disk', 'image_id': '3ebffd97-b242-42d7-b245-ebdaf8e4377c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Dec 05 09:39:14 compute-1 nova_compute[189066]: 2025-12-05 09:39:14.298 189070 WARNING nova.virt.libvirt.driver [None req-ef5b06bb-9038-4b9c-967b-b34065188b6f 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 05 09:39:14 compute-1 nova_compute[189066]: 2025-12-05 09:39:14.305 189070 DEBUG nova.virt.libvirt.host [None req-ef5b06bb-9038-4b9c-967b-b34065188b6f 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Dec 05 09:39:14 compute-1 nova_compute[189066]: 2025-12-05 09:39:14.306 189070 DEBUG nova.virt.libvirt.host [None req-ef5b06bb-9038-4b9c-967b-b34065188b6f 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Dec 05 09:39:14 compute-1 nova_compute[189066]: 2025-12-05 09:39:14.310 189070 DEBUG nova.virt.libvirt.host [None req-ef5b06bb-9038-4b9c-967b-b34065188b6f 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Dec 05 09:39:14 compute-1 nova_compute[189066]: 2025-12-05 09:39:14.310 189070 DEBUG nova.virt.libvirt.host [None req-ef5b06bb-9038-4b9c-967b-b34065188b6f 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Dec 05 09:39:14 compute-1 nova_compute[189066]: 2025-12-05 09:39:14.312 189070 DEBUG nova.virt.libvirt.driver [None req-ef5b06bb-9038-4b9c-967b-b34065188b6f 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Dec 05 09:39:14 compute-1 nova_compute[189066]: 2025-12-05 09:39:14.312 189070 DEBUG nova.virt.hardware [None req-ef5b06bb-9038-4b9c-967b-b34065188b6f 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-05T09:19:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='fbadeab4-f24f-4100-963a-d228b2a6f7c4',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T09:19:04Z,direct_url=<?>,disk_format='qcow2',id=3ebffd97-b242-42d7-b245-ebdaf8e4377c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='1cb29f3743454b7fb71af92993455437',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T09:19:06Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Dec 05 09:39:14 compute-1 nova_compute[189066]: 2025-12-05 09:39:14.312 189070 DEBUG nova.virt.hardware [None req-ef5b06bb-9038-4b9c-967b-b34065188b6f 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Dec 05 09:39:14 compute-1 nova_compute[189066]: 2025-12-05 09:39:14.313 189070 DEBUG nova.virt.hardware [None req-ef5b06bb-9038-4b9c-967b-b34065188b6f 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Dec 05 09:39:14 compute-1 nova_compute[189066]: 2025-12-05 09:39:14.313 189070 DEBUG nova.virt.hardware [None req-ef5b06bb-9038-4b9c-967b-b34065188b6f 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Dec 05 09:39:14 compute-1 nova_compute[189066]: 2025-12-05 09:39:14.313 189070 DEBUG nova.virt.hardware [None req-ef5b06bb-9038-4b9c-967b-b34065188b6f 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Dec 05 09:39:14 compute-1 nova_compute[189066]: 2025-12-05 09:39:14.314 189070 DEBUG nova.virt.hardware [None req-ef5b06bb-9038-4b9c-967b-b34065188b6f 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Dec 05 09:39:14 compute-1 nova_compute[189066]: 2025-12-05 09:39:14.314 189070 DEBUG nova.virt.hardware [None req-ef5b06bb-9038-4b9c-967b-b34065188b6f 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Dec 05 09:39:14 compute-1 nova_compute[189066]: 2025-12-05 09:39:14.314 189070 DEBUG nova.virt.hardware [None req-ef5b06bb-9038-4b9c-967b-b34065188b6f 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Dec 05 09:39:14 compute-1 nova_compute[189066]: 2025-12-05 09:39:14.314 189070 DEBUG nova.virt.hardware [None req-ef5b06bb-9038-4b9c-967b-b34065188b6f 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Dec 05 09:39:14 compute-1 nova_compute[189066]: 2025-12-05 09:39:14.314 189070 DEBUG nova.virt.hardware [None req-ef5b06bb-9038-4b9c-967b-b34065188b6f 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Dec 05 09:39:14 compute-1 nova_compute[189066]: 2025-12-05 09:39:14.315 189070 DEBUG nova.virt.hardware [None req-ef5b06bb-9038-4b9c-967b-b34065188b6f 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Dec 05 09:39:14 compute-1 nova_compute[189066]: 2025-12-05 09:39:14.319 189070 DEBUG nova.virt.libvirt.vif [None req-ef5b06bb-9038-4b9c-967b-b34065188b6f 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T09:39:05Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-253118960',display_name='tempest-TestNetworkBasicOps-server-253118960',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-253118960',id=41,image_ref='3ebffd97-b242-42d7-b245-ebdaf8e4377c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBE+hv27qgDRMBsfX5jXb8iLomInOpAxFJUfcjC+wXB9YVN42xsS1HtZGTRbX0bIYCe5S8Lfbh8PkI5HycuN5Bbq7EDH5z2L3ZwqGtpY36grPS6JgNrMhfQUlv20UszIIbA==',key_name='tempest-TestNetworkBasicOps-116725453',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='f918144b49634ed5a43d75f8f7d194d3',ramdisk_id='',reservation_id='r-bi9014r1',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='3ebffd97-b242-42d7-b245-ebdaf8e4377c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1341993432',owner_user_name='tempest-TestNetworkBasicOps-1341993432-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T09:39:07Z,user_data=None,user_id='822a2d80e80e46d9a5a49e1e9560e0d9',uuid=732f3a01-82af-4597-8b48-7cc47c00edb8,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "e895fa16-2797-4f30-b0cc-644ea2908267", "address": "fa:16:3e:ae:d0:75", "network": {"id": "cc4506e6-cfb6-485f-9e7b-3a80985c5a7e", "bridge": "br-int", "label": "tempest-network-smoke--1626007660", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.187", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f918144b49634ed5a43d75f8f7d194d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape895fa16-27", "ovs_interfaceid": "e895fa16-2797-4f30-b0cc-644ea2908267", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Dec 05 09:39:14 compute-1 nova_compute[189066]: 2025-12-05 09:39:14.319 189070 DEBUG nova.network.os_vif_util [None req-ef5b06bb-9038-4b9c-967b-b34065188b6f 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Converting VIF {"id": "e895fa16-2797-4f30-b0cc-644ea2908267", "address": "fa:16:3e:ae:d0:75", "network": {"id": "cc4506e6-cfb6-485f-9e7b-3a80985c5a7e", "bridge": "br-int", "label": "tempest-network-smoke--1626007660", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.187", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f918144b49634ed5a43d75f8f7d194d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape895fa16-27", "ovs_interfaceid": "e895fa16-2797-4f30-b0cc-644ea2908267", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 09:39:14 compute-1 nova_compute[189066]: 2025-12-05 09:39:14.320 189070 DEBUG nova.network.os_vif_util [None req-ef5b06bb-9038-4b9c-967b-b34065188b6f 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ae:d0:75,bridge_name='br-int',has_traffic_filtering=True,id=e895fa16-2797-4f30-b0cc-644ea2908267,network=Network(cc4506e6-cfb6-485f-9e7b-3a80985c5a7e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tape895fa16-27') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 09:39:14 compute-1 nova_compute[189066]: 2025-12-05 09:39:14.321 189070 DEBUG nova.objects.instance [None req-ef5b06bb-9038-4b9c-967b-b34065188b6f 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Lazy-loading 'pci_devices' on Instance uuid 732f3a01-82af-4597-8b48-7cc47c00edb8 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 09:39:14 compute-1 nova_compute[189066]: 2025-12-05 09:39:14.348 189070 DEBUG nova.virt.libvirt.driver [None req-ef5b06bb-9038-4b9c-967b-b34065188b6f 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: 732f3a01-82af-4597-8b48-7cc47c00edb8] End _get_guest_xml xml=<domain type="kvm">
Dec 05 09:39:14 compute-1 nova_compute[189066]:   <uuid>732f3a01-82af-4597-8b48-7cc47c00edb8</uuid>
Dec 05 09:39:14 compute-1 nova_compute[189066]:   <name>instance-00000029</name>
Dec 05 09:39:14 compute-1 nova_compute[189066]:   <memory>131072</memory>
Dec 05 09:39:14 compute-1 nova_compute[189066]:   <vcpu>1</vcpu>
Dec 05 09:39:14 compute-1 nova_compute[189066]:   <metadata>
Dec 05 09:39:14 compute-1 nova_compute[189066]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 05 09:39:14 compute-1 nova_compute[189066]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec 05 09:39:14 compute-1 nova_compute[189066]:       <nova:name>tempest-TestNetworkBasicOps-server-253118960</nova:name>
Dec 05 09:39:14 compute-1 nova_compute[189066]:       <nova:creationTime>2025-12-05 09:39:14</nova:creationTime>
Dec 05 09:39:14 compute-1 nova_compute[189066]:       <nova:flavor name="m1.nano">
Dec 05 09:39:14 compute-1 nova_compute[189066]:         <nova:memory>128</nova:memory>
Dec 05 09:39:14 compute-1 nova_compute[189066]:         <nova:disk>1</nova:disk>
Dec 05 09:39:14 compute-1 nova_compute[189066]:         <nova:swap>0</nova:swap>
Dec 05 09:39:14 compute-1 nova_compute[189066]:         <nova:ephemeral>0</nova:ephemeral>
Dec 05 09:39:14 compute-1 nova_compute[189066]:         <nova:vcpus>1</nova:vcpus>
Dec 05 09:39:14 compute-1 nova_compute[189066]:       </nova:flavor>
Dec 05 09:39:14 compute-1 nova_compute[189066]:       <nova:owner>
Dec 05 09:39:14 compute-1 nova_compute[189066]:         <nova:user uuid="822a2d80e80e46d9a5a49e1e9560e0d9">tempest-TestNetworkBasicOps-1341993432-project-member</nova:user>
Dec 05 09:39:14 compute-1 nova_compute[189066]:         <nova:project uuid="f918144b49634ed5a43d75f8f7d194d3">tempest-TestNetworkBasicOps-1341993432</nova:project>
Dec 05 09:39:14 compute-1 nova_compute[189066]:       </nova:owner>
Dec 05 09:39:14 compute-1 nova_compute[189066]:       <nova:root type="image" uuid="3ebffd97-b242-42d7-b245-ebdaf8e4377c"/>
Dec 05 09:39:14 compute-1 nova_compute[189066]:       <nova:ports>
Dec 05 09:39:14 compute-1 nova_compute[189066]:         <nova:port uuid="e895fa16-2797-4f30-b0cc-644ea2908267">
Dec 05 09:39:14 compute-1 nova_compute[189066]:           <nova:ip type="fixed" address="10.100.0.3" ipVersion="4"/>
Dec 05 09:39:14 compute-1 nova_compute[189066]:         </nova:port>
Dec 05 09:39:14 compute-1 nova_compute[189066]:       </nova:ports>
Dec 05 09:39:14 compute-1 nova_compute[189066]:     </nova:instance>
Dec 05 09:39:14 compute-1 nova_compute[189066]:   </metadata>
Dec 05 09:39:14 compute-1 nova_compute[189066]:   <sysinfo type="smbios">
Dec 05 09:39:14 compute-1 nova_compute[189066]:     <system>
Dec 05 09:39:14 compute-1 nova_compute[189066]:       <entry name="manufacturer">RDO</entry>
Dec 05 09:39:14 compute-1 nova_compute[189066]:       <entry name="product">OpenStack Compute</entry>
Dec 05 09:39:14 compute-1 nova_compute[189066]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec 05 09:39:14 compute-1 nova_compute[189066]:       <entry name="serial">732f3a01-82af-4597-8b48-7cc47c00edb8</entry>
Dec 05 09:39:14 compute-1 nova_compute[189066]:       <entry name="uuid">732f3a01-82af-4597-8b48-7cc47c00edb8</entry>
Dec 05 09:39:14 compute-1 nova_compute[189066]:       <entry name="family">Virtual Machine</entry>
Dec 05 09:39:14 compute-1 nova_compute[189066]:     </system>
Dec 05 09:39:14 compute-1 nova_compute[189066]:   </sysinfo>
Dec 05 09:39:14 compute-1 nova_compute[189066]:   <os>
Dec 05 09:39:14 compute-1 nova_compute[189066]:     <type arch="x86_64" machine="q35">hvm</type>
Dec 05 09:39:14 compute-1 nova_compute[189066]:     <boot dev="hd"/>
Dec 05 09:39:14 compute-1 nova_compute[189066]:     <smbios mode="sysinfo"/>
Dec 05 09:39:14 compute-1 nova_compute[189066]:   </os>
Dec 05 09:39:14 compute-1 nova_compute[189066]:   <features>
Dec 05 09:39:14 compute-1 nova_compute[189066]:     <acpi/>
Dec 05 09:39:14 compute-1 nova_compute[189066]:     <apic/>
Dec 05 09:39:14 compute-1 nova_compute[189066]:     <vmcoreinfo/>
Dec 05 09:39:14 compute-1 nova_compute[189066]:   </features>
Dec 05 09:39:14 compute-1 nova_compute[189066]:   <clock offset="utc">
Dec 05 09:39:14 compute-1 nova_compute[189066]:     <timer name="pit" tickpolicy="delay"/>
Dec 05 09:39:14 compute-1 nova_compute[189066]:     <timer name="rtc" tickpolicy="catchup"/>
Dec 05 09:39:14 compute-1 nova_compute[189066]:     <timer name="hpet" present="no"/>
Dec 05 09:39:14 compute-1 nova_compute[189066]:   </clock>
Dec 05 09:39:14 compute-1 nova_compute[189066]:   <cpu mode="custom" match="exact">
Dec 05 09:39:14 compute-1 nova_compute[189066]:     <model>Nehalem</model>
Dec 05 09:39:14 compute-1 nova_compute[189066]:     <topology sockets="1" cores="1" threads="1"/>
Dec 05 09:39:14 compute-1 nova_compute[189066]:   </cpu>
Dec 05 09:39:14 compute-1 nova_compute[189066]:   <devices>
Dec 05 09:39:14 compute-1 nova_compute[189066]:     <disk type="file" device="disk">
Dec 05 09:39:14 compute-1 nova_compute[189066]:       <driver name="qemu" type="qcow2" cache="none"/>
Dec 05 09:39:14 compute-1 nova_compute[189066]:       <source file="/var/lib/nova/instances/732f3a01-82af-4597-8b48-7cc47c00edb8/disk"/>
Dec 05 09:39:14 compute-1 nova_compute[189066]:       <target dev="vda" bus="virtio"/>
Dec 05 09:39:14 compute-1 nova_compute[189066]:     </disk>
Dec 05 09:39:14 compute-1 nova_compute[189066]:     <disk type="file" device="cdrom">
Dec 05 09:39:14 compute-1 nova_compute[189066]:       <driver name="qemu" type="raw" cache="none"/>
Dec 05 09:39:14 compute-1 nova_compute[189066]:       <source file="/var/lib/nova/instances/732f3a01-82af-4597-8b48-7cc47c00edb8/disk.config"/>
Dec 05 09:39:14 compute-1 nova_compute[189066]:       <target dev="sda" bus="sata"/>
Dec 05 09:39:14 compute-1 nova_compute[189066]:     </disk>
Dec 05 09:39:14 compute-1 nova_compute[189066]:     <interface type="ethernet">
Dec 05 09:39:14 compute-1 nova_compute[189066]:       <mac address="fa:16:3e:ae:d0:75"/>
Dec 05 09:39:14 compute-1 nova_compute[189066]:       <model type="virtio"/>
Dec 05 09:39:14 compute-1 nova_compute[189066]:       <driver name="vhost" rx_queue_size="512"/>
Dec 05 09:39:14 compute-1 nova_compute[189066]:       <mtu size="1442"/>
Dec 05 09:39:14 compute-1 nova_compute[189066]:       <target dev="tape895fa16-27"/>
Dec 05 09:39:14 compute-1 nova_compute[189066]:     </interface>
Dec 05 09:39:14 compute-1 nova_compute[189066]:     <serial type="pty">
Dec 05 09:39:14 compute-1 nova_compute[189066]:       <log file="/var/lib/nova/instances/732f3a01-82af-4597-8b48-7cc47c00edb8/console.log" append="off"/>
Dec 05 09:39:14 compute-1 nova_compute[189066]:     </serial>
Dec 05 09:39:14 compute-1 nova_compute[189066]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Dec 05 09:39:14 compute-1 nova_compute[189066]:     <video>
Dec 05 09:39:14 compute-1 nova_compute[189066]:       <model type="virtio"/>
Dec 05 09:39:14 compute-1 nova_compute[189066]:     </video>
Dec 05 09:39:14 compute-1 nova_compute[189066]:     <input type="tablet" bus="usb"/>
Dec 05 09:39:14 compute-1 nova_compute[189066]:     <rng model="virtio">
Dec 05 09:39:14 compute-1 nova_compute[189066]:       <backend model="random">/dev/urandom</backend>
Dec 05 09:39:14 compute-1 nova_compute[189066]:     </rng>
Dec 05 09:39:14 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root"/>
Dec 05 09:39:14 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:39:14 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:39:14 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:39:14 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:39:14 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:39:14 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:39:14 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:39:14 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:39:14 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:39:14 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:39:14 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:39:14 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:39:14 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:39:14 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:39:14 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:39:14 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:39:14 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:39:14 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:39:14 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:39:14 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:39:14 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:39:14 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:39:14 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:39:14 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:39:14 compute-1 nova_compute[189066]:     <controller type="usb" index="0"/>
Dec 05 09:39:14 compute-1 nova_compute[189066]:     <memballoon model="virtio">
Dec 05 09:39:14 compute-1 nova_compute[189066]:       <stats period="10"/>
Dec 05 09:39:14 compute-1 nova_compute[189066]:     </memballoon>
Dec 05 09:39:14 compute-1 nova_compute[189066]:   </devices>
Dec 05 09:39:14 compute-1 nova_compute[189066]: </domain>
Dec 05 09:39:14 compute-1 nova_compute[189066]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Dec 05 09:39:14 compute-1 nova_compute[189066]: 2025-12-05 09:39:14.349 189070 DEBUG nova.compute.manager [None req-ef5b06bb-9038-4b9c-967b-b34065188b6f 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: 732f3a01-82af-4597-8b48-7cc47c00edb8] Preparing to wait for external event network-vif-plugged-e895fa16-2797-4f30-b0cc-644ea2908267 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Dec 05 09:39:14 compute-1 nova_compute[189066]: 2025-12-05 09:39:14.349 189070 DEBUG oslo_concurrency.lockutils [None req-ef5b06bb-9038-4b9c-967b-b34065188b6f 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Acquiring lock "732f3a01-82af-4597-8b48-7cc47c00edb8-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:39:14 compute-1 nova_compute[189066]: 2025-12-05 09:39:14.350 189070 DEBUG oslo_concurrency.lockutils [None req-ef5b06bb-9038-4b9c-967b-b34065188b6f 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Lock "732f3a01-82af-4597-8b48-7cc47c00edb8-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:39:14 compute-1 nova_compute[189066]: 2025-12-05 09:39:14.350 189070 DEBUG oslo_concurrency.lockutils [None req-ef5b06bb-9038-4b9c-967b-b34065188b6f 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Lock "732f3a01-82af-4597-8b48-7cc47c00edb8-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:39:14 compute-1 nova_compute[189066]: 2025-12-05 09:39:14.351 189070 DEBUG nova.virt.libvirt.vif [None req-ef5b06bb-9038-4b9c-967b-b34065188b6f 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T09:39:05Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-253118960',display_name='tempest-TestNetworkBasicOps-server-253118960',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-253118960',id=41,image_ref='3ebffd97-b242-42d7-b245-ebdaf8e4377c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBE+hv27qgDRMBsfX5jXb8iLomInOpAxFJUfcjC+wXB9YVN42xsS1HtZGTRbX0bIYCe5S8Lfbh8PkI5HycuN5Bbq7EDH5z2L3ZwqGtpY36grPS6JgNrMhfQUlv20UszIIbA==',key_name='tempest-TestNetworkBasicOps-116725453',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='f918144b49634ed5a43d75f8f7d194d3',ramdisk_id='',reservation_id='r-bi9014r1',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='3ebffd97-b242-42d7-b245-ebdaf8e4377c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1341993432',owner_user_name='tempest-TestNetworkBasicOps-1341993432-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T09:39:07Z,user_data=None,user_id='822a2d80e80e46d9a5a49e1e9560e0d9',uuid=732f3a01-82af-4597-8b48-7cc47c00edb8,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "e895fa16-2797-4f30-b0cc-644ea2908267", "address": "fa:16:3e:ae:d0:75", "network": {"id": "cc4506e6-cfb6-485f-9e7b-3a80985c5a7e", "bridge": "br-int", "label": "tempest-network-smoke--1626007660", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.187", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f918144b49634ed5a43d75f8f7d194d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape895fa16-27", "ovs_interfaceid": "e895fa16-2797-4f30-b0cc-644ea2908267", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Dec 05 09:39:14 compute-1 nova_compute[189066]: 2025-12-05 09:39:14.351 189070 DEBUG nova.network.os_vif_util [None req-ef5b06bb-9038-4b9c-967b-b34065188b6f 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Converting VIF {"id": "e895fa16-2797-4f30-b0cc-644ea2908267", "address": "fa:16:3e:ae:d0:75", "network": {"id": "cc4506e6-cfb6-485f-9e7b-3a80985c5a7e", "bridge": "br-int", "label": "tempest-network-smoke--1626007660", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.187", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f918144b49634ed5a43d75f8f7d194d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape895fa16-27", "ovs_interfaceid": "e895fa16-2797-4f30-b0cc-644ea2908267", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 09:39:14 compute-1 nova_compute[189066]: 2025-12-05 09:39:14.352 189070 DEBUG nova.network.os_vif_util [None req-ef5b06bb-9038-4b9c-967b-b34065188b6f 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ae:d0:75,bridge_name='br-int',has_traffic_filtering=True,id=e895fa16-2797-4f30-b0cc-644ea2908267,network=Network(cc4506e6-cfb6-485f-9e7b-3a80985c5a7e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tape895fa16-27') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 09:39:14 compute-1 nova_compute[189066]: 2025-12-05 09:39:14.352 189070 DEBUG os_vif [None req-ef5b06bb-9038-4b9c-967b-b34065188b6f 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:ae:d0:75,bridge_name='br-int',has_traffic_filtering=True,id=e895fa16-2797-4f30-b0cc-644ea2908267,network=Network(cc4506e6-cfb6-485f-9e7b-3a80985c5a7e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tape895fa16-27') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Dec 05 09:39:14 compute-1 nova_compute[189066]: 2025-12-05 09:39:14.353 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:39:14 compute-1 nova_compute[189066]: 2025-12-05 09:39:14.353 189070 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 09:39:14 compute-1 nova_compute[189066]: 2025-12-05 09:39:14.353 189070 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 09:39:14 compute-1 nova_compute[189066]: 2025-12-05 09:39:14.357 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:39:14 compute-1 nova_compute[189066]: 2025-12-05 09:39:14.357 189070 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tape895fa16-27, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 09:39:14 compute-1 nova_compute[189066]: 2025-12-05 09:39:14.358 189070 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tape895fa16-27, col_values=(('external_ids', {'iface-id': 'e895fa16-2797-4f30-b0cc-644ea2908267', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:ae:d0:75', 'vm-uuid': '732f3a01-82af-4597-8b48-7cc47c00edb8'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 09:39:14 compute-1 nova_compute[189066]: 2025-12-05 09:39:14.359 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:39:14 compute-1 NetworkManager[55704]: <info>  [1764927554.3607] manager: (tape895fa16-27): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/115)
Dec 05 09:39:14 compute-1 nova_compute[189066]: 2025-12-05 09:39:14.363 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 09:39:14 compute-1 nova_compute[189066]: 2025-12-05 09:39:14.367 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:39:14 compute-1 nova_compute[189066]: 2025-12-05 09:39:14.369 189070 INFO os_vif [None req-ef5b06bb-9038-4b9c-967b-b34065188b6f 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:ae:d0:75,bridge_name='br-int',has_traffic_filtering=True,id=e895fa16-2797-4f30-b0cc-644ea2908267,network=Network(cc4506e6-cfb6-485f-9e7b-3a80985c5a7e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tape895fa16-27')
Dec 05 09:39:14 compute-1 nova_compute[189066]: 2025-12-05 09:39:14.474 189070 DEBUG nova.virt.libvirt.driver [None req-ef5b06bb-9038-4b9c-967b-b34065188b6f 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 05 09:39:14 compute-1 nova_compute[189066]: 2025-12-05 09:39:14.475 189070 DEBUG nova.virt.libvirt.driver [None req-ef5b06bb-9038-4b9c-967b-b34065188b6f 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 05 09:39:14 compute-1 nova_compute[189066]: 2025-12-05 09:39:14.475 189070 DEBUG nova.virt.libvirt.driver [None req-ef5b06bb-9038-4b9c-967b-b34065188b6f 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] No VIF found with MAC fa:16:3e:ae:d0:75, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Dec 05 09:39:14 compute-1 nova_compute[189066]: 2025-12-05 09:39:14.476 189070 INFO nova.virt.libvirt.driver [None req-ef5b06bb-9038-4b9c-967b-b34065188b6f 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: 732f3a01-82af-4597-8b48-7cc47c00edb8] Using config drive
Dec 05 09:39:15 compute-1 nova_compute[189066]: 2025-12-05 09:39:15.119 189070 INFO nova.virt.libvirt.driver [None req-ef5b06bb-9038-4b9c-967b-b34065188b6f 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: 732f3a01-82af-4597-8b48-7cc47c00edb8] Creating config drive at /var/lib/nova/instances/732f3a01-82af-4597-8b48-7cc47c00edb8/disk.config
Dec 05 09:39:15 compute-1 nova_compute[189066]: 2025-12-05 09:39:15.131 189070 DEBUG oslo_concurrency.processutils [None req-ef5b06bb-9038-4b9c-967b-b34065188b6f 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/732f3a01-82af-4597-8b48-7cc47c00edb8/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpthmlf_ge execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 09:39:15 compute-1 nova_compute[189066]: 2025-12-05 09:39:15.281 189070 DEBUG oslo_concurrency.processutils [None req-ef5b06bb-9038-4b9c-967b-b34065188b6f 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/732f3a01-82af-4597-8b48-7cc47c00edb8/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpthmlf_ge" returned: 0 in 0.150s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 09:39:15 compute-1 nova_compute[189066]: 2025-12-05 09:39:15.331 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:39:15 compute-1 kernel: tape895fa16-27: entered promiscuous mode
Dec 05 09:39:15 compute-1 NetworkManager[55704]: <info>  [1764927555.3734] manager: (tape895fa16-27): new Tun device (/org/freedesktop/NetworkManager/Devices/116)
Dec 05 09:39:15 compute-1 ovn_controller[95809]: 2025-12-05T09:39:15Z|00224|binding|INFO|Claiming lport e895fa16-2797-4f30-b0cc-644ea2908267 for this chassis.
Dec 05 09:39:15 compute-1 ovn_controller[95809]: 2025-12-05T09:39:15Z|00225|binding|INFO|e895fa16-2797-4f30-b0cc-644ea2908267: Claiming fa:16:3e:ae:d0:75 10.100.0.3
Dec 05 09:39:15 compute-1 nova_compute[189066]: 2025-12-05 09:39:15.374 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:39:15 compute-1 nova_compute[189066]: 2025-12-05 09:39:15.377 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:39:15 compute-1 ovn_controller[95809]: 2025-12-05T09:39:15Z|00226|binding|INFO|Setting lport e895fa16-2797-4f30-b0cc-644ea2908267 ovn-installed in OVS
Dec 05 09:39:15 compute-1 nova_compute[189066]: 2025-12-05 09:39:15.387 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:39:15 compute-1 ovn_controller[95809]: 2025-12-05T09:39:15Z|00227|binding|INFO|Setting lport e895fa16-2797-4f30-b0cc-644ea2908267 up in Southbound
Dec 05 09:39:15 compute-1 nova_compute[189066]: 2025-12-05 09:39:15.392 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:39:15 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:39:15.392 105272 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ae:d0:75 10.100.0.3'], port_security=['fa:16:3e:ae:d0:75 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-TestNetworkBasicOps-45866682', 'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': '732f3a01-82af-4597-8b48-7cc47c00edb8', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-cc4506e6-cfb6-485f-9e7b-3a80985c5a7e', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-TestNetworkBasicOps-45866682', 'neutron:project_id': 'f918144b49634ed5a43d75f8f7d194d3', 'neutron:revision_number': '7', 'neutron:security_group_ids': 'b999a1ea-2a47-43bb-b6d8-1df5b14a091a', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.187'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=4843ac73-160e-4d90-a31d-4942b2dafdeb, chassis=[<ovs.db.idl.Row object at 0x7fc06f54c970>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fc06f54c970>], logical_port=e895fa16-2797-4f30-b0cc-644ea2908267) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 09:39:15 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:39:15.394 105272 INFO neutron.agent.ovn.metadata.agent [-] Port e895fa16-2797-4f30-b0cc-644ea2908267 in datapath cc4506e6-cfb6-485f-9e7b-3a80985c5a7e bound to our chassis
Dec 05 09:39:15 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:39:15.397 105272 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network cc4506e6-cfb6-485f-9e7b-3a80985c5a7e
Dec 05 09:39:15 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:39:15.417 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[f28ab4bc-45f0-466f-b97f-9adccee9a194]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:39:15 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:39:15.419 105272 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapcc4506e6-c1 in ovnmeta-cc4506e6-cfb6-485f-9e7b-3a80985c5a7e namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Dec 05 09:39:15 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:39:15.421 220556 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapcc4506e6-c0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Dec 05 09:39:15 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:39:15.421 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[8c5f674c-5883-4be2-94c1-676f7ed31cf5]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:39:15 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:39:15.422 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[fa7729d8-68e2-41cf-a91c-90e8eddd1ca1]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:39:15 compute-1 systemd-udevd[230478]: Network interface NamePolicy= disabled on kernel command line.
Dec 05 09:39:15 compute-1 systemd-machined[154815]: New machine qemu-18-instance-00000029.
Dec 05 09:39:15 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:39:15.439 105809 DEBUG oslo.privsep.daemon [-] privsep: reply[a2d052fb-012d-4114-b003-e4437b011bdb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:39:15 compute-1 systemd[1]: Started Virtual Machine qemu-18-instance-00000029.
Dec 05 09:39:15 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:39:15.456 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[40b129e7-0a05-443d-a705-7904bf90aa7b]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:39:15 compute-1 NetworkManager[55704]: <info>  [1764927555.4608] device (tape895fa16-27): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec 05 09:39:15 compute-1 NetworkManager[55704]: <info>  [1764927555.4624] device (tape895fa16-27): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec 05 09:39:15 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:39:15.491 220896 DEBUG oslo.privsep.daemon [-] privsep: reply[70806494-46ec-4cca-b80b-4336f20179e4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:39:15 compute-1 NetworkManager[55704]: <info>  [1764927555.4984] manager: (tapcc4506e6-c0): new Veth device (/org/freedesktop/NetworkManager/Devices/117)
Dec 05 09:39:15 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:39:15.499 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[d0960054-e7bd-4dd1-8246-d10f4f8f0472]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:39:15 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:39:15.540 220896 DEBUG oslo.privsep.daemon [-] privsep: reply[93d5cdc2-5355-4148-b320-656792e14392]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:39:15 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:39:15.545 220896 DEBUG oslo.privsep.daemon [-] privsep: reply[ccfb5504-4ff5-4d30-b7d6-93c65b2bd81a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:39:15 compute-1 NetworkManager[55704]: <info>  [1764927555.5746] device (tapcc4506e6-c0): carrier: link connected
Dec 05 09:39:15 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:39:15.582 220896 DEBUG oslo.privsep.daemon [-] privsep: reply[43c3a7c3-2519-4756-ab2a-0143ec83e75b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:39:15 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:39:15.603 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[25008e9a-00a5-49bb-9f71-87fe96992eea]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapcc4506e6-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:25:5d:24'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 66], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 482857, 'reachable_time': 17715, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 230510, 'error': None, 'target': 'ovnmeta-cc4506e6-cfb6-485f-9e7b-3a80985c5a7e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:39:15 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:39:15.623 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[fc6d7703-89ef-43f2-ad0c-e7618b185e67]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe25:5d24'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 482857, 'tstamp': 482857}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 230511, 'error': None, 'target': 'ovnmeta-cc4506e6-cfb6-485f-9e7b-3a80985c5a7e', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:39:15 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:39:15.645 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[e46d7f4e-799f-4fae-97ac-5759d172932f]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapcc4506e6-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:25:5d:24'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 66], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 482857, 'reachable_time': 17715, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 230512, 'error': None, 'target': 'ovnmeta-cc4506e6-cfb6-485f-9e7b-3a80985c5a7e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:39:15 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:39:15.681 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[97ffded2-2bc5-4cba-a7ca-d220a28ba0d7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:39:15 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:39:15.761 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[87afdba9-9559-47e0-9209-6fc68da9ad74]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:39:15 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:39:15.763 105272 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapcc4506e6-c0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 09:39:15 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:39:15.763 105272 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 09:39:15 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:39:15.764 105272 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapcc4506e6-c0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 09:39:15 compute-1 nova_compute[189066]: 2025-12-05 09:39:15.765 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:39:15 compute-1 kernel: tapcc4506e6-c0: entered promiscuous mode
Dec 05 09:39:15 compute-1 nova_compute[189066]: 2025-12-05 09:39:15.768 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:39:15 compute-1 NetworkManager[55704]: <info>  [1764927555.7694] manager: (tapcc4506e6-c0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/118)
Dec 05 09:39:15 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:39:15.769 105272 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapcc4506e6-c0, col_values=(('external_ids', {'iface-id': '58d8efc9-6765-49f2-921c-3850e3ec8c81'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 09:39:15 compute-1 ovn_controller[95809]: 2025-12-05T09:39:15Z|00228|binding|INFO|Releasing lport 58d8efc9-6765-49f2-921c-3850e3ec8c81 from this chassis (sb_readonly=0)
Dec 05 09:39:15 compute-1 nova_compute[189066]: 2025-12-05 09:39:15.771 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:39:15 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:39:15.772 105272 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/cc4506e6-cfb6-485f-9e7b-3a80985c5a7e.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/cc4506e6-cfb6-485f-9e7b-3a80985c5a7e.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Dec 05 09:39:15 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:39:15.773 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[869f903a-8239-48e7-8890-3b2954fcf306]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:39:15 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:39:15.774 105272 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Dec 05 09:39:15 compute-1 ovn_metadata_agent[105267]: global
Dec 05 09:39:15 compute-1 ovn_metadata_agent[105267]:     log         /dev/log local0 debug
Dec 05 09:39:15 compute-1 ovn_metadata_agent[105267]:     log-tag     haproxy-metadata-proxy-cc4506e6-cfb6-485f-9e7b-3a80985c5a7e
Dec 05 09:39:15 compute-1 ovn_metadata_agent[105267]:     user        root
Dec 05 09:39:15 compute-1 ovn_metadata_agent[105267]:     group       root
Dec 05 09:39:15 compute-1 ovn_metadata_agent[105267]:     maxconn     1024
Dec 05 09:39:15 compute-1 ovn_metadata_agent[105267]:     pidfile     /var/lib/neutron/external/pids/cc4506e6-cfb6-485f-9e7b-3a80985c5a7e.pid.haproxy
Dec 05 09:39:15 compute-1 ovn_metadata_agent[105267]:     daemon
Dec 05 09:39:15 compute-1 ovn_metadata_agent[105267]: 
Dec 05 09:39:15 compute-1 ovn_metadata_agent[105267]: defaults
Dec 05 09:39:15 compute-1 ovn_metadata_agent[105267]:     log global
Dec 05 09:39:15 compute-1 ovn_metadata_agent[105267]:     mode http
Dec 05 09:39:15 compute-1 ovn_metadata_agent[105267]:     option httplog
Dec 05 09:39:15 compute-1 ovn_metadata_agent[105267]:     option dontlognull
Dec 05 09:39:15 compute-1 ovn_metadata_agent[105267]:     option http-server-close
Dec 05 09:39:15 compute-1 ovn_metadata_agent[105267]:     option forwardfor
Dec 05 09:39:15 compute-1 ovn_metadata_agent[105267]:     retries                 3
Dec 05 09:39:15 compute-1 ovn_metadata_agent[105267]:     timeout http-request    30s
Dec 05 09:39:15 compute-1 ovn_metadata_agent[105267]:     timeout connect         30s
Dec 05 09:39:15 compute-1 ovn_metadata_agent[105267]:     timeout client          32s
Dec 05 09:39:15 compute-1 ovn_metadata_agent[105267]:     timeout server          32s
Dec 05 09:39:15 compute-1 ovn_metadata_agent[105267]:     timeout http-keep-alive 30s
Dec 05 09:39:15 compute-1 ovn_metadata_agent[105267]: 
Dec 05 09:39:15 compute-1 ovn_metadata_agent[105267]: 
Dec 05 09:39:15 compute-1 ovn_metadata_agent[105267]: listen listener
Dec 05 09:39:15 compute-1 ovn_metadata_agent[105267]:     bind 169.254.169.254:80
Dec 05 09:39:15 compute-1 ovn_metadata_agent[105267]:     server metadata /var/lib/neutron/metadata_proxy
Dec 05 09:39:15 compute-1 ovn_metadata_agent[105267]:     http-request add-header X-OVN-Network-ID cc4506e6-cfb6-485f-9e7b-3a80985c5a7e
Dec 05 09:39:15 compute-1 ovn_metadata_agent[105267]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Dec 05 09:39:15 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:39:15.775 105272 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-cc4506e6-cfb6-485f-9e7b-3a80985c5a7e', 'env', 'PROCESS_TAG=haproxy-cc4506e6-cfb6-485f-9e7b-3a80985c5a7e', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/cc4506e6-cfb6-485f-9e7b-3a80985c5a7e.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Dec 05 09:39:15 compute-1 nova_compute[189066]: 2025-12-05 09:39:15.783 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:39:16 compute-1 nova_compute[189066]: 2025-12-05 09:39:16.020 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:39:16 compute-1 podman[230544]: 2025-12-05 09:39:16.165841779 +0000 UTC m=+0.048632231 container create 8703e59c600fbc7f21b5302649996e73dc416d4fc8eec335955af1bbceccb840 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-cc4506e6-cfb6-485f-9e7b-3a80985c5a7e, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 05 09:39:16 compute-1 systemd[1]: Started libpod-conmon-8703e59c600fbc7f21b5302649996e73dc416d4fc8eec335955af1bbceccb840.scope.
Dec 05 09:39:16 compute-1 systemd[1]: Started libcrun container.
Dec 05 09:39:16 compute-1 podman[230544]: 2025-12-05 09:39:16.138959952 +0000 UTC m=+0.021750424 image pull 014dc726c85414b29f2dde7b5d875685d08784761c0f0ffa8630d1583a877bf9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Dec 05 09:39:16 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/641abc02b5b55c9bab922971104ff3a8ad19c87587da5c61cb4ef048ab85f04d/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 05 09:39:16 compute-1 nova_compute[189066]: 2025-12-05 09:39:16.245 189070 DEBUG nova.compute.manager [req-b67bf87d-58cf-4441-a427-307e1da90566 req-9e1b1707-6422-470e-8768-723e213aea54 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 732f3a01-82af-4597-8b48-7cc47c00edb8] Received event network-vif-plugged-e895fa16-2797-4f30-b0cc-644ea2908267 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 09:39:16 compute-1 nova_compute[189066]: 2025-12-05 09:39:16.247 189070 DEBUG oslo_concurrency.lockutils [req-b67bf87d-58cf-4441-a427-307e1da90566 req-9e1b1707-6422-470e-8768-723e213aea54 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Acquiring lock "732f3a01-82af-4597-8b48-7cc47c00edb8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:39:16 compute-1 nova_compute[189066]: 2025-12-05 09:39:16.247 189070 DEBUG oslo_concurrency.lockutils [req-b67bf87d-58cf-4441-a427-307e1da90566 req-9e1b1707-6422-470e-8768-723e213aea54 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Lock "732f3a01-82af-4597-8b48-7cc47c00edb8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:39:16 compute-1 nova_compute[189066]: 2025-12-05 09:39:16.248 189070 DEBUG oslo_concurrency.lockutils [req-b67bf87d-58cf-4441-a427-307e1da90566 req-9e1b1707-6422-470e-8768-723e213aea54 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Lock "732f3a01-82af-4597-8b48-7cc47c00edb8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:39:16 compute-1 nova_compute[189066]: 2025-12-05 09:39:16.248 189070 DEBUG nova.compute.manager [req-b67bf87d-58cf-4441-a427-307e1da90566 req-9e1b1707-6422-470e-8768-723e213aea54 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 732f3a01-82af-4597-8b48-7cc47c00edb8] Processing event network-vif-plugged-e895fa16-2797-4f30-b0cc-644ea2908267 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Dec 05 09:39:16 compute-1 podman[230544]: 2025-12-05 09:39:16.255739867 +0000 UTC m=+0.138530339 container init 8703e59c600fbc7f21b5302649996e73dc416d4fc8eec335955af1bbceccb840 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-cc4506e6-cfb6-485f-9e7b-3a80985c5a7e, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Dec 05 09:39:16 compute-1 podman[230544]: 2025-12-05 09:39:16.264512372 +0000 UTC m=+0.147302824 container start 8703e59c600fbc7f21b5302649996e73dc416d4fc8eec335955af1bbceccb840 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-cc4506e6-cfb6-485f-9e7b-3a80985c5a7e, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Dec 05 09:39:16 compute-1 podman[230557]: 2025-12-05 09:39:16.272573429 +0000 UTC m=+0.063306649 container health_status 550b604b3b40c506829669f3af0f3e8dc4e7ac01464222ce7f26998708fe1a96 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Dec 05 09:39:16 compute-1 neutron-haproxy-ovnmeta-cc4506e6-cfb6-485f-9e7b-3a80985c5a7e[230560]: [NOTICE]   (230585) : New worker (230589) forked
Dec 05 09:39:16 compute-1 neutron-haproxy-ovnmeta-cc4506e6-cfb6-485f-9e7b-3a80985c5a7e[230560]: [NOTICE]   (230585) : Loading success.
Dec 05 09:39:16 compute-1 nova_compute[189066]: 2025-12-05 09:39:16.407 189070 DEBUG nova.compute.manager [None req-ef5b06bb-9038-4b9c-967b-b34065188b6f 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: 732f3a01-82af-4597-8b48-7cc47c00edb8] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Dec 05 09:39:16 compute-1 nova_compute[189066]: 2025-12-05 09:39:16.409 189070 DEBUG nova.virt.driver [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] Emitting event <LifecycleEvent: 1764927556.4066112, 732f3a01-82af-4597-8b48-7cc47c00edb8 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 09:39:16 compute-1 nova_compute[189066]: 2025-12-05 09:39:16.410 189070 INFO nova.compute.manager [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] [instance: 732f3a01-82af-4597-8b48-7cc47c00edb8] VM Started (Lifecycle Event)
Dec 05 09:39:16 compute-1 nova_compute[189066]: 2025-12-05 09:39:16.413 189070 DEBUG nova.virt.libvirt.driver [None req-ef5b06bb-9038-4b9c-967b-b34065188b6f 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: 732f3a01-82af-4597-8b48-7cc47c00edb8] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Dec 05 09:39:16 compute-1 nova_compute[189066]: 2025-12-05 09:39:16.418 189070 INFO nova.virt.libvirt.driver [-] [instance: 732f3a01-82af-4597-8b48-7cc47c00edb8] Instance spawned successfully.
Dec 05 09:39:16 compute-1 nova_compute[189066]: 2025-12-05 09:39:16.419 189070 DEBUG nova.virt.libvirt.driver [None req-ef5b06bb-9038-4b9c-967b-b34065188b6f 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: 732f3a01-82af-4597-8b48-7cc47c00edb8] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Dec 05 09:39:16 compute-1 nova_compute[189066]: 2025-12-05 09:39:16.737 189070 DEBUG nova.compute.manager [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] [instance: 732f3a01-82af-4597-8b48-7cc47c00edb8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 09:39:16 compute-1 nova_compute[189066]: 2025-12-05 09:39:16.741 189070 DEBUG nova.compute.manager [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] [instance: 732f3a01-82af-4597-8b48-7cc47c00edb8] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 09:39:16 compute-1 nova_compute[189066]: 2025-12-05 09:39:16.776 189070 DEBUG nova.virt.libvirt.driver [None req-ef5b06bb-9038-4b9c-967b-b34065188b6f 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: 732f3a01-82af-4597-8b48-7cc47c00edb8] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 09:39:16 compute-1 nova_compute[189066]: 2025-12-05 09:39:16.777 189070 DEBUG nova.virt.libvirt.driver [None req-ef5b06bb-9038-4b9c-967b-b34065188b6f 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: 732f3a01-82af-4597-8b48-7cc47c00edb8] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 09:39:16 compute-1 nova_compute[189066]: 2025-12-05 09:39:16.777 189070 DEBUG nova.virt.libvirt.driver [None req-ef5b06bb-9038-4b9c-967b-b34065188b6f 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: 732f3a01-82af-4597-8b48-7cc47c00edb8] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 09:39:16 compute-1 nova_compute[189066]: 2025-12-05 09:39:16.778 189070 DEBUG nova.virt.libvirt.driver [None req-ef5b06bb-9038-4b9c-967b-b34065188b6f 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: 732f3a01-82af-4597-8b48-7cc47c00edb8] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 09:39:16 compute-1 nova_compute[189066]: 2025-12-05 09:39:16.778 189070 DEBUG nova.virt.libvirt.driver [None req-ef5b06bb-9038-4b9c-967b-b34065188b6f 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: 732f3a01-82af-4597-8b48-7cc47c00edb8] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 09:39:16 compute-1 nova_compute[189066]: 2025-12-05 09:39:16.779 189070 DEBUG nova.virt.libvirt.driver [None req-ef5b06bb-9038-4b9c-967b-b34065188b6f 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: 732f3a01-82af-4597-8b48-7cc47c00edb8] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 09:39:16 compute-1 nova_compute[189066]: 2025-12-05 09:39:16.844 189070 INFO nova.compute.manager [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] [instance: 732f3a01-82af-4597-8b48-7cc47c00edb8] During sync_power_state the instance has a pending task (spawning). Skip.
Dec 05 09:39:16 compute-1 nova_compute[189066]: 2025-12-05 09:39:16.845 189070 DEBUG nova.virt.driver [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] Emitting event <LifecycleEvent: 1764927556.4083958, 732f3a01-82af-4597-8b48-7cc47c00edb8 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 09:39:16 compute-1 nova_compute[189066]: 2025-12-05 09:39:16.845 189070 INFO nova.compute.manager [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] [instance: 732f3a01-82af-4597-8b48-7cc47c00edb8] VM Paused (Lifecycle Event)
Dec 05 09:39:16 compute-1 nova_compute[189066]: 2025-12-05 09:39:16.897 189070 DEBUG nova.compute.manager [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] [instance: 732f3a01-82af-4597-8b48-7cc47c00edb8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 09:39:16 compute-1 nova_compute[189066]: 2025-12-05 09:39:16.902 189070 DEBUG nova.virt.driver [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] Emitting event <LifecycleEvent: 1764927556.4125953, 732f3a01-82af-4597-8b48-7cc47c00edb8 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 09:39:16 compute-1 nova_compute[189066]: 2025-12-05 09:39:16.903 189070 INFO nova.compute.manager [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] [instance: 732f3a01-82af-4597-8b48-7cc47c00edb8] VM Resumed (Lifecycle Event)
Dec 05 09:39:16 compute-1 nova_compute[189066]: 2025-12-05 09:39:16.913 189070 INFO nova.compute.manager [None req-ef5b06bb-9038-4b9c-967b-b34065188b6f 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: 732f3a01-82af-4597-8b48-7cc47c00edb8] Took 9.10 seconds to spawn the instance on the hypervisor.
Dec 05 09:39:16 compute-1 nova_compute[189066]: 2025-12-05 09:39:16.914 189070 DEBUG nova.compute.manager [None req-ef5b06bb-9038-4b9c-967b-b34065188b6f 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: 732f3a01-82af-4597-8b48-7cc47c00edb8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 09:39:16 compute-1 nova_compute[189066]: 2025-12-05 09:39:16.925 189070 DEBUG nova.compute.manager [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] [instance: 732f3a01-82af-4597-8b48-7cc47c00edb8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 09:39:16 compute-1 nova_compute[189066]: 2025-12-05 09:39:16.929 189070 DEBUG nova.compute.manager [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] [instance: 732f3a01-82af-4597-8b48-7cc47c00edb8] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 09:39:16 compute-1 nova_compute[189066]: 2025-12-05 09:39:16.957 189070 INFO nova.compute.manager [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] [instance: 732f3a01-82af-4597-8b48-7cc47c00edb8] During sync_power_state the instance has a pending task (spawning). Skip.
Dec 05 09:39:17 compute-1 nova_compute[189066]: 2025-12-05 09:39:17.018 189070 INFO nova.compute.manager [None req-ef5b06bb-9038-4b9c-967b-b34065188b6f 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: 732f3a01-82af-4597-8b48-7cc47c00edb8] Took 10.02 seconds to build instance.
Dec 05 09:39:17 compute-1 nova_compute[189066]: 2025-12-05 09:39:17.048 189070 DEBUG oslo_concurrency.lockutils [None req-ef5b06bb-9038-4b9c-967b-b34065188b6f 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Lock "732f3a01-82af-4597-8b48-7cc47c00edb8" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 10.139s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:39:19 compute-1 nova_compute[189066]: 2025-12-05 09:39:19.021 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:39:19 compute-1 nova_compute[189066]: 2025-12-05 09:39:19.238 189070 DEBUG nova.compute.manager [req-77d4877e-4009-4535-b98d-760f9777d98e req-e82e32fa-f2d1-48cd-a1a0-56b4712e46a5 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 732f3a01-82af-4597-8b48-7cc47c00edb8] Received event network-vif-plugged-e895fa16-2797-4f30-b0cc-644ea2908267 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 09:39:19 compute-1 nova_compute[189066]: 2025-12-05 09:39:19.239 189070 DEBUG oslo_concurrency.lockutils [req-77d4877e-4009-4535-b98d-760f9777d98e req-e82e32fa-f2d1-48cd-a1a0-56b4712e46a5 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Acquiring lock "732f3a01-82af-4597-8b48-7cc47c00edb8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:39:19 compute-1 nova_compute[189066]: 2025-12-05 09:39:19.239 189070 DEBUG oslo_concurrency.lockutils [req-77d4877e-4009-4535-b98d-760f9777d98e req-e82e32fa-f2d1-48cd-a1a0-56b4712e46a5 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Lock "732f3a01-82af-4597-8b48-7cc47c00edb8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:39:19 compute-1 nova_compute[189066]: 2025-12-05 09:39:19.239 189070 DEBUG oslo_concurrency.lockutils [req-77d4877e-4009-4535-b98d-760f9777d98e req-e82e32fa-f2d1-48cd-a1a0-56b4712e46a5 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Lock "732f3a01-82af-4597-8b48-7cc47c00edb8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:39:19 compute-1 nova_compute[189066]: 2025-12-05 09:39:19.240 189070 DEBUG nova.compute.manager [req-77d4877e-4009-4535-b98d-760f9777d98e req-e82e32fa-f2d1-48cd-a1a0-56b4712e46a5 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 732f3a01-82af-4597-8b48-7cc47c00edb8] No waiting events found dispatching network-vif-plugged-e895fa16-2797-4f30-b0cc-644ea2908267 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 09:39:19 compute-1 nova_compute[189066]: 2025-12-05 09:39:19.240 189070 WARNING nova.compute.manager [req-77d4877e-4009-4535-b98d-760f9777d98e req-e82e32fa-f2d1-48cd-a1a0-56b4712e46a5 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 732f3a01-82af-4597-8b48-7cc47c00edb8] Received unexpected event network-vif-plugged-e895fa16-2797-4f30-b0cc-644ea2908267 for instance with vm_state active and task_state None.
Dec 05 09:39:19 compute-1 nova_compute[189066]: 2025-12-05 09:39:19.360 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:39:20 compute-1 nova_compute[189066]: 2025-12-05 09:39:20.332 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:39:20 compute-1 nova_compute[189066]: 2025-12-05 09:39:20.543 189070 DEBUG oslo_concurrency.lockutils [None req-bde595fc-382a-4cb0-a0de-38f2656c93d5 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Acquiring lock "732f3a01-82af-4597-8b48-7cc47c00edb8" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:39:20 compute-1 nova_compute[189066]: 2025-12-05 09:39:20.544 189070 DEBUG oslo_concurrency.lockutils [None req-bde595fc-382a-4cb0-a0de-38f2656c93d5 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Lock "732f3a01-82af-4597-8b48-7cc47c00edb8" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:39:20 compute-1 nova_compute[189066]: 2025-12-05 09:39:20.545 189070 DEBUG oslo_concurrency.lockutils [None req-bde595fc-382a-4cb0-a0de-38f2656c93d5 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Acquiring lock "732f3a01-82af-4597-8b48-7cc47c00edb8-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:39:20 compute-1 nova_compute[189066]: 2025-12-05 09:39:20.545 189070 DEBUG oslo_concurrency.lockutils [None req-bde595fc-382a-4cb0-a0de-38f2656c93d5 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Lock "732f3a01-82af-4597-8b48-7cc47c00edb8-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:39:20 compute-1 nova_compute[189066]: 2025-12-05 09:39:20.545 189070 DEBUG oslo_concurrency.lockutils [None req-bde595fc-382a-4cb0-a0de-38f2656c93d5 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Lock "732f3a01-82af-4597-8b48-7cc47c00edb8-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:39:20 compute-1 nova_compute[189066]: 2025-12-05 09:39:20.547 189070 INFO nova.compute.manager [None req-bde595fc-382a-4cb0-a0de-38f2656c93d5 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: 732f3a01-82af-4597-8b48-7cc47c00edb8] Terminating instance
Dec 05 09:39:20 compute-1 nova_compute[189066]: 2025-12-05 09:39:20.548 189070 DEBUG nova.compute.manager [None req-bde595fc-382a-4cb0-a0de-38f2656c93d5 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: 732f3a01-82af-4597-8b48-7cc47c00edb8] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Dec 05 09:39:20 compute-1 kernel: tape895fa16-27 (unregistering): left promiscuous mode
Dec 05 09:39:20 compute-1 NetworkManager[55704]: <info>  [1764927560.5692] device (tape895fa16-27): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec 05 09:39:20 compute-1 ovn_controller[95809]: 2025-12-05T09:39:20Z|00229|binding|INFO|Releasing lport e895fa16-2797-4f30-b0cc-644ea2908267 from this chassis (sb_readonly=0)
Dec 05 09:39:20 compute-1 ovn_controller[95809]: 2025-12-05T09:39:20Z|00230|binding|INFO|Setting lport e895fa16-2797-4f30-b0cc-644ea2908267 down in Southbound
Dec 05 09:39:20 compute-1 ovn_controller[95809]: 2025-12-05T09:39:20Z|00231|binding|INFO|Removing iface tape895fa16-27 ovn-installed in OVS
Dec 05 09:39:20 compute-1 nova_compute[189066]: 2025-12-05 09:39:20.576 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:39:20 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:39:20.594 105272 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ae:d0:75 10.100.0.3'], port_security=['fa:16:3e:ae:d0:75 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-TestNetworkBasicOps-45866682', 'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': '732f3a01-82af-4597-8b48-7cc47c00edb8', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-cc4506e6-cfb6-485f-9e7b-3a80985c5a7e', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-TestNetworkBasicOps-45866682', 'neutron:project_id': 'f918144b49634ed5a43d75f8f7d194d3', 'neutron:revision_number': '9', 'neutron:security_group_ids': 'b999a1ea-2a47-43bb-b6d8-1df5b14a091a', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.187', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=4843ac73-160e-4d90-a31d-4942b2dafdeb, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fc06f54c970>], logical_port=e895fa16-2797-4f30-b0cc-644ea2908267) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fc06f54c970>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 09:39:20 compute-1 nova_compute[189066]: 2025-12-05 09:39:20.594 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:39:20 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:39:20.598 105272 INFO neutron.agent.ovn.metadata.agent [-] Port e895fa16-2797-4f30-b0cc-644ea2908267 in datapath cc4506e6-cfb6-485f-9e7b-3a80985c5a7e unbound from our chassis
Dec 05 09:39:20 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:39:20.600 105272 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network cc4506e6-cfb6-485f-9e7b-3a80985c5a7e, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 05 09:39:20 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:39:20.601 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[2de8fc00-7318-4ded-98b8-c1d276250b03]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:39:20 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:39:20.602 105272 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-cc4506e6-cfb6-485f-9e7b-3a80985c5a7e namespace which is not needed anymore
Dec 05 09:39:20 compute-1 systemd[1]: machine-qemu\x2d18\x2dinstance\x2d00000029.scope: Deactivated successfully.
Dec 05 09:39:20 compute-1 systemd[1]: machine-qemu\x2d18\x2dinstance\x2d00000029.scope: Consumed 5.128s CPU time.
Dec 05 09:39:20 compute-1 systemd-machined[154815]: Machine qemu-18-instance-00000029 terminated.
Dec 05 09:39:20 compute-1 neutron-haproxy-ovnmeta-cc4506e6-cfb6-485f-9e7b-3a80985c5a7e[230560]: [NOTICE]   (230585) : haproxy version is 2.8.14-c23fe91
Dec 05 09:39:20 compute-1 neutron-haproxy-ovnmeta-cc4506e6-cfb6-485f-9e7b-3a80985c5a7e[230560]: [NOTICE]   (230585) : path to executable is /usr/sbin/haproxy
Dec 05 09:39:20 compute-1 neutron-haproxy-ovnmeta-cc4506e6-cfb6-485f-9e7b-3a80985c5a7e[230560]: [WARNING]  (230585) : Exiting Master process...
Dec 05 09:39:20 compute-1 neutron-haproxy-ovnmeta-cc4506e6-cfb6-485f-9e7b-3a80985c5a7e[230560]: [ALERT]    (230585) : Current worker (230589) exited with code 143 (Terminated)
Dec 05 09:39:20 compute-1 neutron-haproxy-ovnmeta-cc4506e6-cfb6-485f-9e7b-3a80985c5a7e[230560]: [WARNING]  (230585) : All workers exited. Exiting... (0)
Dec 05 09:39:20 compute-1 systemd[1]: libpod-8703e59c600fbc7f21b5302649996e73dc416d4fc8eec335955af1bbceccb840.scope: Deactivated successfully.
Dec 05 09:39:20 compute-1 podman[230626]: 2025-12-05 09:39:20.753655656 +0000 UTC m=+0.048570739 container died 8703e59c600fbc7f21b5302649996e73dc416d4fc8eec335955af1bbceccb840 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-cc4506e6-cfb6-485f-9e7b-3a80985c5a7e, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3)
Dec 05 09:39:20 compute-1 nova_compute[189066]: 2025-12-05 09:39:20.815 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:39:20 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-8703e59c600fbc7f21b5302649996e73dc416d4fc8eec335955af1bbceccb840-userdata-shm.mount: Deactivated successfully.
Dec 05 09:39:20 compute-1 systemd[1]: var-lib-containers-storage-overlay-641abc02b5b55c9bab922971104ff3a8ad19c87587da5c61cb4ef048ab85f04d-merged.mount: Deactivated successfully.
Dec 05 09:39:20 compute-1 nova_compute[189066]: 2025-12-05 09:39:20.827 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:39:20 compute-1 podman[230626]: 2025-12-05 09:39:20.830866094 +0000 UTC m=+0.125781147 container cleanup 8703e59c600fbc7f21b5302649996e73dc416d4fc8eec335955af1bbceccb840 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-cc4506e6-cfb6-485f-9e7b-3a80985c5a7e, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Dec 05 09:39:20 compute-1 systemd[1]: libpod-conmon-8703e59c600fbc7f21b5302649996e73dc416d4fc8eec335955af1bbceccb840.scope: Deactivated successfully.
Dec 05 09:39:20 compute-1 nova_compute[189066]: 2025-12-05 09:39:20.860 189070 INFO nova.virt.libvirt.driver [-] [instance: 732f3a01-82af-4597-8b48-7cc47c00edb8] Instance destroyed successfully.
Dec 05 09:39:20 compute-1 nova_compute[189066]: 2025-12-05 09:39:20.861 189070 DEBUG nova.objects.instance [None req-bde595fc-382a-4cb0-a0de-38f2656c93d5 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Lazy-loading 'resources' on Instance uuid 732f3a01-82af-4597-8b48-7cc47c00edb8 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 09:39:20 compute-1 nova_compute[189066]: 2025-12-05 09:39:20.895 189070 DEBUG nova.virt.libvirt.vif [None req-bde595fc-382a-4cb0-a0de-38f2656c93d5 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-05T09:39:05Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-253118960',display_name='tempest-TestNetworkBasicOps-server-253118960',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-253118960',id=41,image_ref='3ebffd97-b242-42d7-b245-ebdaf8e4377c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBE+hv27qgDRMBsfX5jXb8iLomInOpAxFJUfcjC+wXB9YVN42xsS1HtZGTRbX0bIYCe5S8Lfbh8PkI5HycuN5Bbq7EDH5z2L3ZwqGtpY36grPS6JgNrMhfQUlv20UszIIbA==',key_name='tempest-TestNetworkBasicOps-116725453',keypairs=<?>,launch_index=0,launched_at=2025-12-05T09:39:16Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='f918144b49634ed5a43d75f8f7d194d3',ramdisk_id='',reservation_id='r-bi9014r1',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='3ebffd97-b242-42d7-b245-ebdaf8e4377c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1341993432',owner_user_name='tempest-TestNetworkBasicOps-1341993432-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-05T09:39:16Z,user_data=None,user_id='822a2d80e80e46d9a5a49e1e9560e0d9',uuid=732f3a01-82af-4597-8b48-7cc47c00edb8,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "e895fa16-2797-4f30-b0cc-644ea2908267", "address": "fa:16:3e:ae:d0:75", "network": {"id": "cc4506e6-cfb6-485f-9e7b-3a80985c5a7e", "bridge": "br-int", "label": "tempest-network-smoke--1626007660", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.187", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f918144b49634ed5a43d75f8f7d194d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape895fa16-27", "ovs_interfaceid": "e895fa16-2797-4f30-b0cc-644ea2908267", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Dec 05 09:39:20 compute-1 nova_compute[189066]: 2025-12-05 09:39:20.896 189070 DEBUG nova.network.os_vif_util [None req-bde595fc-382a-4cb0-a0de-38f2656c93d5 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Converting VIF {"id": "e895fa16-2797-4f30-b0cc-644ea2908267", "address": "fa:16:3e:ae:d0:75", "network": {"id": "cc4506e6-cfb6-485f-9e7b-3a80985c5a7e", "bridge": "br-int", "label": "tempest-network-smoke--1626007660", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.187", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f918144b49634ed5a43d75f8f7d194d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape895fa16-27", "ovs_interfaceid": "e895fa16-2797-4f30-b0cc-644ea2908267", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 09:39:20 compute-1 nova_compute[189066]: 2025-12-05 09:39:20.896 189070 DEBUG nova.network.os_vif_util [None req-bde595fc-382a-4cb0-a0de-38f2656c93d5 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ae:d0:75,bridge_name='br-int',has_traffic_filtering=True,id=e895fa16-2797-4f30-b0cc-644ea2908267,network=Network(cc4506e6-cfb6-485f-9e7b-3a80985c5a7e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tape895fa16-27') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 09:39:20 compute-1 nova_compute[189066]: 2025-12-05 09:39:20.897 189070 DEBUG os_vif [None req-bde595fc-382a-4cb0-a0de-38f2656c93d5 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:ae:d0:75,bridge_name='br-int',has_traffic_filtering=True,id=e895fa16-2797-4f30-b0cc-644ea2908267,network=Network(cc4506e6-cfb6-485f-9e7b-3a80985c5a7e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tape895fa16-27') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Dec 05 09:39:20 compute-1 nova_compute[189066]: 2025-12-05 09:39:20.899 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:39:20 compute-1 nova_compute[189066]: 2025-12-05 09:39:20.899 189070 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tape895fa16-27, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 09:39:20 compute-1 nova_compute[189066]: 2025-12-05 09:39:20.901 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:39:20 compute-1 nova_compute[189066]: 2025-12-05 09:39:20.902 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:39:20 compute-1 nova_compute[189066]: 2025-12-05 09:39:20.905 189070 INFO os_vif [None req-bde595fc-382a-4cb0-a0de-38f2656c93d5 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:ae:d0:75,bridge_name='br-int',has_traffic_filtering=True,id=e895fa16-2797-4f30-b0cc-644ea2908267,network=Network(cc4506e6-cfb6-485f-9e7b-3a80985c5a7e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tape895fa16-27')
Dec 05 09:39:20 compute-1 nova_compute[189066]: 2025-12-05 09:39:20.905 189070 INFO nova.virt.libvirt.driver [None req-bde595fc-382a-4cb0-a0de-38f2656c93d5 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: 732f3a01-82af-4597-8b48-7cc47c00edb8] Deleting instance files /var/lib/nova/instances/732f3a01-82af-4597-8b48-7cc47c00edb8_del
Dec 05 09:39:20 compute-1 nova_compute[189066]: 2025-12-05 09:39:20.906 189070 INFO nova.virt.libvirt.driver [None req-bde595fc-382a-4cb0-a0de-38f2656c93d5 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: 732f3a01-82af-4597-8b48-7cc47c00edb8] Deletion of /var/lib/nova/instances/732f3a01-82af-4597-8b48-7cc47c00edb8_del complete
Dec 05 09:39:20 compute-1 podman[230665]: 2025-12-05 09:39:20.907276523 +0000 UTC m=+0.050344792 container remove 8703e59c600fbc7f21b5302649996e73dc416d4fc8eec335955af1bbceccb840 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-cc4506e6-cfb6-485f-9e7b-3a80985c5a7e, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Dec 05 09:39:20 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:39:20.915 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[fa55ee2f-cc9f-4bfa-a78c-cdcfa66e76d1]: (4, ('Fri Dec  5 09:39:20 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-cc4506e6-cfb6-485f-9e7b-3a80985c5a7e (8703e59c600fbc7f21b5302649996e73dc416d4fc8eec335955af1bbceccb840)\n8703e59c600fbc7f21b5302649996e73dc416d4fc8eec335955af1bbceccb840\nFri Dec  5 09:39:20 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-cc4506e6-cfb6-485f-9e7b-3a80985c5a7e (8703e59c600fbc7f21b5302649996e73dc416d4fc8eec335955af1bbceccb840)\n8703e59c600fbc7f21b5302649996e73dc416d4fc8eec335955af1bbceccb840\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:39:20 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:39:20.917 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[4177e0ee-195c-40b0-b8d8-3aaef55956a8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:39:20 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:39:20.918 105272 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapcc4506e6-c0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 09:39:20 compute-1 nova_compute[189066]: 2025-12-05 09:39:20.920 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:39:20 compute-1 kernel: tapcc4506e6-c0: left promiscuous mode
Dec 05 09:39:20 compute-1 nova_compute[189066]: 2025-12-05 09:39:20.933 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:39:20 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:39:20.937 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[4b0846e5-3bbb-4ed7-940b-1d98bd538719]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:39:20 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:39:20.952 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[92b804a2-c4bb-4eb5-94f2-71f35679f554]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:39:20 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:39:20.954 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[17572873-410f-4b14-a7fa-201a93d9002b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:39:20 compute-1 nova_compute[189066]: 2025-12-05 09:39:20.980 189070 INFO nova.compute.manager [None req-bde595fc-382a-4cb0-a0de-38f2656c93d5 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] [instance: 732f3a01-82af-4597-8b48-7cc47c00edb8] Took 0.43 seconds to destroy the instance on the hypervisor.
Dec 05 09:39:20 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:39:20.978 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[f8f4fc88-15cc-4ce9-bc34-850982058d68]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 482848, 'reachable_time': 35939, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 230683, 'error': None, 'target': 'ovnmeta-cc4506e6-cfb6-485f-9e7b-3a80985c5a7e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:39:20 compute-1 nova_compute[189066]: 2025-12-05 09:39:20.981 189070 DEBUG oslo.service.loopingcall [None req-bde595fc-382a-4cb0-a0de-38f2656c93d5 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Dec 05 09:39:20 compute-1 nova_compute[189066]: 2025-12-05 09:39:20.981 189070 DEBUG nova.compute.manager [-] [instance: 732f3a01-82af-4597-8b48-7cc47c00edb8] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Dec 05 09:39:20 compute-1 nova_compute[189066]: 2025-12-05 09:39:20.982 189070 DEBUG nova.network.neutron [-] [instance: 732f3a01-82af-4597-8b48-7cc47c00edb8] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Dec 05 09:39:20 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:39:20.983 105809 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-cc4506e6-cfb6-485f-9e7b-3a80985c5a7e deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Dec 05 09:39:20 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:39:20.983 105809 DEBUG oslo.privsep.daemon [-] privsep: reply[c864b885-e9eb-4aef-89a0-e76d0ac70c13]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:39:20 compute-1 systemd[1]: run-netns-ovnmeta\x2dcc4506e6\x2dcfb6\x2d485f\x2d9e7b\x2d3a80985c5a7e.mount: Deactivated successfully.
Dec 05 09:39:21 compute-1 nova_compute[189066]: 2025-12-05 09:39:21.020 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:39:21 compute-1 nova_compute[189066]: 2025-12-05 09:39:21.650 189070 DEBUG nova.compute.manager [req-881ddd0b-39da-41d1-9572-d3ea4e61a6a3 req-f35736b1-5ede-4d6f-8481-4d081bd97ecb 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 732f3a01-82af-4597-8b48-7cc47c00edb8] Received event network-vif-unplugged-e895fa16-2797-4f30-b0cc-644ea2908267 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 09:39:21 compute-1 nova_compute[189066]: 2025-12-05 09:39:21.650 189070 DEBUG oslo_concurrency.lockutils [req-881ddd0b-39da-41d1-9572-d3ea4e61a6a3 req-f35736b1-5ede-4d6f-8481-4d081bd97ecb 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Acquiring lock "732f3a01-82af-4597-8b48-7cc47c00edb8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:39:21 compute-1 nova_compute[189066]: 2025-12-05 09:39:21.651 189070 DEBUG oslo_concurrency.lockutils [req-881ddd0b-39da-41d1-9572-d3ea4e61a6a3 req-f35736b1-5ede-4d6f-8481-4d081bd97ecb 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Lock "732f3a01-82af-4597-8b48-7cc47c00edb8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:39:21 compute-1 nova_compute[189066]: 2025-12-05 09:39:21.651 189070 DEBUG oslo_concurrency.lockutils [req-881ddd0b-39da-41d1-9572-d3ea4e61a6a3 req-f35736b1-5ede-4d6f-8481-4d081bd97ecb 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Lock "732f3a01-82af-4597-8b48-7cc47c00edb8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:39:21 compute-1 nova_compute[189066]: 2025-12-05 09:39:21.651 189070 DEBUG nova.compute.manager [req-881ddd0b-39da-41d1-9572-d3ea4e61a6a3 req-f35736b1-5ede-4d6f-8481-4d081bd97ecb 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 732f3a01-82af-4597-8b48-7cc47c00edb8] No waiting events found dispatching network-vif-unplugged-e895fa16-2797-4f30-b0cc-644ea2908267 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 09:39:21 compute-1 nova_compute[189066]: 2025-12-05 09:39:21.651 189070 DEBUG nova.compute.manager [req-881ddd0b-39da-41d1-9572-d3ea4e61a6a3 req-f35736b1-5ede-4d6f-8481-4d081bd97ecb 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 732f3a01-82af-4597-8b48-7cc47c00edb8] Received event network-vif-unplugged-e895fa16-2797-4f30-b0cc-644ea2908267 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Dec 05 09:39:21 compute-1 nova_compute[189066]: 2025-12-05 09:39:21.651 189070 DEBUG nova.compute.manager [req-881ddd0b-39da-41d1-9572-d3ea4e61a6a3 req-f35736b1-5ede-4d6f-8481-4d081bd97ecb 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 732f3a01-82af-4597-8b48-7cc47c00edb8] Received event network-vif-plugged-e895fa16-2797-4f30-b0cc-644ea2908267 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 09:39:21 compute-1 nova_compute[189066]: 2025-12-05 09:39:21.652 189070 DEBUG oslo_concurrency.lockutils [req-881ddd0b-39da-41d1-9572-d3ea4e61a6a3 req-f35736b1-5ede-4d6f-8481-4d081bd97ecb 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Acquiring lock "732f3a01-82af-4597-8b48-7cc47c00edb8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:39:21 compute-1 nova_compute[189066]: 2025-12-05 09:39:21.652 189070 DEBUG oslo_concurrency.lockutils [req-881ddd0b-39da-41d1-9572-d3ea4e61a6a3 req-f35736b1-5ede-4d6f-8481-4d081bd97ecb 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Lock "732f3a01-82af-4597-8b48-7cc47c00edb8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:39:21 compute-1 nova_compute[189066]: 2025-12-05 09:39:21.652 189070 DEBUG oslo_concurrency.lockutils [req-881ddd0b-39da-41d1-9572-d3ea4e61a6a3 req-f35736b1-5ede-4d6f-8481-4d081bd97ecb 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Lock "732f3a01-82af-4597-8b48-7cc47c00edb8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:39:21 compute-1 nova_compute[189066]: 2025-12-05 09:39:21.652 189070 DEBUG nova.compute.manager [req-881ddd0b-39da-41d1-9572-d3ea4e61a6a3 req-f35736b1-5ede-4d6f-8481-4d081bd97ecb 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 732f3a01-82af-4597-8b48-7cc47c00edb8] No waiting events found dispatching network-vif-plugged-e895fa16-2797-4f30-b0cc-644ea2908267 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 09:39:21 compute-1 nova_compute[189066]: 2025-12-05 09:39:21.653 189070 WARNING nova.compute.manager [req-881ddd0b-39da-41d1-9572-d3ea4e61a6a3 req-f35736b1-5ede-4d6f-8481-4d081bd97ecb 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 732f3a01-82af-4597-8b48-7cc47c00edb8] Received unexpected event network-vif-plugged-e895fa16-2797-4f30-b0cc-644ea2908267 for instance with vm_state active and task_state deleting.
Dec 05 09:39:23 compute-1 nova_compute[189066]: 2025-12-05 09:39:23.020 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:39:23 compute-1 nova_compute[189066]: 2025-12-05 09:39:23.020 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:39:23 compute-1 nova_compute[189066]: 2025-12-05 09:39:23.052 189070 DEBUG oslo_concurrency.lockutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:39:23 compute-1 nova_compute[189066]: 2025-12-05 09:39:23.053 189070 DEBUG oslo_concurrency.lockutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:39:23 compute-1 nova_compute[189066]: 2025-12-05 09:39:23.053 189070 DEBUG oslo_concurrency.lockutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:39:23 compute-1 nova_compute[189066]: 2025-12-05 09:39:23.054 189070 DEBUG nova.compute.resource_tracker [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 05 09:39:23 compute-1 nova_compute[189066]: 2025-12-05 09:39:23.150 189070 DEBUG nova.network.neutron [-] [instance: 732f3a01-82af-4597-8b48-7cc47c00edb8] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 09:39:23 compute-1 podman[230685]: 2025-12-05 09:39:23.176555758 +0000 UTC m=+0.070687398 container health_status d60d3dfd47255bc7c068dbedc03dbf86678863f0414a1b8f8536e4d53dd67a13 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a42d21893a42142b0b00e96296a13ac9d2c0fedf55d6438ad104fd0309c63376-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Dec 05 09:39:23 compute-1 nova_compute[189066]: 2025-12-05 09:39:23.196 189070 INFO nova.compute.manager [-] [instance: 732f3a01-82af-4597-8b48-7cc47c00edb8] Took 2.21 seconds to deallocate network for instance.
Dec 05 09:39:23 compute-1 nova_compute[189066]: 2025-12-05 09:39:23.257 189070 WARNING nova.virt.libvirt.driver [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 05 09:39:23 compute-1 nova_compute[189066]: 2025-12-05 09:39:23.259 189070 DEBUG nova.compute.resource_tracker [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5718MB free_disk=73.32583618164062GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 05 09:39:23 compute-1 nova_compute[189066]: 2025-12-05 09:39:23.259 189070 DEBUG oslo_concurrency.lockutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:39:23 compute-1 nova_compute[189066]: 2025-12-05 09:39:23.259 189070 DEBUG oslo_concurrency.lockutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:39:23 compute-1 nova_compute[189066]: 2025-12-05 09:39:23.261 189070 DEBUG oslo_concurrency.lockutils [None req-bde595fc-382a-4cb0-a0de-38f2656c93d5 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:39:23 compute-1 nova_compute[189066]: 2025-12-05 09:39:23.352 189070 WARNING nova.compute.resource_tracker [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Instance 732f3a01-82af-4597-8b48-7cc47c00edb8 is not being actively managed by this compute host but has allocations referencing this compute host: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. Skipping heal of allocation because we do not know what to do.
Dec 05 09:39:23 compute-1 nova_compute[189066]: 2025-12-05 09:39:23.352 189070 DEBUG nova.compute.resource_tracker [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 05 09:39:23 compute-1 nova_compute[189066]: 2025-12-05 09:39:23.353 189070 DEBUG nova.compute.resource_tracker [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 05 09:39:23 compute-1 nova_compute[189066]: 2025-12-05 09:39:23.417 189070 DEBUG nova.compute.provider_tree [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Inventory has not changed in ProviderTree for provider: be68f9f1-7820-4bfa-8dbd-210e13729f64 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 09:39:23 compute-1 nova_compute[189066]: 2025-12-05 09:39:23.439 189070 DEBUG nova.scheduler.client.report [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Inventory has not changed for provider be68f9f1-7820-4bfa-8dbd-210e13729f64 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 09:39:23 compute-1 nova_compute[189066]: 2025-12-05 09:39:23.472 189070 DEBUG nova.compute.resource_tracker [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 05 09:39:23 compute-1 nova_compute[189066]: 2025-12-05 09:39:23.472 189070 DEBUG oslo_concurrency.lockutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.213s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:39:23 compute-1 nova_compute[189066]: 2025-12-05 09:39:23.473 189070 DEBUG oslo_concurrency.lockutils [None req-bde595fc-382a-4cb0-a0de-38f2656c93d5 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.212s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:39:23 compute-1 nova_compute[189066]: 2025-12-05 09:39:23.479 189070 DEBUG oslo_concurrency.lockutils [None req-bde595fc-382a-4cb0-a0de-38f2656c93d5 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.006s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:39:23 compute-1 nova_compute[189066]: 2025-12-05 09:39:23.541 189070 INFO nova.scheduler.client.report [None req-bde595fc-382a-4cb0-a0de-38f2656c93d5 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Deleted allocations for instance 732f3a01-82af-4597-8b48-7cc47c00edb8
Dec 05 09:39:23 compute-1 nova_compute[189066]: 2025-12-05 09:39:23.614 189070 DEBUG oslo_concurrency.lockutils [None req-bde595fc-382a-4cb0-a0de-38f2656c93d5 822a2d80e80e46d9a5a49e1e9560e0d9 f918144b49634ed5a43d75f8f7d194d3 - - default default] Lock "732f3a01-82af-4597-8b48-7cc47c00edb8" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.069s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:39:24 compute-1 nova_compute[189066]: 2025-12-05 09:39:24.474 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:39:24 compute-1 nova_compute[189066]: 2025-12-05 09:39:24.474 189070 DEBUG nova.compute.manager [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 05 09:39:24 compute-1 nova_compute[189066]: 2025-12-05 09:39:24.474 189070 DEBUG nova.compute.manager [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 05 09:39:24 compute-1 nova_compute[189066]: 2025-12-05 09:39:24.505 189070 DEBUG nova.compute.manager [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Dec 05 09:39:24 compute-1 nova_compute[189066]: 2025-12-05 09:39:24.506 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:39:24 compute-1 nova_compute[189066]: 2025-12-05 09:39:24.506 189070 DEBUG nova.compute.manager [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 05 09:39:25 compute-1 nova_compute[189066]: 2025-12-05 09:39:25.021 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:39:25 compute-1 nova_compute[189066]: 2025-12-05 09:39:25.379 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:39:25 compute-1 nova_compute[189066]: 2025-12-05 09:39:25.902 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:39:26 compute-1 nova_compute[189066]: 2025-12-05 09:39:26.020 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:39:28 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:39:28.680 105272 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=22, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f2:d3:89', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '8e:43:2c:7d:20:dd'}, ipsec=False) old=SB_Global(nb_cfg=21) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 09:39:28 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:39:28.682 105272 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Dec 05 09:39:28 compute-1 nova_compute[189066]: 2025-12-05 09:39:28.681 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:39:28 compute-1 podman[230706]: 2025-12-05 09:39:28.687653564 +0000 UTC m=+0.124164515 container health_status 0cdc951f3b455a1459471b20e1f1626ca53f702831c125f835d1b670aeb17ccf (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a42d21893a42142b0b00e96296a13ac9d2c0fedf55d6438ad104fd0309c63376-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125)
Dec 05 09:39:30 compute-1 nova_compute[189066]: 2025-12-05 09:39:30.381 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:39:30 compute-1 nova_compute[189066]: 2025-12-05 09:39:30.904 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:39:31 compute-1 podman[230733]: 2025-12-05 09:39:31.631500947 +0000 UTC m=+0.066523268 container health_status e8cea6352187f28e672aa25c227f2705a90d58074b8cf42235a56df5d4026fd5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a42d21893a42142b0b00e96296a13ac9d2c0fedf55d6438ad104fd0309c63376-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Dec 05 09:39:33 compute-1 podman[230754]: 2025-12-05 09:39:33.614592374 +0000 UTC m=+0.058633364 container health_status 3f3288243aefe700d53cffce184353529fbaf6e44f1c19ec4792ed576af41176 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 05 09:39:34 compute-1 nova_compute[189066]: 2025-12-05 09:39:34.156 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:39:34 compute-1 nova_compute[189066]: 2025-12-05 09:39:34.304 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:39:35 compute-1 nova_compute[189066]: 2025-12-05 09:39:35.383 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:39:35 compute-1 nova_compute[189066]: 2025-12-05 09:39:35.859 189070 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764927560.8570986, 732f3a01-82af-4597-8b48-7cc47c00edb8 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 09:39:35 compute-1 nova_compute[189066]: 2025-12-05 09:39:35.860 189070 INFO nova.compute.manager [-] [instance: 732f3a01-82af-4597-8b48-7cc47c00edb8] VM Stopped (Lifecycle Event)
Dec 05 09:39:35 compute-1 nova_compute[189066]: 2025-12-05 09:39:35.890 189070 DEBUG nova.compute.manager [None req-279fdaa5-2e89-446f-bb36-bfb9817242fa - - - - - -] [instance: 732f3a01-82af-4597-8b48-7cc47c00edb8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 09:39:35 compute-1 nova_compute[189066]: 2025-12-05 09:39:35.908 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:39:37 compute-1 podman[230777]: 2025-12-05 09:39:37.622963761 +0000 UTC m=+0.065559715 container health_status b07f36300303a5c1729056a61e344d6c6fd9b2e404082d8e8ce7185d45652c2b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.component=ubi9-minimal-container, vcs-type=git, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=openstack_network_exporter, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, name=ubi9-minimal, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.expose-services=, maintainer=Red Hat, Inc., io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, architecture=x86_64)
Dec 05 09:39:37 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:39:37.685 105272 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=2deaed7a-68f6-453c-b7f8-10ef033f3762, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '22'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 09:39:40 compute-1 nova_compute[189066]: 2025-12-05 09:39:40.400 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:39:40 compute-1 nova_compute[189066]: 2025-12-05 09:39:40.910 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:39:41 compute-1 podman[230801]: 2025-12-05 09:39:41.628825936 +0000 UTC m=+0.063691898 container health_status b9f45cdcb567bb250381fa80d86c78c226d5638c7de1b6d79ee017275814ff68 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 05 09:39:45 compute-1 nova_compute[189066]: 2025-12-05 09:39:45.403 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:39:45 compute-1 nova_compute[189066]: 2025-12-05 09:39:45.912 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:39:46 compute-1 podman[230824]: 2025-12-05 09:39:46.626172428 +0000 UTC m=+0.068264970 container health_status 550b604b3b40c506829669f3af0f3e8dc4e7ac01464222ce7f26998708fe1a96 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Dec 05 09:39:50 compute-1 nova_compute[189066]: 2025-12-05 09:39:50.405 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:39:50 compute-1 nova_compute[189066]: 2025-12-05 09:39:50.915 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:39:53 compute-1 podman[230848]: 2025-12-05 09:39:53.629147729 +0000 UTC m=+0.069702896 container health_status d60d3dfd47255bc7c068dbedc03dbf86678863f0414a1b8f8536e4d53dd67a13 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ceilometer_agent_compute, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a42d21893a42142b0b00e96296a13ac9d2c0fedf55d6438ad104fd0309c63376-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute)
Dec 05 09:39:55 compute-1 nova_compute[189066]: 2025-12-05 09:39:55.407 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:39:55 compute-1 nova_compute[189066]: 2025-12-05 09:39:55.918 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:39:58 compute-1 sshd-session[230868]: Received disconnect from 101.47.162.91 port 45386:11: Bye Bye [preauth]
Dec 05 09:39:58 compute-1 sshd-session[230868]: Disconnected from authenticating user root 101.47.162.91 port 45386 [preauth]
Dec 05 09:39:58 compute-1 nova_compute[189066]: 2025-12-05 09:39:58.396 189070 DEBUG oslo_concurrency.lockutils [None req-f32b50c8-ec3c-4248-b172-43bed64b359a fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] Acquiring lock "cddb30f3-076c-4cc6-8609-80dce3c0c67c" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:39:58 compute-1 nova_compute[189066]: 2025-12-05 09:39:58.396 189070 DEBUG oslo_concurrency.lockutils [None req-f32b50c8-ec3c-4248-b172-43bed64b359a fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] Lock "cddb30f3-076c-4cc6-8609-80dce3c0c67c" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:39:58 compute-1 nova_compute[189066]: 2025-12-05 09:39:58.423 189070 DEBUG nova.compute.manager [None req-f32b50c8-ec3c-4248-b172-43bed64b359a fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] [instance: cddb30f3-076c-4cc6-8609-80dce3c0c67c] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Dec 05 09:39:58 compute-1 nova_compute[189066]: 2025-12-05 09:39:58.507 189070 DEBUG oslo_concurrency.lockutils [None req-f32b50c8-ec3c-4248-b172-43bed64b359a fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:39:58 compute-1 nova_compute[189066]: 2025-12-05 09:39:58.508 189070 DEBUG oslo_concurrency.lockutils [None req-f32b50c8-ec3c-4248-b172-43bed64b359a fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:39:58 compute-1 nova_compute[189066]: 2025-12-05 09:39:58.516 189070 DEBUG nova.virt.hardware [None req-f32b50c8-ec3c-4248-b172-43bed64b359a fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Dec 05 09:39:58 compute-1 nova_compute[189066]: 2025-12-05 09:39:58.517 189070 INFO nova.compute.claims [None req-f32b50c8-ec3c-4248-b172-43bed64b359a fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] [instance: cddb30f3-076c-4cc6-8609-80dce3c0c67c] Claim successful on node compute-1.ctlplane.example.com
Dec 05 09:39:58 compute-1 nova_compute[189066]: 2025-12-05 09:39:58.798 189070 DEBUG nova.compute.provider_tree [None req-f32b50c8-ec3c-4248-b172-43bed64b359a fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] Inventory has not changed in ProviderTree for provider: be68f9f1-7820-4bfa-8dbd-210e13729f64 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 09:39:58 compute-1 nova_compute[189066]: 2025-12-05 09:39:58.816 189070 DEBUG nova.scheduler.client.report [None req-f32b50c8-ec3c-4248-b172-43bed64b359a fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] Inventory has not changed for provider be68f9f1-7820-4bfa-8dbd-210e13729f64 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 09:39:58 compute-1 nova_compute[189066]: 2025-12-05 09:39:58.853 189070 DEBUG oslo_concurrency.lockutils [None req-f32b50c8-ec3c-4248-b172-43bed64b359a fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.345s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:39:58 compute-1 nova_compute[189066]: 2025-12-05 09:39:58.854 189070 DEBUG nova.compute.manager [None req-f32b50c8-ec3c-4248-b172-43bed64b359a fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] [instance: cddb30f3-076c-4cc6-8609-80dce3c0c67c] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Dec 05 09:39:58 compute-1 nova_compute[189066]: 2025-12-05 09:39:58.942 189070 DEBUG nova.compute.manager [None req-f32b50c8-ec3c-4248-b172-43bed64b359a fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] [instance: cddb30f3-076c-4cc6-8609-80dce3c0c67c] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Dec 05 09:39:58 compute-1 nova_compute[189066]: 2025-12-05 09:39:58.943 189070 DEBUG nova.network.neutron [None req-f32b50c8-ec3c-4248-b172-43bed64b359a fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] [instance: cddb30f3-076c-4cc6-8609-80dce3c0c67c] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Dec 05 09:39:58 compute-1 nova_compute[189066]: 2025-12-05 09:39:58.983 189070 INFO nova.virt.libvirt.driver [None req-f32b50c8-ec3c-4248-b172-43bed64b359a fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] [instance: cddb30f3-076c-4cc6-8609-80dce3c0c67c] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Dec 05 09:39:59 compute-1 nova_compute[189066]: 2025-12-05 09:39:59.014 189070 DEBUG nova.compute.manager [None req-f32b50c8-ec3c-4248-b172-43bed64b359a fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] [instance: cddb30f3-076c-4cc6-8609-80dce3c0c67c] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Dec 05 09:39:59 compute-1 nova_compute[189066]: 2025-12-05 09:39:59.144 189070 DEBUG nova.compute.manager [None req-f32b50c8-ec3c-4248-b172-43bed64b359a fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] [instance: cddb30f3-076c-4cc6-8609-80dce3c0c67c] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Dec 05 09:39:59 compute-1 nova_compute[189066]: 2025-12-05 09:39:59.150 189070 DEBUG nova.virt.libvirt.driver [None req-f32b50c8-ec3c-4248-b172-43bed64b359a fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] [instance: cddb30f3-076c-4cc6-8609-80dce3c0c67c] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Dec 05 09:39:59 compute-1 nova_compute[189066]: 2025-12-05 09:39:59.152 189070 INFO nova.virt.libvirt.driver [None req-f32b50c8-ec3c-4248-b172-43bed64b359a fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] [instance: cddb30f3-076c-4cc6-8609-80dce3c0c67c] Creating image(s)
Dec 05 09:39:59 compute-1 nova_compute[189066]: 2025-12-05 09:39:59.153 189070 DEBUG oslo_concurrency.lockutils [None req-f32b50c8-ec3c-4248-b172-43bed64b359a fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] Acquiring lock "/var/lib/nova/instances/cddb30f3-076c-4cc6-8609-80dce3c0c67c/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:39:59 compute-1 nova_compute[189066]: 2025-12-05 09:39:59.154 189070 DEBUG oslo_concurrency.lockutils [None req-f32b50c8-ec3c-4248-b172-43bed64b359a fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] Lock "/var/lib/nova/instances/cddb30f3-076c-4cc6-8609-80dce3c0c67c/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:39:59 compute-1 nova_compute[189066]: 2025-12-05 09:39:59.155 189070 DEBUG oslo_concurrency.lockutils [None req-f32b50c8-ec3c-4248-b172-43bed64b359a fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] Lock "/var/lib/nova/instances/cddb30f3-076c-4cc6-8609-80dce3c0c67c/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:39:59 compute-1 nova_compute[189066]: 2025-12-05 09:39:59.187 189070 DEBUG oslo_concurrency.processutils [None req-f32b50c8-ec3c-4248-b172-43bed64b359a fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5b1bdb5cc50b9bf43ebe3094e9279a12a18df0ce --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 09:39:59 compute-1 nova_compute[189066]: 2025-12-05 09:39:59.255 189070 DEBUG nova.policy [None req-f32b50c8-ec3c-4248-b172-43bed64b359a fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'fae1c60e378945ea84b34c4824b835b1', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'fa1cd463d74b49139a088d332d37e611', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Dec 05 09:39:59 compute-1 nova_compute[189066]: 2025-12-05 09:39:59.262 189070 DEBUG oslo_concurrency.processutils [None req-f32b50c8-ec3c-4248-b172-43bed64b359a fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5b1bdb5cc50b9bf43ebe3094e9279a12a18df0ce --force-share --output=json" returned: 0 in 0.075s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 09:39:59 compute-1 nova_compute[189066]: 2025-12-05 09:39:59.264 189070 DEBUG oslo_concurrency.lockutils [None req-f32b50c8-ec3c-4248-b172-43bed64b359a fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] Acquiring lock "5b1bdb5cc50b9bf43ebe3094e9279a12a18df0ce" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:39:59 compute-1 nova_compute[189066]: 2025-12-05 09:39:59.265 189070 DEBUG oslo_concurrency.lockutils [None req-f32b50c8-ec3c-4248-b172-43bed64b359a fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] Lock "5b1bdb5cc50b9bf43ebe3094e9279a12a18df0ce" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:39:59 compute-1 nova_compute[189066]: 2025-12-05 09:39:59.288 189070 DEBUG oslo_concurrency.processutils [None req-f32b50c8-ec3c-4248-b172-43bed64b359a fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5b1bdb5cc50b9bf43ebe3094e9279a12a18df0ce --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 09:39:59 compute-1 nova_compute[189066]: 2025-12-05 09:39:59.350 189070 DEBUG oslo_concurrency.processutils [None req-f32b50c8-ec3c-4248-b172-43bed64b359a fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5b1bdb5cc50b9bf43ebe3094e9279a12a18df0ce --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 09:39:59 compute-1 nova_compute[189066]: 2025-12-05 09:39:59.352 189070 DEBUG oslo_concurrency.processutils [None req-f32b50c8-ec3c-4248-b172-43bed64b359a fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5b1bdb5cc50b9bf43ebe3094e9279a12a18df0ce,backing_fmt=raw /var/lib/nova/instances/cddb30f3-076c-4cc6-8609-80dce3c0c67c/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 09:39:59 compute-1 nova_compute[189066]: 2025-12-05 09:39:59.395 189070 DEBUG oslo_concurrency.processutils [None req-f32b50c8-ec3c-4248-b172-43bed64b359a fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5b1bdb5cc50b9bf43ebe3094e9279a12a18df0ce,backing_fmt=raw /var/lib/nova/instances/cddb30f3-076c-4cc6-8609-80dce3c0c67c/disk 1073741824" returned: 0 in 0.043s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 09:39:59 compute-1 nova_compute[189066]: 2025-12-05 09:39:59.397 189070 DEBUG oslo_concurrency.lockutils [None req-f32b50c8-ec3c-4248-b172-43bed64b359a fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] Lock "5b1bdb5cc50b9bf43ebe3094e9279a12a18df0ce" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.132s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:39:59 compute-1 nova_compute[189066]: 2025-12-05 09:39:59.398 189070 DEBUG oslo_concurrency.processutils [None req-f32b50c8-ec3c-4248-b172-43bed64b359a fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5b1bdb5cc50b9bf43ebe3094e9279a12a18df0ce --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 09:39:59 compute-1 nova_compute[189066]: 2025-12-05 09:39:59.460 189070 DEBUG oslo_concurrency.processutils [None req-f32b50c8-ec3c-4248-b172-43bed64b359a fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5b1bdb5cc50b9bf43ebe3094e9279a12a18df0ce --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 09:39:59 compute-1 nova_compute[189066]: 2025-12-05 09:39:59.462 189070 DEBUG nova.virt.disk.api [None req-f32b50c8-ec3c-4248-b172-43bed64b359a fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] Checking if we can resize image /var/lib/nova/instances/cddb30f3-076c-4cc6-8609-80dce3c0c67c/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Dec 05 09:39:59 compute-1 nova_compute[189066]: 2025-12-05 09:39:59.462 189070 DEBUG oslo_concurrency.processutils [None req-f32b50c8-ec3c-4248-b172-43bed64b359a fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/cddb30f3-076c-4cc6-8609-80dce3c0c67c/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 09:39:59 compute-1 nova_compute[189066]: 2025-12-05 09:39:59.567 189070 DEBUG oslo_concurrency.processutils [None req-f32b50c8-ec3c-4248-b172-43bed64b359a fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/cddb30f3-076c-4cc6-8609-80dce3c0c67c/disk --force-share --output=json" returned: 0 in 0.105s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 09:39:59 compute-1 nova_compute[189066]: 2025-12-05 09:39:59.569 189070 DEBUG nova.virt.disk.api [None req-f32b50c8-ec3c-4248-b172-43bed64b359a fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] Cannot resize image /var/lib/nova/instances/cddb30f3-076c-4cc6-8609-80dce3c0c67c/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Dec 05 09:39:59 compute-1 nova_compute[189066]: 2025-12-05 09:39:59.570 189070 DEBUG nova.objects.instance [None req-f32b50c8-ec3c-4248-b172-43bed64b359a fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] Lazy-loading 'migration_context' on Instance uuid cddb30f3-076c-4cc6-8609-80dce3c0c67c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 09:39:59 compute-1 podman[230883]: 2025-12-05 09:39:59.664754433 +0000 UTC m=+0.097000274 container health_status 0cdc951f3b455a1459471b20e1f1626ca53f702831c125f835d1b670aeb17ccf (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a42d21893a42142b0b00e96296a13ac9d2c0fedf55d6438ad104fd0309c63376-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 05 09:39:59 compute-1 nova_compute[189066]: 2025-12-05 09:39:59.993 189070 DEBUG nova.virt.libvirt.driver [None req-f32b50c8-ec3c-4248-b172-43bed64b359a fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] [instance: cddb30f3-076c-4cc6-8609-80dce3c0c67c] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Dec 05 09:39:59 compute-1 nova_compute[189066]: 2025-12-05 09:39:59.993 189070 DEBUG nova.virt.libvirt.driver [None req-f32b50c8-ec3c-4248-b172-43bed64b359a fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] [instance: cddb30f3-076c-4cc6-8609-80dce3c0c67c] Ensure instance console log exists: /var/lib/nova/instances/cddb30f3-076c-4cc6-8609-80dce3c0c67c/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Dec 05 09:39:59 compute-1 nova_compute[189066]: 2025-12-05 09:39:59.994 189070 DEBUG oslo_concurrency.lockutils [None req-f32b50c8-ec3c-4248-b172-43bed64b359a fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:39:59 compute-1 nova_compute[189066]: 2025-12-05 09:39:59.994 189070 DEBUG oslo_concurrency.lockutils [None req-f32b50c8-ec3c-4248-b172-43bed64b359a fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:39:59 compute-1 nova_compute[189066]: 2025-12-05 09:39:59.995 189070 DEBUG oslo_concurrency.lockutils [None req-f32b50c8-ec3c-4248-b172-43bed64b359a fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:40:00 compute-1 nova_compute[189066]: 2025-12-05 09:40:00.411 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:40:00 compute-1 nova_compute[189066]: 2025-12-05 09:40:00.921 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:40:01 compute-1 nova_compute[189066]: 2025-12-05 09:40:01.489 189070 DEBUG nova.network.neutron [None req-f32b50c8-ec3c-4248-b172-43bed64b359a fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] [instance: cddb30f3-076c-4cc6-8609-80dce3c0c67c] Successfully created port: ebe39c38-2dc3-42b4-a602-d1b9cc0a0de5 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Dec 05 09:40:02 compute-1 podman[230911]: 2025-12-05 09:40:02.621239714 +0000 UTC m=+0.059427984 container health_status e8cea6352187f28e672aa25c227f2705a90d58074b8cf42235a56df5d4026fd5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, managed_by=edpm_ansible, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a42d21893a42142b0b00e96296a13ac9d2c0fedf55d6438ad104fd0309c63376-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125)
Dec 05 09:40:03 compute-1 nova_compute[189066]: 2025-12-05 09:40:03.094 189070 DEBUG nova.network.neutron [None req-f32b50c8-ec3c-4248-b172-43bed64b359a fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] [instance: cddb30f3-076c-4cc6-8609-80dce3c0c67c] Successfully updated port: ebe39c38-2dc3-42b4-a602-d1b9cc0a0de5 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Dec 05 09:40:03 compute-1 nova_compute[189066]: 2025-12-05 09:40:03.115 189070 DEBUG oslo_concurrency.lockutils [None req-f32b50c8-ec3c-4248-b172-43bed64b359a fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] Acquiring lock "refresh_cache-cddb30f3-076c-4cc6-8609-80dce3c0c67c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 09:40:03 compute-1 nova_compute[189066]: 2025-12-05 09:40:03.115 189070 DEBUG oslo_concurrency.lockutils [None req-f32b50c8-ec3c-4248-b172-43bed64b359a fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] Acquired lock "refresh_cache-cddb30f3-076c-4cc6-8609-80dce3c0c67c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 09:40:03 compute-1 nova_compute[189066]: 2025-12-05 09:40:03.116 189070 DEBUG nova.network.neutron [None req-f32b50c8-ec3c-4248-b172-43bed64b359a fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] [instance: cddb30f3-076c-4cc6-8609-80dce3c0c67c] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Dec 05 09:40:03 compute-1 nova_compute[189066]: 2025-12-05 09:40:03.993 189070 DEBUG nova.network.neutron [None req-f32b50c8-ec3c-4248-b172-43bed64b359a fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] [instance: cddb30f3-076c-4cc6-8609-80dce3c0c67c] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Dec 05 09:40:04 compute-1 podman[230930]: 2025-12-05 09:40:04.629326112 +0000 UTC m=+0.068266230 container health_status 3f3288243aefe700d53cffce184353529fbaf6e44f1c19ec4792ed576af41176 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_managed=true, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2)
Dec 05 09:40:04 compute-1 nova_compute[189066]: 2025-12-05 09:40:04.634 189070 DEBUG nova.compute.manager [req-b9641579-f5f6-48bb-b768-38e0f5d11585 req-50a12fd9-8b05-4831-8e24-9cbc67dc02f1 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: cddb30f3-076c-4cc6-8609-80dce3c0c67c] Received event network-changed-ebe39c38-2dc3-42b4-a602-d1b9cc0a0de5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 09:40:04 compute-1 nova_compute[189066]: 2025-12-05 09:40:04.635 189070 DEBUG nova.compute.manager [req-b9641579-f5f6-48bb-b768-38e0f5d11585 req-50a12fd9-8b05-4831-8e24-9cbc67dc02f1 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: cddb30f3-076c-4cc6-8609-80dce3c0c67c] Refreshing instance network info cache due to event network-changed-ebe39c38-2dc3-42b4-a602-d1b9cc0a0de5. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Dec 05 09:40:04 compute-1 nova_compute[189066]: 2025-12-05 09:40:04.635 189070 DEBUG oslo_concurrency.lockutils [req-b9641579-f5f6-48bb-b768-38e0f5d11585 req-50a12fd9-8b05-4831-8e24-9cbc67dc02f1 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Acquiring lock "refresh_cache-cddb30f3-076c-4cc6-8609-80dce3c0c67c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 09:40:05 compute-1 nova_compute[189066]: 2025-12-05 09:40:05.413 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:40:05 compute-1 nova_compute[189066]: 2025-12-05 09:40:05.924 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:40:07 compute-1 nova_compute[189066]: 2025-12-05 09:40:07.495 189070 DEBUG nova.network.neutron [None req-f32b50c8-ec3c-4248-b172-43bed64b359a fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] [instance: cddb30f3-076c-4cc6-8609-80dce3c0c67c] Updating instance_info_cache with network_info: [{"id": "ebe39c38-2dc3-42b4-a602-d1b9cc0a0de5", "address": "fa:16:3e:33:cd:8b", "network": {"id": "73cf2b7d-b284-4d8e-896e-ab561df20f30", "bridge": "br-int", "label": "tempest-network-smoke--1597388656", "subnets": [{"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe33:cd8b", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe33:cd8b", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "fa1cd463d74b49139a088d332d37e611", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapebe39c38-2d", "ovs_interfaceid": "ebe39c38-2dc3-42b4-a602-d1b9cc0a0de5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 09:40:07 compute-1 nova_compute[189066]: 2025-12-05 09:40:07.519 189070 DEBUG oslo_concurrency.lockutils [None req-f32b50c8-ec3c-4248-b172-43bed64b359a fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] Releasing lock "refresh_cache-cddb30f3-076c-4cc6-8609-80dce3c0c67c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 09:40:07 compute-1 nova_compute[189066]: 2025-12-05 09:40:07.519 189070 DEBUG nova.compute.manager [None req-f32b50c8-ec3c-4248-b172-43bed64b359a fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] [instance: cddb30f3-076c-4cc6-8609-80dce3c0c67c] Instance network_info: |[{"id": "ebe39c38-2dc3-42b4-a602-d1b9cc0a0de5", "address": "fa:16:3e:33:cd:8b", "network": {"id": "73cf2b7d-b284-4d8e-896e-ab561df20f30", "bridge": "br-int", "label": "tempest-network-smoke--1597388656", "subnets": [{"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe33:cd8b", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe33:cd8b", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "fa1cd463d74b49139a088d332d37e611", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapebe39c38-2d", "ovs_interfaceid": "ebe39c38-2dc3-42b4-a602-d1b9cc0a0de5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Dec 05 09:40:07 compute-1 nova_compute[189066]: 2025-12-05 09:40:07.520 189070 DEBUG oslo_concurrency.lockutils [req-b9641579-f5f6-48bb-b768-38e0f5d11585 req-50a12fd9-8b05-4831-8e24-9cbc67dc02f1 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Acquired lock "refresh_cache-cddb30f3-076c-4cc6-8609-80dce3c0c67c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 09:40:07 compute-1 nova_compute[189066]: 2025-12-05 09:40:07.520 189070 DEBUG nova.network.neutron [req-b9641579-f5f6-48bb-b768-38e0f5d11585 req-50a12fd9-8b05-4831-8e24-9cbc67dc02f1 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: cddb30f3-076c-4cc6-8609-80dce3c0c67c] Refreshing network info cache for port ebe39c38-2dc3-42b4-a602-d1b9cc0a0de5 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Dec 05 09:40:07 compute-1 nova_compute[189066]: 2025-12-05 09:40:07.524 189070 DEBUG nova.virt.libvirt.driver [None req-f32b50c8-ec3c-4248-b172-43bed64b359a fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] [instance: cddb30f3-076c-4cc6-8609-80dce3c0c67c] Start _get_guest_xml network_info=[{"id": "ebe39c38-2dc3-42b4-a602-d1b9cc0a0de5", "address": "fa:16:3e:33:cd:8b", "network": {"id": "73cf2b7d-b284-4d8e-896e-ab561df20f30", "bridge": "br-int", "label": "tempest-network-smoke--1597388656", "subnets": [{"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe33:cd8b", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe33:cd8b", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "fa1cd463d74b49139a088d332d37e611", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapebe39c38-2d", "ovs_interfaceid": "ebe39c38-2dc3-42b4-a602-d1b9cc0a0de5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T09:19:04Z,direct_url=<?>,disk_format='qcow2',id=3ebffd97-b242-42d7-b245-ebdaf8e4377c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='1cb29f3743454b7fb71af92993455437',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T09:19:06Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encrypted': False, 'encryption_secret_uuid': None, 'size': 0, 'boot_index': 0, 'device_name': '/dev/vda', 'disk_bus': 'virtio', 'guest_format': None, 'encryption_options': None, 'encryption_format': None, 'device_type': 'disk', 'image_id': '3ebffd97-b242-42d7-b245-ebdaf8e4377c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Dec 05 09:40:07 compute-1 nova_compute[189066]: 2025-12-05 09:40:07.530 189070 WARNING nova.virt.libvirt.driver [None req-f32b50c8-ec3c-4248-b172-43bed64b359a fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 05 09:40:07 compute-1 nova_compute[189066]: 2025-12-05 09:40:07.536 189070 DEBUG nova.virt.libvirt.host [None req-f32b50c8-ec3c-4248-b172-43bed64b359a fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Dec 05 09:40:07 compute-1 nova_compute[189066]: 2025-12-05 09:40:07.536 189070 DEBUG nova.virt.libvirt.host [None req-f32b50c8-ec3c-4248-b172-43bed64b359a fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Dec 05 09:40:07 compute-1 nova_compute[189066]: 2025-12-05 09:40:07.543 189070 DEBUG nova.virt.libvirt.host [None req-f32b50c8-ec3c-4248-b172-43bed64b359a fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Dec 05 09:40:07 compute-1 nova_compute[189066]: 2025-12-05 09:40:07.544 189070 DEBUG nova.virt.libvirt.host [None req-f32b50c8-ec3c-4248-b172-43bed64b359a fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Dec 05 09:40:07 compute-1 nova_compute[189066]: 2025-12-05 09:40:07.545 189070 DEBUG nova.virt.libvirt.driver [None req-f32b50c8-ec3c-4248-b172-43bed64b359a fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Dec 05 09:40:07 compute-1 nova_compute[189066]: 2025-12-05 09:40:07.546 189070 DEBUG nova.virt.hardware [None req-f32b50c8-ec3c-4248-b172-43bed64b359a fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-05T09:19:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='fbadeab4-f24f-4100-963a-d228b2a6f7c4',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T09:19:04Z,direct_url=<?>,disk_format='qcow2',id=3ebffd97-b242-42d7-b245-ebdaf8e4377c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='1cb29f3743454b7fb71af92993455437',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T09:19:06Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Dec 05 09:40:07 compute-1 nova_compute[189066]: 2025-12-05 09:40:07.546 189070 DEBUG nova.virt.hardware [None req-f32b50c8-ec3c-4248-b172-43bed64b359a fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Dec 05 09:40:07 compute-1 nova_compute[189066]: 2025-12-05 09:40:07.546 189070 DEBUG nova.virt.hardware [None req-f32b50c8-ec3c-4248-b172-43bed64b359a fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Dec 05 09:40:07 compute-1 nova_compute[189066]: 2025-12-05 09:40:07.547 189070 DEBUG nova.virt.hardware [None req-f32b50c8-ec3c-4248-b172-43bed64b359a fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Dec 05 09:40:07 compute-1 nova_compute[189066]: 2025-12-05 09:40:07.547 189070 DEBUG nova.virt.hardware [None req-f32b50c8-ec3c-4248-b172-43bed64b359a fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Dec 05 09:40:07 compute-1 nova_compute[189066]: 2025-12-05 09:40:07.547 189070 DEBUG nova.virt.hardware [None req-f32b50c8-ec3c-4248-b172-43bed64b359a fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Dec 05 09:40:07 compute-1 nova_compute[189066]: 2025-12-05 09:40:07.547 189070 DEBUG nova.virt.hardware [None req-f32b50c8-ec3c-4248-b172-43bed64b359a fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Dec 05 09:40:07 compute-1 nova_compute[189066]: 2025-12-05 09:40:07.548 189070 DEBUG nova.virt.hardware [None req-f32b50c8-ec3c-4248-b172-43bed64b359a fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Dec 05 09:40:07 compute-1 nova_compute[189066]: 2025-12-05 09:40:07.548 189070 DEBUG nova.virt.hardware [None req-f32b50c8-ec3c-4248-b172-43bed64b359a fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Dec 05 09:40:07 compute-1 nova_compute[189066]: 2025-12-05 09:40:07.548 189070 DEBUG nova.virt.hardware [None req-f32b50c8-ec3c-4248-b172-43bed64b359a fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Dec 05 09:40:07 compute-1 nova_compute[189066]: 2025-12-05 09:40:07.548 189070 DEBUG nova.virt.hardware [None req-f32b50c8-ec3c-4248-b172-43bed64b359a fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Dec 05 09:40:07 compute-1 nova_compute[189066]: 2025-12-05 09:40:07.553 189070 DEBUG nova.virt.libvirt.vif [None req-f32b50c8-ec3c-4248-b172-43bed64b359a fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T09:39:57Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1388404493',display_name='tempest-TestGettingAddress-server-1388404493',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1388404493',id=44,image_ref='3ebffd97-b242-42d7-b245-ebdaf8e4377c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBBafU0ViNZ53/PwZlyW/c3PBfwXWKR4oTu3AQxTtSLWjy2Zcdb0NG0DqRxEmeoeFfvnQSXxIpzjyhk7NEskxhM71gR3QXFE46g9tGofm55gDMCuHR08Qdd5xuE5myonkkw==',key_name='tempest-TestGettingAddress-1977496242',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='fa1cd463d74b49139a088d332d37e611',ramdisk_id='',reservation_id='r-17at0qfh',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='3ebffd97-b242-42d7-b245-ebdaf8e4377c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-8368731',owner_user_name='tempest-TestGettingAddress-8368731-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T09:39:59Z,user_data=None,user_id='fae1c60e378945ea84b34c4824b835b1',uuid=cddb30f3-076c-4cc6-8609-80dce3c0c67c,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "ebe39c38-2dc3-42b4-a602-d1b9cc0a0de5", "address": "fa:16:3e:33:cd:8b", "network": {"id": "73cf2b7d-b284-4d8e-896e-ab561df20f30", "bridge": "br-int", "label": "tempest-network-smoke--1597388656", "subnets": [{"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe33:cd8b", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe33:cd8b", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "fa1cd463d74b49139a088d332d37e611", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapebe39c38-2d", "ovs_interfaceid": "ebe39c38-2dc3-42b4-a602-d1b9cc0a0de5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Dec 05 09:40:07 compute-1 nova_compute[189066]: 2025-12-05 09:40:07.554 189070 DEBUG nova.network.os_vif_util [None req-f32b50c8-ec3c-4248-b172-43bed64b359a fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] Converting VIF {"id": "ebe39c38-2dc3-42b4-a602-d1b9cc0a0de5", "address": "fa:16:3e:33:cd:8b", "network": {"id": "73cf2b7d-b284-4d8e-896e-ab561df20f30", "bridge": "br-int", "label": "tempest-network-smoke--1597388656", "subnets": [{"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe33:cd8b", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe33:cd8b", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "fa1cd463d74b49139a088d332d37e611", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapebe39c38-2d", "ovs_interfaceid": "ebe39c38-2dc3-42b4-a602-d1b9cc0a0de5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 09:40:07 compute-1 nova_compute[189066]: 2025-12-05 09:40:07.556 189070 DEBUG nova.network.os_vif_util [None req-f32b50c8-ec3c-4248-b172-43bed64b359a fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:33:cd:8b,bridge_name='br-int',has_traffic_filtering=True,id=ebe39c38-2dc3-42b4-a602-d1b9cc0a0de5,network=Network(73cf2b7d-b284-4d8e-896e-ab561df20f30),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapebe39c38-2d') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 09:40:07 compute-1 nova_compute[189066]: 2025-12-05 09:40:07.557 189070 DEBUG nova.objects.instance [None req-f32b50c8-ec3c-4248-b172-43bed64b359a fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] Lazy-loading 'pci_devices' on Instance uuid cddb30f3-076c-4cc6-8609-80dce3c0c67c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 09:40:07 compute-1 nova_compute[189066]: 2025-12-05 09:40:07.578 189070 DEBUG nova.virt.libvirt.driver [None req-f32b50c8-ec3c-4248-b172-43bed64b359a fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] [instance: cddb30f3-076c-4cc6-8609-80dce3c0c67c] End _get_guest_xml xml=<domain type="kvm">
Dec 05 09:40:07 compute-1 nova_compute[189066]:   <uuid>cddb30f3-076c-4cc6-8609-80dce3c0c67c</uuid>
Dec 05 09:40:07 compute-1 nova_compute[189066]:   <name>instance-0000002c</name>
Dec 05 09:40:07 compute-1 nova_compute[189066]:   <memory>131072</memory>
Dec 05 09:40:07 compute-1 nova_compute[189066]:   <vcpu>1</vcpu>
Dec 05 09:40:07 compute-1 nova_compute[189066]:   <metadata>
Dec 05 09:40:07 compute-1 nova_compute[189066]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 05 09:40:07 compute-1 nova_compute[189066]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec 05 09:40:07 compute-1 nova_compute[189066]:       <nova:name>tempest-TestGettingAddress-server-1388404493</nova:name>
Dec 05 09:40:07 compute-1 nova_compute[189066]:       <nova:creationTime>2025-12-05 09:40:07</nova:creationTime>
Dec 05 09:40:07 compute-1 nova_compute[189066]:       <nova:flavor name="m1.nano">
Dec 05 09:40:07 compute-1 nova_compute[189066]:         <nova:memory>128</nova:memory>
Dec 05 09:40:07 compute-1 nova_compute[189066]:         <nova:disk>1</nova:disk>
Dec 05 09:40:07 compute-1 nova_compute[189066]:         <nova:swap>0</nova:swap>
Dec 05 09:40:07 compute-1 nova_compute[189066]:         <nova:ephemeral>0</nova:ephemeral>
Dec 05 09:40:07 compute-1 nova_compute[189066]:         <nova:vcpus>1</nova:vcpus>
Dec 05 09:40:07 compute-1 nova_compute[189066]:       </nova:flavor>
Dec 05 09:40:07 compute-1 nova_compute[189066]:       <nova:owner>
Dec 05 09:40:07 compute-1 nova_compute[189066]:         <nova:user uuid="fae1c60e378945ea84b34c4824b835b1">tempest-TestGettingAddress-8368731-project-member</nova:user>
Dec 05 09:40:07 compute-1 nova_compute[189066]:         <nova:project uuid="fa1cd463d74b49139a088d332d37e611">tempest-TestGettingAddress-8368731</nova:project>
Dec 05 09:40:07 compute-1 nova_compute[189066]:       </nova:owner>
Dec 05 09:40:07 compute-1 nova_compute[189066]:       <nova:root type="image" uuid="3ebffd97-b242-42d7-b245-ebdaf8e4377c"/>
Dec 05 09:40:07 compute-1 nova_compute[189066]:       <nova:ports>
Dec 05 09:40:07 compute-1 nova_compute[189066]:         <nova:port uuid="ebe39c38-2dc3-42b4-a602-d1b9cc0a0de5">
Dec 05 09:40:07 compute-1 nova_compute[189066]:           <nova:ip type="fixed" address="2001:db8:0:1:f816:3eff:fe33:cd8b" ipVersion="6"/>
Dec 05 09:40:07 compute-1 nova_compute[189066]:           <nova:ip type="fixed" address="10.100.0.9" ipVersion="4"/>
Dec 05 09:40:07 compute-1 nova_compute[189066]:           <nova:ip type="fixed" address="2001:db8::f816:3eff:fe33:cd8b" ipVersion="6"/>
Dec 05 09:40:07 compute-1 nova_compute[189066]:         </nova:port>
Dec 05 09:40:07 compute-1 nova_compute[189066]:       </nova:ports>
Dec 05 09:40:07 compute-1 nova_compute[189066]:     </nova:instance>
Dec 05 09:40:07 compute-1 nova_compute[189066]:   </metadata>
Dec 05 09:40:07 compute-1 nova_compute[189066]:   <sysinfo type="smbios">
Dec 05 09:40:07 compute-1 nova_compute[189066]:     <system>
Dec 05 09:40:07 compute-1 nova_compute[189066]:       <entry name="manufacturer">RDO</entry>
Dec 05 09:40:07 compute-1 nova_compute[189066]:       <entry name="product">OpenStack Compute</entry>
Dec 05 09:40:07 compute-1 nova_compute[189066]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec 05 09:40:07 compute-1 nova_compute[189066]:       <entry name="serial">cddb30f3-076c-4cc6-8609-80dce3c0c67c</entry>
Dec 05 09:40:07 compute-1 nova_compute[189066]:       <entry name="uuid">cddb30f3-076c-4cc6-8609-80dce3c0c67c</entry>
Dec 05 09:40:07 compute-1 nova_compute[189066]:       <entry name="family">Virtual Machine</entry>
Dec 05 09:40:07 compute-1 nova_compute[189066]:     </system>
Dec 05 09:40:07 compute-1 nova_compute[189066]:   </sysinfo>
Dec 05 09:40:07 compute-1 nova_compute[189066]:   <os>
Dec 05 09:40:07 compute-1 nova_compute[189066]:     <type arch="x86_64" machine="q35">hvm</type>
Dec 05 09:40:07 compute-1 nova_compute[189066]:     <boot dev="hd"/>
Dec 05 09:40:07 compute-1 nova_compute[189066]:     <smbios mode="sysinfo"/>
Dec 05 09:40:07 compute-1 nova_compute[189066]:   </os>
Dec 05 09:40:07 compute-1 nova_compute[189066]:   <features>
Dec 05 09:40:07 compute-1 nova_compute[189066]:     <acpi/>
Dec 05 09:40:07 compute-1 nova_compute[189066]:     <apic/>
Dec 05 09:40:07 compute-1 nova_compute[189066]:     <vmcoreinfo/>
Dec 05 09:40:07 compute-1 nova_compute[189066]:   </features>
Dec 05 09:40:07 compute-1 nova_compute[189066]:   <clock offset="utc">
Dec 05 09:40:07 compute-1 nova_compute[189066]:     <timer name="pit" tickpolicy="delay"/>
Dec 05 09:40:07 compute-1 nova_compute[189066]:     <timer name="rtc" tickpolicy="catchup"/>
Dec 05 09:40:07 compute-1 nova_compute[189066]:     <timer name="hpet" present="no"/>
Dec 05 09:40:07 compute-1 nova_compute[189066]:   </clock>
Dec 05 09:40:07 compute-1 nova_compute[189066]:   <cpu mode="custom" match="exact">
Dec 05 09:40:07 compute-1 nova_compute[189066]:     <model>Nehalem</model>
Dec 05 09:40:07 compute-1 nova_compute[189066]:     <topology sockets="1" cores="1" threads="1"/>
Dec 05 09:40:07 compute-1 nova_compute[189066]:   </cpu>
Dec 05 09:40:07 compute-1 nova_compute[189066]:   <devices>
Dec 05 09:40:07 compute-1 nova_compute[189066]:     <disk type="file" device="disk">
Dec 05 09:40:07 compute-1 nova_compute[189066]:       <driver name="qemu" type="qcow2" cache="none"/>
Dec 05 09:40:07 compute-1 nova_compute[189066]:       <source file="/var/lib/nova/instances/cddb30f3-076c-4cc6-8609-80dce3c0c67c/disk"/>
Dec 05 09:40:07 compute-1 nova_compute[189066]:       <target dev="vda" bus="virtio"/>
Dec 05 09:40:07 compute-1 nova_compute[189066]:     </disk>
Dec 05 09:40:07 compute-1 nova_compute[189066]:     <disk type="file" device="cdrom">
Dec 05 09:40:07 compute-1 nova_compute[189066]:       <driver name="qemu" type="raw" cache="none"/>
Dec 05 09:40:07 compute-1 nova_compute[189066]:       <source file="/var/lib/nova/instances/cddb30f3-076c-4cc6-8609-80dce3c0c67c/disk.config"/>
Dec 05 09:40:07 compute-1 nova_compute[189066]:       <target dev="sda" bus="sata"/>
Dec 05 09:40:07 compute-1 nova_compute[189066]:     </disk>
Dec 05 09:40:07 compute-1 nova_compute[189066]:     <interface type="ethernet">
Dec 05 09:40:07 compute-1 nova_compute[189066]:       <mac address="fa:16:3e:33:cd:8b"/>
Dec 05 09:40:07 compute-1 nova_compute[189066]:       <model type="virtio"/>
Dec 05 09:40:07 compute-1 nova_compute[189066]:       <driver name="vhost" rx_queue_size="512"/>
Dec 05 09:40:07 compute-1 nova_compute[189066]:       <mtu size="1442"/>
Dec 05 09:40:07 compute-1 nova_compute[189066]:       <target dev="tapebe39c38-2d"/>
Dec 05 09:40:07 compute-1 nova_compute[189066]:     </interface>
Dec 05 09:40:07 compute-1 nova_compute[189066]:     <serial type="pty">
Dec 05 09:40:07 compute-1 nova_compute[189066]:       <log file="/var/lib/nova/instances/cddb30f3-076c-4cc6-8609-80dce3c0c67c/console.log" append="off"/>
Dec 05 09:40:07 compute-1 nova_compute[189066]:     </serial>
Dec 05 09:40:07 compute-1 nova_compute[189066]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Dec 05 09:40:07 compute-1 nova_compute[189066]:     <video>
Dec 05 09:40:07 compute-1 nova_compute[189066]:       <model type="virtio"/>
Dec 05 09:40:07 compute-1 nova_compute[189066]:     </video>
Dec 05 09:40:07 compute-1 nova_compute[189066]:     <input type="tablet" bus="usb"/>
Dec 05 09:40:07 compute-1 nova_compute[189066]:     <rng model="virtio">
Dec 05 09:40:07 compute-1 nova_compute[189066]:       <backend model="random">/dev/urandom</backend>
Dec 05 09:40:07 compute-1 nova_compute[189066]:     </rng>
Dec 05 09:40:07 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root"/>
Dec 05 09:40:07 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:40:07 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:40:07 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:40:07 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:40:07 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:40:07 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:40:07 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:40:07 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:40:07 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:40:07 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:40:07 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:40:07 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:40:07 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:40:07 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:40:07 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:40:07 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:40:07 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:40:07 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:40:07 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:40:07 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:40:07 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:40:07 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:40:07 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:40:07 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:40:07 compute-1 nova_compute[189066]:     <controller type="usb" index="0"/>
Dec 05 09:40:07 compute-1 nova_compute[189066]:     <memballoon model="virtio">
Dec 05 09:40:07 compute-1 nova_compute[189066]:       <stats period="10"/>
Dec 05 09:40:07 compute-1 nova_compute[189066]:     </memballoon>
Dec 05 09:40:07 compute-1 nova_compute[189066]:   </devices>
Dec 05 09:40:07 compute-1 nova_compute[189066]: </domain>
Dec 05 09:40:07 compute-1 nova_compute[189066]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Dec 05 09:40:07 compute-1 nova_compute[189066]: 2025-12-05 09:40:07.580 189070 DEBUG nova.compute.manager [None req-f32b50c8-ec3c-4248-b172-43bed64b359a fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] [instance: cddb30f3-076c-4cc6-8609-80dce3c0c67c] Preparing to wait for external event network-vif-plugged-ebe39c38-2dc3-42b4-a602-d1b9cc0a0de5 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Dec 05 09:40:07 compute-1 nova_compute[189066]: 2025-12-05 09:40:07.580 189070 DEBUG oslo_concurrency.lockutils [None req-f32b50c8-ec3c-4248-b172-43bed64b359a fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] Acquiring lock "cddb30f3-076c-4cc6-8609-80dce3c0c67c-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:40:07 compute-1 nova_compute[189066]: 2025-12-05 09:40:07.580 189070 DEBUG oslo_concurrency.lockutils [None req-f32b50c8-ec3c-4248-b172-43bed64b359a fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] Lock "cddb30f3-076c-4cc6-8609-80dce3c0c67c-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:40:07 compute-1 nova_compute[189066]: 2025-12-05 09:40:07.581 189070 DEBUG oslo_concurrency.lockutils [None req-f32b50c8-ec3c-4248-b172-43bed64b359a fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] Lock "cddb30f3-076c-4cc6-8609-80dce3c0c67c-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:40:07 compute-1 nova_compute[189066]: 2025-12-05 09:40:07.581 189070 DEBUG nova.virt.libvirt.vif [None req-f32b50c8-ec3c-4248-b172-43bed64b359a fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T09:39:57Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1388404493',display_name='tempest-TestGettingAddress-server-1388404493',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1388404493',id=44,image_ref='3ebffd97-b242-42d7-b245-ebdaf8e4377c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBBafU0ViNZ53/PwZlyW/c3PBfwXWKR4oTu3AQxTtSLWjy2Zcdb0NG0DqRxEmeoeFfvnQSXxIpzjyhk7NEskxhM71gR3QXFE46g9tGofm55gDMCuHR08Qdd5xuE5myonkkw==',key_name='tempest-TestGettingAddress-1977496242',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='fa1cd463d74b49139a088d332d37e611',ramdisk_id='',reservation_id='r-17at0qfh',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='3ebffd97-b242-42d7-b245-ebdaf8e4377c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-8368731',owner_user_name='tempest-TestGettingAddress-8368731-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T09:39:59Z,user_data=None,user_id='fae1c60e378945ea84b34c4824b835b1',uuid=cddb30f3-076c-4cc6-8609-80dce3c0c67c,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "ebe39c38-2dc3-42b4-a602-d1b9cc0a0de5", "address": "fa:16:3e:33:cd:8b", "network": {"id": "73cf2b7d-b284-4d8e-896e-ab561df20f30", "bridge": "br-int", "label": "tempest-network-smoke--1597388656", "subnets": [{"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe33:cd8b", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe33:cd8b", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "fa1cd463d74b49139a088d332d37e611", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapebe39c38-2d", "ovs_interfaceid": "ebe39c38-2dc3-42b4-a602-d1b9cc0a0de5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Dec 05 09:40:07 compute-1 nova_compute[189066]: 2025-12-05 09:40:07.582 189070 DEBUG nova.network.os_vif_util [None req-f32b50c8-ec3c-4248-b172-43bed64b359a fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] Converting VIF {"id": "ebe39c38-2dc3-42b4-a602-d1b9cc0a0de5", "address": "fa:16:3e:33:cd:8b", "network": {"id": "73cf2b7d-b284-4d8e-896e-ab561df20f30", "bridge": "br-int", "label": "tempest-network-smoke--1597388656", "subnets": [{"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe33:cd8b", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe33:cd8b", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "fa1cd463d74b49139a088d332d37e611", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapebe39c38-2d", "ovs_interfaceid": "ebe39c38-2dc3-42b4-a602-d1b9cc0a0de5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 09:40:07 compute-1 nova_compute[189066]: 2025-12-05 09:40:07.582 189070 DEBUG nova.network.os_vif_util [None req-f32b50c8-ec3c-4248-b172-43bed64b359a fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:33:cd:8b,bridge_name='br-int',has_traffic_filtering=True,id=ebe39c38-2dc3-42b4-a602-d1b9cc0a0de5,network=Network(73cf2b7d-b284-4d8e-896e-ab561df20f30),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapebe39c38-2d') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 09:40:07 compute-1 nova_compute[189066]: 2025-12-05 09:40:07.583 189070 DEBUG os_vif [None req-f32b50c8-ec3c-4248-b172-43bed64b359a fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:33:cd:8b,bridge_name='br-int',has_traffic_filtering=True,id=ebe39c38-2dc3-42b4-a602-d1b9cc0a0de5,network=Network(73cf2b7d-b284-4d8e-896e-ab561df20f30),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapebe39c38-2d') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Dec 05 09:40:07 compute-1 nova_compute[189066]: 2025-12-05 09:40:07.584 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:40:07 compute-1 nova_compute[189066]: 2025-12-05 09:40:07.584 189070 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 09:40:07 compute-1 nova_compute[189066]: 2025-12-05 09:40:07.585 189070 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 09:40:07 compute-1 nova_compute[189066]: 2025-12-05 09:40:07.589 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:40:07 compute-1 nova_compute[189066]: 2025-12-05 09:40:07.589 189070 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapebe39c38-2d, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 09:40:07 compute-1 nova_compute[189066]: 2025-12-05 09:40:07.590 189070 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapebe39c38-2d, col_values=(('external_ids', {'iface-id': 'ebe39c38-2dc3-42b4-a602-d1b9cc0a0de5', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:33:cd:8b', 'vm-uuid': 'cddb30f3-076c-4cc6-8609-80dce3c0c67c'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 09:40:07 compute-1 nova_compute[189066]: 2025-12-05 09:40:07.591 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:40:07 compute-1 NetworkManager[55704]: <info>  [1764927607.5925] manager: (tapebe39c38-2d): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/119)
Dec 05 09:40:07 compute-1 nova_compute[189066]: 2025-12-05 09:40:07.593 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 09:40:07 compute-1 nova_compute[189066]: 2025-12-05 09:40:07.599 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:40:07 compute-1 nova_compute[189066]: 2025-12-05 09:40:07.600 189070 INFO os_vif [None req-f32b50c8-ec3c-4248-b172-43bed64b359a fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:33:cd:8b,bridge_name='br-int',has_traffic_filtering=True,id=ebe39c38-2dc3-42b4-a602-d1b9cc0a0de5,network=Network(73cf2b7d-b284-4d8e-896e-ab561df20f30),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapebe39c38-2d')
Dec 05 09:40:07 compute-1 nova_compute[189066]: 2025-12-05 09:40:07.668 189070 DEBUG nova.virt.libvirt.driver [None req-f32b50c8-ec3c-4248-b172-43bed64b359a fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 05 09:40:07 compute-1 nova_compute[189066]: 2025-12-05 09:40:07.668 189070 DEBUG nova.virt.libvirt.driver [None req-f32b50c8-ec3c-4248-b172-43bed64b359a fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 05 09:40:07 compute-1 nova_compute[189066]: 2025-12-05 09:40:07.668 189070 DEBUG nova.virt.libvirt.driver [None req-f32b50c8-ec3c-4248-b172-43bed64b359a fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] No VIF found with MAC fa:16:3e:33:cd:8b, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Dec 05 09:40:07 compute-1 nova_compute[189066]: 2025-12-05 09:40:07.669 189070 INFO nova.virt.libvirt.driver [None req-f32b50c8-ec3c-4248-b172-43bed64b359a fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] [instance: cddb30f3-076c-4cc6-8609-80dce3c0c67c] Using config drive
Dec 05 09:40:08 compute-1 nova_compute[189066]: 2025-12-05 09:40:08.196 189070 INFO nova.virt.libvirt.driver [None req-f32b50c8-ec3c-4248-b172-43bed64b359a fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] [instance: cddb30f3-076c-4cc6-8609-80dce3c0c67c] Creating config drive at /var/lib/nova/instances/cddb30f3-076c-4cc6-8609-80dce3c0c67c/disk.config
Dec 05 09:40:08 compute-1 nova_compute[189066]: 2025-12-05 09:40:08.203 189070 DEBUG oslo_concurrency.processutils [None req-f32b50c8-ec3c-4248-b172-43bed64b359a fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/cddb30f3-076c-4cc6-8609-80dce3c0c67c/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpib5brolk execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 09:40:08 compute-1 nova_compute[189066]: 2025-12-05 09:40:08.338 189070 DEBUG oslo_concurrency.processutils [None req-f32b50c8-ec3c-4248-b172-43bed64b359a fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/cddb30f3-076c-4cc6-8609-80dce3c0c67c/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpib5brolk" returned: 0 in 0.135s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 09:40:08 compute-1 kernel: tapebe39c38-2d: entered promiscuous mode
Dec 05 09:40:08 compute-1 NetworkManager[55704]: <info>  [1764927608.4249] manager: (tapebe39c38-2d): new Tun device (/org/freedesktop/NetworkManager/Devices/120)
Dec 05 09:40:08 compute-1 ovn_controller[95809]: 2025-12-05T09:40:08Z|00232|binding|INFO|Claiming lport ebe39c38-2dc3-42b4-a602-d1b9cc0a0de5 for this chassis.
Dec 05 09:40:08 compute-1 ovn_controller[95809]: 2025-12-05T09:40:08Z|00233|binding|INFO|ebe39c38-2dc3-42b4-a602-d1b9cc0a0de5: Claiming fa:16:3e:33:cd:8b 10.100.0.9 2001:db8:0:1:f816:3eff:fe33:cd8b 2001:db8::f816:3eff:fe33:cd8b
Dec 05 09:40:08 compute-1 nova_compute[189066]: 2025-12-05 09:40:08.427 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:40:08 compute-1 nova_compute[189066]: 2025-12-05 09:40:08.432 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:40:08 compute-1 nova_compute[189066]: 2025-12-05 09:40:08.437 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:40:08 compute-1 nova_compute[189066]: 2025-12-05 09:40:08.438 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:40:08 compute-1 nova_compute[189066]: 2025-12-05 09:40:08.442 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:40:08 compute-1 NetworkManager[55704]: <info>  [1764927608.4434] manager: (patch-br-int-to-provnet-540ef657-1e81-4b72-856b-0e1c84504735): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/121)
Dec 05 09:40:08 compute-1 NetworkManager[55704]: <info>  [1764927608.4441] manager: (patch-provnet-540ef657-1e81-4b72-856b-0e1c84504735-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/122)
Dec 05 09:40:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:40:08.448 105272 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:33:cd:8b 10.100.0.9 2001:db8:0:1:f816:3eff:fe33:cd8b 2001:db8::f816:3eff:fe33:cd8b'], port_security=['fa:16:3e:33:cd:8b 10.100.0.9 2001:db8:0:1:f816:3eff:fe33:cd8b 2001:db8::f816:3eff:fe33:cd8b'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28 2001:db8:0:1:f816:3eff:fe33:cd8b/64 2001:db8::f816:3eff:fe33:cd8b/64', 'neutron:device_id': 'cddb30f3-076c-4cc6-8609-80dce3c0c67c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-73cf2b7d-b284-4d8e-896e-ab561df20f30', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fa1cd463d74b49139a088d332d37e611', 'neutron:revision_number': '2', 'neutron:security_group_ids': '73f8d656-c691-4354-aaa0-9599172cea40', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=44e16512-8e6e-41d8-8a27-2e539ff56608, chassis=[<ovs.db.idl.Row object at 0x7fc06f54c970>], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fc06f54c970>], logical_port=ebe39c38-2dc3-42b4-a602-d1b9cc0a0de5) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 09:40:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:40:08.449 105272 INFO neutron.agent.ovn.metadata.agent [-] Port ebe39c38-2dc3-42b4-a602-d1b9cc0a0de5 in datapath 73cf2b7d-b284-4d8e-896e-ab561df20f30 bound to our chassis
Dec 05 09:40:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:40:08.451 105272 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 73cf2b7d-b284-4d8e-896e-ab561df20f30
Dec 05 09:40:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:40:08.467 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[d956c562-6d60-49d7-994c-2927fd2077d7]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:40:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:40:08.469 105272 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap73cf2b7d-b1 in ovnmeta-73cf2b7d-b284-4d8e-896e-ab561df20f30 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Dec 05 09:40:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:40:08.472 220556 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap73cf2b7d-b0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Dec 05 09:40:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:40:08.473 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[720ad927-1b8e-48cf-8d4f-8541f86a19d4]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:40:08 compute-1 systemd-machined[154815]: New machine qemu-19-instance-0000002c.
Dec 05 09:40:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:40:08.474 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[46334ed2-5bfe-4f16-99b3-440d05995f19]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:40:08 compute-1 systemd-udevd[230991]: Network interface NamePolicy= disabled on kernel command line.
Dec 05 09:40:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:40:08.486 105809 DEBUG oslo.privsep.daemon [-] privsep: reply[3e1360c5-7a4b-499f-9396-d9b112a59bcb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:40:08 compute-1 NetworkManager[55704]: <info>  [1764927608.4968] device (tapebe39c38-2d): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec 05 09:40:08 compute-1 NetworkManager[55704]: <info>  [1764927608.4977] device (tapebe39c38-2d): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec 05 09:40:08 compute-1 systemd[1]: Started Virtual Machine qemu-19-instance-0000002c.
Dec 05 09:40:08 compute-1 podman[230963]: 2025-12-05 09:40:08.50325532 +0000 UTC m=+0.081429592 container health_status b07f36300303a5c1729056a61e344d6c6fd9b2e404082d8e8ce7185d45652c2b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, maintainer=Red Hat, Inc., managed_by=edpm_ansible, vendor=Red Hat, Inc., release=1755695350, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.6, io.buildah.version=1.33.7, name=ubi9-minimal, io.openshift.expose-services=, architecture=x86_64, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=openstack_network_exporter)
Dec 05 09:40:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:40:08.525 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[677af8a1-f293-4978-be8a-9c65d166c427]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:40:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:40:08.550 105272 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=23, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f2:d3:89', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '8e:43:2c:7d:20:dd'}, ipsec=False) old=SB_Global(nb_cfg=22) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 09:40:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:40:08.562 220896 DEBUG oslo.privsep.daemon [-] privsep: reply[645a6c63-b95b-42ea-9a6c-9fa8f228cd90]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:40:08 compute-1 NetworkManager[55704]: <info>  [1764927608.5869] manager: (tap73cf2b7d-b0): new Veth device (/org/freedesktop/NetworkManager/Devices/123)
Dec 05 09:40:08 compute-1 nova_compute[189066]: 2025-12-05 09:40:08.588 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:40:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:40:08.590 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[021d0db6-a2d5-4452-a000-fda286bbdbb8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:40:08 compute-1 ovn_controller[95809]: 2025-12-05T09:40:08Z|00234|binding|INFO|Setting lport ebe39c38-2dc3-42b4-a602-d1b9cc0a0de5 ovn-installed in OVS
Dec 05 09:40:08 compute-1 ovn_controller[95809]: 2025-12-05T09:40:08Z|00235|binding|INFO|Setting lport ebe39c38-2dc3-42b4-a602-d1b9cc0a0de5 up in Southbound
Dec 05 09:40:08 compute-1 nova_compute[189066]: 2025-12-05 09:40:08.609 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:40:08 compute-1 nova_compute[189066]: 2025-12-05 09:40:08.621 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:40:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:40:08.633 220896 DEBUG oslo.privsep.daemon [-] privsep: reply[21d62b6b-03c5-4244-812e-5ba62ecb9ae5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:40:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:40:08.636 220896 DEBUG oslo.privsep.daemon [-] privsep: reply[378af6e8-38eb-4378-a5a8-a6d89cf1d21e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:40:08 compute-1 NetworkManager[55704]: <info>  [1764927608.6631] device (tap73cf2b7d-b0): carrier: link connected
Dec 05 09:40:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:40:08.669 220896 DEBUG oslo.privsep.daemon [-] privsep: reply[810add8f-3091-4194-9df2-e65e29bc5db8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:40:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:40:08.690 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[fa3d5db6-0e3e-4a84-b34e-b2d050e8557b]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap73cf2b7d-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:df:a2:21'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 69], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 488166, 'reachable_time': 16560, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 231025, 'error': None, 'target': 'ovnmeta-73cf2b7d-b284-4d8e-896e-ab561df20f30', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:40:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:40:08.709 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[ab263489-916b-4c08-9e38-c5ae0652034e]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fedf:a221'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 488166, 'tstamp': 488166}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 231026, 'error': None, 'target': 'ovnmeta-73cf2b7d-b284-4d8e-896e-ab561df20f30', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:40:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:40:08.735 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[0896c12e-af47-401c-aa3b-b43621ec04a0]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap73cf2b7d-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:df:a2:21'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 69], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 488166, 'reachable_time': 16560, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 231027, 'error': None, 'target': 'ovnmeta-73cf2b7d-b284-4d8e-896e-ab561df20f30', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:40:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:40:08.778 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[ec470326-b0d9-45c4-830a-a910b5cc50f5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:40:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:40:08.847 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[7d18c253-9e49-40ce-bc46-e9aa0cf33dfc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:40:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:40:08.849 105272 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap73cf2b7d-b0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 09:40:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:40:08.850 105272 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 09:40:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:40:08.850 105272 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap73cf2b7d-b0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 09:40:08 compute-1 nova_compute[189066]: 2025-12-05 09:40:08.852 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:40:08 compute-1 kernel: tap73cf2b7d-b0: entered promiscuous mode
Dec 05 09:40:08 compute-1 NetworkManager[55704]: <info>  [1764927608.8534] manager: (tap73cf2b7d-b0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/124)
Dec 05 09:40:08 compute-1 nova_compute[189066]: 2025-12-05 09:40:08.855 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:40:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:40:08.856 105272 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap73cf2b7d-b0, col_values=(('external_ids', {'iface-id': '14946cf1-e45b-478f-9cf6-cbb706b4f055'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 09:40:08 compute-1 nova_compute[189066]: 2025-12-05 09:40:08.858 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:40:08 compute-1 ovn_controller[95809]: 2025-12-05T09:40:08Z|00236|binding|INFO|Releasing lport 14946cf1-e45b-478f-9cf6-cbb706b4f055 from this chassis (sb_readonly=0)
Dec 05 09:40:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:40:08.859 105272 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/73cf2b7d-b284-4d8e-896e-ab561df20f30.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/73cf2b7d-b284-4d8e-896e-ab561df20f30.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Dec 05 09:40:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:40:08.860 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[8519c1fd-47af-42fb-b3ef-9f9311deca40]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:40:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:40:08.864 105272 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Dec 05 09:40:08 compute-1 ovn_metadata_agent[105267]: global
Dec 05 09:40:08 compute-1 ovn_metadata_agent[105267]:     log         /dev/log local0 debug
Dec 05 09:40:08 compute-1 ovn_metadata_agent[105267]:     log-tag     haproxy-metadata-proxy-73cf2b7d-b284-4d8e-896e-ab561df20f30
Dec 05 09:40:08 compute-1 ovn_metadata_agent[105267]:     user        root
Dec 05 09:40:08 compute-1 ovn_metadata_agent[105267]:     group       root
Dec 05 09:40:08 compute-1 ovn_metadata_agent[105267]:     maxconn     1024
Dec 05 09:40:08 compute-1 ovn_metadata_agent[105267]:     pidfile     /var/lib/neutron/external/pids/73cf2b7d-b284-4d8e-896e-ab561df20f30.pid.haproxy
Dec 05 09:40:08 compute-1 ovn_metadata_agent[105267]:     daemon
Dec 05 09:40:08 compute-1 ovn_metadata_agent[105267]: 
Dec 05 09:40:08 compute-1 ovn_metadata_agent[105267]: defaults
Dec 05 09:40:08 compute-1 ovn_metadata_agent[105267]:     log global
Dec 05 09:40:08 compute-1 ovn_metadata_agent[105267]:     mode http
Dec 05 09:40:08 compute-1 ovn_metadata_agent[105267]:     option httplog
Dec 05 09:40:08 compute-1 ovn_metadata_agent[105267]:     option dontlognull
Dec 05 09:40:08 compute-1 ovn_metadata_agent[105267]:     option http-server-close
Dec 05 09:40:08 compute-1 ovn_metadata_agent[105267]:     option forwardfor
Dec 05 09:40:08 compute-1 ovn_metadata_agent[105267]:     retries                 3
Dec 05 09:40:08 compute-1 ovn_metadata_agent[105267]:     timeout http-request    30s
Dec 05 09:40:08 compute-1 ovn_metadata_agent[105267]:     timeout connect         30s
Dec 05 09:40:08 compute-1 ovn_metadata_agent[105267]:     timeout client          32s
Dec 05 09:40:08 compute-1 ovn_metadata_agent[105267]:     timeout server          32s
Dec 05 09:40:08 compute-1 ovn_metadata_agent[105267]:     timeout http-keep-alive 30s
Dec 05 09:40:08 compute-1 ovn_metadata_agent[105267]: 
Dec 05 09:40:08 compute-1 ovn_metadata_agent[105267]: 
Dec 05 09:40:08 compute-1 ovn_metadata_agent[105267]: listen listener
Dec 05 09:40:08 compute-1 ovn_metadata_agent[105267]:     bind 169.254.169.254:80
Dec 05 09:40:08 compute-1 ovn_metadata_agent[105267]:     server metadata /var/lib/neutron/metadata_proxy
Dec 05 09:40:08 compute-1 ovn_metadata_agent[105267]:     http-request add-header X-OVN-Network-ID 73cf2b7d-b284-4d8e-896e-ab561df20f30
Dec 05 09:40:08 compute-1 ovn_metadata_agent[105267]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Dec 05 09:40:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:40:08.865 105272 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-73cf2b7d-b284-4d8e-896e-ab561df20f30', 'env', 'PROCESS_TAG=haproxy-73cf2b7d-b284-4d8e-896e-ab561df20f30', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/73cf2b7d-b284-4d8e-896e-ab561df20f30.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Dec 05 09:40:08 compute-1 nova_compute[189066]: 2025-12-05 09:40:08.871 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:40:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:40:08.886 105272 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:40:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:40:08.887 105272 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:40:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:40:08.888 105272 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:40:08 compute-1 nova_compute[189066]: 2025-12-05 09:40:08.967 189070 DEBUG nova.virt.driver [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] Emitting event <LifecycleEvent: 1764927608.9663393, cddb30f3-076c-4cc6-8609-80dce3c0c67c => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 09:40:08 compute-1 nova_compute[189066]: 2025-12-05 09:40:08.970 189070 INFO nova.compute.manager [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] [instance: cddb30f3-076c-4cc6-8609-80dce3c0c67c] VM Started (Lifecycle Event)
Dec 05 09:40:09 compute-1 nova_compute[189066]: 2025-12-05 09:40:09.019 189070 DEBUG nova.compute.manager [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] [instance: cddb30f3-076c-4cc6-8609-80dce3c0c67c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 09:40:09 compute-1 nova_compute[189066]: 2025-12-05 09:40:09.025 189070 DEBUG nova.virt.driver [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] Emitting event <LifecycleEvent: 1764927608.9676461, cddb30f3-076c-4cc6-8609-80dce3c0c67c => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 09:40:09 compute-1 nova_compute[189066]: 2025-12-05 09:40:09.025 189070 INFO nova.compute.manager [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] [instance: cddb30f3-076c-4cc6-8609-80dce3c0c67c] VM Paused (Lifecycle Event)
Dec 05 09:40:09 compute-1 nova_compute[189066]: 2025-12-05 09:40:09.060 189070 DEBUG nova.compute.manager [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] [instance: cddb30f3-076c-4cc6-8609-80dce3c0c67c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 09:40:09 compute-1 nova_compute[189066]: 2025-12-05 09:40:09.065 189070 DEBUG nova.compute.manager [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] [instance: cddb30f3-076c-4cc6-8609-80dce3c0c67c] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 09:40:09 compute-1 nova_compute[189066]: 2025-12-05 09:40:09.105 189070 INFO nova.compute.manager [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] [instance: cddb30f3-076c-4cc6-8609-80dce3c0c67c] During sync_power_state the instance has a pending task (spawning). Skip.
Dec 05 09:40:09 compute-1 podman[231066]: 2025-12-05 09:40:09.329165698 +0000 UTC m=+0.058543882 container create 9add60fcafeb2ddb93db68a78eb79c47c80091ba97733a058fcc175dfc52e4c1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-73cf2b7d-b284-4d8e-896e-ab561df20f30, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125)
Dec 05 09:40:09 compute-1 systemd[1]: Started libpod-conmon-9add60fcafeb2ddb93db68a78eb79c47c80091ba97733a058fcc175dfc52e4c1.scope.
Dec 05 09:40:09 compute-1 podman[231066]: 2025-12-05 09:40:09.300388095 +0000 UTC m=+0.029766279 image pull 014dc726c85414b29f2dde7b5d875685d08784761c0f0ffa8630d1583a877bf9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Dec 05 09:40:09 compute-1 systemd[1]: Started libcrun container.
Dec 05 09:40:09 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e032a2f8701d73f387b9fbffaf55bd9ef90952ddb92c8df9b69de3ed352175fa/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 05 09:40:09 compute-1 podman[231066]: 2025-12-05 09:40:09.428653131 +0000 UTC m=+0.158031335 container init 9add60fcafeb2ddb93db68a78eb79c47c80091ba97733a058fcc175dfc52e4c1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-73cf2b7d-b284-4d8e-896e-ab561df20f30, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true)
Dec 05 09:40:09 compute-1 podman[231066]: 2025-12-05 09:40:09.435353985 +0000 UTC m=+0.164732159 container start 9add60fcafeb2ddb93db68a78eb79c47c80091ba97733a058fcc175dfc52e4c1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-73cf2b7d-b284-4d8e-896e-ab561df20f30, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Dec 05 09:40:09 compute-1 neutron-haproxy-ovnmeta-73cf2b7d-b284-4d8e-896e-ab561df20f30[231081]: [NOTICE]   (231085) : New worker (231087) forked
Dec 05 09:40:09 compute-1 neutron-haproxy-ovnmeta-73cf2b7d-b284-4d8e-896e-ab561df20f30[231081]: [NOTICE]   (231085) : Loading success.
Dec 05 09:40:09 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:40:09.506 105272 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Dec 05 09:40:10 compute-1 nova_compute[189066]: 2025-12-05 09:40:10.415 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:40:10 compute-1 nova_compute[189066]: 2025-12-05 09:40:10.570 189070 DEBUG nova.network.neutron [req-b9641579-f5f6-48bb-b768-38e0f5d11585 req-50a12fd9-8b05-4831-8e24-9cbc67dc02f1 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: cddb30f3-076c-4cc6-8609-80dce3c0c67c] Updated VIF entry in instance network info cache for port ebe39c38-2dc3-42b4-a602-d1b9cc0a0de5. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Dec 05 09:40:10 compute-1 nova_compute[189066]: 2025-12-05 09:40:10.571 189070 DEBUG nova.network.neutron [req-b9641579-f5f6-48bb-b768-38e0f5d11585 req-50a12fd9-8b05-4831-8e24-9cbc67dc02f1 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: cddb30f3-076c-4cc6-8609-80dce3c0c67c] Updating instance_info_cache with network_info: [{"id": "ebe39c38-2dc3-42b4-a602-d1b9cc0a0de5", "address": "fa:16:3e:33:cd:8b", "network": {"id": "73cf2b7d-b284-4d8e-896e-ab561df20f30", "bridge": "br-int", "label": "tempest-network-smoke--1597388656", "subnets": [{"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe33:cd8b", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe33:cd8b", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "fa1cd463d74b49139a088d332d37e611", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapebe39c38-2d", "ovs_interfaceid": "ebe39c38-2dc3-42b4-a602-d1b9cc0a0de5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 09:40:10 compute-1 nova_compute[189066]: 2025-12-05 09:40:10.590 189070 DEBUG oslo_concurrency.lockutils [req-b9641579-f5f6-48bb-b768-38e0f5d11585 req-50a12fd9-8b05-4831-8e24-9cbc67dc02f1 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Releasing lock "refresh_cache-cddb30f3-076c-4cc6-8609-80dce3c0c67c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 09:40:10 compute-1 nova_compute[189066]: 2025-12-05 09:40:10.757 189070 DEBUG nova.compute.manager [req-aed1f5b0-95e6-4dfb-8664-67805d0c0adf req-51d6d915-f214-41ad-9195-ffc8c5320026 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: cddb30f3-076c-4cc6-8609-80dce3c0c67c] Received event network-vif-plugged-ebe39c38-2dc3-42b4-a602-d1b9cc0a0de5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 09:40:10 compute-1 nova_compute[189066]: 2025-12-05 09:40:10.758 189070 DEBUG oslo_concurrency.lockutils [req-aed1f5b0-95e6-4dfb-8664-67805d0c0adf req-51d6d915-f214-41ad-9195-ffc8c5320026 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Acquiring lock "cddb30f3-076c-4cc6-8609-80dce3c0c67c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:40:10 compute-1 nova_compute[189066]: 2025-12-05 09:40:10.758 189070 DEBUG oslo_concurrency.lockutils [req-aed1f5b0-95e6-4dfb-8664-67805d0c0adf req-51d6d915-f214-41ad-9195-ffc8c5320026 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Lock "cddb30f3-076c-4cc6-8609-80dce3c0c67c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:40:10 compute-1 nova_compute[189066]: 2025-12-05 09:40:10.758 189070 DEBUG oslo_concurrency.lockutils [req-aed1f5b0-95e6-4dfb-8664-67805d0c0adf req-51d6d915-f214-41ad-9195-ffc8c5320026 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Lock "cddb30f3-076c-4cc6-8609-80dce3c0c67c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:40:10 compute-1 nova_compute[189066]: 2025-12-05 09:40:10.759 189070 DEBUG nova.compute.manager [req-aed1f5b0-95e6-4dfb-8664-67805d0c0adf req-51d6d915-f214-41ad-9195-ffc8c5320026 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: cddb30f3-076c-4cc6-8609-80dce3c0c67c] Processing event network-vif-plugged-ebe39c38-2dc3-42b4-a602-d1b9cc0a0de5 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Dec 05 09:40:10 compute-1 nova_compute[189066]: 2025-12-05 09:40:10.759 189070 DEBUG nova.compute.manager [req-aed1f5b0-95e6-4dfb-8664-67805d0c0adf req-51d6d915-f214-41ad-9195-ffc8c5320026 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: cddb30f3-076c-4cc6-8609-80dce3c0c67c] Received event network-vif-plugged-ebe39c38-2dc3-42b4-a602-d1b9cc0a0de5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 09:40:10 compute-1 nova_compute[189066]: 2025-12-05 09:40:10.759 189070 DEBUG oslo_concurrency.lockutils [req-aed1f5b0-95e6-4dfb-8664-67805d0c0adf req-51d6d915-f214-41ad-9195-ffc8c5320026 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Acquiring lock "cddb30f3-076c-4cc6-8609-80dce3c0c67c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:40:10 compute-1 nova_compute[189066]: 2025-12-05 09:40:10.759 189070 DEBUG oslo_concurrency.lockutils [req-aed1f5b0-95e6-4dfb-8664-67805d0c0adf req-51d6d915-f214-41ad-9195-ffc8c5320026 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Lock "cddb30f3-076c-4cc6-8609-80dce3c0c67c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:40:10 compute-1 nova_compute[189066]: 2025-12-05 09:40:10.759 189070 DEBUG oslo_concurrency.lockutils [req-aed1f5b0-95e6-4dfb-8664-67805d0c0adf req-51d6d915-f214-41ad-9195-ffc8c5320026 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Lock "cddb30f3-076c-4cc6-8609-80dce3c0c67c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:40:10 compute-1 nova_compute[189066]: 2025-12-05 09:40:10.760 189070 DEBUG nova.compute.manager [req-aed1f5b0-95e6-4dfb-8664-67805d0c0adf req-51d6d915-f214-41ad-9195-ffc8c5320026 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: cddb30f3-076c-4cc6-8609-80dce3c0c67c] No waiting events found dispatching network-vif-plugged-ebe39c38-2dc3-42b4-a602-d1b9cc0a0de5 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 09:40:10 compute-1 nova_compute[189066]: 2025-12-05 09:40:10.760 189070 WARNING nova.compute.manager [req-aed1f5b0-95e6-4dfb-8664-67805d0c0adf req-51d6d915-f214-41ad-9195-ffc8c5320026 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: cddb30f3-076c-4cc6-8609-80dce3c0c67c] Received unexpected event network-vif-plugged-ebe39c38-2dc3-42b4-a602-d1b9cc0a0de5 for instance with vm_state building and task_state spawning.
Dec 05 09:40:10 compute-1 nova_compute[189066]: 2025-12-05 09:40:10.760 189070 DEBUG nova.compute.manager [None req-f32b50c8-ec3c-4248-b172-43bed64b359a fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] [instance: cddb30f3-076c-4cc6-8609-80dce3c0c67c] Instance event wait completed in 1 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Dec 05 09:40:10 compute-1 nova_compute[189066]: 2025-12-05 09:40:10.765 189070 DEBUG nova.virt.driver [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] Emitting event <LifecycleEvent: 1764927610.765647, cddb30f3-076c-4cc6-8609-80dce3c0c67c => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 09:40:10 compute-1 nova_compute[189066]: 2025-12-05 09:40:10.766 189070 INFO nova.compute.manager [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] [instance: cddb30f3-076c-4cc6-8609-80dce3c0c67c] VM Resumed (Lifecycle Event)
Dec 05 09:40:10 compute-1 nova_compute[189066]: 2025-12-05 09:40:10.768 189070 DEBUG nova.virt.libvirt.driver [None req-f32b50c8-ec3c-4248-b172-43bed64b359a fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] [instance: cddb30f3-076c-4cc6-8609-80dce3c0c67c] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Dec 05 09:40:10 compute-1 nova_compute[189066]: 2025-12-05 09:40:10.771 189070 INFO nova.virt.libvirt.driver [-] [instance: cddb30f3-076c-4cc6-8609-80dce3c0c67c] Instance spawned successfully.
Dec 05 09:40:10 compute-1 nova_compute[189066]: 2025-12-05 09:40:10.771 189070 DEBUG nova.virt.libvirt.driver [None req-f32b50c8-ec3c-4248-b172-43bed64b359a fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] [instance: cddb30f3-076c-4cc6-8609-80dce3c0c67c] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Dec 05 09:40:10 compute-1 nova_compute[189066]: 2025-12-05 09:40:10.801 189070 DEBUG nova.compute.manager [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] [instance: cddb30f3-076c-4cc6-8609-80dce3c0c67c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 09:40:10 compute-1 nova_compute[189066]: 2025-12-05 09:40:10.806 189070 DEBUG nova.virt.libvirt.driver [None req-f32b50c8-ec3c-4248-b172-43bed64b359a fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] [instance: cddb30f3-076c-4cc6-8609-80dce3c0c67c] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 09:40:10 compute-1 nova_compute[189066]: 2025-12-05 09:40:10.807 189070 DEBUG nova.virt.libvirt.driver [None req-f32b50c8-ec3c-4248-b172-43bed64b359a fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] [instance: cddb30f3-076c-4cc6-8609-80dce3c0c67c] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 09:40:10 compute-1 nova_compute[189066]: 2025-12-05 09:40:10.807 189070 DEBUG nova.virt.libvirt.driver [None req-f32b50c8-ec3c-4248-b172-43bed64b359a fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] [instance: cddb30f3-076c-4cc6-8609-80dce3c0c67c] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 09:40:10 compute-1 nova_compute[189066]: 2025-12-05 09:40:10.807 189070 DEBUG nova.virt.libvirt.driver [None req-f32b50c8-ec3c-4248-b172-43bed64b359a fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] [instance: cddb30f3-076c-4cc6-8609-80dce3c0c67c] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 09:40:10 compute-1 nova_compute[189066]: 2025-12-05 09:40:10.808 189070 DEBUG nova.virt.libvirt.driver [None req-f32b50c8-ec3c-4248-b172-43bed64b359a fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] [instance: cddb30f3-076c-4cc6-8609-80dce3c0c67c] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 09:40:10 compute-1 nova_compute[189066]: 2025-12-05 09:40:10.808 189070 DEBUG nova.virt.libvirt.driver [None req-f32b50c8-ec3c-4248-b172-43bed64b359a fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] [instance: cddb30f3-076c-4cc6-8609-80dce3c0c67c] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 09:40:10 compute-1 nova_compute[189066]: 2025-12-05 09:40:10.812 189070 DEBUG nova.compute.manager [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] [instance: cddb30f3-076c-4cc6-8609-80dce3c0c67c] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 09:40:10 compute-1 nova_compute[189066]: 2025-12-05 09:40:10.856 189070 INFO nova.compute.manager [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] [instance: cddb30f3-076c-4cc6-8609-80dce3c0c67c] During sync_power_state the instance has a pending task (spawning). Skip.
Dec 05 09:40:10 compute-1 nova_compute[189066]: 2025-12-05 09:40:10.888 189070 INFO nova.compute.manager [None req-f32b50c8-ec3c-4248-b172-43bed64b359a fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] [instance: cddb30f3-076c-4cc6-8609-80dce3c0c67c] Took 11.74 seconds to spawn the instance on the hypervisor.
Dec 05 09:40:10 compute-1 nova_compute[189066]: 2025-12-05 09:40:10.888 189070 DEBUG nova.compute.manager [None req-f32b50c8-ec3c-4248-b172-43bed64b359a fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] [instance: cddb30f3-076c-4cc6-8609-80dce3c0c67c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 09:40:10 compute-1 nova_compute[189066]: 2025-12-05 09:40:10.958 189070 INFO nova.compute.manager [None req-f32b50c8-ec3c-4248-b172-43bed64b359a fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] [instance: cddb30f3-076c-4cc6-8609-80dce3c0c67c] Took 12.49 seconds to build instance.
Dec 05 09:40:10 compute-1 nova_compute[189066]: 2025-12-05 09:40:10.981 189070 DEBUG oslo_concurrency.lockutils [None req-f32b50c8-ec3c-4248-b172-43bed64b359a fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] Lock "cddb30f3-076c-4cc6-8609-80dce3c0c67c" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 12.585s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:40:12 compute-1 nova_compute[189066]: 2025-12-05 09:40:12.592 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:40:12 compute-1 podman[231096]: 2025-12-05 09:40:12.631235732 +0000 UTC m=+0.063728210 container health_status b9f45cdcb567bb250381fa80d86c78c226d5638c7de1b6d79ee017275814ff68 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 05 09:40:14 compute-1 nova_compute[189066]: 2025-12-05 09:40:14.233 189070 DEBUG nova.compute.manager [req-874a282e-9a5b-4553-9323-5933ac5523ec req-53c31457-73fc-47b2-b7e4-5d57b0ba6e94 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: cddb30f3-076c-4cc6-8609-80dce3c0c67c] Received event network-changed-ebe39c38-2dc3-42b4-a602-d1b9cc0a0de5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 09:40:14 compute-1 nova_compute[189066]: 2025-12-05 09:40:14.235 189070 DEBUG nova.compute.manager [req-874a282e-9a5b-4553-9323-5933ac5523ec req-53c31457-73fc-47b2-b7e4-5d57b0ba6e94 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: cddb30f3-076c-4cc6-8609-80dce3c0c67c] Refreshing instance network info cache due to event network-changed-ebe39c38-2dc3-42b4-a602-d1b9cc0a0de5. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Dec 05 09:40:14 compute-1 nova_compute[189066]: 2025-12-05 09:40:14.235 189070 DEBUG oslo_concurrency.lockutils [req-874a282e-9a5b-4553-9323-5933ac5523ec req-53c31457-73fc-47b2-b7e4-5d57b0ba6e94 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Acquiring lock "refresh_cache-cddb30f3-076c-4cc6-8609-80dce3c0c67c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 09:40:14 compute-1 nova_compute[189066]: 2025-12-05 09:40:14.235 189070 DEBUG oslo_concurrency.lockutils [req-874a282e-9a5b-4553-9323-5933ac5523ec req-53c31457-73fc-47b2-b7e4-5d57b0ba6e94 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Acquired lock "refresh_cache-cddb30f3-076c-4cc6-8609-80dce3c0c67c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 09:40:14 compute-1 nova_compute[189066]: 2025-12-05 09:40:14.236 189070 DEBUG nova.network.neutron [req-874a282e-9a5b-4553-9323-5933ac5523ec req-53c31457-73fc-47b2-b7e4-5d57b0ba6e94 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: cddb30f3-076c-4cc6-8609-80dce3c0c67c] Refreshing network info cache for port ebe39c38-2dc3-42b4-a602-d1b9cc0a0de5 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Dec 05 09:40:15 compute-1 nova_compute[189066]: 2025-12-05 09:40:15.479 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:40:15 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:40:15.508 105272 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=2deaed7a-68f6-453c-b7f8-10ef033f3762, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '23'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 09:40:16 compute-1 nova_compute[189066]: 2025-12-05 09:40:16.984 189070 DEBUG nova.network.neutron [req-874a282e-9a5b-4553-9323-5933ac5523ec req-53c31457-73fc-47b2-b7e4-5d57b0ba6e94 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: cddb30f3-076c-4cc6-8609-80dce3c0c67c] Updated VIF entry in instance network info cache for port ebe39c38-2dc3-42b4-a602-d1b9cc0a0de5. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Dec 05 09:40:16 compute-1 nova_compute[189066]: 2025-12-05 09:40:16.985 189070 DEBUG nova.network.neutron [req-874a282e-9a5b-4553-9323-5933ac5523ec req-53c31457-73fc-47b2-b7e4-5d57b0ba6e94 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: cddb30f3-076c-4cc6-8609-80dce3c0c67c] Updating instance_info_cache with network_info: [{"id": "ebe39c38-2dc3-42b4-a602-d1b9cc0a0de5", "address": "fa:16:3e:33:cd:8b", "network": {"id": "73cf2b7d-b284-4d8e-896e-ab561df20f30", "bridge": "br-int", "label": "tempest-network-smoke--1597388656", "subnets": [{"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe33:cd8b", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.195", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe33:cd8b", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "fa1cd463d74b49139a088d332d37e611", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapebe39c38-2d", "ovs_interfaceid": "ebe39c38-2dc3-42b4-a602-d1b9cc0a0de5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 09:40:17 compute-1 nova_compute[189066]: 2025-12-05 09:40:17.018 189070 DEBUG oslo_concurrency.lockutils [req-874a282e-9a5b-4553-9323-5933ac5523ec req-53c31457-73fc-47b2-b7e4-5d57b0ba6e94 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Releasing lock "refresh_cache-cddb30f3-076c-4cc6-8609-80dce3c0c67c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 09:40:17 compute-1 nova_compute[189066]: 2025-12-05 09:40:17.021 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:40:17 compute-1 nova_compute[189066]: 2025-12-05 09:40:17.594 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:40:17 compute-1 podman[231121]: 2025-12-05 09:40:17.623390687 +0000 UTC m=+0.060826348 container health_status 550b604b3b40c506829669f3af0f3e8dc4e7ac01464222ce7f26998708fe1a96 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec 05 09:40:19 compute-1 nova_compute[189066]: 2025-12-05 09:40:19.015 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:40:20 compute-1 nova_compute[189066]: 2025-12-05 09:40:20.014 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:40:20 compute-1 nova_compute[189066]: 2025-12-05 09:40:20.481 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:40:21 compute-1 nova_compute[189066]: 2025-12-05 09:40:21.020 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:40:22 compute-1 nova_compute[189066]: 2025-12-05 09:40:22.597 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:40:24 compute-1 nova_compute[189066]: 2025-12-05 09:40:24.021 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:40:24 compute-1 nova_compute[189066]: 2025-12-05 09:40:24.051 189070 DEBUG oslo_concurrency.lockutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:40:24 compute-1 nova_compute[189066]: 2025-12-05 09:40:24.052 189070 DEBUG oslo_concurrency.lockutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:40:24 compute-1 nova_compute[189066]: 2025-12-05 09:40:24.052 189070 DEBUG oslo_concurrency.lockutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:40:24 compute-1 nova_compute[189066]: 2025-12-05 09:40:24.053 189070 DEBUG nova.compute.resource_tracker [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 05 09:40:24 compute-1 podman[231156]: 2025-12-05 09:40:24.184232425 +0000 UTC m=+0.071602612 container health_status d60d3dfd47255bc7c068dbedc03dbf86678863f0414a1b8f8536e4d53dd67a13 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a42d21893a42142b0b00e96296a13ac9d2c0fedf55d6438ad104fd0309c63376-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Dec 05 09:40:24 compute-1 nova_compute[189066]: 2025-12-05 09:40:24.200 189070 DEBUG oslo_concurrency.processutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/cddb30f3-076c-4cc6-8609-80dce3c0c67c/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 09:40:24 compute-1 nova_compute[189066]: 2025-12-05 09:40:24.269 189070 DEBUG oslo_concurrency.processutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/cddb30f3-076c-4cc6-8609-80dce3c0c67c/disk --force-share --output=json" returned: 0 in 0.069s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 09:40:24 compute-1 nova_compute[189066]: 2025-12-05 09:40:24.271 189070 DEBUG oslo_concurrency.processutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/cddb30f3-076c-4cc6-8609-80dce3c0c67c/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 09:40:24 compute-1 nova_compute[189066]: 2025-12-05 09:40:24.346 189070 DEBUG oslo_concurrency.processutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/cddb30f3-076c-4cc6-8609-80dce3c0c67c/disk --force-share --output=json" returned: 0 in 0.076s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 09:40:24 compute-1 nova_compute[189066]: 2025-12-05 09:40:24.518 189070 WARNING nova.virt.libvirt.driver [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 05 09:40:24 compute-1 nova_compute[189066]: 2025-12-05 09:40:24.521 189070 DEBUG nova.compute.resource_tracker [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5576MB free_disk=73.29602813720703GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 05 09:40:24 compute-1 nova_compute[189066]: 2025-12-05 09:40:24.521 189070 DEBUG oslo_concurrency.lockutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:40:24 compute-1 nova_compute[189066]: 2025-12-05 09:40:24.522 189070 DEBUG oslo_concurrency.lockutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:40:24 compute-1 nova_compute[189066]: 2025-12-05 09:40:24.648 189070 DEBUG nova.compute.resource_tracker [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Instance cddb30f3-076c-4cc6-8609-80dce3c0c67c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Dec 05 09:40:24 compute-1 nova_compute[189066]: 2025-12-05 09:40:24.649 189070 DEBUG nova.compute.resource_tracker [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 05 09:40:24 compute-1 nova_compute[189066]: 2025-12-05 09:40:24.649 189070 DEBUG nova.compute.resource_tracker [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 05 09:40:24 compute-1 nova_compute[189066]: 2025-12-05 09:40:24.702 189070 DEBUG nova.compute.provider_tree [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Inventory has not changed in ProviderTree for provider: be68f9f1-7820-4bfa-8dbd-210e13729f64 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 09:40:24 compute-1 nova_compute[189066]: 2025-12-05 09:40:24.719 189070 DEBUG nova.scheduler.client.report [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Inventory has not changed for provider be68f9f1-7820-4bfa-8dbd-210e13729f64 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 09:40:24 compute-1 nova_compute[189066]: 2025-12-05 09:40:24.744 189070 DEBUG nova.compute.resource_tracker [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 05 09:40:24 compute-1 nova_compute[189066]: 2025-12-05 09:40:24.744 189070 DEBUG oslo_concurrency.lockutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.223s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:40:25 compute-1 ovn_controller[95809]: 2025-12-05T09:40:25Z|00039|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:33:cd:8b 10.100.0.9
Dec 05 09:40:25 compute-1 ovn_controller[95809]: 2025-12-05T09:40:25Z|00040|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:33:cd:8b 10.100.0.9
Dec 05 09:40:25 compute-1 nova_compute[189066]: 2025-12-05 09:40:25.483 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:40:25 compute-1 nova_compute[189066]: 2025-12-05 09:40:25.745 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:40:25 compute-1 nova_compute[189066]: 2025-12-05 09:40:25.746 189070 DEBUG nova.compute.manager [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 05 09:40:25 compute-1 nova_compute[189066]: 2025-12-05 09:40:25.746 189070 DEBUG nova.compute.manager [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 05 09:40:26 compute-1 nova_compute[189066]: 2025-12-05 09:40:26.125 189070 DEBUG oslo_concurrency.lockutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Acquiring lock "refresh_cache-cddb30f3-076c-4cc6-8609-80dce3c0c67c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 09:40:26 compute-1 nova_compute[189066]: 2025-12-05 09:40:26.125 189070 DEBUG oslo_concurrency.lockutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Acquired lock "refresh_cache-cddb30f3-076c-4cc6-8609-80dce3c0c67c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 09:40:26 compute-1 nova_compute[189066]: 2025-12-05 09:40:26.125 189070 DEBUG nova.network.neutron [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] [instance: cddb30f3-076c-4cc6-8609-80dce3c0c67c] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Dec 05 09:40:26 compute-1 nova_compute[189066]: 2025-12-05 09:40:26.126 189070 DEBUG nova.objects.instance [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Lazy-loading 'info_cache' on Instance uuid cddb30f3-076c-4cc6-8609-80dce3c0c67c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 09:40:27 compute-1 nova_compute[189066]: 2025-12-05 09:40:27.657 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:40:28 compute-1 nova_compute[189066]: 2025-12-05 09:40:28.980 189070 DEBUG nova.network.neutron [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] [instance: cddb30f3-076c-4cc6-8609-80dce3c0c67c] Updating instance_info_cache with network_info: [{"id": "ebe39c38-2dc3-42b4-a602-d1b9cc0a0de5", "address": "fa:16:3e:33:cd:8b", "network": {"id": "73cf2b7d-b284-4d8e-896e-ab561df20f30", "bridge": "br-int", "label": "tempest-network-smoke--1597388656", "subnets": [{"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe33:cd8b", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.195", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe33:cd8b", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "fa1cd463d74b49139a088d332d37e611", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapebe39c38-2d", "ovs_interfaceid": "ebe39c38-2dc3-42b4-a602-d1b9cc0a0de5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 09:40:29 compute-1 nova_compute[189066]: 2025-12-05 09:40:29.012 189070 DEBUG oslo_concurrency.lockutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Releasing lock "refresh_cache-cddb30f3-076c-4cc6-8609-80dce3c0c67c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 09:40:29 compute-1 nova_compute[189066]: 2025-12-05 09:40:29.013 189070 DEBUG nova.compute.manager [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] [instance: cddb30f3-076c-4cc6-8609-80dce3c0c67c] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Dec 05 09:40:29 compute-1 nova_compute[189066]: 2025-12-05 09:40:29.013 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:40:29 compute-1 nova_compute[189066]: 2025-12-05 09:40:29.013 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:40:29 compute-1 nova_compute[189066]: 2025-12-05 09:40:29.014 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:40:29 compute-1 nova_compute[189066]: 2025-12-05 09:40:29.014 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:40:29 compute-1 nova_compute[189066]: 2025-12-05 09:40:29.014 189070 DEBUG nova.compute.manager [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 05 09:40:30 compute-1 nova_compute[189066]: 2025-12-05 09:40:30.486 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:40:30 compute-1 podman[231186]: 2025-12-05 09:40:30.665698001 +0000 UTC m=+0.096428119 container health_status 0cdc951f3b455a1459471b20e1f1626ca53f702831c125f835d1b670aeb17ccf (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a42d21893a42142b0b00e96296a13ac9d2c0fedf55d6438ad104fd0309c63376-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 05 09:40:32 compute-1 nova_compute[189066]: 2025-12-05 09:40:32.660 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:40:33 compute-1 podman[231213]: 2025-12-05 09:40:33.62493518 +0000 UTC m=+0.060354227 container health_status e8cea6352187f28e672aa25c227f2705a90d58074b8cf42235a56df5d4026fd5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a42d21893a42142b0b00e96296a13ac9d2c0fedf55d6438ad104fd0309c63376-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true)
Dec 05 09:40:35 compute-1 ovn_controller[95809]: 2025-12-05T09:40:35Z|00237|binding|INFO|Releasing lport 14946cf1-e45b-478f-9cf6-cbb706b4f055 from this chassis (sb_readonly=0)
Dec 05 09:40:35 compute-1 nova_compute[189066]: 2025-12-05 09:40:35.235 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:40:35 compute-1 nova_compute[189066]: 2025-12-05 09:40:35.489 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:40:35 compute-1 podman[231232]: 2025-12-05 09:40:35.625410384 +0000 UTC m=+0.063506914 container health_status 3f3288243aefe700d53cffce184353529fbaf6e44f1c19ec4792ed576af41176 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.vendor=CentOS, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 05 09:40:37 compute-1 nova_compute[189066]: 2025-12-05 09:40:37.663 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:40:38 compute-1 podman[231254]: 2025-12-05 09:40:38.620717247 +0000 UTC m=+0.059153499 container health_status b07f36300303a5c1729056a61e344d6c6fd9b2e404082d8e8ce7185d45652c2b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, architecture=x86_64, name=ubi9-minimal, distribution-scope=public, io.openshift.expose-services=, maintainer=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=openstack_network_exporter, container_name=openstack_network_exporter, io.buildah.version=1.33.7, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b)
Dec 05 09:40:40 compute-1 nova_compute[189066]: 2025-12-05 09:40:40.491 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:40:42 compute-1 nova_compute[189066]: 2025-12-05 09:40:42.666 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:40:43 compute-1 podman[231275]: 2025-12-05 09:40:43.61811186 +0000 UTC m=+0.063007072 container health_status b9f45cdcb567bb250381fa80d86c78c226d5638c7de1b6d79ee017275814ff68 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 05 09:40:44 compute-1 nova_compute[189066]: 2025-12-05 09:40:44.330 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:40:45 compute-1 nova_compute[189066]: 2025-12-05 09:40:45.494 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:40:47 compute-1 nova_compute[189066]: 2025-12-05 09:40:47.668 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:40:48 compute-1 podman[231299]: 2025-12-05 09:40:48.628432859 +0000 UTC m=+0.056458712 container health_status 550b604b3b40c506829669f3af0f3e8dc4e7ac01464222ce7f26998708fe1a96 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Dec 05 09:40:48 compute-1 nova_compute[189066]: 2025-12-05 09:40:48.771 189070 DEBUG nova.compute.manager [req-0d76977b-ba55-4fd4-8ff3-fff0b8381541 req-d7669045-3278-4826-befb-cb9fc461346b 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: cddb30f3-076c-4cc6-8609-80dce3c0c67c] Received event network-changed-ebe39c38-2dc3-42b4-a602-d1b9cc0a0de5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 09:40:48 compute-1 nova_compute[189066]: 2025-12-05 09:40:48.771 189070 DEBUG nova.compute.manager [req-0d76977b-ba55-4fd4-8ff3-fff0b8381541 req-d7669045-3278-4826-befb-cb9fc461346b 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: cddb30f3-076c-4cc6-8609-80dce3c0c67c] Refreshing instance network info cache due to event network-changed-ebe39c38-2dc3-42b4-a602-d1b9cc0a0de5. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Dec 05 09:40:48 compute-1 nova_compute[189066]: 2025-12-05 09:40:48.772 189070 DEBUG oslo_concurrency.lockutils [req-0d76977b-ba55-4fd4-8ff3-fff0b8381541 req-d7669045-3278-4826-befb-cb9fc461346b 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Acquiring lock "refresh_cache-cddb30f3-076c-4cc6-8609-80dce3c0c67c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 09:40:48 compute-1 nova_compute[189066]: 2025-12-05 09:40:48.772 189070 DEBUG oslo_concurrency.lockutils [req-0d76977b-ba55-4fd4-8ff3-fff0b8381541 req-d7669045-3278-4826-befb-cb9fc461346b 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Acquired lock "refresh_cache-cddb30f3-076c-4cc6-8609-80dce3c0c67c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 09:40:48 compute-1 nova_compute[189066]: 2025-12-05 09:40:48.772 189070 DEBUG nova.network.neutron [req-0d76977b-ba55-4fd4-8ff3-fff0b8381541 req-d7669045-3278-4826-befb-cb9fc461346b 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: cddb30f3-076c-4cc6-8609-80dce3c0c67c] Refreshing network info cache for port ebe39c38-2dc3-42b4-a602-d1b9cc0a0de5 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Dec 05 09:40:48 compute-1 nova_compute[189066]: 2025-12-05 09:40:48.925 189070 DEBUG oslo_concurrency.lockutils [None req-edd2efdd-e99c-4d41-814b-97052e985a48 fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] Acquiring lock "cddb30f3-076c-4cc6-8609-80dce3c0c67c" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:40:48 compute-1 nova_compute[189066]: 2025-12-05 09:40:48.926 189070 DEBUG oslo_concurrency.lockutils [None req-edd2efdd-e99c-4d41-814b-97052e985a48 fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] Lock "cddb30f3-076c-4cc6-8609-80dce3c0c67c" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:40:48 compute-1 nova_compute[189066]: 2025-12-05 09:40:48.926 189070 DEBUG oslo_concurrency.lockutils [None req-edd2efdd-e99c-4d41-814b-97052e985a48 fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] Acquiring lock "cddb30f3-076c-4cc6-8609-80dce3c0c67c-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:40:48 compute-1 nova_compute[189066]: 2025-12-05 09:40:48.926 189070 DEBUG oslo_concurrency.lockutils [None req-edd2efdd-e99c-4d41-814b-97052e985a48 fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] Lock "cddb30f3-076c-4cc6-8609-80dce3c0c67c-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:40:48 compute-1 nova_compute[189066]: 2025-12-05 09:40:48.927 189070 DEBUG oslo_concurrency.lockutils [None req-edd2efdd-e99c-4d41-814b-97052e985a48 fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] Lock "cddb30f3-076c-4cc6-8609-80dce3c0c67c-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:40:48 compute-1 nova_compute[189066]: 2025-12-05 09:40:48.928 189070 INFO nova.compute.manager [None req-edd2efdd-e99c-4d41-814b-97052e985a48 fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] [instance: cddb30f3-076c-4cc6-8609-80dce3c0c67c] Terminating instance
Dec 05 09:40:48 compute-1 nova_compute[189066]: 2025-12-05 09:40:48.929 189070 DEBUG nova.compute.manager [None req-edd2efdd-e99c-4d41-814b-97052e985a48 fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] [instance: cddb30f3-076c-4cc6-8609-80dce3c0c67c] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Dec 05 09:40:48 compute-1 kernel: tapebe39c38-2d (unregistering): left promiscuous mode
Dec 05 09:40:49 compute-1 NetworkManager[55704]: <info>  [1764927649.0022] device (tapebe39c38-2d): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec 05 09:40:49 compute-1 nova_compute[189066]: 2025-12-05 09:40:49.014 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:40:49 compute-1 ovn_controller[95809]: 2025-12-05T09:40:49Z|00238|binding|INFO|Releasing lport ebe39c38-2dc3-42b4-a602-d1b9cc0a0de5 from this chassis (sb_readonly=0)
Dec 05 09:40:49 compute-1 ovn_controller[95809]: 2025-12-05T09:40:49Z|00239|binding|INFO|Setting lport ebe39c38-2dc3-42b4-a602-d1b9cc0a0de5 down in Southbound
Dec 05 09:40:49 compute-1 ovn_controller[95809]: 2025-12-05T09:40:49Z|00240|binding|INFO|Removing iface tapebe39c38-2d ovn-installed in OVS
Dec 05 09:40:49 compute-1 nova_compute[189066]: 2025-12-05 09:40:49.017 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:40:49 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:40:49.025 105272 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:33:cd:8b 10.100.0.9 2001:db8:0:1:f816:3eff:fe33:cd8b 2001:db8::f816:3eff:fe33:cd8b'], port_security=['fa:16:3e:33:cd:8b 10.100.0.9 2001:db8:0:1:f816:3eff:fe33:cd8b 2001:db8::f816:3eff:fe33:cd8b'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28 2001:db8:0:1:f816:3eff:fe33:cd8b/64 2001:db8::f816:3eff:fe33:cd8b/64', 'neutron:device_id': 'cddb30f3-076c-4cc6-8609-80dce3c0c67c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-73cf2b7d-b284-4d8e-896e-ab561df20f30', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fa1cd463d74b49139a088d332d37e611', 'neutron:revision_number': '4', 'neutron:security_group_ids': '73f8d656-c691-4354-aaa0-9599172cea40', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=44e16512-8e6e-41d8-8a27-2e539ff56608, chassis=[], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fc06f54c970>], logical_port=ebe39c38-2dc3-42b4-a602-d1b9cc0a0de5) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fc06f54c970>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 09:40:49 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:40:49.027 105272 INFO neutron.agent.ovn.metadata.agent [-] Port ebe39c38-2dc3-42b4-a602-d1b9cc0a0de5 in datapath 73cf2b7d-b284-4d8e-896e-ab561df20f30 unbound from our chassis
Dec 05 09:40:49 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:40:49.030 105272 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 73cf2b7d-b284-4d8e-896e-ab561df20f30, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 05 09:40:49 compute-1 nova_compute[189066]: 2025-12-05 09:40:49.030 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:40:49 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:40:49.033 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[d728ee6d-c926-41ac-b934-cef3eb93ef1c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:40:49 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:40:49.034 105272 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-73cf2b7d-b284-4d8e-896e-ab561df20f30 namespace which is not needed anymore
Dec 05 09:40:49 compute-1 systemd[1]: machine-qemu\x2d19\x2dinstance\x2d0000002c.scope: Deactivated successfully.
Dec 05 09:40:49 compute-1 systemd[1]: machine-qemu\x2d19\x2dinstance\x2d0000002c.scope: Consumed 15.466s CPU time.
Dec 05 09:40:49 compute-1 systemd-machined[154815]: Machine qemu-19-instance-0000002c terminated.
Dec 05 09:40:49 compute-1 neutron-haproxy-ovnmeta-73cf2b7d-b284-4d8e-896e-ab561df20f30[231081]: [NOTICE]   (231085) : haproxy version is 2.8.14-c23fe91
Dec 05 09:40:49 compute-1 neutron-haproxy-ovnmeta-73cf2b7d-b284-4d8e-896e-ab561df20f30[231081]: [NOTICE]   (231085) : path to executable is /usr/sbin/haproxy
Dec 05 09:40:49 compute-1 neutron-haproxy-ovnmeta-73cf2b7d-b284-4d8e-896e-ab561df20f30[231081]: [WARNING]  (231085) : Exiting Master process...
Dec 05 09:40:49 compute-1 neutron-haproxy-ovnmeta-73cf2b7d-b284-4d8e-896e-ab561df20f30[231081]: [ALERT]    (231085) : Current worker (231087) exited with code 143 (Terminated)
Dec 05 09:40:49 compute-1 neutron-haproxy-ovnmeta-73cf2b7d-b284-4d8e-896e-ab561df20f30[231081]: [WARNING]  (231085) : All workers exited. Exiting... (0)
Dec 05 09:40:49 compute-1 systemd[1]: libpod-9add60fcafeb2ddb93db68a78eb79c47c80091ba97733a058fcc175dfc52e4c1.scope: Deactivated successfully.
Dec 05 09:40:49 compute-1 conmon[231081]: conmon 9add60fcafeb2ddb93db <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-9add60fcafeb2ddb93db68a78eb79c47c80091ba97733a058fcc175dfc52e4c1.scope/container/memory.events
Dec 05 09:40:49 compute-1 podman[231348]: 2025-12-05 09:40:49.193660452 +0000 UTC m=+0.056814891 container died 9add60fcafeb2ddb93db68a78eb79c47c80091ba97733a058fcc175dfc52e4c1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-73cf2b7d-b284-4d8e-896e-ab561df20f30, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Dec 05 09:40:49 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-9add60fcafeb2ddb93db68a78eb79c47c80091ba97733a058fcc175dfc52e4c1-userdata-shm.mount: Deactivated successfully.
Dec 05 09:40:49 compute-1 systemd[1]: var-lib-containers-storage-overlay-e032a2f8701d73f387b9fbffaf55bd9ef90952ddb92c8df9b69de3ed352175fa-merged.mount: Deactivated successfully.
Dec 05 09:40:49 compute-1 nova_compute[189066]: 2025-12-05 09:40:49.231 189070 INFO nova.virt.libvirt.driver [-] [instance: cddb30f3-076c-4cc6-8609-80dce3c0c67c] Instance destroyed successfully.
Dec 05 09:40:49 compute-1 nova_compute[189066]: 2025-12-05 09:40:49.233 189070 DEBUG nova.objects.instance [None req-edd2efdd-e99c-4d41-814b-97052e985a48 fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] Lazy-loading 'resources' on Instance uuid cddb30f3-076c-4cc6-8609-80dce3c0c67c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 09:40:49 compute-1 podman[231348]: 2025-12-05 09:40:49.234961251 +0000 UTC m=+0.098115680 container cleanup 9add60fcafeb2ddb93db68a78eb79c47c80091ba97733a058fcc175dfc52e4c1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-73cf2b7d-b284-4d8e-896e-ab561df20f30, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Dec 05 09:40:49 compute-1 systemd[1]: libpod-conmon-9add60fcafeb2ddb93db68a78eb79c47c80091ba97733a058fcc175dfc52e4c1.scope: Deactivated successfully.
Dec 05 09:40:49 compute-1 nova_compute[189066]: 2025-12-05 09:40:49.260 189070 DEBUG nova.virt.libvirt.vif [None req-edd2efdd-e99c-4d41-814b-97052e985a48 fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-05T09:39:57Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1388404493',display_name='tempest-TestGettingAddress-server-1388404493',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1388404493',id=44,image_ref='3ebffd97-b242-42d7-b245-ebdaf8e4377c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBBafU0ViNZ53/PwZlyW/c3PBfwXWKR4oTu3AQxTtSLWjy2Zcdb0NG0DqRxEmeoeFfvnQSXxIpzjyhk7NEskxhM71gR3QXFE46g9tGofm55gDMCuHR08Qdd5xuE5myonkkw==',key_name='tempest-TestGettingAddress-1977496242',keypairs=<?>,launch_index=0,launched_at=2025-12-05T09:40:10Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='fa1cd463d74b49139a088d332d37e611',ramdisk_id='',reservation_id='r-17at0qfh',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='3ebffd97-b242-42d7-b245-ebdaf8e4377c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestGettingAddress-8368731',owner_user_name='tempest-TestGettingAddress-8368731-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-05T09:40:10Z,user_data=None,user_id='fae1c60e378945ea84b34c4824b835b1',uuid=cddb30f3-076c-4cc6-8609-80dce3c0c67c,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "ebe39c38-2dc3-42b4-a602-d1b9cc0a0de5", "address": "fa:16:3e:33:cd:8b", "network": {"id": "73cf2b7d-b284-4d8e-896e-ab561df20f30", "bridge": "br-int", "label": "tempest-network-smoke--1597388656", "subnets": [{"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe33:cd8b", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.195", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe33:cd8b", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "fa1cd463d74b49139a088d332d37e611", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapebe39c38-2d", "ovs_interfaceid": "ebe39c38-2dc3-42b4-a602-d1b9cc0a0de5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Dec 05 09:40:49 compute-1 nova_compute[189066]: 2025-12-05 09:40:49.261 189070 DEBUG nova.network.os_vif_util [None req-edd2efdd-e99c-4d41-814b-97052e985a48 fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] Converting VIF {"id": "ebe39c38-2dc3-42b4-a602-d1b9cc0a0de5", "address": "fa:16:3e:33:cd:8b", "network": {"id": "73cf2b7d-b284-4d8e-896e-ab561df20f30", "bridge": "br-int", "label": "tempest-network-smoke--1597388656", "subnets": [{"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe33:cd8b", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.195", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe33:cd8b", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "fa1cd463d74b49139a088d332d37e611", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapebe39c38-2d", "ovs_interfaceid": "ebe39c38-2dc3-42b4-a602-d1b9cc0a0de5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 09:40:49 compute-1 nova_compute[189066]: 2025-12-05 09:40:49.262 189070 DEBUG nova.network.os_vif_util [None req-edd2efdd-e99c-4d41-814b-97052e985a48 fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:33:cd:8b,bridge_name='br-int',has_traffic_filtering=True,id=ebe39c38-2dc3-42b4-a602-d1b9cc0a0de5,network=Network(73cf2b7d-b284-4d8e-896e-ab561df20f30),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapebe39c38-2d') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 09:40:49 compute-1 nova_compute[189066]: 2025-12-05 09:40:49.263 189070 DEBUG os_vif [None req-edd2efdd-e99c-4d41-814b-97052e985a48 fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:33:cd:8b,bridge_name='br-int',has_traffic_filtering=True,id=ebe39c38-2dc3-42b4-a602-d1b9cc0a0de5,network=Network(73cf2b7d-b284-4d8e-896e-ab561df20f30),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapebe39c38-2d') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Dec 05 09:40:49 compute-1 nova_compute[189066]: 2025-12-05 09:40:49.265 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:40:49 compute-1 nova_compute[189066]: 2025-12-05 09:40:49.265 189070 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapebe39c38-2d, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 09:40:49 compute-1 nova_compute[189066]: 2025-12-05 09:40:49.267 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:40:49 compute-1 nova_compute[189066]: 2025-12-05 09:40:49.269 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:40:49 compute-1 nova_compute[189066]: 2025-12-05 09:40:49.274 189070 INFO os_vif [None req-edd2efdd-e99c-4d41-814b-97052e985a48 fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:33:cd:8b,bridge_name='br-int',has_traffic_filtering=True,id=ebe39c38-2dc3-42b4-a602-d1b9cc0a0de5,network=Network(73cf2b7d-b284-4d8e-896e-ab561df20f30),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapebe39c38-2d')
Dec 05 09:40:49 compute-1 nova_compute[189066]: 2025-12-05 09:40:49.275 189070 INFO nova.virt.libvirt.driver [None req-edd2efdd-e99c-4d41-814b-97052e985a48 fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] [instance: cddb30f3-076c-4cc6-8609-80dce3c0c67c] Deleting instance files /var/lib/nova/instances/cddb30f3-076c-4cc6-8609-80dce3c0c67c_del
Dec 05 09:40:49 compute-1 nova_compute[189066]: 2025-12-05 09:40:49.276 189070 INFO nova.virt.libvirt.driver [None req-edd2efdd-e99c-4d41-814b-97052e985a48 fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] [instance: cddb30f3-076c-4cc6-8609-80dce3c0c67c] Deletion of /var/lib/nova/instances/cddb30f3-076c-4cc6-8609-80dce3c0c67c_del complete
Dec 05 09:40:49 compute-1 podman[231393]: 2025-12-05 09:40:49.300601207 +0000 UTC m=+0.042126601 container remove 9add60fcafeb2ddb93db68a78eb79c47c80091ba97733a058fcc175dfc52e4c1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-73cf2b7d-b284-4d8e-896e-ab561df20f30, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Dec 05 09:40:49 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:40:49.306 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[e975ee7b-5928-4766-bb16-8eb5d9a27eca]: (4, ('Fri Dec  5 09:40:49 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-73cf2b7d-b284-4d8e-896e-ab561df20f30 (9add60fcafeb2ddb93db68a78eb79c47c80091ba97733a058fcc175dfc52e4c1)\n9add60fcafeb2ddb93db68a78eb79c47c80091ba97733a058fcc175dfc52e4c1\nFri Dec  5 09:40:49 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-73cf2b7d-b284-4d8e-896e-ab561df20f30 (9add60fcafeb2ddb93db68a78eb79c47c80091ba97733a058fcc175dfc52e4c1)\n9add60fcafeb2ddb93db68a78eb79c47c80091ba97733a058fcc175dfc52e4c1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:40:49 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:40:49.308 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[9fb4f3c0-8f3f-4763-a7d3-a9e849a9b9c6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:40:49 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:40:49.309 105272 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap73cf2b7d-b0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 09:40:49 compute-1 nova_compute[189066]: 2025-12-05 09:40:49.311 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:40:49 compute-1 kernel: tap73cf2b7d-b0: left promiscuous mode
Dec 05 09:40:49 compute-1 nova_compute[189066]: 2025-12-05 09:40:49.322 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:40:49 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:40:49.326 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[da0d459a-1198-4ebe-9521-bd9c82996de5]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:40:49 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:40:49.345 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[136fd4eb-e137-461e-9716-259b8758f998]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:40:49 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:40:49.346 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[b8c63e9a-8a2d-497f-b1b8-c42c422133b2]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:40:49 compute-1 nova_compute[189066]: 2025-12-05 09:40:49.365 189070 INFO nova.compute.manager [None req-edd2efdd-e99c-4d41-814b-97052e985a48 fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] [instance: cddb30f3-076c-4cc6-8609-80dce3c0c67c] Took 0.44 seconds to destroy the instance on the hypervisor.
Dec 05 09:40:49 compute-1 nova_compute[189066]: 2025-12-05 09:40:49.366 189070 DEBUG oslo.service.loopingcall [None req-edd2efdd-e99c-4d41-814b-97052e985a48 fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Dec 05 09:40:49 compute-1 nova_compute[189066]: 2025-12-05 09:40:49.366 189070 DEBUG nova.compute.manager [-] [instance: cddb30f3-076c-4cc6-8609-80dce3c0c67c] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Dec 05 09:40:49 compute-1 nova_compute[189066]: 2025-12-05 09:40:49.367 189070 DEBUG nova.network.neutron [-] [instance: cddb30f3-076c-4cc6-8609-80dce3c0c67c] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Dec 05 09:40:49 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:40:49.368 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[5bf0e548-a8ba-4924-9886-ab079c8fa434]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 488156, 'reachable_time': 21229, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 231409, 'error': None, 'target': 'ovnmeta-73cf2b7d-b284-4d8e-896e-ab561df20f30', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:40:49 compute-1 systemd[1]: run-netns-ovnmeta\x2d73cf2b7d\x2db284\x2d4d8e\x2d896e\x2dab561df20f30.mount: Deactivated successfully.
Dec 05 09:40:49 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:40:49.375 105809 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-73cf2b7d-b284-4d8e-896e-ab561df20f30 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Dec 05 09:40:49 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:40:49.377 105809 DEBUG oslo.privsep.daemon [-] privsep: reply[b376e73b-c379-4d3b-afb3-5aec8b106b49]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:40:50 compute-1 nova_compute[189066]: 2025-12-05 09:40:50.498 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:40:54 compute-1 nova_compute[189066]: 2025-12-05 09:40:54.304 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:40:54 compute-1 nova_compute[189066]: 2025-12-05 09:40:54.316 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:40:54 compute-1 podman[231410]: 2025-12-05 09:40:54.627736714 +0000 UTC m=+0.064847706 container health_status d60d3dfd47255bc7c068dbedc03dbf86678863f0414a1b8f8536e4d53dd67a13 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a42d21893a42142b0b00e96296a13ac9d2c0fedf55d6438ad104fd0309c63376-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=ceilometer_agent_compute, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125)
Dec 05 09:40:54 compute-1 nova_compute[189066]: 2025-12-05 09:40:54.888 189070 DEBUG nova.network.neutron [-] [instance: cddb30f3-076c-4cc6-8609-80dce3c0c67c] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 09:40:54 compute-1 nova_compute[189066]: 2025-12-05 09:40:54.920 189070 INFO nova.compute.manager [-] [instance: cddb30f3-076c-4cc6-8609-80dce3c0c67c] Took 5.55 seconds to deallocate network for instance.
Dec 05 09:40:54 compute-1 nova_compute[189066]: 2025-12-05 09:40:54.986 189070 DEBUG oslo_concurrency.lockutils [None req-edd2efdd-e99c-4d41-814b-97052e985a48 fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:40:54 compute-1 nova_compute[189066]: 2025-12-05 09:40:54.987 189070 DEBUG oslo_concurrency.lockutils [None req-edd2efdd-e99c-4d41-814b-97052e985a48 fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:40:55 compute-1 nova_compute[189066]: 2025-12-05 09:40:55.029 189070 DEBUG nova.compute.manager [req-43952ec1-6627-49d4-bb14-af31928b3911 req-889562a0-9281-4397-bc0c-ea410dbdb1e1 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: cddb30f3-076c-4cc6-8609-80dce3c0c67c] Received event network-vif-deleted-ebe39c38-2dc3-42b4-a602-d1b9cc0a0de5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 09:40:55 compute-1 nova_compute[189066]: 2025-12-05 09:40:55.062 189070 DEBUG nova.compute.provider_tree [None req-edd2efdd-e99c-4d41-814b-97052e985a48 fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] Inventory has not changed in ProviderTree for provider: be68f9f1-7820-4bfa-8dbd-210e13729f64 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 09:40:55 compute-1 nova_compute[189066]: 2025-12-05 09:40:55.082 189070 DEBUG nova.scheduler.client.report [None req-edd2efdd-e99c-4d41-814b-97052e985a48 fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] Inventory has not changed for provider be68f9f1-7820-4bfa-8dbd-210e13729f64 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 09:40:55 compute-1 nova_compute[189066]: 2025-12-05 09:40:55.110 189070 DEBUG oslo_concurrency.lockutils [None req-edd2efdd-e99c-4d41-814b-97052e985a48 fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.123s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:40:55 compute-1 nova_compute[189066]: 2025-12-05 09:40:55.178 189070 INFO nova.scheduler.client.report [None req-edd2efdd-e99c-4d41-814b-97052e985a48 fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] Deleted allocations for instance cddb30f3-076c-4cc6-8609-80dce3c0c67c
Dec 05 09:40:55 compute-1 nova_compute[189066]: 2025-12-05 09:40:55.283 189070 DEBUG oslo_concurrency.lockutils [None req-edd2efdd-e99c-4d41-814b-97052e985a48 fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] Lock "cddb30f3-076c-4cc6-8609-80dce3c0c67c" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 6.358s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:40:55 compute-1 nova_compute[189066]: 2025-12-05 09:40:55.500 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:40:56 compute-1 nova_compute[189066]: 2025-12-05 09:40:56.808 189070 DEBUG nova.network.neutron [req-0d76977b-ba55-4fd4-8ff3-fff0b8381541 req-d7669045-3278-4826-befb-cb9fc461346b 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: cddb30f3-076c-4cc6-8609-80dce3c0c67c] Updated VIF entry in instance network info cache for port ebe39c38-2dc3-42b4-a602-d1b9cc0a0de5. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Dec 05 09:40:56 compute-1 nova_compute[189066]: 2025-12-05 09:40:56.809 189070 DEBUG nova.network.neutron [req-0d76977b-ba55-4fd4-8ff3-fff0b8381541 req-d7669045-3278-4826-befb-cb9fc461346b 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: cddb30f3-076c-4cc6-8609-80dce3c0c67c] Updating instance_info_cache with network_info: [{"id": "ebe39c38-2dc3-42b4-a602-d1b9cc0a0de5", "address": "fa:16:3e:33:cd:8b", "network": {"id": "73cf2b7d-b284-4d8e-896e-ab561df20f30", "bridge": "br-int", "label": "tempest-network-smoke--1597388656", "subnets": [{"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe33:cd8b", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe33:cd8b", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "fa1cd463d74b49139a088d332d37e611", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapebe39c38-2d", "ovs_interfaceid": "ebe39c38-2dc3-42b4-a602-d1b9cc0a0de5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 09:40:56 compute-1 nova_compute[189066]: 2025-12-05 09:40:56.833 189070 DEBUG oslo_concurrency.lockutils [req-0d76977b-ba55-4fd4-8ff3-fff0b8381541 req-d7669045-3278-4826-befb-cb9fc461346b 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Releasing lock "refresh_cache-cddb30f3-076c-4cc6-8609-80dce3c0c67c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 09:40:59 compute-1 nova_compute[189066]: 2025-12-05 09:40:59.309 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:41:00 compute-1 nova_compute[189066]: 2025-12-05 09:41:00.502 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:41:01 compute-1 podman[231430]: 2025-12-05 09:41:01.658072474 +0000 UTC m=+0.086018504 container health_status 0cdc951f3b455a1459471b20e1f1626ca53f702831c125f835d1b670aeb17ccf (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a42d21893a42142b0b00e96296a13ac9d2c0fedf55d6438ad104fd0309c63376-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Dec 05 09:41:04 compute-1 nova_compute[189066]: 2025-12-05 09:41:04.231 189070 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764927649.22951, cddb30f3-076c-4cc6-8609-80dce3c0c67c => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 09:41:04 compute-1 nova_compute[189066]: 2025-12-05 09:41:04.231 189070 INFO nova.compute.manager [-] [instance: cddb30f3-076c-4cc6-8609-80dce3c0c67c] VM Stopped (Lifecycle Event)
Dec 05 09:41:04 compute-1 nova_compute[189066]: 2025-12-05 09:41:04.285 189070 DEBUG nova.compute.manager [None req-a145c7c9-83a9-4569-b789-1e7292a258e9 - - - - - -] [instance: cddb30f3-076c-4cc6-8609-80dce3c0c67c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 09:41:04 compute-1 nova_compute[189066]: 2025-12-05 09:41:04.313 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:41:04 compute-1 podman[231456]: 2025-12-05 09:41:04.61021054 +0000 UTC m=+0.054649817 container health_status e8cea6352187f28e672aa25c227f2705a90d58074b8cf42235a56df5d4026fd5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a42d21893a42142b0b00e96296a13ac9d2c0fedf55d6438ad104fd0309c63376-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3)
Dec 05 09:41:05 compute-1 nova_compute[189066]: 2025-12-05 09:41:05.504 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:41:06 compute-1 podman[231475]: 2025-12-05 09:41:06.60856349 +0000 UTC m=+0.051521810 container health_status 3f3288243aefe700d53cffce184353529fbaf6e44f1c19ec4792ed576af41176 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 05 09:41:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:41:08.887 105272 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:41:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:41:08.888 105272 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:41:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:41:08.888 105272 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:41:09 compute-1 nova_compute[189066]: 2025-12-05 09:41:09.346 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:41:09 compute-1 podman[231494]: 2025-12-05 09:41:09.622488026 +0000 UTC m=+0.062300874 container health_status b07f36300303a5c1729056a61e344d6c6fd9b2e404082d8e8ce7185d45652c2b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, distribution-scope=public, vendor=Red Hat, Inc., version=9.6, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.tags=minimal rhel9, config_id=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Red Hat, Inc., io.openshift.expose-services=, name=ubi9-minimal, vcs-type=git, build-date=2025-08-20T13:12:41, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, com.redhat.component=ubi9-minimal-container)
Dec 05 09:41:09 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:41:09.659 105272 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=24, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f2:d3:89', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '8e:43:2c:7d:20:dd'}, ipsec=False) old=SB_Global(nb_cfg=23) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 09:41:09 compute-1 nova_compute[189066]: 2025-12-05 09:41:09.659 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:41:09 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:41:09.660 105272 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Dec 05 09:41:09 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:41:09.661 105272 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=2deaed7a-68f6-453c-b7f8-10ef033f3762, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '24'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 09:41:10 compute-1 nova_compute[189066]: 2025-12-05 09:41:10.506 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:41:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:41:10.753 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 09:41:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:41:10.754 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 09:41:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:41:10.754 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 09:41:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:41:10.754 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 09:41:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:41:10.754 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 09:41:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:41:10.754 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 09:41:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:41:10.754 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 09:41:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:41:10.754 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 09:41:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:41:10.754 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 09:41:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:41:10.754 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 09:41:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:41:10.754 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 09:41:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:41:10.755 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 09:41:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:41:10.755 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 09:41:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:41:10.755 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 09:41:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:41:10.755 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 09:41:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:41:10.755 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 09:41:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:41:10.755 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 09:41:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:41:10.755 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 09:41:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:41:10.755 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 09:41:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:41:10.755 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 09:41:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:41:10.755 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 09:41:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:41:10.755 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 09:41:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:41:10.755 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 09:41:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:41:10.755 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 09:41:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:41:10.756 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 09:41:12 compute-1 nova_compute[189066]: 2025-12-05 09:41:12.259 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:41:12 compute-1 nova_compute[189066]: 2025-12-05 09:41:12.403 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:41:14 compute-1 nova_compute[189066]: 2025-12-05 09:41:14.349 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:41:14 compute-1 podman[231517]: 2025-12-05 09:41:14.618789803 +0000 UTC m=+0.056679108 container health_status b9f45cdcb567bb250381fa80d86c78c226d5638c7de1b6d79ee017275814ff68 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 05 09:41:15 compute-1 nova_compute[189066]: 2025-12-05 09:41:15.507 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:41:18 compute-1 nova_compute[189066]: 2025-12-05 09:41:18.021 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:41:19 compute-1 nova_compute[189066]: 2025-12-05 09:41:19.015 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:41:19 compute-1 nova_compute[189066]: 2025-12-05 09:41:19.392 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:41:19 compute-1 podman[231541]: 2025-12-05 09:41:19.616804283 +0000 UTC m=+0.057121898 container health_status 550b604b3b40c506829669f3af0f3e8dc4e7ac01464222ce7f26998708fe1a96 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Dec 05 09:41:20 compute-1 nova_compute[189066]: 2025-12-05 09:41:20.510 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:41:21 compute-1 nova_compute[189066]: 2025-12-05 09:41:21.021 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:41:24 compute-1 nova_compute[189066]: 2025-12-05 09:41:24.020 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:41:24 compute-1 nova_compute[189066]: 2025-12-05 09:41:24.396 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:41:24 compute-1 nova_compute[189066]: 2025-12-05 09:41:24.645 189070 DEBUG oslo_concurrency.lockutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:41:24 compute-1 nova_compute[189066]: 2025-12-05 09:41:24.646 189070 DEBUG oslo_concurrency.lockutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:41:24 compute-1 nova_compute[189066]: 2025-12-05 09:41:24.646 189070 DEBUG oslo_concurrency.lockutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:41:24 compute-1 nova_compute[189066]: 2025-12-05 09:41:24.647 189070 DEBUG nova.compute.resource_tracker [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 05 09:41:24 compute-1 podman[231566]: 2025-12-05 09:41:24.779780815 +0000 UTC m=+0.079183448 container health_status d60d3dfd47255bc7c068dbedc03dbf86678863f0414a1b8f8536e4d53dd67a13 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ceilometer_agent_compute, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a42d21893a42142b0b00e96296a13ac9d2c0fedf55d6438ad104fd0309c63376-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Dec 05 09:41:24 compute-1 nova_compute[189066]: 2025-12-05 09:41:24.913 189070 WARNING nova.virt.libvirt.driver [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 05 09:41:24 compute-1 nova_compute[189066]: 2025-12-05 09:41:24.916 189070 DEBUG nova.compute.resource_tracker [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5736MB free_disk=73.32389068603516GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 05 09:41:24 compute-1 nova_compute[189066]: 2025-12-05 09:41:24.916 189070 DEBUG oslo_concurrency.lockutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:41:24 compute-1 nova_compute[189066]: 2025-12-05 09:41:24.916 189070 DEBUG oslo_concurrency.lockutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:41:25 compute-1 nova_compute[189066]: 2025-12-05 09:41:25.267 189070 DEBUG nova.compute.resource_tracker [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 05 09:41:25 compute-1 nova_compute[189066]: 2025-12-05 09:41:25.268 189070 DEBUG nova.compute.resource_tracker [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 05 09:41:25 compute-1 nova_compute[189066]: 2025-12-05 09:41:25.294 189070 DEBUG nova.compute.provider_tree [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Inventory has not changed in ProviderTree for provider: be68f9f1-7820-4bfa-8dbd-210e13729f64 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 09:41:25 compute-1 nova_compute[189066]: 2025-12-05 09:41:25.513 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:41:25 compute-1 nova_compute[189066]: 2025-12-05 09:41:25.516 189070 DEBUG nova.scheduler.client.report [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Inventory has not changed for provider be68f9f1-7820-4bfa-8dbd-210e13729f64 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 09:41:25 compute-1 nova_compute[189066]: 2025-12-05 09:41:25.609 189070 DEBUG nova.compute.resource_tracker [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 05 09:41:25 compute-1 nova_compute[189066]: 2025-12-05 09:41:25.610 189070 DEBUG oslo_concurrency.lockutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.694s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:41:27 compute-1 nova_compute[189066]: 2025-12-05 09:41:27.610 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:41:27 compute-1 nova_compute[189066]: 2025-12-05 09:41:27.611 189070 DEBUG nova.compute.manager [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 05 09:41:27 compute-1 nova_compute[189066]: 2025-12-05 09:41:27.611 189070 DEBUG nova.compute.manager [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 05 09:41:27 compute-1 nova_compute[189066]: 2025-12-05 09:41:27.834 189070 DEBUG nova.compute.manager [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Dec 05 09:41:27 compute-1 nova_compute[189066]: 2025-12-05 09:41:27.835 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:41:27 compute-1 nova_compute[189066]: 2025-12-05 09:41:27.835 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:41:27 compute-1 nova_compute[189066]: 2025-12-05 09:41:27.835 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:41:27 compute-1 nova_compute[189066]: 2025-12-05 09:41:27.836 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:41:27 compute-1 nova_compute[189066]: 2025-12-05 09:41:27.836 189070 DEBUG nova.compute.manager [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 05 09:41:29 compute-1 nova_compute[189066]: 2025-12-05 09:41:29.400 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:41:30 compute-1 nova_compute[189066]: 2025-12-05 09:41:30.514 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:41:32 compute-1 podman[231588]: 2025-12-05 09:41:32.683019442 +0000 UTC m=+0.124052335 container health_status 0cdc951f3b455a1459471b20e1f1626ca53f702831c125f835d1b670aeb17ccf (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, container_name=ovn_controller, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a42d21893a42142b0b00e96296a13ac9d2c0fedf55d6438ad104fd0309c63376-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 05 09:41:34 compute-1 nova_compute[189066]: 2025-12-05 09:41:34.403 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:41:35 compute-1 nova_compute[189066]: 2025-12-05 09:41:35.517 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:41:35 compute-1 podman[231615]: 2025-12-05 09:41:35.623871611 +0000 UTC m=+0.058601254 container health_status e8cea6352187f28e672aa25c227f2705a90d58074b8cf42235a56df5d4026fd5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a42d21893a42142b0b00e96296a13ac9d2c0fedf55d6438ad104fd0309c63376-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, io.buildah.version=1.41.3)
Dec 05 09:41:37 compute-1 podman[231636]: 2025-12-05 09:41:37.630767811 +0000 UTC m=+0.067543173 container health_status 3f3288243aefe700d53cffce184353529fbaf6e44f1c19ec4792ed576af41176 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_id=multipathd, container_name=multipathd, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, managed_by=edpm_ansible, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Dec 05 09:41:39 compute-1 nova_compute[189066]: 2025-12-05 09:41:39.408 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:41:40 compute-1 nova_compute[189066]: 2025-12-05 09:41:40.556 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:41:40 compute-1 podman[231656]: 2025-12-05 09:41:40.654106078 +0000 UTC m=+0.066729383 container health_status b07f36300303a5c1729056a61e344d6c6fd9b2e404082d8e8ce7185d45652c2b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., version=9.6, architecture=x86_64, config_id=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, io.openshift.expose-services=, vcs-type=git, maintainer=Red Hat, Inc., distribution-scope=public, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350)
Dec 05 09:41:44 compute-1 nova_compute[189066]: 2025-12-05 09:41:44.412 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:41:45 compute-1 nova_compute[189066]: 2025-12-05 09:41:45.557 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:41:45 compute-1 podman[231679]: 2025-12-05 09:41:45.631563614 +0000 UTC m=+0.075186180 container health_status b9f45cdcb567bb250381fa80d86c78c226d5638c7de1b6d79ee017275814ff68 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Dec 05 09:41:46 compute-1 sshd-session[231677]: Connection reset by authenticating user root 45.135.232.92 port 31620 [preauth]
Dec 05 09:41:47 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:41:47.367 105272 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e4:63:79 10.100.0.2 2001:db8::f816:3eff:fee4:6379'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8::f816:3eff:fee4:6379/64', 'neutron:device_id': 'ovnmeta-c8127193-8f39-4d91-b81d-cba716b6b70a', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c8127193-8f39-4d91-b81d-cba716b6b70a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fa1cd463d74b49139a088d332d37e611', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e534066c-24e2-4a55-800c-4e1e4900b516, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=dbd588a7-89a1-4a96-82f6-7a2d0d05566b) old=Port_Binding(mac=['fa:16:3e:e4:63:79 10.100.0.2'], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-c8127193-8f39-4d91-b81d-cba716b6b70a', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c8127193-8f39-4d91-b81d-cba716b6b70a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fa1cd463d74b49139a088d332d37e611', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 09:41:47 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:41:47.369 105272 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port dbd588a7-89a1-4a96-82f6-7a2d0d05566b in datapath c8127193-8f39-4d91-b81d-cba716b6b70a updated
Dec 05 09:41:47 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:41:47.371 105272 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network c8127193-8f39-4d91-b81d-cba716b6b70a, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 05 09:41:47 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:41:47.373 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[ef60918b-c33c-445e-a08e-41185ffa2a07]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:41:48 compute-1 sshd-session[231704]: Connection reset by authenticating user root 45.135.232.92 port 31624 [preauth]
Dec 05 09:41:49 compute-1 nova_compute[189066]: 2025-12-05 09:41:49.416 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:41:50 compute-1 sshd-session[231706]: Invalid user support from 45.135.232.92 port 31632
Dec 05 09:41:50 compute-1 podman[231708]: 2025-12-05 09:41:50.227597661 +0000 UTC m=+0.058429410 container health_status 550b604b3b40c506829669f3af0f3e8dc4e7ac01464222ce7f26998708fe1a96 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec 05 09:41:50 compute-1 nova_compute[189066]: 2025-12-05 09:41:50.559 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:41:51 compute-1 sshd-session[231706]: Connection reset by invalid user support 45.135.232.92 port 31632 [preauth]
Dec 05 09:41:53 compute-1 sshd-session[231732]: Invalid user admin from 45.135.232.92 port 31642
Dec 05 09:41:53 compute-1 sshd-session[231732]: Connection reset by invalid user admin 45.135.232.92 port 31642 [preauth]
Dec 05 09:41:54 compute-1 nova_compute[189066]: 2025-12-05 09:41:54.420 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:41:55 compute-1 nova_compute[189066]: 2025-12-05 09:41:55.560 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:41:55 compute-1 podman[231736]: 2025-12-05 09:41:55.626281988 +0000 UTC m=+0.067130383 container health_status d60d3dfd47255bc7c068dbedc03dbf86678863f0414a1b8f8536e4d53dd67a13 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a42d21893a42142b0b00e96296a13ac9d2c0fedf55d6438ad104fd0309c63376-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Dec 05 09:41:56 compute-1 sshd-session[231734]: Connection reset by authenticating user root 45.135.232.92 port 31656 [preauth]
Dec 05 09:41:59 compute-1 nova_compute[189066]: 2025-12-05 09:41:59.423 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:42:00 compute-1 nova_compute[189066]: 2025-12-05 09:42:00.564 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:42:03 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:42:03.533 105272 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e4:63:79 10.100.0.2 2001:db8:0:1:f816:3eff:fee4:6379 2001:db8::f816:3eff:fee4:6379'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8:0:1:f816:3eff:fee4:6379/64 2001:db8::f816:3eff:fee4:6379/64', 'neutron:device_id': 'ovnmeta-c8127193-8f39-4d91-b81d-cba716b6b70a', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c8127193-8f39-4d91-b81d-cba716b6b70a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fa1cd463d74b49139a088d332d37e611', 'neutron:revision_number': '4', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e534066c-24e2-4a55-800c-4e1e4900b516, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=dbd588a7-89a1-4a96-82f6-7a2d0d05566b) old=Port_Binding(mac=['fa:16:3e:e4:63:79 10.100.0.2 2001:db8::f816:3eff:fee4:6379'], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8::f816:3eff:fee4:6379/64', 'neutron:device_id': 'ovnmeta-c8127193-8f39-4d91-b81d-cba716b6b70a', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c8127193-8f39-4d91-b81d-cba716b6b70a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fa1cd463d74b49139a088d332d37e611', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 09:42:03 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:42:03.535 105272 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port dbd588a7-89a1-4a96-82f6-7a2d0d05566b in datapath c8127193-8f39-4d91-b81d-cba716b6b70a updated
Dec 05 09:42:03 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:42:03.537 105272 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network c8127193-8f39-4d91-b81d-cba716b6b70a, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 05 09:42:03 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:42:03.538 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[dbbd9dbf-ede1-4ce8-b596-197bc3983f1c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:42:03 compute-1 podman[231756]: 2025-12-05 09:42:03.661624065 +0000 UTC m=+0.097314011 container health_status 0cdc951f3b455a1459471b20e1f1626ca53f702831c125f835d1b670aeb17ccf (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_controller, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ovn_controller, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a42d21893a42142b0b00e96296a13ac9d2c0fedf55d6438ad104fd0309c63376-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125)
Dec 05 09:42:04 compute-1 nova_compute[189066]: 2025-12-05 09:42:04.425 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:42:05 compute-1 nova_compute[189066]: 2025-12-05 09:42:05.566 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:42:06 compute-1 podman[231782]: 2025-12-05 09:42:06.624746809 +0000 UTC m=+0.056312168 container health_status e8cea6352187f28e672aa25c227f2705a90d58074b8cf42235a56df5d4026fd5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a42d21893a42142b0b00e96296a13ac9d2c0fedf55d6438ad104fd0309c63376-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Dec 05 09:42:08 compute-1 ovn_controller[95809]: 2025-12-05T09:42:08Z|00241|memory_trim|INFO|Detected inactivity (last active 30002 ms ago): trimming memory
Dec 05 09:42:08 compute-1 podman[231801]: 2025-12-05 09:42:08.619694976 +0000 UTC m=+0.064907708 container health_status 3f3288243aefe700d53cffce184353529fbaf6e44f1c19ec4792ed576af41176 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3)
Dec 05 09:42:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:42:08.889 105272 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:42:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:42:08.889 105272 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:42:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:42:08.890 105272 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:42:09 compute-1 nova_compute[189066]: 2025-12-05 09:42:09.455 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:42:10 compute-1 nova_compute[189066]: 2025-12-05 09:42:10.568 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:42:11 compute-1 podman[231823]: 2025-12-05 09:42:11.636577818 +0000 UTC m=+0.075572968 container health_status b07f36300303a5c1729056a61e344d6c6fd9b2e404082d8e8ce7185d45652c2b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=openstack_network_exporter, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, name=ubi9-minimal, version=9.6, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, vcs-type=git, vendor=Red Hat, Inc., architecture=x86_64, release=1755695350)
Dec 05 09:42:12 compute-1 nova_compute[189066]: 2025-12-05 09:42:12.106 189070 DEBUG oslo_concurrency.lockutils [None req-14d85949-fca8-4f4e-a1b8-beaeee3f2877 fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] Acquiring lock "b417b484-e28d-4f07-a7a6-9528c9b27a63" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:42:12 compute-1 nova_compute[189066]: 2025-12-05 09:42:12.107 189070 DEBUG oslo_concurrency.lockutils [None req-14d85949-fca8-4f4e-a1b8-beaeee3f2877 fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] Lock "b417b484-e28d-4f07-a7a6-9528c9b27a63" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:42:12 compute-1 nova_compute[189066]: 2025-12-05 09:42:12.139 189070 DEBUG nova.compute.manager [None req-14d85949-fca8-4f4e-a1b8-beaeee3f2877 fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] [instance: b417b484-e28d-4f07-a7a6-9528c9b27a63] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Dec 05 09:42:12 compute-1 nova_compute[189066]: 2025-12-05 09:42:12.262 189070 DEBUG oslo_concurrency.lockutils [None req-14d85949-fca8-4f4e-a1b8-beaeee3f2877 fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:42:12 compute-1 nova_compute[189066]: 2025-12-05 09:42:12.263 189070 DEBUG oslo_concurrency.lockutils [None req-14d85949-fca8-4f4e-a1b8-beaeee3f2877 fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:42:12 compute-1 nova_compute[189066]: 2025-12-05 09:42:12.273 189070 DEBUG nova.virt.hardware [None req-14d85949-fca8-4f4e-a1b8-beaeee3f2877 fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Dec 05 09:42:12 compute-1 nova_compute[189066]: 2025-12-05 09:42:12.274 189070 INFO nova.compute.claims [None req-14d85949-fca8-4f4e-a1b8-beaeee3f2877 fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] [instance: b417b484-e28d-4f07-a7a6-9528c9b27a63] Claim successful on node compute-1.ctlplane.example.com
Dec 05 09:42:12 compute-1 nova_compute[189066]: 2025-12-05 09:42:12.415 189070 DEBUG nova.compute.provider_tree [None req-14d85949-fca8-4f4e-a1b8-beaeee3f2877 fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] Inventory has not changed in ProviderTree for provider: be68f9f1-7820-4bfa-8dbd-210e13729f64 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 09:42:12 compute-1 nova_compute[189066]: 2025-12-05 09:42:12.436 189070 DEBUG nova.scheduler.client.report [None req-14d85949-fca8-4f4e-a1b8-beaeee3f2877 fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] Inventory has not changed for provider be68f9f1-7820-4bfa-8dbd-210e13729f64 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 09:42:12 compute-1 nova_compute[189066]: 2025-12-05 09:42:12.508 189070 DEBUG oslo_concurrency.lockutils [None req-14d85949-fca8-4f4e-a1b8-beaeee3f2877 fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.245s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:42:12 compute-1 nova_compute[189066]: 2025-12-05 09:42:12.509 189070 DEBUG nova.compute.manager [None req-14d85949-fca8-4f4e-a1b8-beaeee3f2877 fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] [instance: b417b484-e28d-4f07-a7a6-9528c9b27a63] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Dec 05 09:42:12 compute-1 nova_compute[189066]: 2025-12-05 09:42:12.775 189070 DEBUG nova.compute.manager [None req-14d85949-fca8-4f4e-a1b8-beaeee3f2877 fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] [instance: b417b484-e28d-4f07-a7a6-9528c9b27a63] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Dec 05 09:42:12 compute-1 nova_compute[189066]: 2025-12-05 09:42:12.776 189070 DEBUG nova.network.neutron [None req-14d85949-fca8-4f4e-a1b8-beaeee3f2877 fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] [instance: b417b484-e28d-4f07-a7a6-9528c9b27a63] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Dec 05 09:42:13 compute-1 nova_compute[189066]: 2025-12-05 09:42:13.426 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._cleanup_running_deleted_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:42:13 compute-1 nova_compute[189066]: 2025-12-05 09:42:13.462 189070 INFO nova.virt.libvirt.driver [None req-14d85949-fca8-4f4e-a1b8-beaeee3f2877 fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] [instance: b417b484-e28d-4f07-a7a6-9528c9b27a63] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Dec 05 09:42:13 compute-1 nova_compute[189066]: 2025-12-05 09:42:13.505 189070 DEBUG nova.compute.manager [None req-14d85949-fca8-4f4e-a1b8-beaeee3f2877 fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] [instance: b417b484-e28d-4f07-a7a6-9528c9b27a63] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Dec 05 09:42:13 compute-1 nova_compute[189066]: 2025-12-05 09:42:13.636 189070 DEBUG nova.compute.manager [None req-14d85949-fca8-4f4e-a1b8-beaeee3f2877 fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] [instance: b417b484-e28d-4f07-a7a6-9528c9b27a63] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Dec 05 09:42:13 compute-1 nova_compute[189066]: 2025-12-05 09:42:13.638 189070 DEBUG nova.virt.libvirt.driver [None req-14d85949-fca8-4f4e-a1b8-beaeee3f2877 fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] [instance: b417b484-e28d-4f07-a7a6-9528c9b27a63] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Dec 05 09:42:13 compute-1 nova_compute[189066]: 2025-12-05 09:42:13.638 189070 INFO nova.virt.libvirt.driver [None req-14d85949-fca8-4f4e-a1b8-beaeee3f2877 fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] [instance: b417b484-e28d-4f07-a7a6-9528c9b27a63] Creating image(s)
Dec 05 09:42:13 compute-1 nova_compute[189066]: 2025-12-05 09:42:13.639 189070 DEBUG oslo_concurrency.lockutils [None req-14d85949-fca8-4f4e-a1b8-beaeee3f2877 fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] Acquiring lock "/var/lib/nova/instances/b417b484-e28d-4f07-a7a6-9528c9b27a63/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:42:13 compute-1 nova_compute[189066]: 2025-12-05 09:42:13.640 189070 DEBUG oslo_concurrency.lockutils [None req-14d85949-fca8-4f4e-a1b8-beaeee3f2877 fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] Lock "/var/lib/nova/instances/b417b484-e28d-4f07-a7a6-9528c9b27a63/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:42:13 compute-1 nova_compute[189066]: 2025-12-05 09:42:13.640 189070 DEBUG oslo_concurrency.lockutils [None req-14d85949-fca8-4f4e-a1b8-beaeee3f2877 fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] Lock "/var/lib/nova/instances/b417b484-e28d-4f07-a7a6-9528c9b27a63/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:42:13 compute-1 nova_compute[189066]: 2025-12-05 09:42:13.657 189070 DEBUG oslo_concurrency.processutils [None req-14d85949-fca8-4f4e-a1b8-beaeee3f2877 fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5b1bdb5cc50b9bf43ebe3094e9279a12a18df0ce --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 09:42:13 compute-1 nova_compute[189066]: 2025-12-05 09:42:13.729 189070 DEBUG oslo_concurrency.processutils [None req-14d85949-fca8-4f4e-a1b8-beaeee3f2877 fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5b1bdb5cc50b9bf43ebe3094e9279a12a18df0ce --force-share --output=json" returned: 0 in 0.072s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 09:42:13 compute-1 nova_compute[189066]: 2025-12-05 09:42:13.731 189070 DEBUG oslo_concurrency.lockutils [None req-14d85949-fca8-4f4e-a1b8-beaeee3f2877 fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] Acquiring lock "5b1bdb5cc50b9bf43ebe3094e9279a12a18df0ce" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:42:13 compute-1 nova_compute[189066]: 2025-12-05 09:42:13.732 189070 DEBUG oslo_concurrency.lockutils [None req-14d85949-fca8-4f4e-a1b8-beaeee3f2877 fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] Lock "5b1bdb5cc50b9bf43ebe3094e9279a12a18df0ce" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:42:13 compute-1 nova_compute[189066]: 2025-12-05 09:42:13.747 189070 DEBUG oslo_concurrency.processutils [None req-14d85949-fca8-4f4e-a1b8-beaeee3f2877 fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5b1bdb5cc50b9bf43ebe3094e9279a12a18df0ce --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 09:42:13 compute-1 nova_compute[189066]: 2025-12-05 09:42:13.811 189070 DEBUG oslo_concurrency.processutils [None req-14d85949-fca8-4f4e-a1b8-beaeee3f2877 fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5b1bdb5cc50b9bf43ebe3094e9279a12a18df0ce --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 09:42:13 compute-1 nova_compute[189066]: 2025-12-05 09:42:13.812 189070 DEBUG oslo_concurrency.processutils [None req-14d85949-fca8-4f4e-a1b8-beaeee3f2877 fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5b1bdb5cc50b9bf43ebe3094e9279a12a18df0ce,backing_fmt=raw /var/lib/nova/instances/b417b484-e28d-4f07-a7a6-9528c9b27a63/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 09:42:13 compute-1 nova_compute[189066]: 2025-12-05 09:42:13.927 189070 DEBUG oslo_concurrency.processutils [None req-14d85949-fca8-4f4e-a1b8-beaeee3f2877 fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5b1bdb5cc50b9bf43ebe3094e9279a12a18df0ce,backing_fmt=raw /var/lib/nova/instances/b417b484-e28d-4f07-a7a6-9528c9b27a63/disk 1073741824" returned: 0 in 0.115s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 09:42:13 compute-1 nova_compute[189066]: 2025-12-05 09:42:13.929 189070 DEBUG oslo_concurrency.lockutils [None req-14d85949-fca8-4f4e-a1b8-beaeee3f2877 fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] Lock "5b1bdb5cc50b9bf43ebe3094e9279a12a18df0ce" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.196s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:42:13 compute-1 nova_compute[189066]: 2025-12-05 09:42:13.929 189070 DEBUG oslo_concurrency.processutils [None req-14d85949-fca8-4f4e-a1b8-beaeee3f2877 fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5b1bdb5cc50b9bf43ebe3094e9279a12a18df0ce --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 09:42:13 compute-1 nova_compute[189066]: 2025-12-05 09:42:13.986 189070 DEBUG oslo_concurrency.processutils [None req-14d85949-fca8-4f4e-a1b8-beaeee3f2877 fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5b1bdb5cc50b9bf43ebe3094e9279a12a18df0ce --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 09:42:13 compute-1 nova_compute[189066]: 2025-12-05 09:42:13.988 189070 DEBUG nova.virt.disk.api [None req-14d85949-fca8-4f4e-a1b8-beaeee3f2877 fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] Checking if we can resize image /var/lib/nova/instances/b417b484-e28d-4f07-a7a6-9528c9b27a63/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Dec 05 09:42:13 compute-1 nova_compute[189066]: 2025-12-05 09:42:13.988 189070 DEBUG oslo_concurrency.processutils [None req-14d85949-fca8-4f4e-a1b8-beaeee3f2877 fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b417b484-e28d-4f07-a7a6-9528c9b27a63/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 09:42:14 compute-1 nova_compute[189066]: 2025-12-05 09:42:14.049 189070 DEBUG oslo_concurrency.processutils [None req-14d85949-fca8-4f4e-a1b8-beaeee3f2877 fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b417b484-e28d-4f07-a7a6-9528c9b27a63/disk --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 09:42:14 compute-1 nova_compute[189066]: 2025-12-05 09:42:14.050 189070 DEBUG nova.virt.disk.api [None req-14d85949-fca8-4f4e-a1b8-beaeee3f2877 fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] Cannot resize image /var/lib/nova/instances/b417b484-e28d-4f07-a7a6-9528c9b27a63/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Dec 05 09:42:14 compute-1 nova_compute[189066]: 2025-12-05 09:42:14.051 189070 DEBUG nova.objects.instance [None req-14d85949-fca8-4f4e-a1b8-beaeee3f2877 fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] Lazy-loading 'migration_context' on Instance uuid b417b484-e28d-4f07-a7a6-9528c9b27a63 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 09:42:14 compute-1 nova_compute[189066]: 2025-12-05 09:42:14.069 189070 DEBUG nova.virt.libvirt.driver [None req-14d85949-fca8-4f4e-a1b8-beaeee3f2877 fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] [instance: b417b484-e28d-4f07-a7a6-9528c9b27a63] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Dec 05 09:42:14 compute-1 nova_compute[189066]: 2025-12-05 09:42:14.070 189070 DEBUG nova.virt.libvirt.driver [None req-14d85949-fca8-4f4e-a1b8-beaeee3f2877 fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] [instance: b417b484-e28d-4f07-a7a6-9528c9b27a63] Ensure instance console log exists: /var/lib/nova/instances/b417b484-e28d-4f07-a7a6-9528c9b27a63/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Dec 05 09:42:14 compute-1 nova_compute[189066]: 2025-12-05 09:42:14.071 189070 DEBUG oslo_concurrency.lockutils [None req-14d85949-fca8-4f4e-a1b8-beaeee3f2877 fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:42:14 compute-1 nova_compute[189066]: 2025-12-05 09:42:14.071 189070 DEBUG oslo_concurrency.lockutils [None req-14d85949-fca8-4f4e-a1b8-beaeee3f2877 fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:42:14 compute-1 nova_compute[189066]: 2025-12-05 09:42:14.072 189070 DEBUG oslo_concurrency.lockutils [None req-14d85949-fca8-4f4e-a1b8-beaeee3f2877 fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:42:14 compute-1 nova_compute[189066]: 2025-12-05 09:42:14.376 189070 DEBUG nova.policy [None req-14d85949-fca8-4f4e-a1b8-beaeee3f2877 fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'fae1c60e378945ea84b34c4824b835b1', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'fa1cd463d74b49139a088d332d37e611', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Dec 05 09:42:14 compute-1 nova_compute[189066]: 2025-12-05 09:42:14.458 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:42:15 compute-1 nova_compute[189066]: 2025-12-05 09:42:15.570 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:42:16 compute-1 podman[231861]: 2025-12-05 09:42:16.636850771 +0000 UTC m=+0.070651007 container health_status b9f45cdcb567bb250381fa80d86c78c226d5638c7de1b6d79ee017275814ff68 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Dec 05 09:42:19 compute-1 nova_compute[189066]: 2025-12-05 09:42:19.018 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:42:19 compute-1 nova_compute[189066]: 2025-12-05 09:42:19.019 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:42:19 compute-1 nova_compute[189066]: 2025-12-05 09:42:19.497 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:42:20 compute-1 nova_compute[189066]: 2025-12-05 09:42:20.015 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:42:20 compute-1 nova_compute[189066]: 2025-12-05 09:42:20.609 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:42:20 compute-1 podman[231888]: 2025-12-05 09:42:20.632033023 +0000 UTC m=+0.069811468 container health_status 550b604b3b40c506829669f3af0f3e8dc4e7ac01464222ce7f26998708fe1a96 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec 05 09:42:22 compute-1 nova_compute[189066]: 2025-12-05 09:42:22.020 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:42:23 compute-1 nova_compute[189066]: 2025-12-05 09:42:23.234 189070 DEBUG nova.network.neutron [None req-14d85949-fca8-4f4e-a1b8-beaeee3f2877 fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] [instance: b417b484-e28d-4f07-a7a6-9528c9b27a63] Successfully created port: ce2e0b21-6b51-4972-a784-60843f6686f3 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Dec 05 09:42:24 compute-1 nova_compute[189066]: 2025-12-05 09:42:24.021 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:42:24 compute-1 nova_compute[189066]: 2025-12-05 09:42:24.022 189070 DEBUG nova.compute.manager [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Dec 05 09:42:24 compute-1 nova_compute[189066]: 2025-12-05 09:42:24.235 189070 DEBUG nova.compute.manager [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Dec 05 09:42:24 compute-1 nova_compute[189066]: 2025-12-05 09:42:24.501 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:42:24 compute-1 sshd-session[231887]: Connection closed by 101.47.162.91 port 58578 [preauth]
Dec 05 09:42:25 compute-1 nova_compute[189066]: 2025-12-05 09:42:25.236 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:42:25 compute-1 nova_compute[189066]: 2025-12-05 09:42:25.316 189070 DEBUG oslo_concurrency.lockutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:42:25 compute-1 nova_compute[189066]: 2025-12-05 09:42:25.317 189070 DEBUG oslo_concurrency.lockutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:42:25 compute-1 nova_compute[189066]: 2025-12-05 09:42:25.317 189070 DEBUG oslo_concurrency.lockutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:42:25 compute-1 nova_compute[189066]: 2025-12-05 09:42:25.317 189070 DEBUG nova.compute.resource_tracker [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 05 09:42:25 compute-1 nova_compute[189066]: 2025-12-05 09:42:25.509 189070 WARNING nova.virt.libvirt.driver [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 05 09:42:25 compute-1 nova_compute[189066]: 2025-12-05 09:42:25.511 189070 DEBUG nova.compute.resource_tracker [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5740MB free_disk=73.32368087768555GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 05 09:42:25 compute-1 nova_compute[189066]: 2025-12-05 09:42:25.511 189070 DEBUG oslo_concurrency.lockutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:42:25 compute-1 nova_compute[189066]: 2025-12-05 09:42:25.512 189070 DEBUG oslo_concurrency.lockutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:42:25 compute-1 nova_compute[189066]: 2025-12-05 09:42:25.612 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:42:25 compute-1 nova_compute[189066]: 2025-12-05 09:42:25.912 189070 DEBUG nova.compute.resource_tracker [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Instance b417b484-e28d-4f07-a7a6-9528c9b27a63 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Dec 05 09:42:25 compute-1 nova_compute[189066]: 2025-12-05 09:42:25.913 189070 DEBUG nova.compute.resource_tracker [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 05 09:42:25 compute-1 nova_compute[189066]: 2025-12-05 09:42:25.913 189070 DEBUG nova.compute.resource_tracker [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 05 09:42:25 compute-1 nova_compute[189066]: 2025-12-05 09:42:25.931 189070 DEBUG nova.scheduler.client.report [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Refreshing inventories for resource provider be68f9f1-7820-4bfa-8dbd-210e13729f64 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Dec 05 09:42:26 compute-1 nova_compute[189066]: 2025-12-05 09:42:26.007 189070 DEBUG nova.scheduler.client.report [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Updating ProviderTree inventory for provider be68f9f1-7820-4bfa-8dbd-210e13729f64 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Dec 05 09:42:26 compute-1 nova_compute[189066]: 2025-12-05 09:42:26.008 189070 DEBUG nova.compute.provider_tree [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Updating inventory in ProviderTree for provider be68f9f1-7820-4bfa-8dbd-210e13729f64 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Dec 05 09:42:26 compute-1 nova_compute[189066]: 2025-12-05 09:42:26.025 189070 DEBUG nova.scheduler.client.report [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Refreshing aggregate associations for resource provider be68f9f1-7820-4bfa-8dbd-210e13729f64, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Dec 05 09:42:26 compute-1 nova_compute[189066]: 2025-12-05 09:42:26.060 189070 DEBUG nova.scheduler.client.report [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Refreshing trait associations for resource provider be68f9f1-7820-4bfa-8dbd-210e13729f64, traits: COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_DEVICE_TAGGING,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_STORAGE_BUS_USB,COMPUTE_NODE,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_VOLUME_EXTEND,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_GRAPHICS_MODEL_NONE,HW_CPU_X86_MMX,COMPUTE_TRUSTED_CERTS,COMPUTE_SECURITY_UEFI_SECURE_BOOT,HW_CPU_X86_SSE,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_RESCUE_BFV,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_NET_VIF_MODEL_E1000E,HW_CPU_X86_SSE41,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_STORAGE_BUS_FDC,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_STORAGE_BUS_IDE,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_SECURITY_TPM_1_2,HW_CPU_X86_SSE42,COMPUTE_SECURITY_TPM_2_0,COMPUTE_NET_VIF_MODEL_NE2K_PCI,HW_CPU_X86_SSSE3,COMPUTE_STORAGE_BUS_SATA,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_ACCELERATORS,HW_CPU_X86_SSE2,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_NET_VIF_MODEL_RTL8139 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Dec 05 09:42:26 compute-1 nova_compute[189066]: 2025-12-05 09:42:26.133 189070 DEBUG nova.compute.provider_tree [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Inventory has not changed in ProviderTree for provider: be68f9f1-7820-4bfa-8dbd-210e13729f64 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 09:42:26 compute-1 nova_compute[189066]: 2025-12-05 09:42:26.337 189070 DEBUG nova.scheduler.client.report [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Inventory has not changed for provider be68f9f1-7820-4bfa-8dbd-210e13729f64 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 09:42:26 compute-1 nova_compute[189066]: 2025-12-05 09:42:26.397 189070 DEBUG nova.compute.resource_tracker [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 05 09:42:26 compute-1 nova_compute[189066]: 2025-12-05 09:42:26.397 189070 DEBUG oslo_concurrency.lockutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.885s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:42:26 compute-1 podman[231916]: 2025-12-05 09:42:26.649141055 +0000 UTC m=+0.088207121 container health_status d60d3dfd47255bc7c068dbedc03dbf86678863f0414a1b8f8536e4d53dd67a13 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a42d21893a42142b0b00e96296a13ac9d2c0fedf55d6438ad104fd0309c63376-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 05 09:42:26 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:42:26.799 105272 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=25, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f2:d3:89', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '8e:43:2c:7d:20:dd'}, ipsec=False) old=SB_Global(nb_cfg=24) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 09:42:26 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:42:26.801 105272 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Dec 05 09:42:26 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:42:26.802 105272 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=2deaed7a-68f6-453c-b7f8-10ef033f3762, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '25'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 09:42:26 compute-1 nova_compute[189066]: 2025-12-05 09:42:26.803 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:42:27 compute-1 nova_compute[189066]: 2025-12-05 09:42:27.183 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:42:27 compute-1 nova_compute[189066]: 2025-12-05 09:42:27.184 189070 DEBUG nova.compute.manager [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 05 09:42:27 compute-1 nova_compute[189066]: 2025-12-05 09:42:27.184 189070 DEBUG nova.compute.manager [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 05 09:42:27 compute-1 nova_compute[189066]: 2025-12-05 09:42:27.426 189070 DEBUG nova.compute.manager [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] [instance: b417b484-e28d-4f07-a7a6-9528c9b27a63] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871
Dec 05 09:42:27 compute-1 nova_compute[189066]: 2025-12-05 09:42:27.427 189070 DEBUG nova.compute.manager [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Dec 05 09:42:27 compute-1 nova_compute[189066]: 2025-12-05 09:42:27.428 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:42:27 compute-1 nova_compute[189066]: 2025-12-05 09:42:27.428 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:42:27 compute-1 nova_compute[189066]: 2025-12-05 09:42:27.428 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:42:27 compute-1 nova_compute[189066]: 2025-12-05 09:42:27.429 189070 DEBUG nova.compute.manager [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 05 09:42:27 compute-1 nova_compute[189066]: 2025-12-05 09:42:27.612 189070 DEBUG nova.network.neutron [None req-14d85949-fca8-4f4e-a1b8-beaeee3f2877 fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] [instance: b417b484-e28d-4f07-a7a6-9528c9b27a63] Successfully updated port: ce2e0b21-6b51-4972-a784-60843f6686f3 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Dec 05 09:42:27 compute-1 nova_compute[189066]: 2025-12-05 09:42:27.853 189070 DEBUG oslo_concurrency.lockutils [None req-14d85949-fca8-4f4e-a1b8-beaeee3f2877 fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] Acquiring lock "refresh_cache-b417b484-e28d-4f07-a7a6-9528c9b27a63" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 09:42:27 compute-1 nova_compute[189066]: 2025-12-05 09:42:27.853 189070 DEBUG oslo_concurrency.lockutils [None req-14d85949-fca8-4f4e-a1b8-beaeee3f2877 fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] Acquired lock "refresh_cache-b417b484-e28d-4f07-a7a6-9528c9b27a63" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 09:42:27 compute-1 nova_compute[189066]: 2025-12-05 09:42:27.853 189070 DEBUG nova.network.neutron [None req-14d85949-fca8-4f4e-a1b8-beaeee3f2877 fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] [instance: b417b484-e28d-4f07-a7a6-9528c9b27a63] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Dec 05 09:42:28 compute-1 nova_compute[189066]: 2025-12-05 09:42:28.050 189070 DEBUG nova.network.neutron [None req-14d85949-fca8-4f4e-a1b8-beaeee3f2877 fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] [instance: b417b484-e28d-4f07-a7a6-9528c9b27a63] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Dec 05 09:42:28 compute-1 nova_compute[189066]: 2025-12-05 09:42:28.620 189070 DEBUG nova.compute.manager [req-d9b499f9-fbf0-4d85-84f2-0153fd477e8c req-041b4b3a-f537-4c68-a139-2178bf062852 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: b417b484-e28d-4f07-a7a6-9528c9b27a63] Received event network-changed-ce2e0b21-6b51-4972-a784-60843f6686f3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 09:42:28 compute-1 nova_compute[189066]: 2025-12-05 09:42:28.621 189070 DEBUG nova.compute.manager [req-d9b499f9-fbf0-4d85-84f2-0153fd477e8c req-041b4b3a-f537-4c68-a139-2178bf062852 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: b417b484-e28d-4f07-a7a6-9528c9b27a63] Refreshing instance network info cache due to event network-changed-ce2e0b21-6b51-4972-a784-60843f6686f3. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Dec 05 09:42:28 compute-1 nova_compute[189066]: 2025-12-05 09:42:28.621 189070 DEBUG oslo_concurrency.lockutils [req-d9b499f9-fbf0-4d85-84f2-0153fd477e8c req-041b4b3a-f537-4c68-a139-2178bf062852 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Acquiring lock "refresh_cache-b417b484-e28d-4f07-a7a6-9528c9b27a63" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 09:42:29 compute-1 nova_compute[189066]: 2025-12-05 09:42:29.021 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:42:29 compute-1 nova_compute[189066]: 2025-12-05 09:42:29.504 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:42:30 compute-1 nova_compute[189066]: 2025-12-05 09:42:30.668 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:42:30 compute-1 nova_compute[189066]: 2025-12-05 09:42:30.841 189070 DEBUG nova.network.neutron [None req-14d85949-fca8-4f4e-a1b8-beaeee3f2877 fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] [instance: b417b484-e28d-4f07-a7a6-9528c9b27a63] Updating instance_info_cache with network_info: [{"id": "ce2e0b21-6b51-4972-a784-60843f6686f3", "address": "fa:16:3e:70:c2:8e", "network": {"id": "c8127193-8f39-4d91-b81d-cba716b6b70a", "bridge": "br-int", "label": "tempest-network-smoke--1577898204", "subnets": [{"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe70:c28e", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe70:c28e", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "fa1cd463d74b49139a088d332d37e611", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapce2e0b21-6b", "ovs_interfaceid": "ce2e0b21-6b51-4972-a784-60843f6686f3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 09:42:30 compute-1 nova_compute[189066]: 2025-12-05 09:42:30.867 189070 DEBUG oslo_concurrency.lockutils [None req-14d85949-fca8-4f4e-a1b8-beaeee3f2877 fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] Releasing lock "refresh_cache-b417b484-e28d-4f07-a7a6-9528c9b27a63" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 09:42:30 compute-1 nova_compute[189066]: 2025-12-05 09:42:30.868 189070 DEBUG nova.compute.manager [None req-14d85949-fca8-4f4e-a1b8-beaeee3f2877 fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] [instance: b417b484-e28d-4f07-a7a6-9528c9b27a63] Instance network_info: |[{"id": "ce2e0b21-6b51-4972-a784-60843f6686f3", "address": "fa:16:3e:70:c2:8e", "network": {"id": "c8127193-8f39-4d91-b81d-cba716b6b70a", "bridge": "br-int", "label": "tempest-network-smoke--1577898204", "subnets": [{"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe70:c28e", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe70:c28e", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "fa1cd463d74b49139a088d332d37e611", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapce2e0b21-6b", "ovs_interfaceid": "ce2e0b21-6b51-4972-a784-60843f6686f3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Dec 05 09:42:30 compute-1 nova_compute[189066]: 2025-12-05 09:42:30.869 189070 DEBUG oslo_concurrency.lockutils [req-d9b499f9-fbf0-4d85-84f2-0153fd477e8c req-041b4b3a-f537-4c68-a139-2178bf062852 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Acquired lock "refresh_cache-b417b484-e28d-4f07-a7a6-9528c9b27a63" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 09:42:30 compute-1 nova_compute[189066]: 2025-12-05 09:42:30.870 189070 DEBUG nova.network.neutron [req-d9b499f9-fbf0-4d85-84f2-0153fd477e8c req-041b4b3a-f537-4c68-a139-2178bf062852 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: b417b484-e28d-4f07-a7a6-9528c9b27a63] Refreshing network info cache for port ce2e0b21-6b51-4972-a784-60843f6686f3 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Dec 05 09:42:30 compute-1 nova_compute[189066]: 2025-12-05 09:42:30.874 189070 DEBUG nova.virt.libvirt.driver [None req-14d85949-fca8-4f4e-a1b8-beaeee3f2877 fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] [instance: b417b484-e28d-4f07-a7a6-9528c9b27a63] Start _get_guest_xml network_info=[{"id": "ce2e0b21-6b51-4972-a784-60843f6686f3", "address": "fa:16:3e:70:c2:8e", "network": {"id": "c8127193-8f39-4d91-b81d-cba716b6b70a", "bridge": "br-int", "label": "tempest-network-smoke--1577898204", "subnets": [{"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe70:c28e", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe70:c28e", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "fa1cd463d74b49139a088d332d37e611", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapce2e0b21-6b", "ovs_interfaceid": "ce2e0b21-6b51-4972-a784-60843f6686f3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T09:19:04Z,direct_url=<?>,disk_format='qcow2',id=3ebffd97-b242-42d7-b245-ebdaf8e4377c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='1cb29f3743454b7fb71af92993455437',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T09:19:06Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encrypted': False, 'encryption_secret_uuid': None, 'size': 0, 'boot_index': 0, 'device_name': '/dev/vda', 'disk_bus': 'virtio', 'guest_format': None, 'encryption_options': None, 'encryption_format': None, 'device_type': 'disk', 'image_id': '3ebffd97-b242-42d7-b245-ebdaf8e4377c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Dec 05 09:42:30 compute-1 nova_compute[189066]: 2025-12-05 09:42:30.883 189070 WARNING nova.virt.libvirt.driver [None req-14d85949-fca8-4f4e-a1b8-beaeee3f2877 fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 05 09:42:30 compute-1 nova_compute[189066]: 2025-12-05 09:42:30.896 189070 DEBUG nova.virt.libvirt.host [None req-14d85949-fca8-4f4e-a1b8-beaeee3f2877 fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Dec 05 09:42:30 compute-1 nova_compute[189066]: 2025-12-05 09:42:30.897 189070 DEBUG nova.virt.libvirt.host [None req-14d85949-fca8-4f4e-a1b8-beaeee3f2877 fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Dec 05 09:42:30 compute-1 nova_compute[189066]: 2025-12-05 09:42:30.906 189070 DEBUG nova.virt.libvirt.host [None req-14d85949-fca8-4f4e-a1b8-beaeee3f2877 fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Dec 05 09:42:30 compute-1 nova_compute[189066]: 2025-12-05 09:42:30.907 189070 DEBUG nova.virt.libvirt.host [None req-14d85949-fca8-4f4e-a1b8-beaeee3f2877 fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Dec 05 09:42:30 compute-1 nova_compute[189066]: 2025-12-05 09:42:30.909 189070 DEBUG nova.virt.libvirt.driver [None req-14d85949-fca8-4f4e-a1b8-beaeee3f2877 fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Dec 05 09:42:30 compute-1 nova_compute[189066]: 2025-12-05 09:42:30.910 189070 DEBUG nova.virt.hardware [None req-14d85949-fca8-4f4e-a1b8-beaeee3f2877 fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-05T09:19:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='fbadeab4-f24f-4100-963a-d228b2a6f7c4',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T09:19:04Z,direct_url=<?>,disk_format='qcow2',id=3ebffd97-b242-42d7-b245-ebdaf8e4377c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='1cb29f3743454b7fb71af92993455437',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T09:19:06Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Dec 05 09:42:30 compute-1 nova_compute[189066]: 2025-12-05 09:42:30.910 189070 DEBUG nova.virt.hardware [None req-14d85949-fca8-4f4e-a1b8-beaeee3f2877 fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Dec 05 09:42:30 compute-1 nova_compute[189066]: 2025-12-05 09:42:30.910 189070 DEBUG nova.virt.hardware [None req-14d85949-fca8-4f4e-a1b8-beaeee3f2877 fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Dec 05 09:42:30 compute-1 nova_compute[189066]: 2025-12-05 09:42:30.911 189070 DEBUG nova.virt.hardware [None req-14d85949-fca8-4f4e-a1b8-beaeee3f2877 fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Dec 05 09:42:30 compute-1 nova_compute[189066]: 2025-12-05 09:42:30.911 189070 DEBUG nova.virt.hardware [None req-14d85949-fca8-4f4e-a1b8-beaeee3f2877 fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Dec 05 09:42:30 compute-1 nova_compute[189066]: 2025-12-05 09:42:30.911 189070 DEBUG nova.virt.hardware [None req-14d85949-fca8-4f4e-a1b8-beaeee3f2877 fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Dec 05 09:42:30 compute-1 nova_compute[189066]: 2025-12-05 09:42:30.911 189070 DEBUG nova.virt.hardware [None req-14d85949-fca8-4f4e-a1b8-beaeee3f2877 fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Dec 05 09:42:30 compute-1 nova_compute[189066]: 2025-12-05 09:42:30.912 189070 DEBUG nova.virt.hardware [None req-14d85949-fca8-4f4e-a1b8-beaeee3f2877 fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Dec 05 09:42:30 compute-1 nova_compute[189066]: 2025-12-05 09:42:30.912 189070 DEBUG nova.virt.hardware [None req-14d85949-fca8-4f4e-a1b8-beaeee3f2877 fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Dec 05 09:42:30 compute-1 nova_compute[189066]: 2025-12-05 09:42:30.912 189070 DEBUG nova.virt.hardware [None req-14d85949-fca8-4f4e-a1b8-beaeee3f2877 fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Dec 05 09:42:30 compute-1 nova_compute[189066]: 2025-12-05 09:42:30.912 189070 DEBUG nova.virt.hardware [None req-14d85949-fca8-4f4e-a1b8-beaeee3f2877 fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Dec 05 09:42:30 compute-1 nova_compute[189066]: 2025-12-05 09:42:30.917 189070 DEBUG nova.virt.libvirt.vif [None req-14d85949-fca8-4f4e-a1b8-beaeee3f2877 fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T09:42:10Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1007468349',display_name='tempest-TestGettingAddress-server-1007468349',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1007468349',id=47,image_ref='3ebffd97-b242-42d7-b245-ebdaf8e4377c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBF7b/7MwbHNgolWlFgvs9a+gQGW2Zj7PA97WQ9maV2HyX6ljndlAHVDtzh0a6QTU+HpOhqDK5bEN+TBhnr5fx5xapjRV4/ouFtwlIZGp978P/7BBZIVuAmc+cOHOO30M/A==',key_name='tempest-TestGettingAddress-1631840748',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='fa1cd463d74b49139a088d332d37e611',ramdisk_id='',reservation_id='r-p615iv1e',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='3ebffd97-b242-42d7-b245-ebdaf8e4377c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-8368731',owner_user_name='tempest-TestGettingAddress-8368731-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T09:42:13Z,user_data=None,user_id='fae1c60e378945ea84b34c4824b835b1',uuid=b417b484-e28d-4f07-a7a6-9528c9b27a63,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "ce2e0b21-6b51-4972-a784-60843f6686f3", "address": "fa:16:3e:70:c2:8e", "network": {"id": "c8127193-8f39-4d91-b81d-cba716b6b70a", "bridge": "br-int", "label": "tempest-network-smoke--1577898204", "subnets": [{"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe70:c28e", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe70:c28e", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "fa1cd463d74b49139a088d332d37e611", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapce2e0b21-6b", "ovs_interfaceid": "ce2e0b21-6b51-4972-a784-60843f6686f3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Dec 05 09:42:30 compute-1 nova_compute[189066]: 2025-12-05 09:42:30.918 189070 DEBUG nova.network.os_vif_util [None req-14d85949-fca8-4f4e-a1b8-beaeee3f2877 fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] Converting VIF {"id": "ce2e0b21-6b51-4972-a784-60843f6686f3", "address": "fa:16:3e:70:c2:8e", "network": {"id": "c8127193-8f39-4d91-b81d-cba716b6b70a", "bridge": "br-int", "label": "tempest-network-smoke--1577898204", "subnets": [{"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe70:c28e", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe70:c28e", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "fa1cd463d74b49139a088d332d37e611", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapce2e0b21-6b", "ovs_interfaceid": "ce2e0b21-6b51-4972-a784-60843f6686f3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 09:42:30 compute-1 nova_compute[189066]: 2025-12-05 09:42:30.919 189070 DEBUG nova.network.os_vif_util [None req-14d85949-fca8-4f4e-a1b8-beaeee3f2877 fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:70:c2:8e,bridge_name='br-int',has_traffic_filtering=True,id=ce2e0b21-6b51-4972-a784-60843f6686f3,network=Network(c8127193-8f39-4d91-b81d-cba716b6b70a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapce2e0b21-6b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 09:42:30 compute-1 nova_compute[189066]: 2025-12-05 09:42:30.921 189070 DEBUG nova.objects.instance [None req-14d85949-fca8-4f4e-a1b8-beaeee3f2877 fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] Lazy-loading 'pci_devices' on Instance uuid b417b484-e28d-4f07-a7a6-9528c9b27a63 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 09:42:30 compute-1 nova_compute[189066]: 2025-12-05 09:42:30.940 189070 DEBUG nova.virt.libvirt.driver [None req-14d85949-fca8-4f4e-a1b8-beaeee3f2877 fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] [instance: b417b484-e28d-4f07-a7a6-9528c9b27a63] End _get_guest_xml xml=<domain type="kvm">
Dec 05 09:42:30 compute-1 nova_compute[189066]:   <uuid>b417b484-e28d-4f07-a7a6-9528c9b27a63</uuid>
Dec 05 09:42:30 compute-1 nova_compute[189066]:   <name>instance-0000002f</name>
Dec 05 09:42:30 compute-1 nova_compute[189066]:   <memory>131072</memory>
Dec 05 09:42:30 compute-1 nova_compute[189066]:   <vcpu>1</vcpu>
Dec 05 09:42:30 compute-1 nova_compute[189066]:   <metadata>
Dec 05 09:42:30 compute-1 nova_compute[189066]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 05 09:42:30 compute-1 nova_compute[189066]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec 05 09:42:30 compute-1 nova_compute[189066]:       <nova:name>tempest-TestGettingAddress-server-1007468349</nova:name>
Dec 05 09:42:30 compute-1 nova_compute[189066]:       <nova:creationTime>2025-12-05 09:42:30</nova:creationTime>
Dec 05 09:42:30 compute-1 nova_compute[189066]:       <nova:flavor name="m1.nano">
Dec 05 09:42:30 compute-1 nova_compute[189066]:         <nova:memory>128</nova:memory>
Dec 05 09:42:30 compute-1 nova_compute[189066]:         <nova:disk>1</nova:disk>
Dec 05 09:42:30 compute-1 nova_compute[189066]:         <nova:swap>0</nova:swap>
Dec 05 09:42:30 compute-1 nova_compute[189066]:         <nova:ephemeral>0</nova:ephemeral>
Dec 05 09:42:30 compute-1 nova_compute[189066]:         <nova:vcpus>1</nova:vcpus>
Dec 05 09:42:30 compute-1 nova_compute[189066]:       </nova:flavor>
Dec 05 09:42:30 compute-1 nova_compute[189066]:       <nova:owner>
Dec 05 09:42:30 compute-1 nova_compute[189066]:         <nova:user uuid="fae1c60e378945ea84b34c4824b835b1">tempest-TestGettingAddress-8368731-project-member</nova:user>
Dec 05 09:42:30 compute-1 nova_compute[189066]:         <nova:project uuid="fa1cd463d74b49139a088d332d37e611">tempest-TestGettingAddress-8368731</nova:project>
Dec 05 09:42:30 compute-1 nova_compute[189066]:       </nova:owner>
Dec 05 09:42:30 compute-1 nova_compute[189066]:       <nova:root type="image" uuid="3ebffd97-b242-42d7-b245-ebdaf8e4377c"/>
Dec 05 09:42:30 compute-1 nova_compute[189066]:       <nova:ports>
Dec 05 09:42:30 compute-1 nova_compute[189066]:         <nova:port uuid="ce2e0b21-6b51-4972-a784-60843f6686f3">
Dec 05 09:42:30 compute-1 nova_compute[189066]:           <nova:ip type="fixed" address="2001:db8:0:1:f816:3eff:fe70:c28e" ipVersion="6"/>
Dec 05 09:42:30 compute-1 nova_compute[189066]:           <nova:ip type="fixed" address="10.100.0.9" ipVersion="4"/>
Dec 05 09:42:30 compute-1 nova_compute[189066]:           <nova:ip type="fixed" address="2001:db8::f816:3eff:fe70:c28e" ipVersion="6"/>
Dec 05 09:42:30 compute-1 nova_compute[189066]:         </nova:port>
Dec 05 09:42:30 compute-1 nova_compute[189066]:       </nova:ports>
Dec 05 09:42:30 compute-1 nova_compute[189066]:     </nova:instance>
Dec 05 09:42:30 compute-1 nova_compute[189066]:   </metadata>
Dec 05 09:42:30 compute-1 nova_compute[189066]:   <sysinfo type="smbios">
Dec 05 09:42:30 compute-1 nova_compute[189066]:     <system>
Dec 05 09:42:30 compute-1 nova_compute[189066]:       <entry name="manufacturer">RDO</entry>
Dec 05 09:42:30 compute-1 nova_compute[189066]:       <entry name="product">OpenStack Compute</entry>
Dec 05 09:42:30 compute-1 nova_compute[189066]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec 05 09:42:30 compute-1 nova_compute[189066]:       <entry name="serial">b417b484-e28d-4f07-a7a6-9528c9b27a63</entry>
Dec 05 09:42:30 compute-1 nova_compute[189066]:       <entry name="uuid">b417b484-e28d-4f07-a7a6-9528c9b27a63</entry>
Dec 05 09:42:30 compute-1 nova_compute[189066]:       <entry name="family">Virtual Machine</entry>
Dec 05 09:42:30 compute-1 nova_compute[189066]:     </system>
Dec 05 09:42:30 compute-1 nova_compute[189066]:   </sysinfo>
Dec 05 09:42:30 compute-1 nova_compute[189066]:   <os>
Dec 05 09:42:30 compute-1 nova_compute[189066]:     <type arch="x86_64" machine="q35">hvm</type>
Dec 05 09:42:30 compute-1 nova_compute[189066]:     <boot dev="hd"/>
Dec 05 09:42:30 compute-1 nova_compute[189066]:     <smbios mode="sysinfo"/>
Dec 05 09:42:30 compute-1 nova_compute[189066]:   </os>
Dec 05 09:42:30 compute-1 nova_compute[189066]:   <features>
Dec 05 09:42:30 compute-1 nova_compute[189066]:     <acpi/>
Dec 05 09:42:30 compute-1 nova_compute[189066]:     <apic/>
Dec 05 09:42:30 compute-1 nova_compute[189066]:     <vmcoreinfo/>
Dec 05 09:42:30 compute-1 nova_compute[189066]:   </features>
Dec 05 09:42:30 compute-1 nova_compute[189066]:   <clock offset="utc">
Dec 05 09:42:30 compute-1 nova_compute[189066]:     <timer name="pit" tickpolicy="delay"/>
Dec 05 09:42:30 compute-1 nova_compute[189066]:     <timer name="rtc" tickpolicy="catchup"/>
Dec 05 09:42:30 compute-1 nova_compute[189066]:     <timer name="hpet" present="no"/>
Dec 05 09:42:30 compute-1 nova_compute[189066]:   </clock>
Dec 05 09:42:30 compute-1 nova_compute[189066]:   <cpu mode="custom" match="exact">
Dec 05 09:42:30 compute-1 nova_compute[189066]:     <model>Nehalem</model>
Dec 05 09:42:30 compute-1 nova_compute[189066]:     <topology sockets="1" cores="1" threads="1"/>
Dec 05 09:42:30 compute-1 nova_compute[189066]:   </cpu>
Dec 05 09:42:30 compute-1 nova_compute[189066]:   <devices>
Dec 05 09:42:30 compute-1 nova_compute[189066]:     <disk type="file" device="disk">
Dec 05 09:42:30 compute-1 nova_compute[189066]:       <driver name="qemu" type="qcow2" cache="none"/>
Dec 05 09:42:30 compute-1 nova_compute[189066]:       <source file="/var/lib/nova/instances/b417b484-e28d-4f07-a7a6-9528c9b27a63/disk"/>
Dec 05 09:42:30 compute-1 nova_compute[189066]:       <target dev="vda" bus="virtio"/>
Dec 05 09:42:30 compute-1 nova_compute[189066]:     </disk>
Dec 05 09:42:30 compute-1 nova_compute[189066]:     <disk type="file" device="cdrom">
Dec 05 09:42:30 compute-1 nova_compute[189066]:       <driver name="qemu" type="raw" cache="none"/>
Dec 05 09:42:30 compute-1 nova_compute[189066]:       <source file="/var/lib/nova/instances/b417b484-e28d-4f07-a7a6-9528c9b27a63/disk.config"/>
Dec 05 09:42:30 compute-1 nova_compute[189066]:       <target dev="sda" bus="sata"/>
Dec 05 09:42:30 compute-1 nova_compute[189066]:     </disk>
Dec 05 09:42:30 compute-1 nova_compute[189066]:     <interface type="ethernet">
Dec 05 09:42:30 compute-1 nova_compute[189066]:       <mac address="fa:16:3e:70:c2:8e"/>
Dec 05 09:42:30 compute-1 nova_compute[189066]:       <model type="virtio"/>
Dec 05 09:42:30 compute-1 nova_compute[189066]:       <driver name="vhost" rx_queue_size="512"/>
Dec 05 09:42:30 compute-1 nova_compute[189066]:       <mtu size="1442"/>
Dec 05 09:42:30 compute-1 nova_compute[189066]:       <target dev="tapce2e0b21-6b"/>
Dec 05 09:42:30 compute-1 nova_compute[189066]:     </interface>
Dec 05 09:42:30 compute-1 nova_compute[189066]:     <serial type="pty">
Dec 05 09:42:30 compute-1 nova_compute[189066]:       <log file="/var/lib/nova/instances/b417b484-e28d-4f07-a7a6-9528c9b27a63/console.log" append="off"/>
Dec 05 09:42:30 compute-1 nova_compute[189066]:     </serial>
Dec 05 09:42:30 compute-1 nova_compute[189066]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Dec 05 09:42:30 compute-1 nova_compute[189066]:     <video>
Dec 05 09:42:30 compute-1 nova_compute[189066]:       <model type="virtio"/>
Dec 05 09:42:30 compute-1 nova_compute[189066]:     </video>
Dec 05 09:42:30 compute-1 nova_compute[189066]:     <input type="tablet" bus="usb"/>
Dec 05 09:42:30 compute-1 nova_compute[189066]:     <rng model="virtio">
Dec 05 09:42:30 compute-1 nova_compute[189066]:       <backend model="random">/dev/urandom</backend>
Dec 05 09:42:30 compute-1 nova_compute[189066]:     </rng>
Dec 05 09:42:30 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root"/>
Dec 05 09:42:30 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:42:30 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:42:30 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:42:30 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:42:30 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:42:30 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:42:30 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:42:30 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:42:30 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:42:30 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:42:30 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:42:30 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:42:30 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:42:30 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:42:30 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:42:30 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:42:30 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:42:30 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:42:30 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:42:30 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:42:30 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:42:30 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:42:30 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:42:30 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:42:30 compute-1 nova_compute[189066]:     <controller type="usb" index="0"/>
Dec 05 09:42:30 compute-1 nova_compute[189066]:     <memballoon model="virtio">
Dec 05 09:42:30 compute-1 nova_compute[189066]:       <stats period="10"/>
Dec 05 09:42:30 compute-1 nova_compute[189066]:     </memballoon>
Dec 05 09:42:30 compute-1 nova_compute[189066]:   </devices>
Dec 05 09:42:30 compute-1 nova_compute[189066]: </domain>
Dec 05 09:42:30 compute-1 nova_compute[189066]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Dec 05 09:42:30 compute-1 nova_compute[189066]: 2025-12-05 09:42:30.942 189070 DEBUG nova.compute.manager [None req-14d85949-fca8-4f4e-a1b8-beaeee3f2877 fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] [instance: b417b484-e28d-4f07-a7a6-9528c9b27a63] Preparing to wait for external event network-vif-plugged-ce2e0b21-6b51-4972-a784-60843f6686f3 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Dec 05 09:42:30 compute-1 nova_compute[189066]: 2025-12-05 09:42:30.942 189070 DEBUG oslo_concurrency.lockutils [None req-14d85949-fca8-4f4e-a1b8-beaeee3f2877 fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] Acquiring lock "b417b484-e28d-4f07-a7a6-9528c9b27a63-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:42:30 compute-1 nova_compute[189066]: 2025-12-05 09:42:30.942 189070 DEBUG oslo_concurrency.lockutils [None req-14d85949-fca8-4f4e-a1b8-beaeee3f2877 fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] Lock "b417b484-e28d-4f07-a7a6-9528c9b27a63-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:42:30 compute-1 nova_compute[189066]: 2025-12-05 09:42:30.943 189070 DEBUG oslo_concurrency.lockutils [None req-14d85949-fca8-4f4e-a1b8-beaeee3f2877 fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] Lock "b417b484-e28d-4f07-a7a6-9528c9b27a63-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:42:30 compute-1 nova_compute[189066]: 2025-12-05 09:42:30.944 189070 DEBUG nova.virt.libvirt.vif [None req-14d85949-fca8-4f4e-a1b8-beaeee3f2877 fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T09:42:10Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1007468349',display_name='tempest-TestGettingAddress-server-1007468349',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1007468349',id=47,image_ref='3ebffd97-b242-42d7-b245-ebdaf8e4377c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBF7b/7MwbHNgolWlFgvs9a+gQGW2Zj7PA97WQ9maV2HyX6ljndlAHVDtzh0a6QTU+HpOhqDK5bEN+TBhnr5fx5xapjRV4/ouFtwlIZGp978P/7BBZIVuAmc+cOHOO30M/A==',key_name='tempest-TestGettingAddress-1631840748',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='fa1cd463d74b49139a088d332d37e611',ramdisk_id='',reservation_id='r-p615iv1e',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='3ebffd97-b242-42d7-b245-ebdaf8e4377c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-8368731',owner_user_name='tempest-TestGettingAddress-8368731-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T09:42:13Z,user_data=None,user_id='fae1c60e378945ea84b34c4824b835b1',uuid=b417b484-e28d-4f07-a7a6-9528c9b27a63,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "ce2e0b21-6b51-4972-a784-60843f6686f3", "address": "fa:16:3e:70:c2:8e", "network": {"id": "c8127193-8f39-4d91-b81d-cba716b6b70a", "bridge": "br-int", "label": "tempest-network-smoke--1577898204", "subnets": [{"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe70:c28e", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe70:c28e", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "fa1cd463d74b49139a088d332d37e611", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapce2e0b21-6b", "ovs_interfaceid": "ce2e0b21-6b51-4972-a784-60843f6686f3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Dec 05 09:42:30 compute-1 nova_compute[189066]: 2025-12-05 09:42:30.944 189070 DEBUG nova.network.os_vif_util [None req-14d85949-fca8-4f4e-a1b8-beaeee3f2877 fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] Converting VIF {"id": "ce2e0b21-6b51-4972-a784-60843f6686f3", "address": "fa:16:3e:70:c2:8e", "network": {"id": "c8127193-8f39-4d91-b81d-cba716b6b70a", "bridge": "br-int", "label": "tempest-network-smoke--1577898204", "subnets": [{"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe70:c28e", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe70:c28e", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "fa1cd463d74b49139a088d332d37e611", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapce2e0b21-6b", "ovs_interfaceid": "ce2e0b21-6b51-4972-a784-60843f6686f3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 09:42:30 compute-1 nova_compute[189066]: 2025-12-05 09:42:30.945 189070 DEBUG nova.network.os_vif_util [None req-14d85949-fca8-4f4e-a1b8-beaeee3f2877 fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:70:c2:8e,bridge_name='br-int',has_traffic_filtering=True,id=ce2e0b21-6b51-4972-a784-60843f6686f3,network=Network(c8127193-8f39-4d91-b81d-cba716b6b70a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapce2e0b21-6b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 09:42:30 compute-1 nova_compute[189066]: 2025-12-05 09:42:30.945 189070 DEBUG os_vif [None req-14d85949-fca8-4f4e-a1b8-beaeee3f2877 fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:70:c2:8e,bridge_name='br-int',has_traffic_filtering=True,id=ce2e0b21-6b51-4972-a784-60843f6686f3,network=Network(c8127193-8f39-4d91-b81d-cba716b6b70a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapce2e0b21-6b') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Dec 05 09:42:30 compute-1 nova_compute[189066]: 2025-12-05 09:42:30.946 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:42:30 compute-1 nova_compute[189066]: 2025-12-05 09:42:30.947 189070 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 09:42:30 compute-1 nova_compute[189066]: 2025-12-05 09:42:30.947 189070 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 09:42:30 compute-1 nova_compute[189066]: 2025-12-05 09:42:30.960 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:42:30 compute-1 nova_compute[189066]: 2025-12-05 09:42:30.961 189070 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapce2e0b21-6b, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 09:42:30 compute-1 nova_compute[189066]: 2025-12-05 09:42:30.961 189070 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapce2e0b21-6b, col_values=(('external_ids', {'iface-id': 'ce2e0b21-6b51-4972-a784-60843f6686f3', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:70:c2:8e', 'vm-uuid': 'b417b484-e28d-4f07-a7a6-9528c9b27a63'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 09:42:30 compute-1 nova_compute[189066]: 2025-12-05 09:42:30.963 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:42:30 compute-1 NetworkManager[55704]: <info>  [1764927750.9652] manager: (tapce2e0b21-6b): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/125)
Dec 05 09:42:30 compute-1 nova_compute[189066]: 2025-12-05 09:42:30.967 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 09:42:30 compute-1 nova_compute[189066]: 2025-12-05 09:42:30.971 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:42:30 compute-1 nova_compute[189066]: 2025-12-05 09:42:30.972 189070 INFO os_vif [None req-14d85949-fca8-4f4e-a1b8-beaeee3f2877 fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:70:c2:8e,bridge_name='br-int',has_traffic_filtering=True,id=ce2e0b21-6b51-4972-a784-60843f6686f3,network=Network(c8127193-8f39-4d91-b81d-cba716b6b70a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapce2e0b21-6b')
Dec 05 09:42:31 compute-1 nova_compute[189066]: 2025-12-05 09:42:31.040 189070 DEBUG nova.virt.libvirt.driver [None req-14d85949-fca8-4f4e-a1b8-beaeee3f2877 fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 05 09:42:31 compute-1 nova_compute[189066]: 2025-12-05 09:42:31.041 189070 DEBUG nova.virt.libvirt.driver [None req-14d85949-fca8-4f4e-a1b8-beaeee3f2877 fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 05 09:42:31 compute-1 nova_compute[189066]: 2025-12-05 09:42:31.041 189070 DEBUG nova.virt.libvirt.driver [None req-14d85949-fca8-4f4e-a1b8-beaeee3f2877 fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] No VIF found with MAC fa:16:3e:70:c2:8e, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Dec 05 09:42:31 compute-1 nova_compute[189066]: 2025-12-05 09:42:31.042 189070 INFO nova.virt.libvirt.driver [None req-14d85949-fca8-4f4e-a1b8-beaeee3f2877 fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] [instance: b417b484-e28d-4f07-a7a6-9528c9b27a63] Using config drive
Dec 05 09:42:32 compute-1 nova_compute[189066]: 2025-12-05 09:42:32.319 189070 INFO nova.virt.libvirt.driver [None req-14d85949-fca8-4f4e-a1b8-beaeee3f2877 fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] [instance: b417b484-e28d-4f07-a7a6-9528c9b27a63] Creating config drive at /var/lib/nova/instances/b417b484-e28d-4f07-a7a6-9528c9b27a63/disk.config
Dec 05 09:42:32 compute-1 nova_compute[189066]: 2025-12-05 09:42:32.325 189070 DEBUG oslo_concurrency.processutils [None req-14d85949-fca8-4f4e-a1b8-beaeee3f2877 fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/b417b484-e28d-4f07-a7a6-9528c9b27a63/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp75i4pbpn execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 09:42:32 compute-1 nova_compute[189066]: 2025-12-05 09:42:32.457 189070 DEBUG oslo_concurrency.processutils [None req-14d85949-fca8-4f4e-a1b8-beaeee3f2877 fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/b417b484-e28d-4f07-a7a6-9528c9b27a63/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp75i4pbpn" returned: 0 in 0.133s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 09:42:32 compute-1 kernel: tapce2e0b21-6b: entered promiscuous mode
Dec 05 09:42:32 compute-1 NetworkManager[55704]: <info>  [1764927752.5281] manager: (tapce2e0b21-6b): new Tun device (/org/freedesktop/NetworkManager/Devices/126)
Dec 05 09:42:32 compute-1 ovn_controller[95809]: 2025-12-05T09:42:32Z|00242|binding|INFO|Claiming lport ce2e0b21-6b51-4972-a784-60843f6686f3 for this chassis.
Dec 05 09:42:32 compute-1 ovn_controller[95809]: 2025-12-05T09:42:32Z|00243|binding|INFO|ce2e0b21-6b51-4972-a784-60843f6686f3: Claiming fa:16:3e:70:c2:8e 10.100.0.9 2001:db8:0:1:f816:3eff:fe70:c28e 2001:db8::f816:3eff:fe70:c28e
Dec 05 09:42:32 compute-1 nova_compute[189066]: 2025-12-05 09:42:32.529 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:42:32 compute-1 nova_compute[189066]: 2025-12-05 09:42:32.533 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:42:32 compute-1 nova_compute[189066]: 2025-12-05 09:42:32.542 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:42:32 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:42:32.550 105272 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:70:c2:8e 10.100.0.9 2001:db8:0:1:f816:3eff:fe70:c28e 2001:db8::f816:3eff:fe70:c28e'], port_security=['fa:16:3e:70:c2:8e 10.100.0.9 2001:db8:0:1:f816:3eff:fe70:c28e 2001:db8::f816:3eff:fe70:c28e'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28 2001:db8:0:1:f816:3eff:fe70:c28e/64 2001:db8::f816:3eff:fe70:c28e/64', 'neutron:device_id': 'b417b484-e28d-4f07-a7a6-9528c9b27a63', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c8127193-8f39-4d91-b81d-cba716b6b70a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fa1cd463d74b49139a088d332d37e611', 'neutron:revision_number': '2', 'neutron:security_group_ids': '9c8a624e-974c-4a4d-8814-5cb24cdfcef3', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e534066c-24e2-4a55-800c-4e1e4900b516, chassis=[<ovs.db.idl.Row object at 0x7fc06f54c970>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fc06f54c970>], logical_port=ce2e0b21-6b51-4972-a784-60843f6686f3) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 09:42:32 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:42:32.552 105272 INFO neutron.agent.ovn.metadata.agent [-] Port ce2e0b21-6b51-4972-a784-60843f6686f3 in datapath c8127193-8f39-4d91-b81d-cba716b6b70a bound to our chassis
Dec 05 09:42:32 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:42:32.554 105272 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network c8127193-8f39-4d91-b81d-cba716b6b70a
Dec 05 09:42:32 compute-1 systemd-udevd[231955]: Network interface NamePolicy= disabled on kernel command line.
Dec 05 09:42:32 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:42:32.572 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[33b20038-6503-4807-9c2a-5b2b7f37ee45]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:42:32 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:42:32.573 105272 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapc8127193-81 in ovnmeta-c8127193-8f39-4d91-b81d-cba716b6b70a namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Dec 05 09:42:32 compute-1 systemd-machined[154815]: New machine qemu-20-instance-0000002f.
Dec 05 09:42:32 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:42:32.579 220556 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapc8127193-80 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Dec 05 09:42:32 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:42:32.580 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[c64e6feb-2d46-4299-81e9-9103ec6c90cd]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:42:32 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:42:32.581 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[3db99a99-5ae8-445c-b0bd-b0d00a56f7f5]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:42:32 compute-1 NetworkManager[55704]: <info>  [1764927752.5878] device (tapce2e0b21-6b): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec 05 09:42:32 compute-1 NetworkManager[55704]: <info>  [1764927752.5889] device (tapce2e0b21-6b): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec 05 09:42:32 compute-1 systemd[1]: Started Virtual Machine qemu-20-instance-0000002f.
Dec 05 09:42:32 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:42:32.605 105809 DEBUG oslo.privsep.daemon [-] privsep: reply[773c8b86-c173-44d2-9b48-2477e949b5f8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:42:32 compute-1 nova_compute[189066]: 2025-12-05 09:42:32.616 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:42:32 compute-1 ovn_controller[95809]: 2025-12-05T09:42:32Z|00244|binding|INFO|Setting lport ce2e0b21-6b51-4972-a784-60843f6686f3 ovn-installed in OVS
Dec 05 09:42:32 compute-1 ovn_controller[95809]: 2025-12-05T09:42:32Z|00245|binding|INFO|Setting lport ce2e0b21-6b51-4972-a784-60843f6686f3 up in Southbound
Dec 05 09:42:32 compute-1 nova_compute[189066]: 2025-12-05 09:42:32.623 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:42:32 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:42:32.626 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[edf22b0f-c743-410f-bfd4-b35458d8cada]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:42:32 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:42:32.685 220896 DEBUG oslo.privsep.daemon [-] privsep: reply[91ec2623-0495-4735-bb92-f1b1f410599d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:42:32 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:42:32.692 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[d85de548-542d-4b3f-8e6f-56079619997d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:42:32 compute-1 NetworkManager[55704]: <info>  [1764927752.6970] manager: (tapc8127193-80): new Veth device (/org/freedesktop/NetworkManager/Devices/127)
Dec 05 09:42:32 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:42:32.738 220896 DEBUG oslo.privsep.daemon [-] privsep: reply[46a57723-1cc6-46aa-9bf5-389ff4aaaed2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:42:32 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:42:32.741 220896 DEBUG oslo.privsep.daemon [-] privsep: reply[10448b1b-e770-453a-a6b7-fb169b48a4a1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:42:32 compute-1 NetworkManager[55704]: <info>  [1764927752.7698] device (tapc8127193-80): carrier: link connected
Dec 05 09:42:32 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:42:32.775 220896 DEBUG oslo.privsep.daemon [-] privsep: reply[d64bd825-3fed-48ff-8ba1-370ed14de7ac]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:42:32 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:42:32.798 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[82b57961-3253-4c7d-b127-55b501a06b3f]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapc8127193-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:e4:63:79'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 72], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 502577, 'reachable_time': 18852, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 231989, 'error': None, 'target': 'ovnmeta-c8127193-8f39-4d91-b81d-cba716b6b70a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:42:32 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:42:32.822 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[cbc5755e-c7f9-4b56-a0b7-210ac2709868]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fee4:6379'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 502577, 'tstamp': 502577}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 231990, 'error': None, 'target': 'ovnmeta-c8127193-8f39-4d91-b81d-cba716b6b70a', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:42:32 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:42:32.843 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[30970b6d-4dd1-410d-b11d-aa510243219b]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapc8127193-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:e4:63:79'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 72], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 502577, 'reachable_time': 18852, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 231991, 'error': None, 'target': 'ovnmeta-c8127193-8f39-4d91-b81d-cba716b6b70a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:42:32 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:42:32.880 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[3bf1f3de-ea1a-4de3-b33d-d6731feb5f35]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:42:32 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:42:32.947 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[95d779d0-df3e-4e6b-81f9-f1ae480289d4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:42:32 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:42:32.949 105272 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc8127193-80, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 09:42:32 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:42:32.950 105272 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 09:42:32 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:42:32.950 105272 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapc8127193-80, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 09:42:33 compute-1 NetworkManager[55704]: <info>  [1764927753.0048] manager: (tapc8127193-80): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/128)
Dec 05 09:42:33 compute-1 kernel: tapc8127193-80: entered promiscuous mode
Dec 05 09:42:33 compute-1 nova_compute[189066]: 2025-12-05 09:42:33.004 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:42:33 compute-1 nova_compute[189066]: 2025-12-05 09:42:33.007 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:42:33 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:42:33.008 105272 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapc8127193-80, col_values=(('external_ids', {'iface-id': 'dbd588a7-89a1-4a96-82f6-7a2d0d05566b'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 09:42:33 compute-1 nova_compute[189066]: 2025-12-05 09:42:33.009 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:42:33 compute-1 nova_compute[189066]: 2025-12-05 09:42:33.010 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:42:33 compute-1 ovn_controller[95809]: 2025-12-05T09:42:33Z|00246|binding|INFO|Releasing lport dbd588a7-89a1-4a96-82f6-7a2d0d05566b from this chassis (sb_readonly=0)
Dec 05 09:42:33 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:42:33.011 105272 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/c8127193-8f39-4d91-b81d-cba716b6b70a.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/c8127193-8f39-4d91-b81d-cba716b6b70a.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Dec 05 09:42:33 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:42:33.012 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[bfbc8f67-93ba-4019-aa0f-e4bf99e7045e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:42:33 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:42:33.013 105272 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Dec 05 09:42:33 compute-1 ovn_metadata_agent[105267]: global
Dec 05 09:42:33 compute-1 ovn_metadata_agent[105267]:     log         /dev/log local0 debug
Dec 05 09:42:33 compute-1 ovn_metadata_agent[105267]:     log-tag     haproxy-metadata-proxy-c8127193-8f39-4d91-b81d-cba716b6b70a
Dec 05 09:42:33 compute-1 ovn_metadata_agent[105267]:     user        root
Dec 05 09:42:33 compute-1 ovn_metadata_agent[105267]:     group       root
Dec 05 09:42:33 compute-1 ovn_metadata_agent[105267]:     maxconn     1024
Dec 05 09:42:33 compute-1 ovn_metadata_agent[105267]:     pidfile     /var/lib/neutron/external/pids/c8127193-8f39-4d91-b81d-cba716b6b70a.pid.haproxy
Dec 05 09:42:33 compute-1 ovn_metadata_agent[105267]:     daemon
Dec 05 09:42:33 compute-1 ovn_metadata_agent[105267]: 
Dec 05 09:42:33 compute-1 ovn_metadata_agent[105267]: defaults
Dec 05 09:42:33 compute-1 ovn_metadata_agent[105267]:     log global
Dec 05 09:42:33 compute-1 ovn_metadata_agent[105267]:     mode http
Dec 05 09:42:33 compute-1 ovn_metadata_agent[105267]:     option httplog
Dec 05 09:42:33 compute-1 ovn_metadata_agent[105267]:     option dontlognull
Dec 05 09:42:33 compute-1 ovn_metadata_agent[105267]:     option http-server-close
Dec 05 09:42:33 compute-1 ovn_metadata_agent[105267]:     option forwardfor
Dec 05 09:42:33 compute-1 ovn_metadata_agent[105267]:     retries                 3
Dec 05 09:42:33 compute-1 ovn_metadata_agent[105267]:     timeout http-request    30s
Dec 05 09:42:33 compute-1 ovn_metadata_agent[105267]:     timeout connect         30s
Dec 05 09:42:33 compute-1 ovn_metadata_agent[105267]:     timeout client          32s
Dec 05 09:42:33 compute-1 ovn_metadata_agent[105267]:     timeout server          32s
Dec 05 09:42:33 compute-1 ovn_metadata_agent[105267]:     timeout http-keep-alive 30s
Dec 05 09:42:33 compute-1 ovn_metadata_agent[105267]: 
Dec 05 09:42:33 compute-1 ovn_metadata_agent[105267]: 
Dec 05 09:42:33 compute-1 ovn_metadata_agent[105267]: listen listener
Dec 05 09:42:33 compute-1 ovn_metadata_agent[105267]:     bind 169.254.169.254:80
Dec 05 09:42:33 compute-1 ovn_metadata_agent[105267]:     server metadata /var/lib/neutron/metadata_proxy
Dec 05 09:42:33 compute-1 ovn_metadata_agent[105267]:     http-request add-header X-OVN-Network-ID c8127193-8f39-4d91-b81d-cba716b6b70a
Dec 05 09:42:33 compute-1 ovn_metadata_agent[105267]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Dec 05 09:42:33 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:42:33.014 105272 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-c8127193-8f39-4d91-b81d-cba716b6b70a', 'env', 'PROCESS_TAG=haproxy-c8127193-8f39-4d91-b81d-cba716b6b70a', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/c8127193-8f39-4d91-b81d-cba716b6b70a.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Dec 05 09:42:33 compute-1 nova_compute[189066]: 2025-12-05 09:42:33.021 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:42:33 compute-1 nova_compute[189066]: 2025-12-05 09:42:33.022 189070 DEBUG nova.compute.manager [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Dec 05 09:42:33 compute-1 nova_compute[189066]: 2025-12-05 09:42:33.024 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:42:33 compute-1 nova_compute[189066]: 2025-12-05 09:42:33.086 189070 DEBUG nova.virt.driver [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] Emitting event <LifecycleEvent: 1764927753.0853584, b417b484-e28d-4f07-a7a6-9528c9b27a63 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 09:42:33 compute-1 nova_compute[189066]: 2025-12-05 09:42:33.087 189070 INFO nova.compute.manager [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] [instance: b417b484-e28d-4f07-a7a6-9528c9b27a63] VM Started (Lifecycle Event)
Dec 05 09:42:33 compute-1 nova_compute[189066]: 2025-12-05 09:42:33.110 189070 DEBUG nova.compute.manager [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] [instance: b417b484-e28d-4f07-a7a6-9528c9b27a63] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 09:42:33 compute-1 nova_compute[189066]: 2025-12-05 09:42:33.115 189070 DEBUG nova.virt.driver [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] Emitting event <LifecycleEvent: 1764927753.0854936, b417b484-e28d-4f07-a7a6-9528c9b27a63 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 09:42:33 compute-1 nova_compute[189066]: 2025-12-05 09:42:33.116 189070 INFO nova.compute.manager [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] [instance: b417b484-e28d-4f07-a7a6-9528c9b27a63] VM Paused (Lifecycle Event)
Dec 05 09:42:33 compute-1 nova_compute[189066]: 2025-12-05 09:42:33.143 189070 DEBUG nova.compute.manager [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] [instance: b417b484-e28d-4f07-a7a6-9528c9b27a63] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 09:42:33 compute-1 nova_compute[189066]: 2025-12-05 09:42:33.148 189070 DEBUG nova.compute.manager [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] [instance: b417b484-e28d-4f07-a7a6-9528c9b27a63] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 09:42:33 compute-1 nova_compute[189066]: 2025-12-05 09:42:33.204 189070 INFO nova.compute.manager [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] [instance: b417b484-e28d-4f07-a7a6-9528c9b27a63] During sync_power_state the instance has a pending task (spawning). Skip.
Dec 05 09:42:33 compute-1 nova_compute[189066]: 2025-12-05 09:42:33.343 189070 DEBUG nova.compute.manager [req-70d0aa17-040c-4187-98bf-ca519f6a3045 req-af91ebde-b5af-45af-9ecb-4a73ad154a54 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: b417b484-e28d-4f07-a7a6-9528c9b27a63] Received event network-vif-plugged-ce2e0b21-6b51-4972-a784-60843f6686f3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 09:42:33 compute-1 nova_compute[189066]: 2025-12-05 09:42:33.344 189070 DEBUG oslo_concurrency.lockutils [req-70d0aa17-040c-4187-98bf-ca519f6a3045 req-af91ebde-b5af-45af-9ecb-4a73ad154a54 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Acquiring lock "b417b484-e28d-4f07-a7a6-9528c9b27a63-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:42:33 compute-1 nova_compute[189066]: 2025-12-05 09:42:33.344 189070 DEBUG oslo_concurrency.lockutils [req-70d0aa17-040c-4187-98bf-ca519f6a3045 req-af91ebde-b5af-45af-9ecb-4a73ad154a54 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Lock "b417b484-e28d-4f07-a7a6-9528c9b27a63-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:42:33 compute-1 nova_compute[189066]: 2025-12-05 09:42:33.345 189070 DEBUG oslo_concurrency.lockutils [req-70d0aa17-040c-4187-98bf-ca519f6a3045 req-af91ebde-b5af-45af-9ecb-4a73ad154a54 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Lock "b417b484-e28d-4f07-a7a6-9528c9b27a63-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:42:33 compute-1 nova_compute[189066]: 2025-12-05 09:42:33.345 189070 DEBUG nova.compute.manager [req-70d0aa17-040c-4187-98bf-ca519f6a3045 req-af91ebde-b5af-45af-9ecb-4a73ad154a54 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: b417b484-e28d-4f07-a7a6-9528c9b27a63] Processing event network-vif-plugged-ce2e0b21-6b51-4972-a784-60843f6686f3 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Dec 05 09:42:33 compute-1 nova_compute[189066]: 2025-12-05 09:42:33.346 189070 DEBUG nova.compute.manager [None req-14d85949-fca8-4f4e-a1b8-beaeee3f2877 fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] [instance: b417b484-e28d-4f07-a7a6-9528c9b27a63] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Dec 05 09:42:33 compute-1 nova_compute[189066]: 2025-12-05 09:42:33.350 189070 DEBUG nova.virt.driver [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] Emitting event <LifecycleEvent: 1764927753.3502712, b417b484-e28d-4f07-a7a6-9528c9b27a63 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 09:42:33 compute-1 nova_compute[189066]: 2025-12-05 09:42:33.351 189070 INFO nova.compute.manager [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] [instance: b417b484-e28d-4f07-a7a6-9528c9b27a63] VM Resumed (Lifecycle Event)
Dec 05 09:42:33 compute-1 nova_compute[189066]: 2025-12-05 09:42:33.353 189070 DEBUG nova.virt.libvirt.driver [None req-14d85949-fca8-4f4e-a1b8-beaeee3f2877 fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] [instance: b417b484-e28d-4f07-a7a6-9528c9b27a63] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Dec 05 09:42:33 compute-1 nova_compute[189066]: 2025-12-05 09:42:33.357 189070 INFO nova.virt.libvirt.driver [-] [instance: b417b484-e28d-4f07-a7a6-9528c9b27a63] Instance spawned successfully.
Dec 05 09:42:33 compute-1 nova_compute[189066]: 2025-12-05 09:42:33.358 189070 DEBUG nova.virt.libvirt.driver [None req-14d85949-fca8-4f4e-a1b8-beaeee3f2877 fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] [instance: b417b484-e28d-4f07-a7a6-9528c9b27a63] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Dec 05 09:42:33 compute-1 nova_compute[189066]: 2025-12-05 09:42:33.392 189070 DEBUG nova.compute.manager [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] [instance: b417b484-e28d-4f07-a7a6-9528c9b27a63] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 09:42:33 compute-1 nova_compute[189066]: 2025-12-05 09:42:33.396 189070 DEBUG nova.virt.libvirt.driver [None req-14d85949-fca8-4f4e-a1b8-beaeee3f2877 fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] [instance: b417b484-e28d-4f07-a7a6-9528c9b27a63] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 09:42:33 compute-1 nova_compute[189066]: 2025-12-05 09:42:33.397 189070 DEBUG nova.virt.libvirt.driver [None req-14d85949-fca8-4f4e-a1b8-beaeee3f2877 fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] [instance: b417b484-e28d-4f07-a7a6-9528c9b27a63] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 09:42:33 compute-1 nova_compute[189066]: 2025-12-05 09:42:33.397 189070 DEBUG nova.virt.libvirt.driver [None req-14d85949-fca8-4f4e-a1b8-beaeee3f2877 fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] [instance: b417b484-e28d-4f07-a7a6-9528c9b27a63] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 09:42:33 compute-1 nova_compute[189066]: 2025-12-05 09:42:33.398 189070 DEBUG nova.virt.libvirt.driver [None req-14d85949-fca8-4f4e-a1b8-beaeee3f2877 fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] [instance: b417b484-e28d-4f07-a7a6-9528c9b27a63] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 09:42:33 compute-1 nova_compute[189066]: 2025-12-05 09:42:33.398 189070 DEBUG nova.virt.libvirt.driver [None req-14d85949-fca8-4f4e-a1b8-beaeee3f2877 fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] [instance: b417b484-e28d-4f07-a7a6-9528c9b27a63] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 09:42:33 compute-1 nova_compute[189066]: 2025-12-05 09:42:33.398 189070 DEBUG nova.virt.libvirt.driver [None req-14d85949-fca8-4f4e-a1b8-beaeee3f2877 fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] [instance: b417b484-e28d-4f07-a7a6-9528c9b27a63] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 09:42:33 compute-1 nova_compute[189066]: 2025-12-05 09:42:33.406 189070 DEBUG nova.compute.manager [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] [instance: b417b484-e28d-4f07-a7a6-9528c9b27a63] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 09:42:33 compute-1 nova_compute[189066]: 2025-12-05 09:42:33.446 189070 INFO nova.compute.manager [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] [instance: b417b484-e28d-4f07-a7a6-9528c9b27a63] During sync_power_state the instance has a pending task (spawning). Skip.
Dec 05 09:42:33 compute-1 nova_compute[189066]: 2025-12-05 09:42:33.479 189070 INFO nova.compute.manager [None req-14d85949-fca8-4f4e-a1b8-beaeee3f2877 fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] [instance: b417b484-e28d-4f07-a7a6-9528c9b27a63] Took 19.84 seconds to spawn the instance on the hypervisor.
Dec 05 09:42:33 compute-1 nova_compute[189066]: 2025-12-05 09:42:33.480 189070 DEBUG nova.compute.manager [None req-14d85949-fca8-4f4e-a1b8-beaeee3f2877 fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] [instance: b417b484-e28d-4f07-a7a6-9528c9b27a63] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 09:42:33 compute-1 podman[232026]: 2025-12-05 09:42:33.5200409 +0000 UTC m=+0.058903483 container create b6492b9ac72ef1cb25a25a4a873f3cb5f92a1abcc6b177567fc49146fa779df0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c8127193-8f39-4d91-b81d-cba716b6b70a, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS)
Dec 05 09:42:33 compute-1 systemd[1]: Started libpod-conmon-b6492b9ac72ef1cb25a25a4a873f3cb5f92a1abcc6b177567fc49146fa779df0.scope.
Dec 05 09:42:33 compute-1 podman[232026]: 2025-12-05 09:42:33.489035651 +0000 UTC m=+0.027898284 image pull 014dc726c85414b29f2dde7b5d875685d08784761c0f0ffa8630d1583a877bf9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Dec 05 09:42:33 compute-1 nova_compute[189066]: 2025-12-05 09:42:33.584 189070 INFO nova.compute.manager [None req-14d85949-fca8-4f4e-a1b8-beaeee3f2877 fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] [instance: b417b484-e28d-4f07-a7a6-9528c9b27a63] Took 21.36 seconds to build instance.
Dec 05 09:42:33 compute-1 systemd[1]: Started libcrun container.
Dec 05 09:42:33 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/bb1762d25d650fd77c763a7e533a3494ae0f880abb812cef8319a27fc36fb4d1/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 05 09:42:33 compute-1 podman[232026]: 2025-12-05 09:42:33.620083941 +0000 UTC m=+0.158946544 container init b6492b9ac72ef1cb25a25a4a873f3cb5f92a1abcc6b177567fc49146fa779df0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c8127193-8f39-4d91-b81d-cba716b6b70a, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, io.buildah.version=1.41.3)
Dec 05 09:42:33 compute-1 podman[232026]: 2025-12-05 09:42:33.625671457 +0000 UTC m=+0.164534040 container start b6492b9ac72ef1cb25a25a4a873f3cb5f92a1abcc6b177567fc49146fa779df0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c8127193-8f39-4d91-b81d-cba716b6b70a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 05 09:42:33 compute-1 nova_compute[189066]: 2025-12-05 09:42:33.649 189070 DEBUG oslo_concurrency.lockutils [None req-14d85949-fca8-4f4e-a1b8-beaeee3f2877 fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] Lock "b417b484-e28d-4f07-a7a6-9528c9b27a63" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 21.542s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:42:33 compute-1 neutron-haproxy-ovnmeta-c8127193-8f39-4d91-b81d-cba716b6b70a[232042]: [NOTICE]   (232046) : New worker (232048) forked
Dec 05 09:42:33 compute-1 neutron-haproxy-ovnmeta-c8127193-8f39-4d91-b81d-cba716b6b70a[232042]: [NOTICE]   (232046) : Loading success.
Dec 05 09:42:34 compute-1 podman[232057]: 2025-12-05 09:42:34.683609375 +0000 UTC m=+0.115239483 container health_status 0cdc951f3b455a1459471b20e1f1626ca53f702831c125f835d1b670aeb17ccf (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a42d21893a42142b0b00e96296a13ac9d2c0fedf55d6438ad104fd0309c63376-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Dec 05 09:42:34 compute-1 nova_compute[189066]: 2025-12-05 09:42:34.714 189070 DEBUG nova.network.neutron [req-d9b499f9-fbf0-4d85-84f2-0153fd477e8c req-041b4b3a-f537-4c68-a139-2178bf062852 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: b417b484-e28d-4f07-a7a6-9528c9b27a63] Updated VIF entry in instance network info cache for port ce2e0b21-6b51-4972-a784-60843f6686f3. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Dec 05 09:42:34 compute-1 nova_compute[189066]: 2025-12-05 09:42:34.714 189070 DEBUG nova.network.neutron [req-d9b499f9-fbf0-4d85-84f2-0153fd477e8c req-041b4b3a-f537-4c68-a139-2178bf062852 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: b417b484-e28d-4f07-a7a6-9528c9b27a63] Updating instance_info_cache with network_info: [{"id": "ce2e0b21-6b51-4972-a784-60843f6686f3", "address": "fa:16:3e:70:c2:8e", "network": {"id": "c8127193-8f39-4d91-b81d-cba716b6b70a", "bridge": "br-int", "label": "tempest-network-smoke--1577898204", "subnets": [{"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe70:c28e", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe70:c28e", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "fa1cd463d74b49139a088d332d37e611", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapce2e0b21-6b", "ovs_interfaceid": "ce2e0b21-6b51-4972-a784-60843f6686f3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 09:42:34 compute-1 nova_compute[189066]: 2025-12-05 09:42:34.735 189070 DEBUG oslo_concurrency.lockutils [req-d9b499f9-fbf0-4d85-84f2-0153fd477e8c req-041b4b3a-f537-4c68-a139-2178bf062852 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Releasing lock "refresh_cache-b417b484-e28d-4f07-a7a6-9528c9b27a63" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 09:42:35 compute-1 nova_compute[189066]: 2025-12-05 09:42:35.489 189070 DEBUG nova.compute.manager [req-132d8951-8eca-4318-92b2-f39c028b1359 req-08c687b1-f07f-4410-8e5e-f262efed81c6 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: b417b484-e28d-4f07-a7a6-9528c9b27a63] Received event network-vif-plugged-ce2e0b21-6b51-4972-a784-60843f6686f3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 09:42:35 compute-1 nova_compute[189066]: 2025-12-05 09:42:35.490 189070 DEBUG oslo_concurrency.lockutils [req-132d8951-8eca-4318-92b2-f39c028b1359 req-08c687b1-f07f-4410-8e5e-f262efed81c6 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Acquiring lock "b417b484-e28d-4f07-a7a6-9528c9b27a63-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:42:35 compute-1 nova_compute[189066]: 2025-12-05 09:42:35.491 189070 DEBUG oslo_concurrency.lockutils [req-132d8951-8eca-4318-92b2-f39c028b1359 req-08c687b1-f07f-4410-8e5e-f262efed81c6 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Lock "b417b484-e28d-4f07-a7a6-9528c9b27a63-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:42:35 compute-1 nova_compute[189066]: 2025-12-05 09:42:35.491 189070 DEBUG oslo_concurrency.lockutils [req-132d8951-8eca-4318-92b2-f39c028b1359 req-08c687b1-f07f-4410-8e5e-f262efed81c6 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Lock "b417b484-e28d-4f07-a7a6-9528c9b27a63-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:42:35 compute-1 nova_compute[189066]: 2025-12-05 09:42:35.491 189070 DEBUG nova.compute.manager [req-132d8951-8eca-4318-92b2-f39c028b1359 req-08c687b1-f07f-4410-8e5e-f262efed81c6 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: b417b484-e28d-4f07-a7a6-9528c9b27a63] No waiting events found dispatching network-vif-plugged-ce2e0b21-6b51-4972-a784-60843f6686f3 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 09:42:35 compute-1 nova_compute[189066]: 2025-12-05 09:42:35.491 189070 WARNING nova.compute.manager [req-132d8951-8eca-4318-92b2-f39c028b1359 req-08c687b1-f07f-4410-8e5e-f262efed81c6 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: b417b484-e28d-4f07-a7a6-9528c9b27a63] Received unexpected event network-vif-plugged-ce2e0b21-6b51-4972-a784-60843f6686f3 for instance with vm_state active and task_state None.
Dec 05 09:42:35 compute-1 nova_compute[189066]: 2025-12-05 09:42:35.672 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:42:35 compute-1 nova_compute[189066]: 2025-12-05 09:42:35.963 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:42:37 compute-1 podman[232083]: 2025-12-05 09:42:37.656072809 +0000 UTC m=+0.094234048 container health_status e8cea6352187f28e672aa25c227f2705a90d58074b8cf42235a56df5d4026fd5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a42d21893a42142b0b00e96296a13ac9d2c0fedf55d6438ad104fd0309c63376-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Dec 05 09:42:37 compute-1 NetworkManager[55704]: <info>  [1764927757.7918] manager: (patch-br-int-to-provnet-540ef657-1e81-4b72-856b-0e1c84504735): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/129)
Dec 05 09:42:37 compute-1 NetworkManager[55704]: <info>  [1764927757.7929] manager: (patch-provnet-540ef657-1e81-4b72-856b-0e1c84504735-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/130)
Dec 05 09:42:37 compute-1 nova_compute[189066]: 2025-12-05 09:42:37.797 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:42:37 compute-1 nova_compute[189066]: 2025-12-05 09:42:37.850 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:42:37 compute-1 ovn_controller[95809]: 2025-12-05T09:42:37Z|00247|binding|INFO|Releasing lport dbd588a7-89a1-4a96-82f6-7a2d0d05566b from this chassis (sb_readonly=0)
Dec 05 09:42:37 compute-1 nova_compute[189066]: 2025-12-05 09:42:37.862 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:42:38 compute-1 nova_compute[189066]: 2025-12-05 09:42:38.864 189070 DEBUG nova.compute.manager [req-15962e7a-7db0-4ccb-b7f0-f94e17c456eb req-34a39e7f-0be8-42a3-95a4-65edfb4c277f 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: b417b484-e28d-4f07-a7a6-9528c9b27a63] Received event network-changed-ce2e0b21-6b51-4972-a784-60843f6686f3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 09:42:38 compute-1 nova_compute[189066]: 2025-12-05 09:42:38.865 189070 DEBUG nova.compute.manager [req-15962e7a-7db0-4ccb-b7f0-f94e17c456eb req-34a39e7f-0be8-42a3-95a4-65edfb4c277f 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: b417b484-e28d-4f07-a7a6-9528c9b27a63] Refreshing instance network info cache due to event network-changed-ce2e0b21-6b51-4972-a784-60843f6686f3. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Dec 05 09:42:38 compute-1 nova_compute[189066]: 2025-12-05 09:42:38.866 189070 DEBUG oslo_concurrency.lockutils [req-15962e7a-7db0-4ccb-b7f0-f94e17c456eb req-34a39e7f-0be8-42a3-95a4-65edfb4c277f 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Acquiring lock "refresh_cache-b417b484-e28d-4f07-a7a6-9528c9b27a63" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 09:42:38 compute-1 nova_compute[189066]: 2025-12-05 09:42:38.866 189070 DEBUG oslo_concurrency.lockutils [req-15962e7a-7db0-4ccb-b7f0-f94e17c456eb req-34a39e7f-0be8-42a3-95a4-65edfb4c277f 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Acquired lock "refresh_cache-b417b484-e28d-4f07-a7a6-9528c9b27a63" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 09:42:38 compute-1 nova_compute[189066]: 2025-12-05 09:42:38.866 189070 DEBUG nova.network.neutron [req-15962e7a-7db0-4ccb-b7f0-f94e17c456eb req-34a39e7f-0be8-42a3-95a4-65edfb4c277f 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: b417b484-e28d-4f07-a7a6-9528c9b27a63] Refreshing network info cache for port ce2e0b21-6b51-4972-a784-60843f6686f3 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Dec 05 09:42:38 compute-1 ovn_controller[95809]: 2025-12-05T09:42:38Z|00248|binding|INFO|Releasing lport dbd588a7-89a1-4a96-82f6-7a2d0d05566b from this chassis (sb_readonly=0)
Dec 05 09:42:38 compute-1 nova_compute[189066]: 2025-12-05 09:42:38.921 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:42:39 compute-1 nova_compute[189066]: 2025-12-05 09:42:39.020 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:42:39 compute-1 podman[232104]: 2025-12-05 09:42:39.631703832 +0000 UTC m=+0.061633080 container health_status 3f3288243aefe700d53cffce184353529fbaf6e44f1c19ec4792ed576af41176 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd)
Dec 05 09:42:40 compute-1 nova_compute[189066]: 2025-12-05 09:42:40.674 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:42:40 compute-1 nova_compute[189066]: 2025-12-05 09:42:40.966 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:42:42 compute-1 podman[232126]: 2025-12-05 09:42:42.646092813 +0000 UTC m=+0.076757601 container health_status b07f36300303a5c1729056a61e344d6c6fd9b2e404082d8e8ce7185d45652c2b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, container_name=openstack_network_exporter, io.openshift.expose-services=, vendor=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, name=ubi9-minimal, build-date=2025-08-20T13:12:41, managed_by=edpm_ansible, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., distribution-scope=public, release=1755695350, config_id=openstack_network_exporter, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Dec 05 09:42:43 compute-1 nova_compute[189066]: 2025-12-05 09:42:43.091 189070 DEBUG nova.network.neutron [req-15962e7a-7db0-4ccb-b7f0-f94e17c456eb req-34a39e7f-0be8-42a3-95a4-65edfb4c277f 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: b417b484-e28d-4f07-a7a6-9528c9b27a63] Updated VIF entry in instance network info cache for port ce2e0b21-6b51-4972-a784-60843f6686f3. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Dec 05 09:42:43 compute-1 nova_compute[189066]: 2025-12-05 09:42:43.092 189070 DEBUG nova.network.neutron [req-15962e7a-7db0-4ccb-b7f0-f94e17c456eb req-34a39e7f-0be8-42a3-95a4-65edfb4c277f 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: b417b484-e28d-4f07-a7a6-9528c9b27a63] Updating instance_info_cache with network_info: [{"id": "ce2e0b21-6b51-4972-a784-60843f6686f3", "address": "fa:16:3e:70:c2:8e", "network": {"id": "c8127193-8f39-4d91-b81d-cba716b6b70a", "bridge": "br-int", "label": "tempest-network-smoke--1577898204", "subnets": [{"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe70:c28e", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.241", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe70:c28e", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "fa1cd463d74b49139a088d332d37e611", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapce2e0b21-6b", "ovs_interfaceid": "ce2e0b21-6b51-4972-a784-60843f6686f3", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 09:42:43 compute-1 nova_compute[189066]: 2025-12-05 09:42:43.245 189070 DEBUG oslo_concurrency.lockutils [req-15962e7a-7db0-4ccb-b7f0-f94e17c456eb req-34a39e7f-0be8-42a3-95a4-65edfb4c277f 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Releasing lock "refresh_cache-b417b484-e28d-4f07-a7a6-9528c9b27a63" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 09:42:45 compute-1 nova_compute[189066]: 2025-12-05 09:42:45.443 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:42:45 compute-1 nova_compute[189066]: 2025-12-05 09:42:45.676 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:42:45 compute-1 nova_compute[189066]: 2025-12-05 09:42:45.968 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:42:46 compute-1 ovn_controller[95809]: 2025-12-05T09:42:46Z|00041|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:70:c2:8e 10.100.0.9
Dec 05 09:42:46 compute-1 ovn_controller[95809]: 2025-12-05T09:42:46Z|00042|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:70:c2:8e 10.100.0.9
Dec 05 09:42:47 compute-1 podman[232174]: 2025-12-05 09:42:47.641033757 +0000 UTC m=+0.073134353 container health_status b9f45cdcb567bb250381fa80d86c78c226d5638c7de1b6d79ee017275814ff68 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Dec 05 09:42:50 compute-1 nova_compute[189066]: 2025-12-05 09:42:50.719 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:42:50 compute-1 nova_compute[189066]: 2025-12-05 09:42:50.970 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:42:51 compute-1 podman[232198]: 2025-12-05 09:42:51.641785234 +0000 UTC m=+0.068276694 container health_status 550b604b3b40c506829669f3af0f3e8dc4e7ac01464222ce7f26998708fe1a96 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Dec 05 09:42:52 compute-1 nova_compute[189066]: 2025-12-05 09:42:52.418 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:42:53 compute-1 nova_compute[189066]: 2025-12-05 09:42:53.113 189070 DEBUG nova.compute.manager [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Triggering sync for uuid b417b484-e28d-4f07-a7a6-9528c9b27a63 _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268
Dec 05 09:42:53 compute-1 nova_compute[189066]: 2025-12-05 09:42:53.114 189070 DEBUG oslo_concurrency.lockutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Acquiring lock "b417b484-e28d-4f07-a7a6-9528c9b27a63" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:42:53 compute-1 nova_compute[189066]: 2025-12-05 09:42:53.114 189070 DEBUG oslo_concurrency.lockutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Lock "b417b484-e28d-4f07-a7a6-9528c9b27a63" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:42:53 compute-1 nova_compute[189066]: 2025-12-05 09:42:53.294 189070 DEBUG oslo_concurrency.lockutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Lock "b417b484-e28d-4f07-a7a6-9528c9b27a63" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.180s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:42:55 compute-1 nova_compute[189066]: 2025-12-05 09:42:55.683 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:42:55 compute-1 nova_compute[189066]: 2025-12-05 09:42:55.722 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:42:55 compute-1 nova_compute[189066]: 2025-12-05 09:42:55.973 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:42:57 compute-1 podman[232224]: 2025-12-05 09:42:57.636518913 +0000 UTC m=+0.072842695 container health_status d60d3dfd47255bc7c068dbedc03dbf86678863f0414a1b8f8536e4d53dd67a13 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a42d21893a42142b0b00e96296a13ac9d2c0fedf55d6438ad104fd0309c63376-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute)
Dec 05 09:43:00 compute-1 nova_compute[189066]: 2025-12-05 09:43:00.724 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:43:00 compute-1 nova_compute[189066]: 2025-12-05 09:43:00.974 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:43:05 compute-1 podman[232245]: 2025-12-05 09:43:05.65164387 +0000 UTC m=+0.086868029 container health_status 0cdc951f3b455a1459471b20e1f1626ca53f702831c125f835d1b670aeb17ccf (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a42d21893a42142b0b00e96296a13ac9d2c0fedf55d6438ad104fd0309c63376-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Dec 05 09:43:05 compute-1 nova_compute[189066]: 2025-12-05 09:43:05.725 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:43:05 compute-1 nova_compute[189066]: 2025-12-05 09:43:05.976 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:43:08 compute-1 podman[232271]: 2025-12-05 09:43:08.619142362 +0000 UTC m=+0.055761076 container health_status e8cea6352187f28e672aa25c227f2705a90d58074b8cf42235a56df5d4026fd5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a42d21893a42142b0b00e96296a13ac9d2c0fedf55d6438ad104fd0309c63376-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2)
Dec 05 09:43:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:43:08.890 105272 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:43:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:43:08.891 105272 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:43:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:43:08.893 105272 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:43:10 compute-1 podman[232292]: 2025-12-05 09:43:10.637906889 +0000 UTC m=+0.069656986 container health_status 3f3288243aefe700d53cffce184353529fbaf6e44f1c19ec4792ed576af41176 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, container_name=multipathd, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true)
Dec 05 09:43:10 compute-1 nova_compute[189066]: 2025-12-05 09:43:10.728 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.758 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'b417b484-e28d-4f07-a7a6-9528c9b27a63', 'name': 'tempest-TestGettingAddress-server-1007468349', 'flavor': {'id': 'fbadeab4-f24f-4100-963a-d228b2a6f7c4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '3ebffd97-b242-42d7-b245-ebdaf8e4377c'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-0000002f', 'OS-EXT-SRV-ATTR:host': 'compute-1.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': 'fa1cd463d74b49139a088d332d37e611', 'user_id': 'fae1c60e378945ea84b34c4824b835b1', 'hostId': '2bcb83f9ba5db29ea79c746e62a8a2dc63c5c999436a0571013bf81c', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.760 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.760 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.760 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-TestGettingAddress-server-1007468349>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestGettingAddress-server-1007468349>]
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.761 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.764 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for b417b484-e28d-4f07-a7a6-9528c9b27a63 / tapce2e0b21-6b inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.765 12 DEBUG ceilometer.compute.pollsters [-] b417b484-e28d-4f07-a7a6-9528c9b27a63/network.outgoing.bytes volume: 4048 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.773 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'cae0b054-ef8d-4b5d-84a8-19e8cf240f80', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 4048, 'user_id': 'fae1c60e378945ea84b34c4824b835b1', 'user_name': None, 'project_id': 'fa1cd463d74b49139a088d332d37e611', 'project_name': None, 'resource_id': 'instance-0000002f-b417b484-e28d-4f07-a7a6-9528c9b27a63-tapce2e0b21-6b', 'timestamp': '2025-12-05T09:43:10.761234', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1007468349', 'name': 'tapce2e0b21-6b', 'instance_id': 'b417b484-e28d-4f07-a7a6-9528c9b27a63', 'instance_type': 'm1.nano', 'host': '2bcb83f9ba5db29ea79c746e62a8a2dc63c5c999436a0571013bf81c', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'fbadeab4-f24f-4100-963a-d228b2a6f7c4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3ebffd97-b242-42d7-b245-ebdaf8e4377c'}, 'image_ref': '3ebffd97-b242-42d7-b245-ebdaf8e4377c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:70:c2:8e', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapce2e0b21-6b'}, 'message_id': 'cfbbb3d4-d1be-11f0-a2f3-fa163e9454b0', 'monotonic_time': 5063.818776981, 'message_signature': 'dd893657de6b84ec10c2c47062738cc74ab502eee7e41141dc9ad61be8dc8608'}]}, 'timestamp': '2025-12-05 09:43:10.767033', '_unique_id': 'd136950974ff45b0adc345135e9d688e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.773 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.773 12 ERROR oslo_messaging.notify.messaging     yield
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.773 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.773 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.773 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.773 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.773 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.773 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.773 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.773 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.773 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.773 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.773 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.773 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.773 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.773 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.773 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.773 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.773 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.773 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.773 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.773 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.773 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.773 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.773 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.773 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.773 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.773 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.773 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.773 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.773 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.777 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.810 12 DEBUG ceilometer.compute.pollsters [-] b417b484-e28d-4f07-a7a6-9528c9b27a63/disk.device.write.latency volume: 2826592483 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.811 12 DEBUG ceilometer.compute.pollsters [-] b417b484-e28d-4f07-a7a6-9528c9b27a63/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.813 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8492944d-ed68-490d-89c0-ee979bf898af', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 2826592483, 'user_id': 'fae1c60e378945ea84b34c4824b835b1', 'user_name': None, 'project_id': 'fa1cd463d74b49139a088d332d37e611', 'project_name': None, 'resource_id': 'b417b484-e28d-4f07-a7a6-9528c9b27a63-vda', 'timestamp': '2025-12-05T09:43:10.778117', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1007468349', 'name': 'instance-0000002f', 'instance_id': 'b417b484-e28d-4f07-a7a6-9528c9b27a63', 'instance_type': 'm1.nano', 'host': '2bcb83f9ba5db29ea79c746e62a8a2dc63c5c999436a0571013bf81c', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'fbadeab4-f24f-4100-963a-d228b2a6f7c4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3ebffd97-b242-42d7-b245-ebdaf8e4377c'}, 'image_ref': '3ebffd97-b242-42d7-b245-ebdaf8e4377c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'cfc28894-d1be-11f0-a2f3-fa163e9454b0', 'monotonic_time': 5063.835805759, 'message_signature': 'a7ead873972acf2011bd4ea31fe00b48c163c15fe1890252fd76fc54fb8f6f7f'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': 'fae1c60e378945ea84b34c4824b835b1', 'user_name': None, 'project_id': 'fa1cd463d74b49139a088d332d37e611', 'project_name': None, 'resource_id': 'b417b484-e28d-4f07-a7a6-9528c9b27a63-sda', 'timestamp': '2025-12-05T09:43:10.778117', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1007468349', 'name': 'instance-0000002f', 'instance_id': 'b417b484-e28d-4f07-a7a6-9528c9b27a63', 'instance_type': 'm1.nano', 'host': '2bcb83f9ba5db29ea79c746e62a8a2dc63c5c999436a0571013bf81c', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'fbadeab4-f24f-4100-963a-d228b2a6f7c4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3ebffd97-b242-42d7-b245-ebdaf8e4377c'}, 'image_ref': '3ebffd97-b242-42d7-b245-ebdaf8e4377c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'cfc29c58-d1be-11f0-a2f3-fa163e9454b0', 'monotonic_time': 5063.835805759, 'message_signature': '342a1e69f10240d9dee24c62873a1ca052f80abc9a1061fa8bcd5b54fc192d58'}]}, 'timestamp': '2025-12-05 09:43:10.811649', '_unique_id': 'cd8e0526d9ac4a619c521c214aa718ca'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.813 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.813 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.813 12 ERROR oslo_messaging.notify.messaging     yield
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.813 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.813 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.813 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.813 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.813 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.813 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.813 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.813 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.813 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.813 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.813 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.813 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.813 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.813 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.813 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.813 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.813 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.813 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.813 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.813 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.813 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.813 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.813 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.813 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.813 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.813 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.813 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.813 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.813 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.813 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.813 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.813 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.813 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.813 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.813 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.813 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.813 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.813 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.813 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.813 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.813 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.813 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.813 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.813 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.813 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.813 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.813 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.813 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.813 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.813 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.813 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.814 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.814 12 DEBUG ceilometer.compute.pollsters [-] b417b484-e28d-4f07-a7a6-9528c9b27a63/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.816 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e4d28da8-795f-449c-9c43-589e321ad71f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'fae1c60e378945ea84b34c4824b835b1', 'user_name': None, 'project_id': 'fa1cd463d74b49139a088d332d37e611', 'project_name': None, 'resource_id': 'instance-0000002f-b417b484-e28d-4f07-a7a6-9528c9b27a63-tapce2e0b21-6b', 'timestamp': '2025-12-05T09:43:10.814867', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1007468349', 'name': 'tapce2e0b21-6b', 'instance_id': 'b417b484-e28d-4f07-a7a6-9528c9b27a63', 'instance_type': 'm1.nano', 'host': '2bcb83f9ba5db29ea79c746e62a8a2dc63c5c999436a0571013bf81c', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'fbadeab4-f24f-4100-963a-d228b2a6f7c4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3ebffd97-b242-42d7-b245-ebdaf8e4377c'}, 'image_ref': '3ebffd97-b242-42d7-b245-ebdaf8e4377c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:70:c2:8e', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapce2e0b21-6b'}, 'message_id': 'cfc32b78-d1be-11f0-a2f3-fa163e9454b0', 'monotonic_time': 5063.818776981, 'message_signature': 'c7a1f68c5581fc49c9c99890b47c60bdba04f06e0b726085f3d967382aebcbaf'}]}, 'timestamp': '2025-12-05 09:43:10.815281', '_unique_id': 'a7716028a46a404198cb558c46408568'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.816 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.816 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.816 12 ERROR oslo_messaging.notify.messaging     yield
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.816 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.816 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.816 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.816 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.816 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.816 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.816 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.816 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.816 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.816 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.816 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.816 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.816 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.816 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.816 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.816 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.816 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.816 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.816 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.816 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.816 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.816 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.816 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.816 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.816 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.816 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.816 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.816 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.816 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.816 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.816 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.816 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.816 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.816 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.816 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.816 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.816 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.816 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.816 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.816 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.816 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.816 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.816 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.816 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.816 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.816 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.816 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.816 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.816 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.816 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.816 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.817 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.817 12 DEBUG ceilometer.compute.pollsters [-] b417b484-e28d-4f07-a7a6-9528c9b27a63/disk.device.write.requests volume: 307 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.818 12 DEBUG ceilometer.compute.pollsters [-] b417b484-e28d-4f07-a7a6-9528c9b27a63/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.819 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '08b09ae7-c0ba-4ee2-9e23-8ffa67278f03', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 307, 'user_id': 'fae1c60e378945ea84b34c4824b835b1', 'user_name': None, 'project_id': 'fa1cd463d74b49139a088d332d37e611', 'project_name': None, 'resource_id': 'b417b484-e28d-4f07-a7a6-9528c9b27a63-vda', 'timestamp': '2025-12-05T09:43:10.817946', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1007468349', 'name': 'instance-0000002f', 'instance_id': 'b417b484-e28d-4f07-a7a6-9528c9b27a63', 'instance_type': 'm1.nano', 'host': '2bcb83f9ba5db29ea79c746e62a8a2dc63c5c999436a0571013bf81c', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'fbadeab4-f24f-4100-963a-d228b2a6f7c4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3ebffd97-b242-42d7-b245-ebdaf8e4377c'}, 'image_ref': '3ebffd97-b242-42d7-b245-ebdaf8e4377c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'cfc3a42c-d1be-11f0-a2f3-fa163e9454b0', 'monotonic_time': 5063.835805759, 'message_signature': '45a2074cc0b0f4c616b1831715a3c8cb833ac07cc95c6ad54e1594b40e20aa5a'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': 'fae1c60e378945ea84b34c4824b835b1', 'user_name': None, 'project_id': 'fa1cd463d74b49139a088d332d37e611', 'project_name': None, 'resource_id': 'b417b484-e28d-4f07-a7a6-9528c9b27a63-sda', 'timestamp': '2025-12-05T09:43:10.817946', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1007468349', 'name': 'instance-0000002f', 'instance_id': 'b417b484-e28d-4f07-a7a6-9528c9b27a63', 'instance_type': 'm1.nano', 'host': '2bcb83f9ba5db29ea79c746e62a8a2dc63c5c999436a0571013bf81c', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'fbadeab4-f24f-4100-963a-d228b2a6f7c4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3ebffd97-b242-42d7-b245-ebdaf8e4377c'}, 'image_ref': '3ebffd97-b242-42d7-b245-ebdaf8e4377c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'cfc3b336-d1be-11f0-a2f3-fa163e9454b0', 'monotonic_time': 5063.835805759, 'message_signature': '80793b215d279d1a26e783e4ef0f0974ac46560b45b58a0898a20c118a7e265f'}]}, 'timestamp': '2025-12-05 09:43:10.818750', '_unique_id': '610046fb0b964067bd83f4ef785ba021'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.819 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.819 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.819 12 ERROR oslo_messaging.notify.messaging     yield
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.819 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.819 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.819 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.819 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.819 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.819 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.819 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.819 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.819 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.819 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.819 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.819 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.819 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.819 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.819 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.819 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.819 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.819 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.819 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.819 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.819 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.819 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.819 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.819 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.819 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.819 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.819 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.819 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.819 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.819 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.819 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.819 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.819 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.819 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.819 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.819 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.819 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.819 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.819 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.819 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.819 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.819 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.819 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.819 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.819 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.819 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.819 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.819 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.819 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.819 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.819 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.820 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.821 12 DEBUG ceilometer.compute.pollsters [-] b417b484-e28d-4f07-a7a6-9528c9b27a63/disk.device.read.latency volume: 240313471 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.821 12 DEBUG ceilometer.compute.pollsters [-] b417b484-e28d-4f07-a7a6-9528c9b27a63/disk.device.read.latency volume: 22917629 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.822 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '15c60d7d-cdbd-4058-afd6-abc03d48c77a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 240313471, 'user_id': 'fae1c60e378945ea84b34c4824b835b1', 'user_name': None, 'project_id': 'fa1cd463d74b49139a088d332d37e611', 'project_name': None, 'resource_id': 'b417b484-e28d-4f07-a7a6-9528c9b27a63-vda', 'timestamp': '2025-12-05T09:43:10.821055', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1007468349', 'name': 'instance-0000002f', 'instance_id': 'b417b484-e28d-4f07-a7a6-9528c9b27a63', 'instance_type': 'm1.nano', 'host': '2bcb83f9ba5db29ea79c746e62a8a2dc63c5c999436a0571013bf81c', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'fbadeab4-f24f-4100-963a-d228b2a6f7c4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3ebffd97-b242-42d7-b245-ebdaf8e4377c'}, 'image_ref': '3ebffd97-b242-42d7-b245-ebdaf8e4377c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'cfc41b82-d1be-11f0-a2f3-fa163e9454b0', 'monotonic_time': 5063.835805759, 'message_signature': '3c682ed12b4eb42bbfb200df7ce45f903d98dfd55a643231ebf1f74480bb5b7e'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 22917629, 'user_id': 'fae1c60e378945ea84b34c4824b835b1', 'user_name': None, 'project_id': 'fa1cd463d74b49139a088d332d37e611', 'project_name': None, 'resource_id': 'b417b484-e28d-4f07-a7a6-9528c9b27a63-sda', 'timestamp': '2025-12-05T09:43:10.821055', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1007468349', 'name': 'instance-0000002f', 'instance_id': 'b417b484-e28d-4f07-a7a6-9528c9b27a63', 'instance_type': 'm1.nano', 'host': '2bcb83f9ba5db29ea79c746e62a8a2dc63c5c999436a0571013bf81c', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'fbadeab4-f24f-4100-963a-d228b2a6f7c4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3ebffd97-b242-42d7-b245-ebdaf8e4377c'}, 'image_ref': '3ebffd97-b242-42d7-b245-ebdaf8e4377c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'cfc428ac-d1be-11f0-a2f3-fa163e9454b0', 'monotonic_time': 5063.835805759, 'message_signature': '916c67dccfb2796b0002d5f5c5bd29b319e118e21e996b159b6e2c859de7cbbc'}]}, 'timestamp': '2025-12-05 09:43:10.821767', '_unique_id': '4b22c21b735e4b6babf73f8f2a065516'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.822 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.822 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.822 12 ERROR oslo_messaging.notify.messaging     yield
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.822 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.822 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.822 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.822 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.822 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.822 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.822 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.822 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.822 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.822 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.822 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.822 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.822 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.822 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.822 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.822 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.822 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.822 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.822 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.822 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.822 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.822 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.822 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.822 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.822 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.822 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.822 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.822 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.822 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.822 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.822 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.822 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.822 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.822 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.822 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.822 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.822 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.822 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.822 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.822 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.822 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.822 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.822 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.822 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.822 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.822 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.822 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.822 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.822 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.822 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.822 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.823 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.823 12 DEBUG ceilometer.compute.pollsters [-] b417b484-e28d-4f07-a7a6-9528c9b27a63/network.incoming.packets volume: 26 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.824 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1e934ca5-28c7-42f9-8cdb-6bc912430466', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 26, 'user_id': 'fae1c60e378945ea84b34c4824b835b1', 'user_name': None, 'project_id': 'fa1cd463d74b49139a088d332d37e611', 'project_name': None, 'resource_id': 'instance-0000002f-b417b484-e28d-4f07-a7a6-9528c9b27a63-tapce2e0b21-6b', 'timestamp': '2025-12-05T09:43:10.823671', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1007468349', 'name': 'tapce2e0b21-6b', 'instance_id': 'b417b484-e28d-4f07-a7a6-9528c9b27a63', 'instance_type': 'm1.nano', 'host': '2bcb83f9ba5db29ea79c746e62a8a2dc63c5c999436a0571013bf81c', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'fbadeab4-f24f-4100-963a-d228b2a6f7c4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3ebffd97-b242-42d7-b245-ebdaf8e4377c'}, 'image_ref': '3ebffd97-b242-42d7-b245-ebdaf8e4377c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:70:c2:8e', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapce2e0b21-6b'}, 'message_id': 'cfc48266-d1be-11f0-a2f3-fa163e9454b0', 'monotonic_time': 5063.818776981, 'message_signature': 'e861f3a7c34af9c8280e1ac17ea70a6c59df5f1644f9952cfe2f25f8b554dab8'}]}, 'timestamp': '2025-12-05 09:43:10.824047', '_unique_id': '287279fffe9b413891809ec806b3824a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.824 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.824 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.824 12 ERROR oslo_messaging.notify.messaging     yield
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.824 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.824 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.824 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.824 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.824 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.824 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.824 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.824 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.824 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.824 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.824 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.824 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.824 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.824 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.824 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.824 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.824 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.824 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.824 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.824 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.824 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.824 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.824 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.824 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.824 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.824 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.824 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.824 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.824 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.824 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.824 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.824 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.824 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.824 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.824 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.824 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.824 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.824 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.824 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.824 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.824 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.824 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.824 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.824 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.824 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.824 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.824 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.824 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.824 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.824 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.824 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.826 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.852 12 DEBUG ceilometer.compute.pollsters [-] b417b484-e28d-4f07-a7a6-9528c9b27a63/memory.usage volume: 46.52734375 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.855 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2f2f7135-e7ec-47c6-a81d-e9fa9acb8af6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 46.52734375, 'user_id': 'fae1c60e378945ea84b34c4824b835b1', 'user_name': None, 'project_id': 'fa1cd463d74b49139a088d332d37e611', 'project_name': None, 'resource_id': 'b417b484-e28d-4f07-a7a6-9528c9b27a63', 'timestamp': '2025-12-05T09:43:10.826458', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1007468349', 'name': 'instance-0000002f', 'instance_id': 'b417b484-e28d-4f07-a7a6-9528c9b27a63', 'instance_type': 'm1.nano', 'host': '2bcb83f9ba5db29ea79c746e62a8a2dc63c5c999436a0571013bf81c', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'fbadeab4-f24f-4100-963a-d228b2a6f7c4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3ebffd97-b242-42d7-b245-ebdaf8e4377c'}, 'image_ref': '3ebffd97-b242-42d7-b245-ebdaf8e4377c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': 'cfc9002a-d1be-11f0-a2f3-fa163e9454b0', 'monotonic_time': 5063.909907834, 'message_signature': '33beb9f7dfc3453252625525f2cfbf517a1e314dd0ebe5c2465a60c7578ea643'}]}, 'timestamp': '2025-12-05 09:43:10.853743', '_unique_id': '551df3dad3a140e3bc61639ff2fcf7d6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.855 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.855 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.855 12 ERROR oslo_messaging.notify.messaging     yield
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.855 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.855 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.855 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.855 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.855 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.855 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.855 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.855 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.855 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.855 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.855 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.855 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.855 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.855 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.855 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.855 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.855 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.855 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.855 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.855 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.855 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.855 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.855 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.855 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.855 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.855 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.855 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.855 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.855 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.855 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.855 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.855 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.855 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.855 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.855 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.855 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.855 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.855 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.855 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.855 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.855 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.855 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.855 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.855 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.855 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.855 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.855 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.855 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.855 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.855 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.855 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.857 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.857 12 DEBUG ceilometer.compute.pollsters [-] b417b484-e28d-4f07-a7a6-9528c9b27a63/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.858 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1f4e7dfd-f56c-452f-836e-498d6b476cf0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'fae1c60e378945ea84b34c4824b835b1', 'user_name': None, 'project_id': 'fa1cd463d74b49139a088d332d37e611', 'project_name': None, 'resource_id': 'instance-0000002f-b417b484-e28d-4f07-a7a6-9528c9b27a63-tapce2e0b21-6b', 'timestamp': '2025-12-05T09:43:10.857245', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1007468349', 'name': 'tapce2e0b21-6b', 'instance_id': 'b417b484-e28d-4f07-a7a6-9528c9b27a63', 'instance_type': 'm1.nano', 'host': '2bcb83f9ba5db29ea79c746e62a8a2dc63c5c999436a0571013bf81c', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'fbadeab4-f24f-4100-963a-d228b2a6f7c4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3ebffd97-b242-42d7-b245-ebdaf8e4377c'}, 'image_ref': '3ebffd97-b242-42d7-b245-ebdaf8e4377c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:70:c2:8e', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapce2e0b21-6b'}, 'message_id': 'cfc9a8cc-d1be-11f0-a2f3-fa163e9454b0', 'monotonic_time': 5063.818776981, 'message_signature': 'a23cd3512b6a00e9085bad8bef93cbdc14060ae7d8ceee7779d93495a3f1d09d'}]}, 'timestamp': '2025-12-05 09:43:10.857885', '_unique_id': '0d1ed63b45ab4073a2beada553bbd8c4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.858 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.858 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.858 12 ERROR oslo_messaging.notify.messaging     yield
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.858 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.858 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.858 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.858 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.858 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.858 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.858 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.858 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.858 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.858 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.858 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.858 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.858 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.858 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.858 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.858 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.858 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.858 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.858 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.858 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.858 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.858 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.858 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.858 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.858 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.858 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.858 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.858 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.858 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.858 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.858 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.858 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.858 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.858 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.858 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.858 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.858 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.858 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.858 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.858 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.858 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.858 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.858 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.858 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.858 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.858 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.858 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.858 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.858 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.858 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.858 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.860 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.860 12 DEBUG ceilometer.compute.pollsters [-] b417b484-e28d-4f07-a7a6-9528c9b27a63/disk.device.read.bytes volume: 30304768 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.860 12 DEBUG ceilometer.compute.pollsters [-] b417b484-e28d-4f07-a7a6-9528c9b27a63/disk.device.read.bytes volume: 274750 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.861 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ed1c7075-60bf-4d5b-b5bc-6032d59daa13', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 30304768, 'user_id': 'fae1c60e378945ea84b34c4824b835b1', 'user_name': None, 'project_id': 'fa1cd463d74b49139a088d332d37e611', 'project_name': None, 'resource_id': 'b417b484-e28d-4f07-a7a6-9528c9b27a63-vda', 'timestamp': '2025-12-05T09:43:10.860277', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1007468349', 'name': 'instance-0000002f', 'instance_id': 'b417b484-e28d-4f07-a7a6-9528c9b27a63', 'instance_type': 'm1.nano', 'host': '2bcb83f9ba5db29ea79c746e62a8a2dc63c5c999436a0571013bf81c', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'fbadeab4-f24f-4100-963a-d228b2a6f7c4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3ebffd97-b242-42d7-b245-ebdaf8e4377c'}, 'image_ref': '3ebffd97-b242-42d7-b245-ebdaf8e4377c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'cfca1a28-d1be-11f0-a2f3-fa163e9454b0', 'monotonic_time': 5063.835805759, 'message_signature': 'e009c722816921916be865026c7fde39953dc69933e0534ef088c117a3d4ea05'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 274750, 'user_id': 'fae1c60e378945ea84b34c4824b835b1', 'user_name': None, 'project_id': 'fa1cd463d74b49139a088d332d37e611', 'project_name': None, 'resource_id': 'b417b484-e28d-4f07-a7a6-9528c9b27a63-sda', 'timestamp': '2025-12-05T09:43:10.860277', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1007468349', 'name': 'instance-0000002f', 'instance_id': 'b417b484-e28d-4f07-a7a6-9528c9b27a63', 'instance_type': 'm1.nano', 'host': '2bcb83f9ba5db29ea79c746e62a8a2dc63c5c999436a0571013bf81c', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'fbadeab4-f24f-4100-963a-d228b2a6f7c4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3ebffd97-b242-42d7-b245-ebdaf8e4377c'}, 'image_ref': '3ebffd97-b242-42d7-b245-ebdaf8e4377c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'cfca28ec-d1be-11f0-a2f3-fa163e9454b0', 'monotonic_time': 5063.835805759, 'message_signature': '7e63171e7c74d07d433eb3145ce002123606cbbd634c20a3b6876868ad6be4d1'}]}, 'timestamp': '2025-12-05 09:43:10.861078', '_unique_id': 'a08e7d81d3d946cc8fbdee1d0f061a6c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.861 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.861 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.861 12 ERROR oslo_messaging.notify.messaging     yield
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.861 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.861 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.861 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.861 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.861 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.861 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.861 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.861 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.861 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.861 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.861 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.861 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.861 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.861 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.861 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.861 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.861 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.861 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.861 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.861 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.861 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.861 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.861 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.861 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.861 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.861 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.861 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.861 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.861 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.861 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.861 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.861 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.861 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.861 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.861 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.861 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.861 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.861 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.861 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.861 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.861 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.861 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.861 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.861 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.861 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.861 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.861 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.861 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.861 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.861 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.861 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.863 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.880 12 DEBUG ceilometer.compute.pollsters [-] b417b484-e28d-4f07-a7a6-9528c9b27a63/disk.device.usage volume: 29949952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.881 12 DEBUG ceilometer.compute.pollsters [-] b417b484-e28d-4f07-a7a6-9528c9b27a63/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.883 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9ff85eb8-e58e-40c0-9125-f6412a4a0c01', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 29949952, 'user_id': 'fae1c60e378945ea84b34c4824b835b1', 'user_name': None, 'project_id': 'fa1cd463d74b49139a088d332d37e611', 'project_name': None, 'resource_id': 'b417b484-e28d-4f07-a7a6-9528c9b27a63-vda', 'timestamp': '2025-12-05T09:43:10.863466', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1007468349', 'name': 'instance-0000002f', 'instance_id': 'b417b484-e28d-4f07-a7a6-9528c9b27a63', 'instance_type': 'm1.nano', 'host': '2bcb83f9ba5db29ea79c746e62a8a2dc63c5c999436a0571013bf81c', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'fbadeab4-f24f-4100-963a-d228b2a6f7c4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3ebffd97-b242-42d7-b245-ebdaf8e4377c'}, 'image_ref': '3ebffd97-b242-42d7-b245-ebdaf8e4377c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'cfcd4932-d1be-11f0-a2f3-fa163e9454b0', 'monotonic_time': 5063.921078587, 'message_signature': 'e3c875b6096328119d998c4561f4ca8ffb688dbd262419a208fddcb0e75fa808'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'fae1c60e378945ea84b34c4824b835b1', 'user_name': None, 'project_id': 'fa1cd463d74b49139a088d332d37e611', 'project_name': None, 'resource_id': 'b417b484-e28d-4f07-a7a6-9528c9b27a63-sda', 'timestamp': '2025-12-05T09:43:10.863466', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1007468349', 'name': 'instance-0000002f', 'instance_id': 'b417b484-e28d-4f07-a7a6-9528c9b27a63', 'instance_type': 'm1.nano', 'host': '2bcb83f9ba5db29ea79c746e62a8a2dc63c5c999436a0571013bf81c', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'fbadeab4-f24f-4100-963a-d228b2a6f7c4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3ebffd97-b242-42d7-b245-ebdaf8e4377c'}, 'image_ref': '3ebffd97-b242-42d7-b245-ebdaf8e4377c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'cfcd5c38-d1be-11f0-a2f3-fa163e9454b0', 'monotonic_time': 5063.921078587, 'message_signature': '20051e3a4effaeee4f8c8ebdc297dca77706a59b10f07bd3e9453d808f0b7d8c'}]}, 'timestamp': '2025-12-05 09:43:10.882047', '_unique_id': 'bcb146b75dc14315b9b24b5f1c247149'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.883 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.883 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.883 12 ERROR oslo_messaging.notify.messaging     yield
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.883 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.883 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.883 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.883 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.883 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.883 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.883 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.883 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.883 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.883 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.883 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.883 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.883 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.883 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.883 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.883 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.883 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.883 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.883 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.883 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.883 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.883 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.883 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.883 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.883 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.883 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.883 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.883 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.883 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.883 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.883 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.883 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.883 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.883 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.883 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.883 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.883 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.883 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.883 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.883 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.883 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.883 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.883 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.883 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.883 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.883 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.883 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.883 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.883 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.883 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.883 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.884 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.884 12 DEBUG ceilometer.compute.pollsters [-] b417b484-e28d-4f07-a7a6-9528c9b27a63/network.incoming.bytes volume: 4255 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.885 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1e4e130a-e867-440a-87b7-f5ed4c38ab7b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 4255, 'user_id': 'fae1c60e378945ea84b34c4824b835b1', 'user_name': None, 'project_id': 'fa1cd463d74b49139a088d332d37e611', 'project_name': None, 'resource_id': 'instance-0000002f-b417b484-e28d-4f07-a7a6-9528c9b27a63-tapce2e0b21-6b', 'timestamp': '2025-12-05T09:43:10.884718', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1007468349', 'name': 'tapce2e0b21-6b', 'instance_id': 'b417b484-e28d-4f07-a7a6-9528c9b27a63', 'instance_type': 'm1.nano', 'host': '2bcb83f9ba5db29ea79c746e62a8a2dc63c5c999436a0571013bf81c', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'fbadeab4-f24f-4100-963a-d228b2a6f7c4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3ebffd97-b242-42d7-b245-ebdaf8e4377c'}, 'image_ref': '3ebffd97-b242-42d7-b245-ebdaf8e4377c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:70:c2:8e', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapce2e0b21-6b'}, 'message_id': 'cfcdd17c-d1be-11f0-a2f3-fa163e9454b0', 'monotonic_time': 5063.818776981, 'message_signature': '98b4326a4d9f77a049bcc2dc5aebfe4f369e60e7176d7991d0084ad56503c1c2'}]}, 'timestamp': '2025-12-05 09:43:10.885020', '_unique_id': '494296d00c614ff897d38a85492c1786'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.885 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.885 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.885 12 ERROR oslo_messaging.notify.messaging     yield
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.885 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.885 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.885 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.885 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.885 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.885 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.885 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.885 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.885 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.885 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.885 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.885 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.885 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.885 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.885 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.885 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.885 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.885 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.885 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.885 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.885 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.885 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.885 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.885 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.885 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.885 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.885 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.885 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.885 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.885 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.885 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.885 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.885 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.885 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.885 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.885 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.885 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.885 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.885 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.885 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.885 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.885 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.885 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.885 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.885 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.885 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.885 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.885 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.885 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.885 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.885 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.887 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.887 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.887 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-TestGettingAddress-server-1007468349>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestGettingAddress-server-1007468349>]
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.888 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.888 12 DEBUG ceilometer.compute.pollsters [-] b417b484-e28d-4f07-a7a6-9528c9b27a63/cpu volume: 12660000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.890 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd3b4a88d-5078-4f72-9986-d7c821bb55ad', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 12660000000, 'user_id': 'fae1c60e378945ea84b34c4824b835b1', 'user_name': None, 'project_id': 'fa1cd463d74b49139a088d332d37e611', 'project_name': None, 'resource_id': 'b417b484-e28d-4f07-a7a6-9528c9b27a63', 'timestamp': '2025-12-05T09:43:10.888098', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1007468349', 'name': 'instance-0000002f', 'instance_id': 'b417b484-e28d-4f07-a7a6-9528c9b27a63', 'instance_type': 'm1.nano', 'host': '2bcb83f9ba5db29ea79c746e62a8a2dc63c5c999436a0571013bf81c', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'fbadeab4-f24f-4100-963a-d228b2a6f7c4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3ebffd97-b242-42d7-b245-ebdaf8e4377c'}, 'image_ref': '3ebffd97-b242-42d7-b245-ebdaf8e4377c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': 'cfce78ac-d1be-11f0-a2f3-fa163e9454b0', 'monotonic_time': 5063.909907834, 'message_signature': 'e01670acb985b4f8b44056198f64cabd7604a0339aaf5fa29507d1ad24bcb41f'}]}, 'timestamp': '2025-12-05 09:43:10.889465', '_unique_id': '2864e517f5c043a7beee04ada9162755'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.890 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.890 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.890 12 ERROR oslo_messaging.notify.messaging     yield
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.890 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.890 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.890 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.890 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.890 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.890 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.890 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.890 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.890 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.890 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.890 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.890 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.890 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.890 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.890 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.890 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.890 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.890 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.890 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.890 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.890 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.890 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.890 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.890 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.890 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.890 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.890 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.890 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.890 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.890 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.890 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.890 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.890 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.890 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.890 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.890 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.890 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.890 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.890 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.890 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.890 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.890 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.890 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.890 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.890 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.890 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.890 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.890 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.890 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.890 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.890 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.891 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.891 12 DEBUG ceilometer.compute.pollsters [-] b417b484-e28d-4f07-a7a6-9528c9b27a63/network.outgoing.packets volume: 35 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.892 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '804cce88-3aa3-4783-9eb2-79b806e96f82', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 35, 'user_id': 'fae1c60e378945ea84b34c4824b835b1', 'user_name': None, 'project_id': 'fa1cd463d74b49139a088d332d37e611', 'project_name': None, 'resource_id': 'instance-0000002f-b417b484-e28d-4f07-a7a6-9528c9b27a63-tapce2e0b21-6b', 'timestamp': '2025-12-05T09:43:10.891841', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1007468349', 'name': 'tapce2e0b21-6b', 'instance_id': 'b417b484-e28d-4f07-a7a6-9528c9b27a63', 'instance_type': 'm1.nano', 'host': '2bcb83f9ba5db29ea79c746e62a8a2dc63c5c999436a0571013bf81c', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'fbadeab4-f24f-4100-963a-d228b2a6f7c4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3ebffd97-b242-42d7-b245-ebdaf8e4377c'}, 'image_ref': '3ebffd97-b242-42d7-b245-ebdaf8e4377c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:70:c2:8e', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapce2e0b21-6b'}, 'message_id': 'cfcee72e-d1be-11f0-a2f3-fa163e9454b0', 'monotonic_time': 5063.818776981, 'message_signature': '3adf295228140035d2506a29dca6662992e253c334436110a490a47d6453b9e3'}]}, 'timestamp': '2025-12-05 09:43:10.892137', '_unique_id': 'abb40eee30cd4f29a8fe0f5b2c85eb4f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.892 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.892 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.892 12 ERROR oslo_messaging.notify.messaging     yield
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.892 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.892 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.892 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.892 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.892 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.892 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.892 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.892 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.892 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.892 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.892 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.892 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.892 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.892 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.892 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.892 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.892 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.892 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.892 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.892 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.892 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.892 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.892 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.892 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.892 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.892 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.892 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.892 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.892 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.892 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.892 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.892 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.892 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.892 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.892 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.892 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.892 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.892 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.892 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.892 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.892 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.892 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.892 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.892 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.892 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.892 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.892 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.892 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.892 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.892 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.892 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.893 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.893 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.893 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-TestGettingAddress-server-1007468349>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestGettingAddress-server-1007468349>]
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.893 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.893 12 DEBUG ceilometer.compute.pollsters [-] b417b484-e28d-4f07-a7a6-9528c9b27a63/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.894 12 DEBUG ceilometer.compute.pollsters [-] b417b484-e28d-4f07-a7a6-9528c9b27a63/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.895 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1f9bc4d8-8a20-4639-a327-0822e0037c0a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'fae1c60e378945ea84b34c4824b835b1', 'user_name': None, 'project_id': 'fa1cd463d74b49139a088d332d37e611', 'project_name': None, 'resource_id': 'b417b484-e28d-4f07-a7a6-9528c9b27a63-vda', 'timestamp': '2025-12-05T09:43:10.893870', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1007468349', 'name': 'instance-0000002f', 'instance_id': 'b417b484-e28d-4f07-a7a6-9528c9b27a63', 'instance_type': 'm1.nano', 'host': '2bcb83f9ba5db29ea79c746e62a8a2dc63c5c999436a0571013bf81c', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'fbadeab4-f24f-4100-963a-d228b2a6f7c4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3ebffd97-b242-42d7-b245-ebdaf8e4377c'}, 'image_ref': '3ebffd97-b242-42d7-b245-ebdaf8e4377c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'cfcf3792-d1be-11f0-a2f3-fa163e9454b0', 'monotonic_time': 5063.921078587, 'message_signature': '2e2bed518447fb2fa2e729ed96905b8b9f1acf8ca3d064f9f70bd55700739305'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'fae1c60e378945ea84b34c4824b835b1', 'user_name': None, 'project_id': 'fa1cd463d74b49139a088d332d37e611', 'project_name': None, 'resource_id': 'b417b484-e28d-4f07-a7a6-9528c9b27a63-sda', 'timestamp': '2025-12-05T09:43:10.893870', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1007468349', 'name': 'instance-0000002f', 'instance_id': 'b417b484-e28d-4f07-a7a6-9528c9b27a63', 'instance_type': 'm1.nano', 'host': '2bcb83f9ba5db29ea79c746e62a8a2dc63c5c999436a0571013bf81c', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'fbadeab4-f24f-4100-963a-d228b2a6f7c4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3ebffd97-b242-42d7-b245-ebdaf8e4377c'}, 'image_ref': '3ebffd97-b242-42d7-b245-ebdaf8e4377c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'cfcf444e-d1be-11f0-a2f3-fa163e9454b0', 'monotonic_time': 5063.921078587, 'message_signature': '8a9e700a7cdabac855d1e406c9748e90d6bba57d99c78a0bb3acfac36945e967'}]}, 'timestamp': '2025-12-05 09:43:10.894569', '_unique_id': 'c2ece307e48e47cc874630c94136cef7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.895 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.895 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.895 12 ERROR oslo_messaging.notify.messaging     yield
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.895 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.895 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.895 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.895 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.895 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.895 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.895 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.895 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.895 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.895 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.895 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.895 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.895 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.895 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.895 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.895 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.895 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.895 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.895 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.895 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.895 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.895 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.895 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.895 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.895 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.895 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.895 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.895 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.895 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.895 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.895 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.895 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.895 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.895 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.895 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.895 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.895 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.895 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.895 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.895 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.895 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.895 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.895 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.895 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.895 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.895 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.895 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.895 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.895 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.895 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.895 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.895 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.896 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.896 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-TestGettingAddress-server-1007468349>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestGettingAddress-server-1007468349>]
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.896 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.896 12 DEBUG ceilometer.compute.pollsters [-] b417b484-e28d-4f07-a7a6-9528c9b27a63/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.898 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'cdc821bf-cd28-4356-b301-b3b91b908a3b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'fae1c60e378945ea84b34c4824b835b1', 'user_name': None, 'project_id': 'fa1cd463d74b49139a088d332d37e611', 'project_name': None, 'resource_id': 'instance-0000002f-b417b484-e28d-4f07-a7a6-9528c9b27a63-tapce2e0b21-6b', 'timestamp': '2025-12-05T09:43:10.896451', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1007468349', 'name': 'tapce2e0b21-6b', 'instance_id': 'b417b484-e28d-4f07-a7a6-9528c9b27a63', 'instance_type': 'm1.nano', 'host': '2bcb83f9ba5db29ea79c746e62a8a2dc63c5c999436a0571013bf81c', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'fbadeab4-f24f-4100-963a-d228b2a6f7c4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3ebffd97-b242-42d7-b245-ebdaf8e4377c'}, 'image_ref': '3ebffd97-b242-42d7-b245-ebdaf8e4377c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:70:c2:8e', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapce2e0b21-6b'}, 'message_id': 'cfcf9d54-d1be-11f0-a2f3-fa163e9454b0', 'monotonic_time': 5063.818776981, 'message_signature': '9cccab2611d194928cd79a8332db7f3528c137e9773edae52f00c8197260ab4e'}]}, 'timestamp': '2025-12-05 09:43:10.897433', '_unique_id': 'cbb623f2918e4b41ad7329411d8d7a24'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.898 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.898 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.898 12 ERROR oslo_messaging.notify.messaging     yield
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.898 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.898 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.898 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.898 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.898 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.898 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.898 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.898 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.898 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.898 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.898 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.898 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.898 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.898 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.898 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.898 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.898 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.898 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.898 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.898 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.898 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.898 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.898 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.898 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.898 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.898 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.898 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.898 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.898 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.898 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.898 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.898 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.898 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.898 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.898 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.898 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.898 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.898 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.898 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.898 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.898 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.898 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.898 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.898 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.898 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.898 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.898 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.898 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.898 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.898 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.898 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.898 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.898 12 DEBUG ceilometer.compute.pollsters [-] b417b484-e28d-4f07-a7a6-9528c9b27a63/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.899 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '27bc9ee6-a653-452f-b63b-1fce2667fcca', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'fae1c60e378945ea84b34c4824b835b1', 'user_name': None, 'project_id': 'fa1cd463d74b49139a088d332d37e611', 'project_name': None, 'resource_id': 'instance-0000002f-b417b484-e28d-4f07-a7a6-9528c9b27a63-tapce2e0b21-6b', 'timestamp': '2025-12-05T09:43:10.898801', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1007468349', 'name': 'tapce2e0b21-6b', 'instance_id': 'b417b484-e28d-4f07-a7a6-9528c9b27a63', 'instance_type': 'm1.nano', 'host': '2bcb83f9ba5db29ea79c746e62a8a2dc63c5c999436a0571013bf81c', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'fbadeab4-f24f-4100-963a-d228b2a6f7c4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3ebffd97-b242-42d7-b245-ebdaf8e4377c'}, 'image_ref': '3ebffd97-b242-42d7-b245-ebdaf8e4377c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:70:c2:8e', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapce2e0b21-6b'}, 'message_id': 'cfcff5a6-d1be-11f0-a2f3-fa163e9454b0', 'monotonic_time': 5063.818776981, 'message_signature': 'd0c19168d6c8c3aac31d7a172b1097b9af18cf4fff684604bcec7362d2a2d676'}]}, 'timestamp': '2025-12-05 09:43:10.899073', '_unique_id': '6dd0ed55f56e4a6ab1f453e8897a3279'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.899 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.899 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.899 12 ERROR oslo_messaging.notify.messaging     yield
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.899 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.899 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.899 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.899 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.899 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.899 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.899 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.899 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.899 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.899 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.899 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.899 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.899 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.899 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.899 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.899 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.899 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.899 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.899 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.899 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.899 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.899 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.899 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.899 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.899 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.899 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.899 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.899 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.899 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.899 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.899 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.899 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.899 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.899 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.899 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.899 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.899 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.899 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.899 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.899 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.899 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.899 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.899 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.899 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.899 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.899 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.899 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.899 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.899 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.899 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.899 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.900 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.900 12 DEBUG ceilometer.compute.pollsters [-] b417b484-e28d-4f07-a7a6-9528c9b27a63/disk.device.allocation volume: 30089216 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.900 12 DEBUG ceilometer.compute.pollsters [-] b417b484-e28d-4f07-a7a6-9528c9b27a63/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.901 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '14cabf8e-8a4d-408c-82dc-15ca440cf9f0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30089216, 'user_id': 'fae1c60e378945ea84b34c4824b835b1', 'user_name': None, 'project_id': 'fa1cd463d74b49139a088d332d37e611', 'project_name': None, 'resource_id': 'b417b484-e28d-4f07-a7a6-9528c9b27a63-vda', 'timestamp': '2025-12-05T09:43:10.900203', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1007468349', 'name': 'instance-0000002f', 'instance_id': 'b417b484-e28d-4f07-a7a6-9528c9b27a63', 'instance_type': 'm1.nano', 'host': '2bcb83f9ba5db29ea79c746e62a8a2dc63c5c999436a0571013bf81c', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'fbadeab4-f24f-4100-963a-d228b2a6f7c4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3ebffd97-b242-42d7-b245-ebdaf8e4377c'}, 'image_ref': '3ebffd97-b242-42d7-b245-ebdaf8e4377c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'cfd02bfc-d1be-11f0-a2f3-fa163e9454b0', 'monotonic_time': 5063.921078587, 'message_signature': '2d982ef23ad820aa94185660b2c22d6ffbd00e74bb98d4eb8e4463b7d02c436b'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': 'fae1c60e378945ea84b34c4824b835b1', 'user_name': None, 'project_id': 'fa1cd463d74b49139a088d332d37e611', 'project_name': None, 'resource_id': 'b417b484-e28d-4f07-a7a6-9528c9b27a63-sda', 'timestamp': '2025-12-05T09:43:10.900203', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1007468349', 'name': 'instance-0000002f', 'instance_id': 'b417b484-e28d-4f07-a7a6-9528c9b27a63', 'instance_type': 'm1.nano', 'host': '2bcb83f9ba5db29ea79c746e62a8a2dc63c5c999436a0571013bf81c', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'fbadeab4-f24f-4100-963a-d228b2a6f7c4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3ebffd97-b242-42d7-b245-ebdaf8e4377c'}, 'image_ref': '3ebffd97-b242-42d7-b245-ebdaf8e4377c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'cfd034a8-d1be-11f0-a2f3-fa163e9454b0', 'monotonic_time': 5063.921078587, 'message_signature': '6dde7b0d1e839b35cc0d926753caaee35be567fc152072e7469d77c471267e0a'}]}, 'timestamp': '2025-12-05 09:43:10.900645', '_unique_id': '42f82f8c94ed4865b1d3e91f5d41c390'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.901 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.901 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.901 12 ERROR oslo_messaging.notify.messaging     yield
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.901 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.901 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.901 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.901 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.901 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.901 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.901 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.901 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.901 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.901 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.901 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.901 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.901 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.901 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.901 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.901 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.901 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.901 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.901 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.901 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.901 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.901 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.901 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.901 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.901 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.901 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.901 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.901 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.901 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.901 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.901 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.901 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.901 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.901 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.901 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.901 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.901 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.901 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.901 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.901 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.901 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.901 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.901 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.901 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.901 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.901 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.901 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.901 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.901 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.901 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.901 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.901 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.901 12 DEBUG ceilometer.compute.pollsters [-] b417b484-e28d-4f07-a7a6-9528c9b27a63/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.902 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a56ade8a-8883-4b3f-8547-782e03ee5f18', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'fae1c60e378945ea84b34c4824b835b1', 'user_name': None, 'project_id': 'fa1cd463d74b49139a088d332d37e611', 'project_name': None, 'resource_id': 'instance-0000002f-b417b484-e28d-4f07-a7a6-9528c9b27a63-tapce2e0b21-6b', 'timestamp': '2025-12-05T09:43:10.901835', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1007468349', 'name': 'tapce2e0b21-6b', 'instance_id': 'b417b484-e28d-4f07-a7a6-9528c9b27a63', 'instance_type': 'm1.nano', 'host': '2bcb83f9ba5db29ea79c746e62a8a2dc63c5c999436a0571013bf81c', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'fbadeab4-f24f-4100-963a-d228b2a6f7c4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3ebffd97-b242-42d7-b245-ebdaf8e4377c'}, 'image_ref': '3ebffd97-b242-42d7-b245-ebdaf8e4377c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:70:c2:8e', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapce2e0b21-6b'}, 'message_id': 'cfd06c52-d1be-11f0-a2f3-fa163e9454b0', 'monotonic_time': 5063.818776981, 'message_signature': 'fb25acb9038bfe90e75802bf90bd8e5295dfc64e7c0ca65031b46c0c7d1bddfa'}]}, 'timestamp': '2025-12-05 09:43:10.902088', '_unique_id': 'fe7bd36e0fd8471cb2d35641d3339713'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.902 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.902 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.902 12 ERROR oslo_messaging.notify.messaging     yield
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.902 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.902 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.902 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.902 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.902 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.902 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.902 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.902 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.902 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.902 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.902 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.902 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.902 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.902 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.902 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.902 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.902 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.902 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.902 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.902 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.902 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.902 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.902 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.902 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.902 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.902 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.902 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.902 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.902 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.902 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.902 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.902 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.902 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.902 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.902 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.902 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.902 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.902 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.902 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.902 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.902 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.902 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.902 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.902 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.902 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.902 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.902 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.902 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.902 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.902 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.902 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.903 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.903 12 DEBUG ceilometer.compute.pollsters [-] b417b484-e28d-4f07-a7a6-9528c9b27a63/disk.device.write.bytes volume: 72929280 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.903 12 DEBUG ceilometer.compute.pollsters [-] b417b484-e28d-4f07-a7a6-9528c9b27a63/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.904 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6179ebf5-cb4a-4928-8bb2-c831226de95f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 72929280, 'user_id': 'fae1c60e378945ea84b34c4824b835b1', 'user_name': None, 'project_id': 'fa1cd463d74b49139a088d332d37e611', 'project_name': None, 'resource_id': 'b417b484-e28d-4f07-a7a6-9528c9b27a63-vda', 'timestamp': '2025-12-05T09:43:10.903365', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1007468349', 'name': 'instance-0000002f', 'instance_id': 'b417b484-e28d-4f07-a7a6-9528c9b27a63', 'instance_type': 'm1.nano', 'host': '2bcb83f9ba5db29ea79c746e62a8a2dc63c5c999436a0571013bf81c', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'fbadeab4-f24f-4100-963a-d228b2a6f7c4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3ebffd97-b242-42d7-b245-ebdaf8e4377c'}, 'image_ref': '3ebffd97-b242-42d7-b245-ebdaf8e4377c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'cfd0aa46-d1be-11f0-a2f3-fa163e9454b0', 'monotonic_time': 5063.835805759, 'message_signature': '22765938ab2e1fa6d9a2a42be7acd259b7215badae55b7fab0c451d4b27d6972'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'fae1c60e378945ea84b34c4824b835b1', 'user_name': None, 'project_id': 'fa1cd463d74b49139a088d332d37e611', 'project_name': None, 'resource_id': 'b417b484-e28d-4f07-a7a6-9528c9b27a63-sda', 'timestamp': '2025-12-05T09:43:10.903365', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1007468349', 'name': 'instance-0000002f', 'instance_id': 'b417b484-e28d-4f07-a7a6-9528c9b27a63', 'instance_type': 'm1.nano', 'host': '2bcb83f9ba5db29ea79c746e62a8a2dc63c5c999436a0571013bf81c', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'fbadeab4-f24f-4100-963a-d228b2a6f7c4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3ebffd97-b242-42d7-b245-ebdaf8e4377c'}, 'image_ref': '3ebffd97-b242-42d7-b245-ebdaf8e4377c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'cfd0b536-d1be-11f0-a2f3-fa163e9454b0', 'monotonic_time': 5063.835805759, 'message_signature': '30d971104be4436aba084c239a50bac7bfee8cf88e3d573219b9d0439482718a'}]}, 'timestamp': '2025-12-05 09:43:10.903980', '_unique_id': 'aa0d6e339fa04188b79de85505b67535'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.904 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.904 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.904 12 ERROR oslo_messaging.notify.messaging     yield
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.904 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.904 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.904 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.904 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.904 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.904 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.904 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.904 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.904 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.904 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.904 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.904 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.904 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.904 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.904 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.904 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.904 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.904 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.904 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.904 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.904 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.904 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.904 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.904 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.904 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.904 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.904 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.904 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.904 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.904 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.904 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.904 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.904 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.904 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.904 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.904 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.904 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.904 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.904 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.904 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.904 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.904 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.904 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.904 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.904 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.904 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.904 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.904 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.904 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.904 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.904 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.905 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.905 12 DEBUG ceilometer.compute.pollsters [-] b417b484-e28d-4f07-a7a6-9528c9b27a63/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.906 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '205df9f4-6af6-44a7-8728-6fbb985fcfc5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'fae1c60e378945ea84b34c4824b835b1', 'user_name': None, 'project_id': 'fa1cd463d74b49139a088d332d37e611', 'project_name': None, 'resource_id': 'instance-0000002f-b417b484-e28d-4f07-a7a6-9528c9b27a63-tapce2e0b21-6b', 'timestamp': '2025-12-05T09:43:10.905433', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1007468349', 'name': 'tapce2e0b21-6b', 'instance_id': 'b417b484-e28d-4f07-a7a6-9528c9b27a63', 'instance_type': 'm1.nano', 'host': '2bcb83f9ba5db29ea79c746e62a8a2dc63c5c999436a0571013bf81c', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'fbadeab4-f24f-4100-963a-d228b2a6f7c4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3ebffd97-b242-42d7-b245-ebdaf8e4377c'}, 'image_ref': '3ebffd97-b242-42d7-b245-ebdaf8e4377c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:70:c2:8e', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapce2e0b21-6b'}, 'message_id': 'cfd0f97e-d1be-11f0-a2f3-fa163e9454b0', 'monotonic_time': 5063.818776981, 'message_signature': '58bd9f79fc2d661c3b384fc92ec5586a12859cf3483e003580b52def859c52a5'}]}, 'timestamp': '2025-12-05 09:43:10.905770', '_unique_id': 'd9701a55fc194114940b1efafd1d0d4a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.906 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.906 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.906 12 ERROR oslo_messaging.notify.messaging     yield
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.906 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.906 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.906 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.906 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.906 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.906 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.906 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.906 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.906 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.906 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.906 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.906 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.906 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.906 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.906 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.906 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.906 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.906 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.906 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.906 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.906 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.906 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.906 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.906 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.906 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.906 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.906 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.906 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.906 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.906 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.906 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.906 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.906 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.906 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.906 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.906 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.906 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.906 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.906 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.906 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.906 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.906 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.906 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.906 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.906 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.906 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.906 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.906 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.906 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.906 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.906 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.906 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.907 12 DEBUG ceilometer.compute.pollsters [-] b417b484-e28d-4f07-a7a6-9528c9b27a63/disk.device.read.requests volume: 1090 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.907 12 DEBUG ceilometer.compute.pollsters [-] b417b484-e28d-4f07-a7a6-9528c9b27a63/disk.device.read.requests volume: 108 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.907 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '87bc2754-4036-4613-afd7-837448c3c92f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1090, 'user_id': 'fae1c60e378945ea84b34c4824b835b1', 'user_name': None, 'project_id': 'fa1cd463d74b49139a088d332d37e611', 'project_name': None, 'resource_id': 'b417b484-e28d-4f07-a7a6-9528c9b27a63-vda', 'timestamp': '2025-12-05T09:43:10.906991', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1007468349', 'name': 'instance-0000002f', 'instance_id': 'b417b484-e28d-4f07-a7a6-9528c9b27a63', 'instance_type': 'm1.nano', 'host': '2bcb83f9ba5db29ea79c746e62a8a2dc63c5c999436a0571013bf81c', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'fbadeab4-f24f-4100-963a-d228b2a6f7c4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3ebffd97-b242-42d7-b245-ebdaf8e4377c'}, 'image_ref': '3ebffd97-b242-42d7-b245-ebdaf8e4377c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'cfd1357e-d1be-11f0-a2f3-fa163e9454b0', 'monotonic_time': 5063.835805759, 'message_signature': 'bdd8b6eabec826ce3e95ccc11940844e0f47f6f7ed794eecc2c655a7431bde57'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 108, 'user_id': 'fae1c60e378945ea84b34c4824b835b1', 'user_name': None, 'project_id': 'fa1cd463d74b49139a088d332d37e611', 'project_name': None, 'resource_id': 'b417b484-e28d-4f07-a7a6-9528c9b27a63-sda', 'timestamp': '2025-12-05T09:43:10.906991', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1007468349', 'name': 'instance-0000002f', 'instance_id': 'b417b484-e28d-4f07-a7a6-9528c9b27a63', 'instance_type': 'm1.nano', 'host': '2bcb83f9ba5db29ea79c746e62a8a2dc63c5c999436a0571013bf81c', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'fbadeab4-f24f-4100-963a-d228b2a6f7c4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3ebffd97-b242-42d7-b245-ebdaf8e4377c'}, 'image_ref': '3ebffd97-b242-42d7-b245-ebdaf8e4377c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'cfd13df8-d1be-11f0-a2f3-fa163e9454b0', 'monotonic_time': 5063.835805759, 'message_signature': '91f0330a298f7de61c44c2d331da2e05a63421db58b18413e9da7aae72f1e61a'}]}, 'timestamp': '2025-12-05 09:43:10.907436', '_unique_id': 'f2e62ed6ad1d436b9410fa596ee2ed4c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.907 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.907 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.907 12 ERROR oslo_messaging.notify.messaging     yield
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.907 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.907 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.907 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.907 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.907 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.907 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.907 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.907 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.907 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.907 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.907 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.907 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.907 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.907 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.907 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.907 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.907 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.907 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.907 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.907 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.907 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.907 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.907 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.907 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.907 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.907 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.907 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.907 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.907 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.907 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.907 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.907 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.907 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.907 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.907 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.907 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.907 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.907 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.907 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.907 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.907 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.907 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.907 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.907 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.907 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.907 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.907 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.907 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.907 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.907 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 09:43:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:43:10.907 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:43:10 compute-1 nova_compute[189066]: 2025-12-05 09:43:10.979 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:43:13 compute-1 podman[232313]: 2025-12-05 09:43:13.628734643 +0000 UTC m=+0.062700075 container health_status b07f36300303a5c1729056a61e344d6c6fd9b2e404082d8e8ce7185d45652c2b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9-minimal, managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, release=1755695350, config_id=openstack_network_exporter)
Dec 05 09:43:15 compute-1 nova_compute[189066]: 2025-12-05 09:43:15.730 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:43:15 compute-1 nova_compute[189066]: 2025-12-05 09:43:15.981 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:43:18 compute-1 podman[232335]: 2025-12-05 09:43:18.607798579 +0000 UTC m=+0.050628272 container health_status b9f45cdcb567bb250381fa80d86c78c226d5638c7de1b6d79ee017275814ff68 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Dec 05 09:43:20 compute-1 nova_compute[189066]: 2025-12-05 09:43:20.778 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:43:20 compute-1 nova_compute[189066]: 2025-12-05 09:43:20.983 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:43:21 compute-1 nova_compute[189066]: 2025-12-05 09:43:21.710 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:43:21 compute-1 nova_compute[189066]: 2025-12-05 09:43:21.711 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:43:22 compute-1 podman[232359]: 2025-12-05 09:43:22.614129432 +0000 UTC m=+0.050358014 container health_status 550b604b3b40c506829669f3af0f3e8dc4e7ac01464222ce7f26998708fe1a96 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec 05 09:43:24 compute-1 nova_compute[189066]: 2025-12-05 09:43:24.020 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:43:25 compute-1 nova_compute[189066]: 2025-12-05 09:43:25.831 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:43:25 compute-1 nova_compute[189066]: 2025-12-05 09:43:25.986 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:43:26 compute-1 nova_compute[189066]: 2025-12-05 09:43:26.020 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:43:26 compute-1 nova_compute[189066]: 2025-12-05 09:43:26.020 189070 DEBUG nova.compute.manager [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 05 09:43:27 compute-1 nova_compute[189066]: 2025-12-05 09:43:27.020 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:43:27 compute-1 nova_compute[189066]: 2025-12-05 09:43:27.021 189070 DEBUG nova.compute.manager [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 05 09:43:27 compute-1 nova_compute[189066]: 2025-12-05 09:43:27.021 189070 DEBUG nova.compute.manager [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 05 09:43:27 compute-1 nova_compute[189066]: 2025-12-05 09:43:27.244 189070 DEBUG oslo_concurrency.lockutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Acquiring lock "refresh_cache-b417b484-e28d-4f07-a7a6-9528c9b27a63" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 09:43:27 compute-1 nova_compute[189066]: 2025-12-05 09:43:27.245 189070 DEBUG oslo_concurrency.lockutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Acquired lock "refresh_cache-b417b484-e28d-4f07-a7a6-9528c9b27a63" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 09:43:27 compute-1 nova_compute[189066]: 2025-12-05 09:43:27.245 189070 DEBUG nova.network.neutron [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] [instance: b417b484-e28d-4f07-a7a6-9528c9b27a63] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Dec 05 09:43:27 compute-1 nova_compute[189066]: 2025-12-05 09:43:27.245 189070 DEBUG nova.objects.instance [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Lazy-loading 'info_cache' on Instance uuid b417b484-e28d-4f07-a7a6-9528c9b27a63 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 09:43:28 compute-1 podman[232383]: 2025-12-05 09:43:28.630004668 +0000 UTC m=+0.057929600 container health_status d60d3dfd47255bc7c068dbedc03dbf86678863f0414a1b8f8536e4d53dd67a13 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a42d21893a42142b0b00e96296a13ac9d2c0fedf55d6438ad104fd0309c63376-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Dec 05 09:43:30 compute-1 nova_compute[189066]: 2025-12-05 09:43:30.833 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:43:30 compute-1 nova_compute[189066]: 2025-12-05 09:43:30.988 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:43:35 compute-1 nova_compute[189066]: 2025-12-05 09:43:35.836 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:43:36 compute-1 nova_compute[189066]: 2025-12-05 09:43:36.020 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:43:36 compute-1 podman[232404]: 2025-12-05 09:43:36.675583918 +0000 UTC m=+0.109686467 container health_status 0cdc951f3b455a1459471b20e1f1626ca53f702831c125f835d1b670aeb17ccf (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a42d21893a42142b0b00e96296a13ac9d2c0fedf55d6438ad104fd0309c63376-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_controller, managed_by=edpm_ansible)
Dec 05 09:43:36 compute-1 nova_compute[189066]: 2025-12-05 09:43:36.792 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:43:36 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:43:36.792 105272 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=26, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f2:d3:89', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '8e:43:2c:7d:20:dd'}, ipsec=False) old=SB_Global(nb_cfg=25) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 09:43:36 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:43:36.795 105272 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Dec 05 09:43:38 compute-1 nova_compute[189066]: 2025-12-05 09:43:38.363 189070 DEBUG nova.network.neutron [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] [instance: b417b484-e28d-4f07-a7a6-9528c9b27a63] Updating instance_info_cache with network_info: [{"id": "ce2e0b21-6b51-4972-a784-60843f6686f3", "address": "fa:16:3e:70:c2:8e", "network": {"id": "c8127193-8f39-4d91-b81d-cba716b6b70a", "bridge": "br-int", "label": "tempest-network-smoke--1577898204", "subnets": [{"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe70:c28e", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.241", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe70:c28e", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "fa1cd463d74b49139a088d332d37e611", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapce2e0b21-6b", "ovs_interfaceid": "ce2e0b21-6b51-4972-a784-60843f6686f3", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 09:43:38 compute-1 nova_compute[189066]: 2025-12-05 09:43:38.666 189070 DEBUG oslo_concurrency.lockutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Releasing lock "refresh_cache-b417b484-e28d-4f07-a7a6-9528c9b27a63" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 09:43:38 compute-1 nova_compute[189066]: 2025-12-05 09:43:38.667 189070 DEBUG nova.compute.manager [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] [instance: b417b484-e28d-4f07-a7a6-9528c9b27a63] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Dec 05 09:43:38 compute-1 nova_compute[189066]: 2025-12-05 09:43:38.667 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:43:38 compute-1 nova_compute[189066]: 2025-12-05 09:43:38.667 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:43:38 compute-1 nova_compute[189066]: 2025-12-05 09:43:38.668 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:43:38 compute-1 nova_compute[189066]: 2025-12-05 09:43:38.668 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:43:38 compute-1 nova_compute[189066]: 2025-12-05 09:43:38.878 189070 DEBUG oslo_concurrency.lockutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:43:38 compute-1 nova_compute[189066]: 2025-12-05 09:43:38.879 189070 DEBUG oslo_concurrency.lockutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:43:38 compute-1 nova_compute[189066]: 2025-12-05 09:43:38.879 189070 DEBUG oslo_concurrency.lockutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:43:38 compute-1 nova_compute[189066]: 2025-12-05 09:43:38.879 189070 DEBUG nova.compute.resource_tracker [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 05 09:43:38 compute-1 podman[232430]: 2025-12-05 09:43:38.993712028 +0000 UTC m=+0.056622797 container health_status e8cea6352187f28e672aa25c227f2705a90d58074b8cf42235a56df5d4026fd5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a42d21893a42142b0b00e96296a13ac9d2c0fedf55d6438ad104fd0309c63376-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 05 09:43:39 compute-1 nova_compute[189066]: 2025-12-05 09:43:39.717 189070 DEBUG oslo_concurrency.processutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b417b484-e28d-4f07-a7a6-9528c9b27a63/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 09:43:39 compute-1 nova_compute[189066]: 2025-12-05 09:43:39.791 189070 DEBUG oslo_concurrency.processutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b417b484-e28d-4f07-a7a6-9528c9b27a63/disk --force-share --output=json" returned: 0 in 0.075s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 09:43:39 compute-1 nova_compute[189066]: 2025-12-05 09:43:39.793 189070 DEBUG oslo_concurrency.processutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b417b484-e28d-4f07-a7a6-9528c9b27a63/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 09:43:39 compute-1 nova_compute[189066]: 2025-12-05 09:43:39.859 189070 DEBUG oslo_concurrency.processutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b417b484-e28d-4f07-a7a6-9528c9b27a63/disk --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 09:43:40 compute-1 nova_compute[189066]: 2025-12-05 09:43:40.045 189070 WARNING nova.virt.libvirt.driver [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 05 09:43:40 compute-1 nova_compute[189066]: 2025-12-05 09:43:40.047 189070 DEBUG nova.compute.resource_tracker [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5570MB free_disk=73.29520797729492GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 05 09:43:40 compute-1 nova_compute[189066]: 2025-12-05 09:43:40.048 189070 DEBUG oslo_concurrency.lockutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:43:40 compute-1 nova_compute[189066]: 2025-12-05 09:43:40.048 189070 DEBUG oslo_concurrency.lockutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:43:40 compute-1 nova_compute[189066]: 2025-12-05 09:43:40.202 189070 DEBUG nova.compute.resource_tracker [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Instance b417b484-e28d-4f07-a7a6-9528c9b27a63 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Dec 05 09:43:40 compute-1 nova_compute[189066]: 2025-12-05 09:43:40.203 189070 DEBUG nova.compute.resource_tracker [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 05 09:43:40 compute-1 nova_compute[189066]: 2025-12-05 09:43:40.203 189070 DEBUG nova.compute.resource_tracker [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 05 09:43:40 compute-1 nova_compute[189066]: 2025-12-05 09:43:40.252 189070 DEBUG nova.compute.provider_tree [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Inventory has not changed in ProviderTree for provider: be68f9f1-7820-4bfa-8dbd-210e13729f64 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 09:43:40 compute-1 nova_compute[189066]: 2025-12-05 09:43:40.419 189070 DEBUG nova.scheduler.client.report [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Inventory has not changed for provider be68f9f1-7820-4bfa-8dbd-210e13729f64 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 09:43:40 compute-1 nova_compute[189066]: 2025-12-05 09:43:40.823 189070 DEBUG nova.compute.resource_tracker [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 05 09:43:40 compute-1 nova_compute[189066]: 2025-12-05 09:43:40.824 189070 DEBUG oslo_concurrency.lockutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.776s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:43:40 compute-1 nova_compute[189066]: 2025-12-05 09:43:40.839 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:43:41 compute-1 nova_compute[189066]: 2025-12-05 09:43:41.023 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:43:41 compute-1 podman[232457]: 2025-12-05 09:43:41.628247096 +0000 UTC m=+0.070740863 container health_status 3f3288243aefe700d53cffce184353529fbaf6e44f1c19ec4792ed576af41176 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.3, config_id=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, managed_by=edpm_ansible)
Dec 05 09:43:44 compute-1 podman[232479]: 2025-12-05 09:43:44.633782391 +0000 UTC m=+0.071567583 container health_status b07f36300303a5c1729056a61e344d6c6fd9b2e404082d8e8ce7185d45652c2b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, architecture=x86_64, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, io.buildah.version=1.33.7, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, distribution-scope=public, managed_by=edpm_ansible, release=1755695350, container_name=openstack_network_exporter, version=9.6, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, build-date=2025-08-20T13:12:41, io.openshift.expose-services=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, maintainer=Red Hat, Inc.)
Dec 05 09:43:44 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:43:44.798 105272 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=2deaed7a-68f6-453c-b7f8-10ef033f3762, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '26'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 09:43:45 compute-1 nova_compute[189066]: 2025-12-05 09:43:45.842 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:43:46 compute-1 nova_compute[189066]: 2025-12-05 09:43:46.024 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:43:48 compute-1 ovn_controller[95809]: 2025-12-05T09:43:48Z|00249|binding|INFO|Releasing lport dbd588a7-89a1-4a96-82f6-7a2d0d05566b from this chassis (sb_readonly=0)
Dec 05 09:43:48 compute-1 nova_compute[189066]: 2025-12-05 09:43:48.681 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:43:49 compute-1 podman[232500]: 2025-12-05 09:43:49.611728057 +0000 UTC m=+0.051565824 container health_status b9f45cdcb567bb250381fa80d86c78c226d5638c7de1b6d79ee017275814ff68 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 05 09:43:50 compute-1 nova_compute[189066]: 2025-12-05 09:43:50.844 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:43:51 compute-1 nova_compute[189066]: 2025-12-05 09:43:51.026 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:43:51 compute-1 nova_compute[189066]: 2025-12-05 09:43:51.391 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:43:51 compute-1 nova_compute[189066]: 2025-12-05 09:43:51.804 189070 DEBUG nova.compute.manager [req-d7684a44-c770-4c80-b82a-40e5f70c0825 req-a18db616-dd68-4f1c-a7ef-2d6d887d3405 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: b417b484-e28d-4f07-a7a6-9528c9b27a63] Received event network-changed-ce2e0b21-6b51-4972-a784-60843f6686f3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 09:43:51 compute-1 nova_compute[189066]: 2025-12-05 09:43:51.804 189070 DEBUG nova.compute.manager [req-d7684a44-c770-4c80-b82a-40e5f70c0825 req-a18db616-dd68-4f1c-a7ef-2d6d887d3405 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: b417b484-e28d-4f07-a7a6-9528c9b27a63] Refreshing instance network info cache due to event network-changed-ce2e0b21-6b51-4972-a784-60843f6686f3. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Dec 05 09:43:51 compute-1 nova_compute[189066]: 2025-12-05 09:43:51.805 189070 DEBUG oslo_concurrency.lockutils [req-d7684a44-c770-4c80-b82a-40e5f70c0825 req-a18db616-dd68-4f1c-a7ef-2d6d887d3405 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Acquiring lock "refresh_cache-b417b484-e28d-4f07-a7a6-9528c9b27a63" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 09:43:51 compute-1 nova_compute[189066]: 2025-12-05 09:43:51.805 189070 DEBUG oslo_concurrency.lockutils [req-d7684a44-c770-4c80-b82a-40e5f70c0825 req-a18db616-dd68-4f1c-a7ef-2d6d887d3405 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Acquired lock "refresh_cache-b417b484-e28d-4f07-a7a6-9528c9b27a63" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 09:43:51 compute-1 nova_compute[189066]: 2025-12-05 09:43:51.805 189070 DEBUG nova.network.neutron [req-d7684a44-c770-4c80-b82a-40e5f70c0825 req-a18db616-dd68-4f1c-a7ef-2d6d887d3405 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: b417b484-e28d-4f07-a7a6-9528c9b27a63] Refreshing network info cache for port ce2e0b21-6b51-4972-a784-60843f6686f3 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Dec 05 09:43:51 compute-1 nova_compute[189066]: 2025-12-05 09:43:51.952 189070 DEBUG oslo_concurrency.lockutils [None req-50b6877d-10f2-4097-8daf-38fffe148a81 fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] Acquiring lock "b417b484-e28d-4f07-a7a6-9528c9b27a63" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:43:51 compute-1 nova_compute[189066]: 2025-12-05 09:43:51.952 189070 DEBUG oslo_concurrency.lockutils [None req-50b6877d-10f2-4097-8daf-38fffe148a81 fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] Lock "b417b484-e28d-4f07-a7a6-9528c9b27a63" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:43:51 compute-1 nova_compute[189066]: 2025-12-05 09:43:51.953 189070 DEBUG oslo_concurrency.lockutils [None req-50b6877d-10f2-4097-8daf-38fffe148a81 fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] Acquiring lock "b417b484-e28d-4f07-a7a6-9528c9b27a63-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:43:51 compute-1 nova_compute[189066]: 2025-12-05 09:43:51.953 189070 DEBUG oslo_concurrency.lockutils [None req-50b6877d-10f2-4097-8daf-38fffe148a81 fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] Lock "b417b484-e28d-4f07-a7a6-9528c9b27a63-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:43:51 compute-1 nova_compute[189066]: 2025-12-05 09:43:51.953 189070 DEBUG oslo_concurrency.lockutils [None req-50b6877d-10f2-4097-8daf-38fffe148a81 fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] Lock "b417b484-e28d-4f07-a7a6-9528c9b27a63-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:43:51 compute-1 nova_compute[189066]: 2025-12-05 09:43:51.955 189070 INFO nova.compute.manager [None req-50b6877d-10f2-4097-8daf-38fffe148a81 fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] [instance: b417b484-e28d-4f07-a7a6-9528c9b27a63] Terminating instance
Dec 05 09:43:51 compute-1 nova_compute[189066]: 2025-12-05 09:43:51.956 189070 DEBUG nova.compute.manager [None req-50b6877d-10f2-4097-8daf-38fffe148a81 fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] [instance: b417b484-e28d-4f07-a7a6-9528c9b27a63] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Dec 05 09:43:51 compute-1 kernel: tapce2e0b21-6b (unregistering): left promiscuous mode
Dec 05 09:43:51 compute-1 NetworkManager[55704]: <info>  [1764927831.9937] device (tapce2e0b21-6b): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec 05 09:43:52 compute-1 ovn_controller[95809]: 2025-12-05T09:43:52Z|00250|binding|INFO|Releasing lport ce2e0b21-6b51-4972-a784-60843f6686f3 from this chassis (sb_readonly=0)
Dec 05 09:43:52 compute-1 ovn_controller[95809]: 2025-12-05T09:43:52Z|00251|binding|INFO|Setting lport ce2e0b21-6b51-4972-a784-60843f6686f3 down in Southbound
Dec 05 09:43:52 compute-1 nova_compute[189066]: 2025-12-05 09:43:52.001 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:43:52 compute-1 ovn_controller[95809]: 2025-12-05T09:43:52Z|00252|binding|INFO|Removing iface tapce2e0b21-6b ovn-installed in OVS
Dec 05 09:43:52 compute-1 nova_compute[189066]: 2025-12-05 09:43:52.006 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:43:52 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:43:52.014 105272 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:70:c2:8e 10.100.0.9 2001:db8:0:1:f816:3eff:fe70:c28e 2001:db8::f816:3eff:fe70:c28e'], port_security=['fa:16:3e:70:c2:8e 10.100.0.9 2001:db8:0:1:f816:3eff:fe70:c28e 2001:db8::f816:3eff:fe70:c28e'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28 2001:db8:0:1:f816:3eff:fe70:c28e/64 2001:db8::f816:3eff:fe70:c28e/64', 'neutron:device_id': 'b417b484-e28d-4f07-a7a6-9528c9b27a63', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c8127193-8f39-4d91-b81d-cba716b6b70a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fa1cd463d74b49139a088d332d37e611', 'neutron:revision_number': '4', 'neutron:security_group_ids': '9c8a624e-974c-4a4d-8814-5cb24cdfcef3', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e534066c-24e2-4a55-800c-4e1e4900b516, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fc06f54c970>], logical_port=ce2e0b21-6b51-4972-a784-60843f6686f3) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fc06f54c970>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 09:43:52 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:43:52.016 105272 INFO neutron.agent.ovn.metadata.agent [-] Port ce2e0b21-6b51-4972-a784-60843f6686f3 in datapath c8127193-8f39-4d91-b81d-cba716b6b70a unbound from our chassis
Dec 05 09:43:52 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:43:52.018 105272 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network c8127193-8f39-4d91-b81d-cba716b6b70a, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 05 09:43:52 compute-1 nova_compute[189066]: 2025-12-05 09:43:52.020 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:43:52 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:43:52.021 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[82ac78d6-ac2b-41ff-bc32-f5d6bb4f8b40]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:43:52 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:43:52.023 105272 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-c8127193-8f39-4d91-b81d-cba716b6b70a namespace which is not needed anymore
Dec 05 09:43:52 compute-1 systemd[1]: machine-qemu\x2d20\x2dinstance\x2d0000002f.scope: Deactivated successfully.
Dec 05 09:43:52 compute-1 systemd[1]: machine-qemu\x2d20\x2dinstance\x2d0000002f.scope: Consumed 16.937s CPU time.
Dec 05 09:43:52 compute-1 systemd-machined[154815]: Machine qemu-20-instance-0000002f terminated.
Dec 05 09:43:52 compute-1 neutron-haproxy-ovnmeta-c8127193-8f39-4d91-b81d-cba716b6b70a[232042]: [NOTICE]   (232046) : haproxy version is 2.8.14-c23fe91
Dec 05 09:43:52 compute-1 neutron-haproxy-ovnmeta-c8127193-8f39-4d91-b81d-cba716b6b70a[232042]: [NOTICE]   (232046) : path to executable is /usr/sbin/haproxy
Dec 05 09:43:52 compute-1 neutron-haproxy-ovnmeta-c8127193-8f39-4d91-b81d-cba716b6b70a[232042]: [WARNING]  (232046) : Exiting Master process...
Dec 05 09:43:52 compute-1 neutron-haproxy-ovnmeta-c8127193-8f39-4d91-b81d-cba716b6b70a[232042]: [ALERT]    (232046) : Current worker (232048) exited with code 143 (Terminated)
Dec 05 09:43:52 compute-1 neutron-haproxy-ovnmeta-c8127193-8f39-4d91-b81d-cba716b6b70a[232042]: [WARNING]  (232046) : All workers exited. Exiting... (0)
Dec 05 09:43:52 compute-1 systemd[1]: libpod-b6492b9ac72ef1cb25a25a4a873f3cb5f92a1abcc6b177567fc49146fa779df0.scope: Deactivated successfully.
Dec 05 09:43:52 compute-1 podman[232549]: 2025-12-05 09:43:52.195115082 +0000 UTC m=+0.053021340 container died b6492b9ac72ef1cb25a25a4a873f3cb5f92a1abcc6b177567fc49146fa779df0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c8127193-8f39-4d91-b81d-cba716b6b70a, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Dec 05 09:43:52 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-b6492b9ac72ef1cb25a25a4a873f3cb5f92a1abcc6b177567fc49146fa779df0-userdata-shm.mount: Deactivated successfully.
Dec 05 09:43:52 compute-1 systemd[1]: var-lib-containers-storage-overlay-bb1762d25d650fd77c763a7e533a3494ae0f880abb812cef8319a27fc36fb4d1-merged.mount: Deactivated successfully.
Dec 05 09:43:52 compute-1 nova_compute[189066]: 2025-12-05 09:43:52.237 189070 INFO nova.virt.libvirt.driver [-] [instance: b417b484-e28d-4f07-a7a6-9528c9b27a63] Instance destroyed successfully.
Dec 05 09:43:52 compute-1 nova_compute[189066]: 2025-12-05 09:43:52.238 189070 DEBUG nova.objects.instance [None req-50b6877d-10f2-4097-8daf-38fffe148a81 fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] Lazy-loading 'resources' on Instance uuid b417b484-e28d-4f07-a7a6-9528c9b27a63 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 09:43:52 compute-1 podman[232549]: 2025-12-05 09:43:52.243263731 +0000 UTC m=+0.101169999 container cleanup b6492b9ac72ef1cb25a25a4a873f3cb5f92a1abcc6b177567fc49146fa779df0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c8127193-8f39-4d91-b81d-cba716b6b70a, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0)
Dec 05 09:43:52 compute-1 systemd[1]: libpod-conmon-b6492b9ac72ef1cb25a25a4a873f3cb5f92a1abcc6b177567fc49146fa779df0.scope: Deactivated successfully.
Dec 05 09:43:52 compute-1 nova_compute[189066]: 2025-12-05 09:43:52.270 189070 DEBUG nova.virt.libvirt.vif [None req-50b6877d-10f2-4097-8daf-38fffe148a81 fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-05T09:42:10Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1007468349',display_name='tempest-TestGettingAddress-server-1007468349',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1007468349',id=47,image_ref='3ebffd97-b242-42d7-b245-ebdaf8e4377c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBF7b/7MwbHNgolWlFgvs9a+gQGW2Zj7PA97WQ9maV2HyX6ljndlAHVDtzh0a6QTU+HpOhqDK5bEN+TBhnr5fx5xapjRV4/ouFtwlIZGp978P/7BBZIVuAmc+cOHOO30M/A==',key_name='tempest-TestGettingAddress-1631840748',keypairs=<?>,launch_index=0,launched_at=2025-12-05T09:42:33Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='fa1cd463d74b49139a088d332d37e611',ramdisk_id='',reservation_id='r-p615iv1e',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='3ebffd97-b242-42d7-b245-ebdaf8e4377c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestGettingAddress-8368731',owner_user_name='tempest-TestGettingAddress-8368731-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-05T09:42:33Z,user_data=None,user_id='fae1c60e378945ea84b34c4824b835b1',uuid=b417b484-e28d-4f07-a7a6-9528c9b27a63,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "ce2e0b21-6b51-4972-a784-60843f6686f3", "address": "fa:16:3e:70:c2:8e", "network": {"id": "c8127193-8f39-4d91-b81d-cba716b6b70a", "bridge": "br-int", "label": "tempest-network-smoke--1577898204", "subnets": [{"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe70:c28e", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.241", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe70:c28e", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "fa1cd463d74b49139a088d332d37e611", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapce2e0b21-6b", "ovs_interfaceid": "ce2e0b21-6b51-4972-a784-60843f6686f3", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Dec 05 09:43:52 compute-1 nova_compute[189066]: 2025-12-05 09:43:52.271 189070 DEBUG nova.network.os_vif_util [None req-50b6877d-10f2-4097-8daf-38fffe148a81 fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] Converting VIF {"id": "ce2e0b21-6b51-4972-a784-60843f6686f3", "address": "fa:16:3e:70:c2:8e", "network": {"id": "c8127193-8f39-4d91-b81d-cba716b6b70a", "bridge": "br-int", "label": "tempest-network-smoke--1577898204", "subnets": [{"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe70:c28e", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.241", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe70:c28e", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "fa1cd463d74b49139a088d332d37e611", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapce2e0b21-6b", "ovs_interfaceid": "ce2e0b21-6b51-4972-a784-60843f6686f3", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 09:43:52 compute-1 nova_compute[189066]: 2025-12-05 09:43:52.272 189070 DEBUG nova.network.os_vif_util [None req-50b6877d-10f2-4097-8daf-38fffe148a81 fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:70:c2:8e,bridge_name='br-int',has_traffic_filtering=True,id=ce2e0b21-6b51-4972-a784-60843f6686f3,network=Network(c8127193-8f39-4d91-b81d-cba716b6b70a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapce2e0b21-6b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 09:43:52 compute-1 nova_compute[189066]: 2025-12-05 09:43:52.273 189070 DEBUG os_vif [None req-50b6877d-10f2-4097-8daf-38fffe148a81 fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:70:c2:8e,bridge_name='br-int',has_traffic_filtering=True,id=ce2e0b21-6b51-4972-a784-60843f6686f3,network=Network(c8127193-8f39-4d91-b81d-cba716b6b70a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapce2e0b21-6b') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Dec 05 09:43:52 compute-1 nova_compute[189066]: 2025-12-05 09:43:52.276 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:43:52 compute-1 nova_compute[189066]: 2025-12-05 09:43:52.276 189070 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapce2e0b21-6b, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 09:43:52 compute-1 nova_compute[189066]: 2025-12-05 09:43:52.278 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:43:52 compute-1 nova_compute[189066]: 2025-12-05 09:43:52.280 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:43:52 compute-1 nova_compute[189066]: 2025-12-05 09:43:52.284 189070 INFO os_vif [None req-50b6877d-10f2-4097-8daf-38fffe148a81 fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:70:c2:8e,bridge_name='br-int',has_traffic_filtering=True,id=ce2e0b21-6b51-4972-a784-60843f6686f3,network=Network(c8127193-8f39-4d91-b81d-cba716b6b70a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapce2e0b21-6b')
Dec 05 09:43:52 compute-1 nova_compute[189066]: 2025-12-05 09:43:52.285 189070 INFO nova.virt.libvirt.driver [None req-50b6877d-10f2-4097-8daf-38fffe148a81 fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] [instance: b417b484-e28d-4f07-a7a6-9528c9b27a63] Deleting instance files /var/lib/nova/instances/b417b484-e28d-4f07-a7a6-9528c9b27a63_del
Dec 05 09:43:52 compute-1 nova_compute[189066]: 2025-12-05 09:43:52.286 189070 INFO nova.virt.libvirt.driver [None req-50b6877d-10f2-4097-8daf-38fffe148a81 fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] [instance: b417b484-e28d-4f07-a7a6-9528c9b27a63] Deletion of /var/lib/nova/instances/b417b484-e28d-4f07-a7a6-9528c9b27a63_del complete
Dec 05 09:43:52 compute-1 nova_compute[189066]: 2025-12-05 09:43:52.294 189070 DEBUG nova.compute.manager [req-d1fdac76-d91e-403c-86c2-4f2f12df616b req-68095f0f-0c80-41c5-8b6f-f4a6a3d61a78 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: b417b484-e28d-4f07-a7a6-9528c9b27a63] Received event network-vif-unplugged-ce2e0b21-6b51-4972-a784-60843f6686f3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 09:43:52 compute-1 nova_compute[189066]: 2025-12-05 09:43:52.295 189070 DEBUG oslo_concurrency.lockutils [req-d1fdac76-d91e-403c-86c2-4f2f12df616b req-68095f0f-0c80-41c5-8b6f-f4a6a3d61a78 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Acquiring lock "b417b484-e28d-4f07-a7a6-9528c9b27a63-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:43:52 compute-1 nova_compute[189066]: 2025-12-05 09:43:52.295 189070 DEBUG oslo_concurrency.lockutils [req-d1fdac76-d91e-403c-86c2-4f2f12df616b req-68095f0f-0c80-41c5-8b6f-f4a6a3d61a78 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Lock "b417b484-e28d-4f07-a7a6-9528c9b27a63-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:43:52 compute-1 nova_compute[189066]: 2025-12-05 09:43:52.295 189070 DEBUG oslo_concurrency.lockutils [req-d1fdac76-d91e-403c-86c2-4f2f12df616b req-68095f0f-0c80-41c5-8b6f-f4a6a3d61a78 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Lock "b417b484-e28d-4f07-a7a6-9528c9b27a63-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:43:52 compute-1 nova_compute[189066]: 2025-12-05 09:43:52.296 189070 DEBUG nova.compute.manager [req-d1fdac76-d91e-403c-86c2-4f2f12df616b req-68095f0f-0c80-41c5-8b6f-f4a6a3d61a78 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: b417b484-e28d-4f07-a7a6-9528c9b27a63] No waiting events found dispatching network-vif-unplugged-ce2e0b21-6b51-4972-a784-60843f6686f3 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 09:43:52 compute-1 nova_compute[189066]: 2025-12-05 09:43:52.296 189070 DEBUG nova.compute.manager [req-d1fdac76-d91e-403c-86c2-4f2f12df616b req-68095f0f-0c80-41c5-8b6f-f4a6a3d61a78 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: b417b484-e28d-4f07-a7a6-9528c9b27a63] Received event network-vif-unplugged-ce2e0b21-6b51-4972-a784-60843f6686f3 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Dec 05 09:43:52 compute-1 podman[232594]: 2025-12-05 09:43:52.312500407 +0000 UTC m=+0.044468811 container remove b6492b9ac72ef1cb25a25a4a873f3cb5f92a1abcc6b177567fc49146fa779df0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c8127193-8f39-4d91-b81d-cba716b6b70a, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Dec 05 09:43:52 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:43:52.318 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[2d3a0b95-82c8-4ae1-bc25-cd357174a67c]: (4, ('Fri Dec  5 09:43:52 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-c8127193-8f39-4d91-b81d-cba716b6b70a (b6492b9ac72ef1cb25a25a4a873f3cb5f92a1abcc6b177567fc49146fa779df0)\nb6492b9ac72ef1cb25a25a4a873f3cb5f92a1abcc6b177567fc49146fa779df0\nFri Dec  5 09:43:52 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-c8127193-8f39-4d91-b81d-cba716b6b70a (b6492b9ac72ef1cb25a25a4a873f3cb5f92a1abcc6b177567fc49146fa779df0)\nb6492b9ac72ef1cb25a25a4a873f3cb5f92a1abcc6b177567fc49146fa779df0\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:43:52 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:43:52.320 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[33172199-60dc-4315-a072-6704c09423fb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:43:52 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:43:52.322 105272 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc8127193-80, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 09:43:52 compute-1 nova_compute[189066]: 2025-12-05 09:43:52.324 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:43:52 compute-1 kernel: tapc8127193-80: left promiscuous mode
Dec 05 09:43:52 compute-1 nova_compute[189066]: 2025-12-05 09:43:52.335 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:43:52 compute-1 nova_compute[189066]: 2025-12-05 09:43:52.343 189070 INFO nova.compute.manager [None req-50b6877d-10f2-4097-8daf-38fffe148a81 fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] [instance: b417b484-e28d-4f07-a7a6-9528c9b27a63] Took 0.39 seconds to destroy the instance on the hypervisor.
Dec 05 09:43:52 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:43:52.344 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[635316f2-4a78-4ce0-885f-1546de55b167]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:43:52 compute-1 nova_compute[189066]: 2025-12-05 09:43:52.346 189070 DEBUG oslo.service.loopingcall [None req-50b6877d-10f2-4097-8daf-38fffe148a81 fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Dec 05 09:43:52 compute-1 nova_compute[189066]: 2025-12-05 09:43:52.346 189070 DEBUG nova.compute.manager [-] [instance: b417b484-e28d-4f07-a7a6-9528c9b27a63] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Dec 05 09:43:52 compute-1 nova_compute[189066]: 2025-12-05 09:43:52.347 189070 DEBUG nova.network.neutron [-] [instance: b417b484-e28d-4f07-a7a6-9528c9b27a63] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Dec 05 09:43:52 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:43:52.359 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[49d523e2-eb7f-4b26-95e1-a6ab5d8d09c7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:43:52 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:43:52.360 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[957923b2-b8d5-4aa5-a0eb-bd180e575cf5]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:43:52 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:43:52.378 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[6c0987ba-f592-4889-a514-a177f505afee]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 502568, 'reachable_time': 36260, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 232610, 'error': None, 'target': 'ovnmeta-c8127193-8f39-4d91-b81d-cba716b6b70a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:43:52 compute-1 systemd[1]: run-netns-ovnmeta\x2dc8127193\x2d8f39\x2d4d91\x2db81d\x2dcba716b6b70a.mount: Deactivated successfully.
Dec 05 09:43:52 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:43:52.386 105809 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-c8127193-8f39-4d91-b81d-cba716b6b70a deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Dec 05 09:43:52 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:43:52.387 105809 DEBUG oslo.privsep.daemon [-] privsep: reply[ec07dd21-2823-4beb-886c-8f88b79c9575]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:43:53 compute-1 nova_compute[189066]: 2025-12-05 09:43:53.486 189070 DEBUG nova.network.neutron [-] [instance: b417b484-e28d-4f07-a7a6-9528c9b27a63] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 09:43:53 compute-1 nova_compute[189066]: 2025-12-05 09:43:53.516 189070 INFO nova.compute.manager [-] [instance: b417b484-e28d-4f07-a7a6-9528c9b27a63] Took 1.17 seconds to deallocate network for instance.
Dec 05 09:43:53 compute-1 nova_compute[189066]: 2025-12-05 09:43:53.603 189070 DEBUG oslo_concurrency.lockutils [None req-50b6877d-10f2-4097-8daf-38fffe148a81 fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:43:53 compute-1 nova_compute[189066]: 2025-12-05 09:43:53.604 189070 DEBUG oslo_concurrency.lockutils [None req-50b6877d-10f2-4097-8daf-38fffe148a81 fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:43:53 compute-1 podman[232611]: 2025-12-05 09:43:53.611477859 +0000 UTC m=+0.056812023 container health_status 550b604b3b40c506829669f3af0f3e8dc4e7ac01464222ce7f26998708fe1a96 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Dec 05 09:43:53 compute-1 nova_compute[189066]: 2025-12-05 09:43:53.672 189070 DEBUG nova.compute.provider_tree [None req-50b6877d-10f2-4097-8daf-38fffe148a81 fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] Inventory has not changed in ProviderTree for provider: be68f9f1-7820-4bfa-8dbd-210e13729f64 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 09:43:53 compute-1 nova_compute[189066]: 2025-12-05 09:43:53.698 189070 DEBUG nova.scheduler.client.report [None req-50b6877d-10f2-4097-8daf-38fffe148a81 fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] Inventory has not changed for provider be68f9f1-7820-4bfa-8dbd-210e13729f64 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 09:43:53 compute-1 nova_compute[189066]: 2025-12-05 09:43:53.830 189070 DEBUG oslo_concurrency.lockutils [None req-50b6877d-10f2-4097-8daf-38fffe148a81 fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.226s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:43:53 compute-1 nova_compute[189066]: 2025-12-05 09:43:53.945 189070 INFO nova.scheduler.client.report [None req-50b6877d-10f2-4097-8daf-38fffe148a81 fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] Deleted allocations for instance b417b484-e28d-4f07-a7a6-9528c9b27a63
Dec 05 09:43:54 compute-1 nova_compute[189066]: 2025-12-05 09:43:54.151 189070 DEBUG oslo_concurrency.lockutils [None req-50b6877d-10f2-4097-8daf-38fffe148a81 fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] Lock "b417b484-e28d-4f07-a7a6-9528c9b27a63" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.199s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:43:54 compute-1 nova_compute[189066]: 2025-12-05 09:43:54.273 189070 DEBUG nova.network.neutron [req-d7684a44-c770-4c80-b82a-40e5f70c0825 req-a18db616-dd68-4f1c-a7ef-2d6d887d3405 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: b417b484-e28d-4f07-a7a6-9528c9b27a63] Updated VIF entry in instance network info cache for port ce2e0b21-6b51-4972-a784-60843f6686f3. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Dec 05 09:43:54 compute-1 nova_compute[189066]: 2025-12-05 09:43:54.273 189070 DEBUG nova.network.neutron [req-d7684a44-c770-4c80-b82a-40e5f70c0825 req-a18db616-dd68-4f1c-a7ef-2d6d887d3405 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: b417b484-e28d-4f07-a7a6-9528c9b27a63] Updating instance_info_cache with network_info: [{"id": "ce2e0b21-6b51-4972-a784-60843f6686f3", "address": "fa:16:3e:70:c2:8e", "network": {"id": "c8127193-8f39-4d91-b81d-cba716b6b70a", "bridge": "br-int", "label": "tempest-network-smoke--1577898204", "subnets": [{"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe70:c28e", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe70:c28e", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "fa1cd463d74b49139a088d332d37e611", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapce2e0b21-6b", "ovs_interfaceid": "ce2e0b21-6b51-4972-a784-60843f6686f3", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 09:43:54 compute-1 nova_compute[189066]: 2025-12-05 09:43:54.601 189070 DEBUG nova.compute.manager [req-0165b4e6-81c5-400c-a423-53f1704da3f8 req-307bf3f8-dd67-4573-8228-c30840a42064 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: b417b484-e28d-4f07-a7a6-9528c9b27a63] Received event network-vif-plugged-ce2e0b21-6b51-4972-a784-60843f6686f3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 09:43:54 compute-1 nova_compute[189066]: 2025-12-05 09:43:54.601 189070 DEBUG oslo_concurrency.lockutils [req-0165b4e6-81c5-400c-a423-53f1704da3f8 req-307bf3f8-dd67-4573-8228-c30840a42064 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Acquiring lock "b417b484-e28d-4f07-a7a6-9528c9b27a63-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:43:54 compute-1 nova_compute[189066]: 2025-12-05 09:43:54.601 189070 DEBUG oslo_concurrency.lockutils [req-0165b4e6-81c5-400c-a423-53f1704da3f8 req-307bf3f8-dd67-4573-8228-c30840a42064 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Lock "b417b484-e28d-4f07-a7a6-9528c9b27a63-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:43:54 compute-1 nova_compute[189066]: 2025-12-05 09:43:54.602 189070 DEBUG oslo_concurrency.lockutils [req-0165b4e6-81c5-400c-a423-53f1704da3f8 req-307bf3f8-dd67-4573-8228-c30840a42064 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Lock "b417b484-e28d-4f07-a7a6-9528c9b27a63-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:43:54 compute-1 nova_compute[189066]: 2025-12-05 09:43:54.602 189070 DEBUG nova.compute.manager [req-0165b4e6-81c5-400c-a423-53f1704da3f8 req-307bf3f8-dd67-4573-8228-c30840a42064 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: b417b484-e28d-4f07-a7a6-9528c9b27a63] No waiting events found dispatching network-vif-plugged-ce2e0b21-6b51-4972-a784-60843f6686f3 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 09:43:54 compute-1 nova_compute[189066]: 2025-12-05 09:43:54.602 189070 WARNING nova.compute.manager [req-0165b4e6-81c5-400c-a423-53f1704da3f8 req-307bf3f8-dd67-4573-8228-c30840a42064 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: b417b484-e28d-4f07-a7a6-9528c9b27a63] Received unexpected event network-vif-plugged-ce2e0b21-6b51-4972-a784-60843f6686f3 for instance with vm_state deleted and task_state None.
Dec 05 09:43:54 compute-1 nova_compute[189066]: 2025-12-05 09:43:54.602 189070 DEBUG nova.compute.manager [req-0165b4e6-81c5-400c-a423-53f1704da3f8 req-307bf3f8-dd67-4573-8228-c30840a42064 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: b417b484-e28d-4f07-a7a6-9528c9b27a63] Received event network-vif-deleted-ce2e0b21-6b51-4972-a784-60843f6686f3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 09:43:54 compute-1 nova_compute[189066]: 2025-12-05 09:43:54.604 189070 DEBUG oslo_concurrency.lockutils [req-d7684a44-c770-4c80-b82a-40e5f70c0825 req-a18db616-dd68-4f1c-a7ef-2d6d887d3405 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Releasing lock "refresh_cache-b417b484-e28d-4f07-a7a6-9528c9b27a63" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 09:43:55 compute-1 nova_compute[189066]: 2025-12-05 09:43:55.861 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:43:57 compute-1 nova_compute[189066]: 2025-12-05 09:43:57.278 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:43:59 compute-1 podman[232636]: 2025-12-05 09:43:59.636017096 +0000 UTC m=+0.077768275 container health_status d60d3dfd47255bc7c068dbedc03dbf86678863f0414a1b8f8536e4d53dd67a13 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a42d21893a42142b0b00e96296a13ac9d2c0fedf55d6438ad104fd0309c63376-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.schema-version=1.0, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125)
Dec 05 09:44:00 compute-1 nova_compute[189066]: 2025-12-05 09:44:00.448 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:44:00 compute-1 nova_compute[189066]: 2025-12-05 09:44:00.630 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:44:00 compute-1 nova_compute[189066]: 2025-12-05 09:44:00.865 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:44:02 compute-1 nova_compute[189066]: 2025-12-05 09:44:02.282 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:44:05 compute-1 nova_compute[189066]: 2025-12-05 09:44:05.867 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:44:07 compute-1 nova_compute[189066]: 2025-12-05 09:44:07.232 189070 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764927832.231234, b417b484-e28d-4f07-a7a6-9528c9b27a63 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 09:44:07 compute-1 nova_compute[189066]: 2025-12-05 09:44:07.234 189070 INFO nova.compute.manager [-] [instance: b417b484-e28d-4f07-a7a6-9528c9b27a63] VM Stopped (Lifecycle Event)
Dec 05 09:44:07 compute-1 nova_compute[189066]: 2025-12-05 09:44:07.284 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:44:07 compute-1 podman[232659]: 2025-12-05 09:44:07.680444019 +0000 UTC m=+0.110799394 container health_status 0cdc951f3b455a1459471b20e1f1626ca53f702831c125f835d1b670aeb17ccf (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a42d21893a42142b0b00e96296a13ac9d2c0fedf55d6438ad104fd0309c63376-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Dec 05 09:44:07 compute-1 nova_compute[189066]: 2025-12-05 09:44:07.705 189070 DEBUG nova.compute.manager [None req-178f8327-a36c-4abc-93d0-95782144a64f - - - - - -] [instance: b417b484-e28d-4f07-a7a6-9528c9b27a63] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 09:44:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:44:08.891 105272 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:44:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:44:08.891 105272 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:44:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:44:08.892 105272 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:44:09 compute-1 podman[232685]: 2025-12-05 09:44:09.620338875 +0000 UTC m=+0.061488467 container health_status e8cea6352187f28e672aa25c227f2705a90d58074b8cf42235a56df5d4026fd5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a42d21893a42142b0b00e96296a13ac9d2c0fedf55d6438ad104fd0309c63376-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 05 09:44:10 compute-1 nova_compute[189066]: 2025-12-05 09:44:10.915 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:44:12 compute-1 nova_compute[189066]: 2025-12-05 09:44:12.288 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:44:12 compute-1 podman[232705]: 2025-12-05 09:44:12.629222211 +0000 UTC m=+0.068549060 container health_status 3f3288243aefe700d53cffce184353529fbaf6e44f1c19ec4792ed576af41176 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 05 09:44:15 compute-1 podman[232725]: 2025-12-05 09:44:15.619987933 +0000 UTC m=+0.057924289 container health_status b07f36300303a5c1729056a61e344d6c6fd9b2e404082d8e8ce7185d45652c2b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.tags=minimal rhel9, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, container_name=openstack_network_exporter, io.buildah.version=1.33.7, release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, name=ubi9-minimal, vcs-type=git, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, build-date=2025-08-20T13:12:41, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=openstack_network_exporter, maintainer=Red Hat, Inc., distribution-scope=public)
Dec 05 09:44:15 compute-1 nova_compute[189066]: 2025-12-05 09:44:15.918 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:44:17 compute-1 nova_compute[189066]: 2025-12-05 09:44:17.291 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:44:20 compute-1 podman[232747]: 2025-12-05 09:44:20.623561009 +0000 UTC m=+0.059097138 container health_status b9f45cdcb567bb250381fa80d86c78c226d5638c7de1b6d79ee017275814ff68 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Dec 05 09:44:20 compute-1 nova_compute[189066]: 2025-12-05 09:44:20.976 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:44:22 compute-1 nova_compute[189066]: 2025-12-05 09:44:22.294 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:44:24 compute-1 podman[232771]: 2025-12-05 09:44:24.621337732 +0000 UTC m=+0.055967372 container health_status 550b604b3b40c506829669f3af0f3e8dc4e7ac01464222ce7f26998708fe1a96 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec 05 09:44:25 compute-1 nova_compute[189066]: 2025-12-05 09:44:25.178 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:44:25 compute-1 nova_compute[189066]: 2025-12-05 09:44:25.178 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:44:25 compute-1 nova_compute[189066]: 2025-12-05 09:44:25.206 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:44:25 compute-1 nova_compute[189066]: 2025-12-05 09:44:25.207 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:44:26 compute-1 nova_compute[189066]: 2025-12-05 09:44:26.016 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:44:27 compute-1 nova_compute[189066]: 2025-12-05 09:44:27.022 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:44:27 compute-1 nova_compute[189066]: 2025-12-05 09:44:27.023 189070 DEBUG nova.compute.manager [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 05 09:44:27 compute-1 nova_compute[189066]: 2025-12-05 09:44:27.024 189070 DEBUG nova.compute.manager [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 05 09:44:27 compute-1 nova_compute[189066]: 2025-12-05 09:44:27.042 189070 DEBUG nova.compute.manager [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Dec 05 09:44:27 compute-1 nova_compute[189066]: 2025-12-05 09:44:27.298 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:44:27 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:44:27.454 105272 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:a0:d5:04 10.100.0.2 2001:db8::f816:3eff:fea0:d504'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8::f816:3eff:fea0:d504/64', 'neutron:device_id': 'ovnmeta-ae013098-2701-4ae8-8621-a0515ff9b432', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ae013098-2701-4ae8-8621-a0515ff9b432', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fa1cd463d74b49139a088d332d37e611', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b5135dc1-9df2-4212-8110-1f16b401ed19, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=8c05e074-0cad-4652-a193-a5450971b273) old=Port_Binding(mac=['fa:16:3e:a0:d5:04 10.100.0.2'], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-ae013098-2701-4ae8-8621-a0515ff9b432', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ae013098-2701-4ae8-8621-a0515ff9b432', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fa1cd463d74b49139a088d332d37e611', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 09:44:27 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:44:27.456 105272 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 8c05e074-0cad-4652-a193-a5450971b273 in datapath ae013098-2701-4ae8-8621-a0515ff9b432 updated
Dec 05 09:44:27 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:44:27.458 105272 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network ae013098-2701-4ae8-8621-a0515ff9b432, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 05 09:44:27 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:44:27.460 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[74b89943-a386-40f1-a291-dd250b6b2d58]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:44:28 compute-1 nova_compute[189066]: 2025-12-05 09:44:28.020 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:44:28 compute-1 nova_compute[189066]: 2025-12-05 09:44:28.021 189070 DEBUG nova.compute.manager [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 05 09:44:29 compute-1 nova_compute[189066]: 2025-12-05 09:44:29.021 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:44:29 compute-1 nova_compute[189066]: 2025-12-05 09:44:29.021 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:44:29 compute-1 nova_compute[189066]: 2025-12-05 09:44:29.022 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:44:29 compute-1 nova_compute[189066]: 2025-12-05 09:44:29.060 189070 DEBUG oslo_concurrency.lockutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:44:29 compute-1 nova_compute[189066]: 2025-12-05 09:44:29.061 189070 DEBUG oslo_concurrency.lockutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:44:29 compute-1 nova_compute[189066]: 2025-12-05 09:44:29.061 189070 DEBUG oslo_concurrency.lockutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:44:29 compute-1 nova_compute[189066]: 2025-12-05 09:44:29.062 189070 DEBUG nova.compute.resource_tracker [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 05 09:44:29 compute-1 nova_compute[189066]: 2025-12-05 09:44:29.268 189070 WARNING nova.virt.libvirt.driver [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 05 09:44:29 compute-1 nova_compute[189066]: 2025-12-05 09:44:29.270 189070 DEBUG nova.compute.resource_tracker [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5729MB free_disk=73.3238639831543GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 05 09:44:29 compute-1 nova_compute[189066]: 2025-12-05 09:44:29.270 189070 DEBUG oslo_concurrency.lockutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:44:29 compute-1 nova_compute[189066]: 2025-12-05 09:44:29.270 189070 DEBUG oslo_concurrency.lockutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:44:29 compute-1 nova_compute[189066]: 2025-12-05 09:44:29.367 189070 DEBUG nova.compute.resource_tracker [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 05 09:44:29 compute-1 nova_compute[189066]: 2025-12-05 09:44:29.368 189070 DEBUG nova.compute.resource_tracker [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 05 09:44:29 compute-1 nova_compute[189066]: 2025-12-05 09:44:29.394 189070 DEBUG nova.compute.provider_tree [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Inventory has not changed in ProviderTree for provider: be68f9f1-7820-4bfa-8dbd-210e13729f64 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 09:44:29 compute-1 nova_compute[189066]: 2025-12-05 09:44:29.410 189070 DEBUG nova.scheduler.client.report [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Inventory has not changed for provider be68f9f1-7820-4bfa-8dbd-210e13729f64 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 09:44:29 compute-1 nova_compute[189066]: 2025-12-05 09:44:29.443 189070 DEBUG nova.compute.resource_tracker [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 05 09:44:29 compute-1 nova_compute[189066]: 2025-12-05 09:44:29.443 189070 DEBUG oslo_concurrency.lockutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.173s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:44:30 compute-1 podman[232796]: 2025-12-05 09:44:30.632987364 +0000 UTC m=+0.072282881 container health_status d60d3dfd47255bc7c068dbedc03dbf86678863f0414a1b8f8536e4d53dd67a13 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a42d21893a42142b0b00e96296a13ac9d2c0fedf55d6438ad104fd0309c63376-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Dec 05 09:44:31 compute-1 nova_compute[189066]: 2025-12-05 09:44:31.037 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:44:32 compute-1 nova_compute[189066]: 2025-12-05 09:44:32.332 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:44:32 compute-1 nova_compute[189066]: 2025-12-05 09:44:32.443 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:44:36 compute-1 nova_compute[189066]: 2025-12-05 09:44:36.041 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:44:37 compute-1 nova_compute[189066]: 2025-12-05 09:44:37.334 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:44:38 compute-1 podman[232817]: 2025-12-05 09:44:38.6987208 +0000 UTC m=+0.102220094 container health_status 0cdc951f3b455a1459471b20e1f1626ca53f702831c125f835d1b670aeb17ccf (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a42d21893a42142b0b00e96296a13ac9d2c0fedf55d6438ad104fd0309c63376-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 05 09:44:38 compute-1 nova_compute[189066]: 2025-12-05 09:44:38.716 189070 DEBUG oslo_concurrency.lockutils [None req-d179c2a5-18c7-43fe-8d46-cda14b25e832 fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] Acquiring lock "b185217d-5f23-40c8-9f11-7cf87a1886c3" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:44:38 compute-1 nova_compute[189066]: 2025-12-05 09:44:38.716 189070 DEBUG oslo_concurrency.lockutils [None req-d179c2a5-18c7-43fe-8d46-cda14b25e832 fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] Lock "b185217d-5f23-40c8-9f11-7cf87a1886c3" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:44:38 compute-1 nova_compute[189066]: 2025-12-05 09:44:38.740 189070 DEBUG nova.compute.manager [None req-d179c2a5-18c7-43fe-8d46-cda14b25e832 fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] [instance: b185217d-5f23-40c8-9f11-7cf87a1886c3] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Dec 05 09:44:38 compute-1 nova_compute[189066]: 2025-12-05 09:44:38.879 189070 DEBUG oslo_concurrency.lockutils [None req-d179c2a5-18c7-43fe-8d46-cda14b25e832 fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:44:38 compute-1 nova_compute[189066]: 2025-12-05 09:44:38.879 189070 DEBUG oslo_concurrency.lockutils [None req-d179c2a5-18c7-43fe-8d46-cda14b25e832 fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:44:38 compute-1 nova_compute[189066]: 2025-12-05 09:44:38.889 189070 DEBUG nova.virt.hardware [None req-d179c2a5-18c7-43fe-8d46-cda14b25e832 fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Dec 05 09:44:38 compute-1 nova_compute[189066]: 2025-12-05 09:44:38.890 189070 INFO nova.compute.claims [None req-d179c2a5-18c7-43fe-8d46-cda14b25e832 fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] [instance: b185217d-5f23-40c8-9f11-7cf87a1886c3] Claim successful on node compute-1.ctlplane.example.com
Dec 05 09:44:39 compute-1 nova_compute[189066]: 2025-12-05 09:44:39.063 189070 DEBUG nova.compute.provider_tree [None req-d179c2a5-18c7-43fe-8d46-cda14b25e832 fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] Inventory has not changed in ProviderTree for provider: be68f9f1-7820-4bfa-8dbd-210e13729f64 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 09:44:39 compute-1 nova_compute[189066]: 2025-12-05 09:44:39.090 189070 DEBUG nova.scheduler.client.report [None req-d179c2a5-18c7-43fe-8d46-cda14b25e832 fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] Inventory has not changed for provider be68f9f1-7820-4bfa-8dbd-210e13729f64 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 09:44:39 compute-1 nova_compute[189066]: 2025-12-05 09:44:39.125 189070 DEBUG oslo_concurrency.lockutils [None req-d179c2a5-18c7-43fe-8d46-cda14b25e832 fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.245s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:44:39 compute-1 nova_compute[189066]: 2025-12-05 09:44:39.126 189070 DEBUG nova.compute.manager [None req-d179c2a5-18c7-43fe-8d46-cda14b25e832 fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] [instance: b185217d-5f23-40c8-9f11-7cf87a1886c3] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Dec 05 09:44:39 compute-1 nova_compute[189066]: 2025-12-05 09:44:39.505 189070 DEBUG nova.compute.manager [None req-d179c2a5-18c7-43fe-8d46-cda14b25e832 fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] [instance: b185217d-5f23-40c8-9f11-7cf87a1886c3] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Dec 05 09:44:39 compute-1 nova_compute[189066]: 2025-12-05 09:44:39.506 189070 DEBUG nova.network.neutron [None req-d179c2a5-18c7-43fe-8d46-cda14b25e832 fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] [instance: b185217d-5f23-40c8-9f11-7cf87a1886c3] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Dec 05 09:44:39 compute-1 nova_compute[189066]: 2025-12-05 09:44:39.539 189070 INFO nova.virt.libvirt.driver [None req-d179c2a5-18c7-43fe-8d46-cda14b25e832 fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] [instance: b185217d-5f23-40c8-9f11-7cf87a1886c3] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Dec 05 09:44:39 compute-1 nova_compute[189066]: 2025-12-05 09:44:39.562 189070 DEBUG nova.compute.manager [None req-d179c2a5-18c7-43fe-8d46-cda14b25e832 fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] [instance: b185217d-5f23-40c8-9f11-7cf87a1886c3] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Dec 05 09:44:39 compute-1 nova_compute[189066]: 2025-12-05 09:44:39.696 189070 DEBUG nova.compute.manager [None req-d179c2a5-18c7-43fe-8d46-cda14b25e832 fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] [instance: b185217d-5f23-40c8-9f11-7cf87a1886c3] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Dec 05 09:44:39 compute-1 nova_compute[189066]: 2025-12-05 09:44:39.698 189070 DEBUG nova.virt.libvirt.driver [None req-d179c2a5-18c7-43fe-8d46-cda14b25e832 fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] [instance: b185217d-5f23-40c8-9f11-7cf87a1886c3] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Dec 05 09:44:39 compute-1 nova_compute[189066]: 2025-12-05 09:44:39.699 189070 INFO nova.virt.libvirt.driver [None req-d179c2a5-18c7-43fe-8d46-cda14b25e832 fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] [instance: b185217d-5f23-40c8-9f11-7cf87a1886c3] Creating image(s)
Dec 05 09:44:39 compute-1 nova_compute[189066]: 2025-12-05 09:44:39.700 189070 DEBUG oslo_concurrency.lockutils [None req-d179c2a5-18c7-43fe-8d46-cda14b25e832 fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] Acquiring lock "/var/lib/nova/instances/b185217d-5f23-40c8-9f11-7cf87a1886c3/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:44:39 compute-1 nova_compute[189066]: 2025-12-05 09:44:39.700 189070 DEBUG oslo_concurrency.lockutils [None req-d179c2a5-18c7-43fe-8d46-cda14b25e832 fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] Lock "/var/lib/nova/instances/b185217d-5f23-40c8-9f11-7cf87a1886c3/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:44:39 compute-1 nova_compute[189066]: 2025-12-05 09:44:39.701 189070 DEBUG oslo_concurrency.lockutils [None req-d179c2a5-18c7-43fe-8d46-cda14b25e832 fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] Lock "/var/lib/nova/instances/b185217d-5f23-40c8-9f11-7cf87a1886c3/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:44:39 compute-1 nova_compute[189066]: 2025-12-05 09:44:39.717 189070 DEBUG oslo_concurrency.processutils [None req-d179c2a5-18c7-43fe-8d46-cda14b25e832 fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5b1bdb5cc50b9bf43ebe3094e9279a12a18df0ce --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 09:44:39 compute-1 nova_compute[189066]: 2025-12-05 09:44:39.747 189070 DEBUG nova.policy [None req-d179c2a5-18c7-43fe-8d46-cda14b25e832 fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'fae1c60e378945ea84b34c4824b835b1', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'fa1cd463d74b49139a088d332d37e611', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Dec 05 09:44:39 compute-1 nova_compute[189066]: 2025-12-05 09:44:39.782 189070 DEBUG oslo_concurrency.processutils [None req-d179c2a5-18c7-43fe-8d46-cda14b25e832 fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5b1bdb5cc50b9bf43ebe3094e9279a12a18df0ce --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 09:44:39 compute-1 nova_compute[189066]: 2025-12-05 09:44:39.783 189070 DEBUG oslo_concurrency.lockutils [None req-d179c2a5-18c7-43fe-8d46-cda14b25e832 fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] Acquiring lock "5b1bdb5cc50b9bf43ebe3094e9279a12a18df0ce" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:44:39 compute-1 nova_compute[189066]: 2025-12-05 09:44:39.783 189070 DEBUG oslo_concurrency.lockutils [None req-d179c2a5-18c7-43fe-8d46-cda14b25e832 fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] Lock "5b1bdb5cc50b9bf43ebe3094e9279a12a18df0ce" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:44:39 compute-1 nova_compute[189066]: 2025-12-05 09:44:39.796 189070 DEBUG oslo_concurrency.processutils [None req-d179c2a5-18c7-43fe-8d46-cda14b25e832 fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5b1bdb5cc50b9bf43ebe3094e9279a12a18df0ce --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 09:44:39 compute-1 nova_compute[189066]: 2025-12-05 09:44:39.860 189070 DEBUG oslo_concurrency.processutils [None req-d179c2a5-18c7-43fe-8d46-cda14b25e832 fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5b1bdb5cc50b9bf43ebe3094e9279a12a18df0ce --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 09:44:39 compute-1 nova_compute[189066]: 2025-12-05 09:44:39.861 189070 DEBUG oslo_concurrency.processutils [None req-d179c2a5-18c7-43fe-8d46-cda14b25e832 fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5b1bdb5cc50b9bf43ebe3094e9279a12a18df0ce,backing_fmt=raw /var/lib/nova/instances/b185217d-5f23-40c8-9f11-7cf87a1886c3/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 09:44:39 compute-1 nova_compute[189066]: 2025-12-05 09:44:39.900 189070 DEBUG oslo_concurrency.processutils [None req-d179c2a5-18c7-43fe-8d46-cda14b25e832 fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5b1bdb5cc50b9bf43ebe3094e9279a12a18df0ce,backing_fmt=raw /var/lib/nova/instances/b185217d-5f23-40c8-9f11-7cf87a1886c3/disk 1073741824" returned: 0 in 0.039s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 09:44:39 compute-1 nova_compute[189066]: 2025-12-05 09:44:39.902 189070 DEBUG oslo_concurrency.lockutils [None req-d179c2a5-18c7-43fe-8d46-cda14b25e832 fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] Lock "5b1bdb5cc50b9bf43ebe3094e9279a12a18df0ce" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.118s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:44:39 compute-1 nova_compute[189066]: 2025-12-05 09:44:39.903 189070 DEBUG oslo_concurrency.processutils [None req-d179c2a5-18c7-43fe-8d46-cda14b25e832 fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5b1bdb5cc50b9bf43ebe3094e9279a12a18df0ce --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 09:44:39 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:44:39.927 105272 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=27, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f2:d3:89', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '8e:43:2c:7d:20:dd'}, ipsec=False) old=SB_Global(nb_cfg=26) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 09:44:39 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:44:39.928 105272 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Dec 05 09:44:39 compute-1 nova_compute[189066]: 2025-12-05 09:44:39.929 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:44:39 compute-1 nova_compute[189066]: 2025-12-05 09:44:39.975 189070 DEBUG oslo_concurrency.processutils [None req-d179c2a5-18c7-43fe-8d46-cda14b25e832 fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5b1bdb5cc50b9bf43ebe3094e9279a12a18df0ce --force-share --output=json" returned: 0 in 0.073s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 09:44:39 compute-1 nova_compute[189066]: 2025-12-05 09:44:39.977 189070 DEBUG nova.virt.disk.api [None req-d179c2a5-18c7-43fe-8d46-cda14b25e832 fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] Checking if we can resize image /var/lib/nova/instances/b185217d-5f23-40c8-9f11-7cf87a1886c3/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Dec 05 09:44:39 compute-1 nova_compute[189066]: 2025-12-05 09:44:39.977 189070 DEBUG oslo_concurrency.processutils [None req-d179c2a5-18c7-43fe-8d46-cda14b25e832 fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b185217d-5f23-40c8-9f11-7cf87a1886c3/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 09:44:40 compute-1 nova_compute[189066]: 2025-12-05 09:44:40.044 189070 DEBUG oslo_concurrency.processutils [None req-d179c2a5-18c7-43fe-8d46-cda14b25e832 fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b185217d-5f23-40c8-9f11-7cf87a1886c3/disk --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 09:44:40 compute-1 nova_compute[189066]: 2025-12-05 09:44:40.046 189070 DEBUG nova.virt.disk.api [None req-d179c2a5-18c7-43fe-8d46-cda14b25e832 fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] Cannot resize image /var/lib/nova/instances/b185217d-5f23-40c8-9f11-7cf87a1886c3/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Dec 05 09:44:40 compute-1 nova_compute[189066]: 2025-12-05 09:44:40.046 189070 DEBUG nova.objects.instance [None req-d179c2a5-18c7-43fe-8d46-cda14b25e832 fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] Lazy-loading 'migration_context' on Instance uuid b185217d-5f23-40c8-9f11-7cf87a1886c3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 09:44:40 compute-1 nova_compute[189066]: 2025-12-05 09:44:40.074 189070 DEBUG nova.virt.libvirt.driver [None req-d179c2a5-18c7-43fe-8d46-cda14b25e832 fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] [instance: b185217d-5f23-40c8-9f11-7cf87a1886c3] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Dec 05 09:44:40 compute-1 nova_compute[189066]: 2025-12-05 09:44:40.074 189070 DEBUG nova.virt.libvirt.driver [None req-d179c2a5-18c7-43fe-8d46-cda14b25e832 fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] [instance: b185217d-5f23-40c8-9f11-7cf87a1886c3] Ensure instance console log exists: /var/lib/nova/instances/b185217d-5f23-40c8-9f11-7cf87a1886c3/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Dec 05 09:44:40 compute-1 nova_compute[189066]: 2025-12-05 09:44:40.075 189070 DEBUG oslo_concurrency.lockutils [None req-d179c2a5-18c7-43fe-8d46-cda14b25e832 fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:44:40 compute-1 nova_compute[189066]: 2025-12-05 09:44:40.075 189070 DEBUG oslo_concurrency.lockutils [None req-d179c2a5-18c7-43fe-8d46-cda14b25e832 fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:44:40 compute-1 nova_compute[189066]: 2025-12-05 09:44:40.076 189070 DEBUG oslo_concurrency.lockutils [None req-d179c2a5-18c7-43fe-8d46-cda14b25e832 fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:44:40 compute-1 podman[232859]: 2025-12-05 09:44:40.625487346 +0000 UTC m=+0.063875195 container health_status e8cea6352187f28e672aa25c227f2705a90d58074b8cf42235a56df5d4026fd5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a42d21893a42142b0b00e96296a13ac9d2c0fedf55d6438ad104fd0309c63376-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3)
Dec 05 09:44:41 compute-1 nova_compute[189066]: 2025-12-05 09:44:41.043 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:44:41 compute-1 nova_compute[189066]: 2025-12-05 09:44:41.437 189070 DEBUG nova.network.neutron [None req-d179c2a5-18c7-43fe-8d46-cda14b25e832 fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] [instance: b185217d-5f23-40c8-9f11-7cf87a1886c3] Successfully created port: 5db43502-374a-4ff3-8bcc-a41bb1ae8440 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Dec 05 09:44:42 compute-1 nova_compute[189066]: 2025-12-05 09:44:42.360 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:44:42 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:44:42.931 105272 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=2deaed7a-68f6-453c-b7f8-10ef033f3762, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '27'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 09:44:43 compute-1 podman[232881]: 2025-12-05 09:44:43.653324166 +0000 UTC m=+0.081920557 container health_status 3f3288243aefe700d53cffce184353529fbaf6e44f1c19ec4792ed576af41176 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Dec 05 09:44:43 compute-1 nova_compute[189066]: 2025-12-05 09:44:43.815 189070 DEBUG nova.network.neutron [None req-d179c2a5-18c7-43fe-8d46-cda14b25e832 fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] [instance: b185217d-5f23-40c8-9f11-7cf87a1886c3] Successfully updated port: 5db43502-374a-4ff3-8bcc-a41bb1ae8440 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Dec 05 09:44:43 compute-1 nova_compute[189066]: 2025-12-05 09:44:43.832 189070 DEBUG oslo_concurrency.lockutils [None req-d179c2a5-18c7-43fe-8d46-cda14b25e832 fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] Acquiring lock "refresh_cache-b185217d-5f23-40c8-9f11-7cf87a1886c3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 09:44:43 compute-1 nova_compute[189066]: 2025-12-05 09:44:43.833 189070 DEBUG oslo_concurrency.lockutils [None req-d179c2a5-18c7-43fe-8d46-cda14b25e832 fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] Acquired lock "refresh_cache-b185217d-5f23-40c8-9f11-7cf87a1886c3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 09:44:43 compute-1 nova_compute[189066]: 2025-12-05 09:44:43.833 189070 DEBUG nova.network.neutron [None req-d179c2a5-18c7-43fe-8d46-cda14b25e832 fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] [instance: b185217d-5f23-40c8-9f11-7cf87a1886c3] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Dec 05 09:44:43 compute-1 nova_compute[189066]: 2025-12-05 09:44:43.946 189070 DEBUG nova.compute.manager [req-a8743a21-7ced-4766-bd7a-fb17a298273f req-289bcffd-0d39-4500-9d9a-b4486d878fd1 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: b185217d-5f23-40c8-9f11-7cf87a1886c3] Received event network-changed-5db43502-374a-4ff3-8bcc-a41bb1ae8440 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 09:44:43 compute-1 nova_compute[189066]: 2025-12-05 09:44:43.946 189070 DEBUG nova.compute.manager [req-a8743a21-7ced-4766-bd7a-fb17a298273f req-289bcffd-0d39-4500-9d9a-b4486d878fd1 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: b185217d-5f23-40c8-9f11-7cf87a1886c3] Refreshing instance network info cache due to event network-changed-5db43502-374a-4ff3-8bcc-a41bb1ae8440. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Dec 05 09:44:43 compute-1 nova_compute[189066]: 2025-12-05 09:44:43.947 189070 DEBUG oslo_concurrency.lockutils [req-a8743a21-7ced-4766-bd7a-fb17a298273f req-289bcffd-0d39-4500-9d9a-b4486d878fd1 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Acquiring lock "refresh_cache-b185217d-5f23-40c8-9f11-7cf87a1886c3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 09:44:44 compute-1 nova_compute[189066]: 2025-12-05 09:44:44.056 189070 DEBUG nova.network.neutron [None req-d179c2a5-18c7-43fe-8d46-cda14b25e832 fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] [instance: b185217d-5f23-40c8-9f11-7cf87a1886c3] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Dec 05 09:44:44 compute-1 sshd-session[232878]: Received disconnect from 101.47.162.91 port 57098:11: Bye Bye [preauth]
Dec 05 09:44:44 compute-1 sshd-session[232878]: Disconnected from authenticating user root 101.47.162.91 port 57098 [preauth]
Dec 05 09:44:46 compute-1 nova_compute[189066]: 2025-12-05 09:44:46.090 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:44:46 compute-1 nova_compute[189066]: 2025-12-05 09:44:46.445 189070 DEBUG nova.network.neutron [None req-d179c2a5-18c7-43fe-8d46-cda14b25e832 fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] [instance: b185217d-5f23-40c8-9f11-7cf87a1886c3] Updating instance_info_cache with network_info: [{"id": "5db43502-374a-4ff3-8bcc-a41bb1ae8440", "address": "fa:16:3e:47:d0:61", "network": {"id": "ae013098-2701-4ae8-8621-a0515ff9b432", "bridge": "br-int", "label": "tempest-network-smoke--758374883", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe47:d061", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fa1cd463d74b49139a088d332d37e611", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5db43502-37", "ovs_interfaceid": "5db43502-374a-4ff3-8bcc-a41bb1ae8440", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 09:44:46 compute-1 nova_compute[189066]: 2025-12-05 09:44:46.480 189070 DEBUG oslo_concurrency.lockutils [None req-d179c2a5-18c7-43fe-8d46-cda14b25e832 fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] Releasing lock "refresh_cache-b185217d-5f23-40c8-9f11-7cf87a1886c3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 09:44:46 compute-1 nova_compute[189066]: 2025-12-05 09:44:46.481 189070 DEBUG nova.compute.manager [None req-d179c2a5-18c7-43fe-8d46-cda14b25e832 fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] [instance: b185217d-5f23-40c8-9f11-7cf87a1886c3] Instance network_info: |[{"id": "5db43502-374a-4ff3-8bcc-a41bb1ae8440", "address": "fa:16:3e:47:d0:61", "network": {"id": "ae013098-2701-4ae8-8621-a0515ff9b432", "bridge": "br-int", "label": "tempest-network-smoke--758374883", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe47:d061", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fa1cd463d74b49139a088d332d37e611", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5db43502-37", "ovs_interfaceid": "5db43502-374a-4ff3-8bcc-a41bb1ae8440", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Dec 05 09:44:46 compute-1 nova_compute[189066]: 2025-12-05 09:44:46.482 189070 DEBUG oslo_concurrency.lockutils [req-a8743a21-7ced-4766-bd7a-fb17a298273f req-289bcffd-0d39-4500-9d9a-b4486d878fd1 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Acquired lock "refresh_cache-b185217d-5f23-40c8-9f11-7cf87a1886c3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 09:44:46 compute-1 nova_compute[189066]: 2025-12-05 09:44:46.482 189070 DEBUG nova.network.neutron [req-a8743a21-7ced-4766-bd7a-fb17a298273f req-289bcffd-0d39-4500-9d9a-b4486d878fd1 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: b185217d-5f23-40c8-9f11-7cf87a1886c3] Refreshing network info cache for port 5db43502-374a-4ff3-8bcc-a41bb1ae8440 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Dec 05 09:44:46 compute-1 nova_compute[189066]: 2025-12-05 09:44:46.486 189070 DEBUG nova.virt.libvirt.driver [None req-d179c2a5-18c7-43fe-8d46-cda14b25e832 fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] [instance: b185217d-5f23-40c8-9f11-7cf87a1886c3] Start _get_guest_xml network_info=[{"id": "5db43502-374a-4ff3-8bcc-a41bb1ae8440", "address": "fa:16:3e:47:d0:61", "network": {"id": "ae013098-2701-4ae8-8621-a0515ff9b432", "bridge": "br-int", "label": "tempest-network-smoke--758374883", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe47:d061", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fa1cd463d74b49139a088d332d37e611", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5db43502-37", "ovs_interfaceid": "5db43502-374a-4ff3-8bcc-a41bb1ae8440", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T09:19:04Z,direct_url=<?>,disk_format='qcow2',id=3ebffd97-b242-42d7-b245-ebdaf8e4377c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='1cb29f3743454b7fb71af92993455437',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T09:19:06Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encrypted': False, 'encryption_secret_uuid': None, 'size': 0, 'boot_index': 0, 'device_name': '/dev/vda', 'disk_bus': 'virtio', 'guest_format': None, 'encryption_options': None, 'encryption_format': None, 'device_type': 'disk', 'image_id': '3ebffd97-b242-42d7-b245-ebdaf8e4377c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Dec 05 09:44:46 compute-1 nova_compute[189066]: 2025-12-05 09:44:46.492 189070 WARNING nova.virt.libvirt.driver [None req-d179c2a5-18c7-43fe-8d46-cda14b25e832 fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 05 09:44:46 compute-1 nova_compute[189066]: 2025-12-05 09:44:46.500 189070 DEBUG nova.virt.libvirt.host [None req-d179c2a5-18c7-43fe-8d46-cda14b25e832 fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Dec 05 09:44:46 compute-1 nova_compute[189066]: 2025-12-05 09:44:46.501 189070 DEBUG nova.virt.libvirt.host [None req-d179c2a5-18c7-43fe-8d46-cda14b25e832 fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Dec 05 09:44:46 compute-1 nova_compute[189066]: 2025-12-05 09:44:46.510 189070 DEBUG nova.virt.libvirt.host [None req-d179c2a5-18c7-43fe-8d46-cda14b25e832 fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Dec 05 09:44:46 compute-1 nova_compute[189066]: 2025-12-05 09:44:46.511 189070 DEBUG nova.virt.libvirt.host [None req-d179c2a5-18c7-43fe-8d46-cda14b25e832 fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Dec 05 09:44:46 compute-1 nova_compute[189066]: 2025-12-05 09:44:46.512 189070 DEBUG nova.virt.libvirt.driver [None req-d179c2a5-18c7-43fe-8d46-cda14b25e832 fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Dec 05 09:44:46 compute-1 nova_compute[189066]: 2025-12-05 09:44:46.513 189070 DEBUG nova.virt.hardware [None req-d179c2a5-18c7-43fe-8d46-cda14b25e832 fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-05T09:19:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='fbadeab4-f24f-4100-963a-d228b2a6f7c4',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T09:19:04Z,direct_url=<?>,disk_format='qcow2',id=3ebffd97-b242-42d7-b245-ebdaf8e4377c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='1cb29f3743454b7fb71af92993455437',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T09:19:06Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Dec 05 09:44:46 compute-1 nova_compute[189066]: 2025-12-05 09:44:46.513 189070 DEBUG nova.virt.hardware [None req-d179c2a5-18c7-43fe-8d46-cda14b25e832 fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Dec 05 09:44:46 compute-1 nova_compute[189066]: 2025-12-05 09:44:46.513 189070 DEBUG nova.virt.hardware [None req-d179c2a5-18c7-43fe-8d46-cda14b25e832 fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Dec 05 09:44:46 compute-1 nova_compute[189066]: 2025-12-05 09:44:46.514 189070 DEBUG nova.virt.hardware [None req-d179c2a5-18c7-43fe-8d46-cda14b25e832 fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Dec 05 09:44:46 compute-1 nova_compute[189066]: 2025-12-05 09:44:46.514 189070 DEBUG nova.virt.hardware [None req-d179c2a5-18c7-43fe-8d46-cda14b25e832 fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Dec 05 09:44:46 compute-1 nova_compute[189066]: 2025-12-05 09:44:46.514 189070 DEBUG nova.virt.hardware [None req-d179c2a5-18c7-43fe-8d46-cda14b25e832 fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Dec 05 09:44:46 compute-1 nova_compute[189066]: 2025-12-05 09:44:46.514 189070 DEBUG nova.virt.hardware [None req-d179c2a5-18c7-43fe-8d46-cda14b25e832 fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Dec 05 09:44:46 compute-1 nova_compute[189066]: 2025-12-05 09:44:46.514 189070 DEBUG nova.virt.hardware [None req-d179c2a5-18c7-43fe-8d46-cda14b25e832 fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Dec 05 09:44:46 compute-1 nova_compute[189066]: 2025-12-05 09:44:46.515 189070 DEBUG nova.virt.hardware [None req-d179c2a5-18c7-43fe-8d46-cda14b25e832 fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Dec 05 09:44:46 compute-1 nova_compute[189066]: 2025-12-05 09:44:46.515 189070 DEBUG nova.virt.hardware [None req-d179c2a5-18c7-43fe-8d46-cda14b25e832 fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Dec 05 09:44:46 compute-1 nova_compute[189066]: 2025-12-05 09:44:46.515 189070 DEBUG nova.virt.hardware [None req-d179c2a5-18c7-43fe-8d46-cda14b25e832 fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Dec 05 09:44:46 compute-1 nova_compute[189066]: 2025-12-05 09:44:46.519 189070 DEBUG nova.virt.libvirt.vif [None req-d179c2a5-18c7-43fe-8d46-cda14b25e832 fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T09:44:37Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-252150820',display_name='tempest-TestGettingAddress-server-252150820',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testgettingaddress-server-252150820',id=50,image_ref='3ebffd97-b242-42d7-b245-ebdaf8e4377c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBLHoHBnhl39mFqNwjOALF3l4BETvPQ2aaZWcNux2LXAx5w8R9/0SIqzLs8h9GojCCKVzXry2j15SmWDirbWSgGInNAL1B7ZEoog4LHyeWtsxbXu6L3ScevWo6qmZ3/Z1CQ==',key_name='tempest-TestGettingAddress-998480094',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='fa1cd463d74b49139a088d332d37e611',ramdisk_id='',reservation_id='r-ok2ytoz0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='3ebffd97-b242-42d7-b245-ebdaf8e4377c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-8368731',owner_user_name='tempest-TestGettingAddress-8368731-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T09:44:39Z,user_data=None,user_id='fae1c60e378945ea84b34c4824b835b1',uuid=b185217d-5f23-40c8-9f11-7cf87a1886c3,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "5db43502-374a-4ff3-8bcc-a41bb1ae8440", "address": "fa:16:3e:47:d0:61", "network": {"id": "ae013098-2701-4ae8-8621-a0515ff9b432", "bridge": "br-int", "label": "tempest-network-smoke--758374883", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe47:d061", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fa1cd463d74b49139a088d332d37e611", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5db43502-37", "ovs_interfaceid": "5db43502-374a-4ff3-8bcc-a41bb1ae8440", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Dec 05 09:44:46 compute-1 nova_compute[189066]: 2025-12-05 09:44:46.520 189070 DEBUG nova.network.os_vif_util [None req-d179c2a5-18c7-43fe-8d46-cda14b25e832 fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] Converting VIF {"id": "5db43502-374a-4ff3-8bcc-a41bb1ae8440", "address": "fa:16:3e:47:d0:61", "network": {"id": "ae013098-2701-4ae8-8621-a0515ff9b432", "bridge": "br-int", "label": "tempest-network-smoke--758374883", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe47:d061", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fa1cd463d74b49139a088d332d37e611", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5db43502-37", "ovs_interfaceid": "5db43502-374a-4ff3-8bcc-a41bb1ae8440", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 09:44:46 compute-1 nova_compute[189066]: 2025-12-05 09:44:46.521 189070 DEBUG nova.network.os_vif_util [None req-d179c2a5-18c7-43fe-8d46-cda14b25e832 fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:47:d0:61,bridge_name='br-int',has_traffic_filtering=True,id=5db43502-374a-4ff3-8bcc-a41bb1ae8440,network=Network(ae013098-2701-4ae8-8621-a0515ff9b432),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5db43502-37') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 09:44:46 compute-1 nova_compute[189066]: 2025-12-05 09:44:46.522 189070 DEBUG nova.objects.instance [None req-d179c2a5-18c7-43fe-8d46-cda14b25e832 fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] Lazy-loading 'pci_devices' on Instance uuid b185217d-5f23-40c8-9f11-7cf87a1886c3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 09:44:46 compute-1 nova_compute[189066]: 2025-12-05 09:44:46.548 189070 DEBUG nova.virt.libvirt.driver [None req-d179c2a5-18c7-43fe-8d46-cda14b25e832 fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] [instance: b185217d-5f23-40c8-9f11-7cf87a1886c3] End _get_guest_xml xml=<domain type="kvm">
Dec 05 09:44:46 compute-1 nova_compute[189066]:   <uuid>b185217d-5f23-40c8-9f11-7cf87a1886c3</uuid>
Dec 05 09:44:46 compute-1 nova_compute[189066]:   <name>instance-00000032</name>
Dec 05 09:44:46 compute-1 nova_compute[189066]:   <memory>131072</memory>
Dec 05 09:44:46 compute-1 nova_compute[189066]:   <vcpu>1</vcpu>
Dec 05 09:44:46 compute-1 nova_compute[189066]:   <metadata>
Dec 05 09:44:46 compute-1 nova_compute[189066]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 05 09:44:46 compute-1 nova_compute[189066]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec 05 09:44:46 compute-1 nova_compute[189066]:       <nova:name>tempest-TestGettingAddress-server-252150820</nova:name>
Dec 05 09:44:46 compute-1 nova_compute[189066]:       <nova:creationTime>2025-12-05 09:44:46</nova:creationTime>
Dec 05 09:44:46 compute-1 nova_compute[189066]:       <nova:flavor name="m1.nano">
Dec 05 09:44:46 compute-1 nova_compute[189066]:         <nova:memory>128</nova:memory>
Dec 05 09:44:46 compute-1 nova_compute[189066]:         <nova:disk>1</nova:disk>
Dec 05 09:44:46 compute-1 nova_compute[189066]:         <nova:swap>0</nova:swap>
Dec 05 09:44:46 compute-1 nova_compute[189066]:         <nova:ephemeral>0</nova:ephemeral>
Dec 05 09:44:46 compute-1 nova_compute[189066]:         <nova:vcpus>1</nova:vcpus>
Dec 05 09:44:46 compute-1 nova_compute[189066]:       </nova:flavor>
Dec 05 09:44:46 compute-1 nova_compute[189066]:       <nova:owner>
Dec 05 09:44:46 compute-1 nova_compute[189066]:         <nova:user uuid="fae1c60e378945ea84b34c4824b835b1">tempest-TestGettingAddress-8368731-project-member</nova:user>
Dec 05 09:44:46 compute-1 nova_compute[189066]:         <nova:project uuid="fa1cd463d74b49139a088d332d37e611">tempest-TestGettingAddress-8368731</nova:project>
Dec 05 09:44:46 compute-1 nova_compute[189066]:       </nova:owner>
Dec 05 09:44:46 compute-1 nova_compute[189066]:       <nova:root type="image" uuid="3ebffd97-b242-42d7-b245-ebdaf8e4377c"/>
Dec 05 09:44:46 compute-1 nova_compute[189066]:       <nova:ports>
Dec 05 09:44:46 compute-1 nova_compute[189066]:         <nova:port uuid="5db43502-374a-4ff3-8bcc-a41bb1ae8440">
Dec 05 09:44:46 compute-1 nova_compute[189066]:           <nova:ip type="fixed" address="2001:db8::f816:3eff:fe47:d061" ipVersion="6"/>
Dec 05 09:44:46 compute-1 nova_compute[189066]:           <nova:ip type="fixed" address="10.100.0.4" ipVersion="4"/>
Dec 05 09:44:46 compute-1 nova_compute[189066]:         </nova:port>
Dec 05 09:44:46 compute-1 nova_compute[189066]:       </nova:ports>
Dec 05 09:44:46 compute-1 nova_compute[189066]:     </nova:instance>
Dec 05 09:44:46 compute-1 nova_compute[189066]:   </metadata>
Dec 05 09:44:46 compute-1 nova_compute[189066]:   <sysinfo type="smbios">
Dec 05 09:44:46 compute-1 nova_compute[189066]:     <system>
Dec 05 09:44:46 compute-1 nova_compute[189066]:       <entry name="manufacturer">RDO</entry>
Dec 05 09:44:46 compute-1 nova_compute[189066]:       <entry name="product">OpenStack Compute</entry>
Dec 05 09:44:46 compute-1 nova_compute[189066]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec 05 09:44:46 compute-1 nova_compute[189066]:       <entry name="serial">b185217d-5f23-40c8-9f11-7cf87a1886c3</entry>
Dec 05 09:44:46 compute-1 nova_compute[189066]:       <entry name="uuid">b185217d-5f23-40c8-9f11-7cf87a1886c3</entry>
Dec 05 09:44:46 compute-1 nova_compute[189066]:       <entry name="family">Virtual Machine</entry>
Dec 05 09:44:46 compute-1 nova_compute[189066]:     </system>
Dec 05 09:44:46 compute-1 nova_compute[189066]:   </sysinfo>
Dec 05 09:44:46 compute-1 nova_compute[189066]:   <os>
Dec 05 09:44:46 compute-1 nova_compute[189066]:     <type arch="x86_64" machine="q35">hvm</type>
Dec 05 09:44:46 compute-1 nova_compute[189066]:     <boot dev="hd"/>
Dec 05 09:44:46 compute-1 nova_compute[189066]:     <smbios mode="sysinfo"/>
Dec 05 09:44:46 compute-1 nova_compute[189066]:   </os>
Dec 05 09:44:46 compute-1 nova_compute[189066]:   <features>
Dec 05 09:44:46 compute-1 nova_compute[189066]:     <acpi/>
Dec 05 09:44:46 compute-1 nova_compute[189066]:     <apic/>
Dec 05 09:44:46 compute-1 nova_compute[189066]:     <vmcoreinfo/>
Dec 05 09:44:46 compute-1 nova_compute[189066]:   </features>
Dec 05 09:44:46 compute-1 nova_compute[189066]:   <clock offset="utc">
Dec 05 09:44:46 compute-1 nova_compute[189066]:     <timer name="pit" tickpolicy="delay"/>
Dec 05 09:44:46 compute-1 nova_compute[189066]:     <timer name="rtc" tickpolicy="catchup"/>
Dec 05 09:44:46 compute-1 nova_compute[189066]:     <timer name="hpet" present="no"/>
Dec 05 09:44:46 compute-1 nova_compute[189066]:   </clock>
Dec 05 09:44:46 compute-1 nova_compute[189066]:   <cpu mode="custom" match="exact">
Dec 05 09:44:46 compute-1 nova_compute[189066]:     <model>Nehalem</model>
Dec 05 09:44:46 compute-1 nova_compute[189066]:     <topology sockets="1" cores="1" threads="1"/>
Dec 05 09:44:46 compute-1 nova_compute[189066]:   </cpu>
Dec 05 09:44:46 compute-1 nova_compute[189066]:   <devices>
Dec 05 09:44:46 compute-1 nova_compute[189066]:     <disk type="file" device="disk">
Dec 05 09:44:46 compute-1 nova_compute[189066]:       <driver name="qemu" type="qcow2" cache="none"/>
Dec 05 09:44:46 compute-1 nova_compute[189066]:       <source file="/var/lib/nova/instances/b185217d-5f23-40c8-9f11-7cf87a1886c3/disk"/>
Dec 05 09:44:46 compute-1 nova_compute[189066]:       <target dev="vda" bus="virtio"/>
Dec 05 09:44:46 compute-1 nova_compute[189066]:     </disk>
Dec 05 09:44:46 compute-1 nova_compute[189066]:     <disk type="file" device="cdrom">
Dec 05 09:44:46 compute-1 nova_compute[189066]:       <driver name="qemu" type="raw" cache="none"/>
Dec 05 09:44:46 compute-1 nova_compute[189066]:       <source file="/var/lib/nova/instances/b185217d-5f23-40c8-9f11-7cf87a1886c3/disk.config"/>
Dec 05 09:44:46 compute-1 nova_compute[189066]:       <target dev="sda" bus="sata"/>
Dec 05 09:44:46 compute-1 nova_compute[189066]:     </disk>
Dec 05 09:44:46 compute-1 nova_compute[189066]:     <interface type="ethernet">
Dec 05 09:44:46 compute-1 nova_compute[189066]:       <mac address="fa:16:3e:47:d0:61"/>
Dec 05 09:44:46 compute-1 nova_compute[189066]:       <model type="virtio"/>
Dec 05 09:44:46 compute-1 nova_compute[189066]:       <driver name="vhost" rx_queue_size="512"/>
Dec 05 09:44:46 compute-1 nova_compute[189066]:       <mtu size="1442"/>
Dec 05 09:44:46 compute-1 nova_compute[189066]:       <target dev="tap5db43502-37"/>
Dec 05 09:44:46 compute-1 nova_compute[189066]:     </interface>
Dec 05 09:44:46 compute-1 nova_compute[189066]:     <serial type="pty">
Dec 05 09:44:46 compute-1 nova_compute[189066]:       <log file="/var/lib/nova/instances/b185217d-5f23-40c8-9f11-7cf87a1886c3/console.log" append="off"/>
Dec 05 09:44:46 compute-1 nova_compute[189066]:     </serial>
Dec 05 09:44:46 compute-1 nova_compute[189066]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Dec 05 09:44:46 compute-1 nova_compute[189066]:     <video>
Dec 05 09:44:46 compute-1 nova_compute[189066]:       <model type="virtio"/>
Dec 05 09:44:46 compute-1 nova_compute[189066]:     </video>
Dec 05 09:44:46 compute-1 nova_compute[189066]:     <input type="tablet" bus="usb"/>
Dec 05 09:44:46 compute-1 nova_compute[189066]:     <rng model="virtio">
Dec 05 09:44:46 compute-1 nova_compute[189066]:       <backend model="random">/dev/urandom</backend>
Dec 05 09:44:46 compute-1 nova_compute[189066]:     </rng>
Dec 05 09:44:46 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root"/>
Dec 05 09:44:46 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:44:46 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:44:46 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:44:46 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:44:46 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:44:46 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:44:46 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:44:46 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:44:46 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:44:46 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:44:46 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:44:46 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:44:46 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:44:46 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:44:46 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:44:46 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:44:46 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:44:46 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:44:46 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:44:46 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:44:46 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:44:46 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:44:46 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:44:46 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:44:46 compute-1 nova_compute[189066]:     <controller type="usb" index="0"/>
Dec 05 09:44:46 compute-1 nova_compute[189066]:     <memballoon model="virtio">
Dec 05 09:44:46 compute-1 nova_compute[189066]:       <stats period="10"/>
Dec 05 09:44:46 compute-1 nova_compute[189066]:     </memballoon>
Dec 05 09:44:46 compute-1 nova_compute[189066]:   </devices>
Dec 05 09:44:46 compute-1 nova_compute[189066]: </domain>
Dec 05 09:44:46 compute-1 nova_compute[189066]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Dec 05 09:44:46 compute-1 nova_compute[189066]: 2025-12-05 09:44:46.550 189070 DEBUG nova.compute.manager [None req-d179c2a5-18c7-43fe-8d46-cda14b25e832 fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] [instance: b185217d-5f23-40c8-9f11-7cf87a1886c3] Preparing to wait for external event network-vif-plugged-5db43502-374a-4ff3-8bcc-a41bb1ae8440 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Dec 05 09:44:46 compute-1 nova_compute[189066]: 2025-12-05 09:44:46.551 189070 DEBUG oslo_concurrency.lockutils [None req-d179c2a5-18c7-43fe-8d46-cda14b25e832 fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] Acquiring lock "b185217d-5f23-40c8-9f11-7cf87a1886c3-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:44:46 compute-1 nova_compute[189066]: 2025-12-05 09:44:46.551 189070 DEBUG oslo_concurrency.lockutils [None req-d179c2a5-18c7-43fe-8d46-cda14b25e832 fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] Lock "b185217d-5f23-40c8-9f11-7cf87a1886c3-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:44:46 compute-1 nova_compute[189066]: 2025-12-05 09:44:46.551 189070 DEBUG oslo_concurrency.lockutils [None req-d179c2a5-18c7-43fe-8d46-cda14b25e832 fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] Lock "b185217d-5f23-40c8-9f11-7cf87a1886c3-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:44:46 compute-1 nova_compute[189066]: 2025-12-05 09:44:46.553 189070 DEBUG nova.virt.libvirt.vif [None req-d179c2a5-18c7-43fe-8d46-cda14b25e832 fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T09:44:37Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-252150820',display_name='tempest-TestGettingAddress-server-252150820',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testgettingaddress-server-252150820',id=50,image_ref='3ebffd97-b242-42d7-b245-ebdaf8e4377c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBLHoHBnhl39mFqNwjOALF3l4BETvPQ2aaZWcNux2LXAx5w8R9/0SIqzLs8h9GojCCKVzXry2j15SmWDirbWSgGInNAL1B7ZEoog4LHyeWtsxbXu6L3ScevWo6qmZ3/Z1CQ==',key_name='tempest-TestGettingAddress-998480094',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='fa1cd463d74b49139a088d332d37e611',ramdisk_id='',reservation_id='r-ok2ytoz0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='3ebffd97-b242-42d7-b245-ebdaf8e4377c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-8368731',owner_user_name='tempest-TestGettingAddress-8368731-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T09:44:39Z,user_data=None,user_id='fae1c60e378945ea84b34c4824b835b1',uuid=b185217d-5f23-40c8-9f11-7cf87a1886c3,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "5db43502-374a-4ff3-8bcc-a41bb1ae8440", "address": "fa:16:3e:47:d0:61", "network": {"id": "ae013098-2701-4ae8-8621-a0515ff9b432", "bridge": "br-int", "label": "tempest-network-smoke--758374883", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe47:d061", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fa1cd463d74b49139a088d332d37e611", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5db43502-37", "ovs_interfaceid": "5db43502-374a-4ff3-8bcc-a41bb1ae8440", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Dec 05 09:44:46 compute-1 nova_compute[189066]: 2025-12-05 09:44:46.553 189070 DEBUG nova.network.os_vif_util [None req-d179c2a5-18c7-43fe-8d46-cda14b25e832 fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] Converting VIF {"id": "5db43502-374a-4ff3-8bcc-a41bb1ae8440", "address": "fa:16:3e:47:d0:61", "network": {"id": "ae013098-2701-4ae8-8621-a0515ff9b432", "bridge": "br-int", "label": "tempest-network-smoke--758374883", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe47:d061", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fa1cd463d74b49139a088d332d37e611", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5db43502-37", "ovs_interfaceid": "5db43502-374a-4ff3-8bcc-a41bb1ae8440", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 09:44:46 compute-1 nova_compute[189066]: 2025-12-05 09:44:46.555 189070 DEBUG nova.network.os_vif_util [None req-d179c2a5-18c7-43fe-8d46-cda14b25e832 fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:47:d0:61,bridge_name='br-int',has_traffic_filtering=True,id=5db43502-374a-4ff3-8bcc-a41bb1ae8440,network=Network(ae013098-2701-4ae8-8621-a0515ff9b432),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5db43502-37') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 09:44:46 compute-1 nova_compute[189066]: 2025-12-05 09:44:46.556 189070 DEBUG os_vif [None req-d179c2a5-18c7-43fe-8d46-cda14b25e832 fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:47:d0:61,bridge_name='br-int',has_traffic_filtering=True,id=5db43502-374a-4ff3-8bcc-a41bb1ae8440,network=Network(ae013098-2701-4ae8-8621-a0515ff9b432),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5db43502-37') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Dec 05 09:44:46 compute-1 nova_compute[189066]: 2025-12-05 09:44:46.557 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:44:46 compute-1 nova_compute[189066]: 2025-12-05 09:44:46.557 189070 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 09:44:46 compute-1 nova_compute[189066]: 2025-12-05 09:44:46.558 189070 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 09:44:46 compute-1 nova_compute[189066]: 2025-12-05 09:44:46.567 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:44:46 compute-1 nova_compute[189066]: 2025-12-05 09:44:46.568 189070 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap5db43502-37, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 09:44:46 compute-1 nova_compute[189066]: 2025-12-05 09:44:46.568 189070 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap5db43502-37, col_values=(('external_ids', {'iface-id': '5db43502-374a-4ff3-8bcc-a41bb1ae8440', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:47:d0:61', 'vm-uuid': 'b185217d-5f23-40c8-9f11-7cf87a1886c3'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 09:44:46 compute-1 nova_compute[189066]: 2025-12-05 09:44:46.570 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:44:46 compute-1 nova_compute[189066]: 2025-12-05 09:44:46.573 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 09:44:46 compute-1 NetworkManager[55704]: <info>  [1764927886.5756] manager: (tap5db43502-37): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/131)
Dec 05 09:44:46 compute-1 nova_compute[189066]: 2025-12-05 09:44:46.578 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:44:46 compute-1 nova_compute[189066]: 2025-12-05 09:44:46.579 189070 INFO os_vif [None req-d179c2a5-18c7-43fe-8d46-cda14b25e832 fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:47:d0:61,bridge_name='br-int',has_traffic_filtering=True,id=5db43502-374a-4ff3-8bcc-a41bb1ae8440,network=Network(ae013098-2701-4ae8-8621-a0515ff9b432),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5db43502-37')
Dec 05 09:44:46 compute-1 podman[232901]: 2025-12-05 09:44:46.629995353 +0000 UTC m=+0.071279627 container health_status b07f36300303a5c1729056a61e344d6c6fd9b2e404082d8e8ce7185d45652c2b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, release=1755695350, container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, vcs-type=git, build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.buildah.version=1.33.7, name=ubi9-minimal, distribution-scope=public, version=9.6, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, maintainer=Red Hat, Inc., io.openshift.expose-services=, managed_by=edpm_ansible, config_id=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Dec 05 09:44:46 compute-1 nova_compute[189066]: 2025-12-05 09:44:46.646 189070 DEBUG nova.virt.libvirt.driver [None req-d179c2a5-18c7-43fe-8d46-cda14b25e832 fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 05 09:44:46 compute-1 nova_compute[189066]: 2025-12-05 09:44:46.647 189070 DEBUG nova.virt.libvirt.driver [None req-d179c2a5-18c7-43fe-8d46-cda14b25e832 fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 05 09:44:46 compute-1 nova_compute[189066]: 2025-12-05 09:44:46.647 189070 DEBUG nova.virt.libvirt.driver [None req-d179c2a5-18c7-43fe-8d46-cda14b25e832 fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] No VIF found with MAC fa:16:3e:47:d0:61, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Dec 05 09:44:46 compute-1 nova_compute[189066]: 2025-12-05 09:44:46.648 189070 INFO nova.virt.libvirt.driver [None req-d179c2a5-18c7-43fe-8d46-cda14b25e832 fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] [instance: b185217d-5f23-40c8-9f11-7cf87a1886c3] Using config drive
Dec 05 09:44:47 compute-1 nova_compute[189066]: 2025-12-05 09:44:47.048 189070 INFO nova.virt.libvirt.driver [None req-d179c2a5-18c7-43fe-8d46-cda14b25e832 fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] [instance: b185217d-5f23-40c8-9f11-7cf87a1886c3] Creating config drive at /var/lib/nova/instances/b185217d-5f23-40c8-9f11-7cf87a1886c3/disk.config
Dec 05 09:44:47 compute-1 nova_compute[189066]: 2025-12-05 09:44:47.053 189070 DEBUG oslo_concurrency.processutils [None req-d179c2a5-18c7-43fe-8d46-cda14b25e832 fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/b185217d-5f23-40c8-9f11-7cf87a1886c3/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpio3cmmgb execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 09:44:47 compute-1 nova_compute[189066]: 2025-12-05 09:44:47.181 189070 DEBUG oslo_concurrency.processutils [None req-d179c2a5-18c7-43fe-8d46-cda14b25e832 fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/b185217d-5f23-40c8-9f11-7cf87a1886c3/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpio3cmmgb" returned: 0 in 0.128s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 09:44:47 compute-1 kernel: tap5db43502-37: entered promiscuous mode
Dec 05 09:44:47 compute-1 NetworkManager[55704]: <info>  [1764927887.2483] manager: (tap5db43502-37): new Tun device (/org/freedesktop/NetworkManager/Devices/132)
Dec 05 09:44:47 compute-1 nova_compute[189066]: 2025-12-05 09:44:47.275 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:44:47 compute-1 ovn_controller[95809]: 2025-12-05T09:44:47Z|00253|binding|INFO|Claiming lport 5db43502-374a-4ff3-8bcc-a41bb1ae8440 for this chassis.
Dec 05 09:44:47 compute-1 ovn_controller[95809]: 2025-12-05T09:44:47Z|00254|binding|INFO|5db43502-374a-4ff3-8bcc-a41bb1ae8440: Claiming fa:16:3e:47:d0:61 10.100.0.4 2001:db8::f816:3eff:fe47:d061
Dec 05 09:44:47 compute-1 nova_compute[189066]: 2025-12-05 09:44:47.282 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:44:47 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:44:47.297 105272 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:47:d0:61 10.100.0.4 2001:db8::f816:3eff:fe47:d061'], port_security=['fa:16:3e:47:d0:61 10.100.0.4 2001:db8::f816:3eff:fe47:d061'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28 2001:db8::f816:3eff:fe47:d061/64', 'neutron:device_id': 'b185217d-5f23-40c8-9f11-7cf87a1886c3', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ae013098-2701-4ae8-8621-a0515ff9b432', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fa1cd463d74b49139a088d332d37e611', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'dd3745ba-bc1d-4ff0-a3b8-e555336632a0', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b5135dc1-9df2-4212-8110-1f16b401ed19, chassis=[<ovs.db.idl.Row object at 0x7fc06f54c970>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fc06f54c970>], logical_port=5db43502-374a-4ff3-8bcc-a41bb1ae8440) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 09:44:47 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:44:47.298 105272 INFO neutron.agent.ovn.metadata.agent [-] Port 5db43502-374a-4ff3-8bcc-a41bb1ae8440 in datapath ae013098-2701-4ae8-8621-a0515ff9b432 bound to our chassis
Dec 05 09:44:47 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:44:47.300 105272 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network ae013098-2701-4ae8-8621-a0515ff9b432
Dec 05 09:44:47 compute-1 systemd-udevd[232937]: Network interface NamePolicy= disabled on kernel command line.
Dec 05 09:44:47 compute-1 systemd-machined[154815]: New machine qemu-21-instance-00000032.
Dec 05 09:44:47 compute-1 NetworkManager[55704]: <info>  [1764927887.3236] device (tap5db43502-37): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec 05 09:44:47 compute-1 NetworkManager[55704]: <info>  [1764927887.3248] device (tap5db43502-37): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec 05 09:44:47 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:44:47.323 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[0ecf56b1-5fd9-4c80-a0a9-b0f3c7d3b0ad]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:44:47 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:44:47.325 105272 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapae013098-21 in ovnmeta-ae013098-2701-4ae8-8621-a0515ff9b432 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Dec 05 09:44:47 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:44:47.329 220556 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapae013098-20 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Dec 05 09:44:47 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:44:47.329 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[a8dba0cc-17e2-4380-bd8f-089308aefdf3]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:44:47 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:44:47.331 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[e0d32edb-b678-4e9e-beb3-61e938db5540]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:44:47 compute-1 ovn_controller[95809]: 2025-12-05T09:44:47Z|00255|binding|INFO|Setting lport 5db43502-374a-4ff3-8bcc-a41bb1ae8440 ovn-installed in OVS
Dec 05 09:44:47 compute-1 ovn_controller[95809]: 2025-12-05T09:44:47Z|00256|binding|INFO|Setting lport 5db43502-374a-4ff3-8bcc-a41bb1ae8440 up in Southbound
Dec 05 09:44:47 compute-1 nova_compute[189066]: 2025-12-05 09:44:47.339 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:44:47 compute-1 systemd[1]: Started Virtual Machine qemu-21-instance-00000032.
Dec 05 09:44:47 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:44:47.348 105809 DEBUG oslo.privsep.daemon [-] privsep: reply[133d6539-fa2f-4b79-bdcc-4db11fa37d3b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:44:47 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:44:47.368 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[8c897557-9e73-45d6-b384-152d515b34d7]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:44:47 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:44:47.411 220896 DEBUG oslo.privsep.daemon [-] privsep: reply[08e42898-62e6-4b08-8361-3bcc74e92938]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:44:47 compute-1 NetworkManager[55704]: <info>  [1764927887.4211] manager: (tapae013098-20): new Veth device (/org/freedesktop/NetworkManager/Devices/133)
Dec 05 09:44:47 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:44:47.420 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[047f55c3-6565-4146-854c-38e80f9d52f6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:44:47 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:44:47.461 220896 DEBUG oslo.privsep.daemon [-] privsep: reply[8423dc86-4a25-4bc3-97a9-0eac5efc9932]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:44:47 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:44:47.466 220896 DEBUG oslo.privsep.daemon [-] privsep: reply[1f51ddba-cc3b-439d-b3fe-ec53d6f93d35]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:44:47 compute-1 NetworkManager[55704]: <info>  [1764927887.5032] device (tapae013098-20): carrier: link connected
Dec 05 09:44:47 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:44:47.511 220896 DEBUG oslo.privsep.daemon [-] privsep: reply[1b71f9f1-ad06-4655-ac1e-fdeb8a5c72b2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:44:47 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:44:47.535 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[d1344646-7094-43bd-8c03-bc6e79ab3472]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapae013098-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:a0:d5:04'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 75], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 516050, 'reachable_time': 26227, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 232971, 'error': None, 'target': 'ovnmeta-ae013098-2701-4ae8-8621-a0515ff9b432', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:44:47 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:44:47.556 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[7318472a-6d4d-4392-b835-07b78a8871d7]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fea0:d504'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 516050, 'tstamp': 516050}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 232972, 'error': None, 'target': 'ovnmeta-ae013098-2701-4ae8-8621-a0515ff9b432', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:44:47 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:44:47.580 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[c0b300d7-e571-4757-8338-e945d3b07466]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapae013098-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:a0:d5:04'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 75], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 516050, 'reachable_time': 26227, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 232973, 'error': None, 'target': 'ovnmeta-ae013098-2701-4ae8-8621-a0515ff9b432', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:44:47 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:44:47.624 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[3346392a-d4bf-4d26-87f3-ca25efa61f6f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:44:47 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:44:47.704 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[77a41bf3-c6c2-4df8-8579-96fc6ab19368]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:44:47 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:44:47.706 105272 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapae013098-20, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 09:44:47 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:44:47.706 105272 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 09:44:47 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:44:47.707 105272 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapae013098-20, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 09:44:47 compute-1 kernel: tapae013098-20: entered promiscuous mode
Dec 05 09:44:47 compute-1 nova_compute[189066]: 2025-12-05 09:44:47.708 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:44:47 compute-1 NetworkManager[55704]: <info>  [1764927887.7096] manager: (tapae013098-20): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/134)
Dec 05 09:44:47 compute-1 nova_compute[189066]: 2025-12-05 09:44:47.711 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:44:47 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:44:47.712 105272 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapae013098-20, col_values=(('external_ids', {'iface-id': '8c05e074-0cad-4652-a193-a5450971b273'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 09:44:47 compute-1 nova_compute[189066]: 2025-12-05 09:44:47.713 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:44:47 compute-1 ovn_controller[95809]: 2025-12-05T09:44:47Z|00257|binding|INFO|Releasing lport 8c05e074-0cad-4652-a193-a5450971b273 from this chassis (sb_readonly=0)
Dec 05 09:44:47 compute-1 nova_compute[189066]: 2025-12-05 09:44:47.714 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:44:47 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:44:47.716 105272 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/ae013098-2701-4ae8-8621-a0515ff9b432.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/ae013098-2701-4ae8-8621-a0515ff9b432.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Dec 05 09:44:47 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:44:47.717 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[2c436669-dcda-46a8-9bb9-838a231c3ea3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:44:47 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:44:47.718 105272 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Dec 05 09:44:47 compute-1 ovn_metadata_agent[105267]: global
Dec 05 09:44:47 compute-1 ovn_metadata_agent[105267]:     log         /dev/log local0 debug
Dec 05 09:44:47 compute-1 ovn_metadata_agent[105267]:     log-tag     haproxy-metadata-proxy-ae013098-2701-4ae8-8621-a0515ff9b432
Dec 05 09:44:47 compute-1 ovn_metadata_agent[105267]:     user        root
Dec 05 09:44:47 compute-1 ovn_metadata_agent[105267]:     group       root
Dec 05 09:44:47 compute-1 ovn_metadata_agent[105267]:     maxconn     1024
Dec 05 09:44:47 compute-1 ovn_metadata_agent[105267]:     pidfile     /var/lib/neutron/external/pids/ae013098-2701-4ae8-8621-a0515ff9b432.pid.haproxy
Dec 05 09:44:47 compute-1 ovn_metadata_agent[105267]:     daemon
Dec 05 09:44:47 compute-1 ovn_metadata_agent[105267]: 
Dec 05 09:44:47 compute-1 ovn_metadata_agent[105267]: defaults
Dec 05 09:44:47 compute-1 ovn_metadata_agent[105267]:     log global
Dec 05 09:44:47 compute-1 ovn_metadata_agent[105267]:     mode http
Dec 05 09:44:47 compute-1 ovn_metadata_agent[105267]:     option httplog
Dec 05 09:44:47 compute-1 ovn_metadata_agent[105267]:     option dontlognull
Dec 05 09:44:47 compute-1 ovn_metadata_agent[105267]:     option http-server-close
Dec 05 09:44:47 compute-1 ovn_metadata_agent[105267]:     option forwardfor
Dec 05 09:44:47 compute-1 ovn_metadata_agent[105267]:     retries                 3
Dec 05 09:44:47 compute-1 ovn_metadata_agent[105267]:     timeout http-request    30s
Dec 05 09:44:47 compute-1 ovn_metadata_agent[105267]:     timeout connect         30s
Dec 05 09:44:47 compute-1 ovn_metadata_agent[105267]:     timeout client          32s
Dec 05 09:44:47 compute-1 ovn_metadata_agent[105267]:     timeout server          32s
Dec 05 09:44:47 compute-1 ovn_metadata_agent[105267]:     timeout http-keep-alive 30s
Dec 05 09:44:47 compute-1 ovn_metadata_agent[105267]: 
Dec 05 09:44:47 compute-1 ovn_metadata_agent[105267]: 
Dec 05 09:44:47 compute-1 ovn_metadata_agent[105267]: listen listener
Dec 05 09:44:47 compute-1 ovn_metadata_agent[105267]:     bind 169.254.169.254:80
Dec 05 09:44:47 compute-1 ovn_metadata_agent[105267]:     server metadata /var/lib/neutron/metadata_proxy
Dec 05 09:44:47 compute-1 ovn_metadata_agent[105267]:     http-request add-header X-OVN-Network-ID ae013098-2701-4ae8-8621-a0515ff9b432
Dec 05 09:44:47 compute-1 ovn_metadata_agent[105267]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Dec 05 09:44:47 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:44:47.721 105272 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-ae013098-2701-4ae8-8621-a0515ff9b432', 'env', 'PROCESS_TAG=haproxy-ae013098-2701-4ae8-8621-a0515ff9b432', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/ae013098-2701-4ae8-8621-a0515ff9b432.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Dec 05 09:44:47 compute-1 nova_compute[189066]: 2025-12-05 09:44:47.726 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:44:48 compute-1 podman[233010]: 2025-12-05 09:44:48.204870082 +0000 UTC m=+0.064750547 container create de53b0a6a1de582e8df938d580856413681176263400b69b752ca4ec9ec480d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ae013098-2701-4ae8-8621-a0515ff9b432, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Dec 05 09:44:48 compute-1 systemd[1]: Started libpod-conmon-de53b0a6a1de582e8df938d580856413681176263400b69b752ca4ec9ec480d8.scope.
Dec 05 09:44:48 compute-1 nova_compute[189066]: 2025-12-05 09:44:48.243 189070 DEBUG nova.virt.driver [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] Emitting event <LifecycleEvent: 1764927888.2427824, b185217d-5f23-40c8-9f11-7cf87a1886c3 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 09:44:48 compute-1 nova_compute[189066]: 2025-12-05 09:44:48.244 189070 INFO nova.compute.manager [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] [instance: b185217d-5f23-40c8-9f11-7cf87a1886c3] VM Started (Lifecycle Event)
Dec 05 09:44:48 compute-1 systemd[1]: Started libcrun container.
Dec 05 09:44:48 compute-1 podman[233010]: 2025-12-05 09:44:48.172617862 +0000 UTC m=+0.032498347 image pull 014dc726c85414b29f2dde7b5d875685d08784761c0f0ffa8630d1583a877bf9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Dec 05 09:44:48 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c3ea5bfa49793bf54265759ba8bdca0abcad146031c0a36cfdee477dff3e66f5/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 05 09:44:48 compute-1 nova_compute[189066]: 2025-12-05 09:44:48.275 189070 DEBUG nova.compute.manager [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] [instance: b185217d-5f23-40c8-9f11-7cf87a1886c3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 09:44:48 compute-1 nova_compute[189066]: 2025-12-05 09:44:48.284 189070 DEBUG nova.virt.driver [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] Emitting event <LifecycleEvent: 1764927888.244259, b185217d-5f23-40c8-9f11-7cf87a1886c3 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 09:44:48 compute-1 nova_compute[189066]: 2025-12-05 09:44:48.285 189070 INFO nova.compute.manager [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] [instance: b185217d-5f23-40c8-9f11-7cf87a1886c3] VM Paused (Lifecycle Event)
Dec 05 09:44:48 compute-1 podman[233010]: 2025-12-05 09:44:48.293234886 +0000 UTC m=+0.153115381 container init de53b0a6a1de582e8df938d580856413681176263400b69b752ca4ec9ec480d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ae013098-2701-4ae8-8621-a0515ff9b432, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS)
Dec 05 09:44:48 compute-1 podman[233010]: 2025-12-05 09:44:48.299378946 +0000 UTC m=+0.159259411 container start de53b0a6a1de582e8df938d580856413681176263400b69b752ca4ec9ec480d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ae013098-2701-4ae8-8621-a0515ff9b432, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true)
Dec 05 09:44:48 compute-1 nova_compute[189066]: 2025-12-05 09:44:48.309 189070 DEBUG nova.compute.manager [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] [instance: b185217d-5f23-40c8-9f11-7cf87a1886c3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 09:44:48 compute-1 nova_compute[189066]: 2025-12-05 09:44:48.313 189070 DEBUG nova.compute.manager [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] [instance: b185217d-5f23-40c8-9f11-7cf87a1886c3] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 09:44:48 compute-1 neutron-haproxy-ovnmeta-ae013098-2701-4ae8-8621-a0515ff9b432[233027]: [NOTICE]   (233031) : New worker (233033) forked
Dec 05 09:44:48 compute-1 neutron-haproxy-ovnmeta-ae013098-2701-4ae8-8621-a0515ff9b432[233027]: [NOTICE]   (233031) : Loading success.
Dec 05 09:44:48 compute-1 nova_compute[189066]: 2025-12-05 09:44:48.351 189070 INFO nova.compute.manager [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] [instance: b185217d-5f23-40c8-9f11-7cf87a1886c3] During sync_power_state the instance has a pending task (spawning). Skip.
Dec 05 09:44:50 compute-1 nova_compute[189066]: 2025-12-05 09:44:50.624 189070 DEBUG nova.compute.manager [req-f9fc5d5f-ee72-4b00-adec-59836a4b04eb req-3f6e6daf-5141-4b52-af3f-3555f8d39691 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: b185217d-5f23-40c8-9f11-7cf87a1886c3] Received event network-vif-plugged-5db43502-374a-4ff3-8bcc-a41bb1ae8440 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 09:44:50 compute-1 nova_compute[189066]: 2025-12-05 09:44:50.626 189070 DEBUG oslo_concurrency.lockutils [req-f9fc5d5f-ee72-4b00-adec-59836a4b04eb req-3f6e6daf-5141-4b52-af3f-3555f8d39691 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Acquiring lock "b185217d-5f23-40c8-9f11-7cf87a1886c3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:44:50 compute-1 nova_compute[189066]: 2025-12-05 09:44:50.627 189070 DEBUG oslo_concurrency.lockutils [req-f9fc5d5f-ee72-4b00-adec-59836a4b04eb req-3f6e6daf-5141-4b52-af3f-3555f8d39691 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Lock "b185217d-5f23-40c8-9f11-7cf87a1886c3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:44:50 compute-1 nova_compute[189066]: 2025-12-05 09:44:50.627 189070 DEBUG oslo_concurrency.lockutils [req-f9fc5d5f-ee72-4b00-adec-59836a4b04eb req-3f6e6daf-5141-4b52-af3f-3555f8d39691 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Lock "b185217d-5f23-40c8-9f11-7cf87a1886c3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:44:50 compute-1 nova_compute[189066]: 2025-12-05 09:44:50.628 189070 DEBUG nova.compute.manager [req-f9fc5d5f-ee72-4b00-adec-59836a4b04eb req-3f6e6daf-5141-4b52-af3f-3555f8d39691 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: b185217d-5f23-40c8-9f11-7cf87a1886c3] Processing event network-vif-plugged-5db43502-374a-4ff3-8bcc-a41bb1ae8440 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Dec 05 09:44:50 compute-1 nova_compute[189066]: 2025-12-05 09:44:50.629 189070 DEBUG nova.compute.manager [None req-d179c2a5-18c7-43fe-8d46-cda14b25e832 fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] [instance: b185217d-5f23-40c8-9f11-7cf87a1886c3] Instance event wait completed in 2 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Dec 05 09:44:50 compute-1 nova_compute[189066]: 2025-12-05 09:44:50.633 189070 DEBUG nova.virt.driver [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] Emitting event <LifecycleEvent: 1764927890.6331377, b185217d-5f23-40c8-9f11-7cf87a1886c3 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 09:44:50 compute-1 nova_compute[189066]: 2025-12-05 09:44:50.633 189070 INFO nova.compute.manager [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] [instance: b185217d-5f23-40c8-9f11-7cf87a1886c3] VM Resumed (Lifecycle Event)
Dec 05 09:44:50 compute-1 nova_compute[189066]: 2025-12-05 09:44:50.636 189070 DEBUG nova.virt.libvirt.driver [None req-d179c2a5-18c7-43fe-8d46-cda14b25e832 fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] [instance: b185217d-5f23-40c8-9f11-7cf87a1886c3] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Dec 05 09:44:50 compute-1 nova_compute[189066]: 2025-12-05 09:44:50.639 189070 INFO nova.virt.libvirt.driver [-] [instance: b185217d-5f23-40c8-9f11-7cf87a1886c3] Instance spawned successfully.
Dec 05 09:44:50 compute-1 nova_compute[189066]: 2025-12-05 09:44:50.639 189070 DEBUG nova.virt.libvirt.driver [None req-d179c2a5-18c7-43fe-8d46-cda14b25e832 fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] [instance: b185217d-5f23-40c8-9f11-7cf87a1886c3] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Dec 05 09:44:50 compute-1 nova_compute[189066]: 2025-12-05 09:44:50.660 189070 DEBUG nova.compute.manager [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] [instance: b185217d-5f23-40c8-9f11-7cf87a1886c3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 09:44:50 compute-1 nova_compute[189066]: 2025-12-05 09:44:50.667 189070 DEBUG nova.compute.manager [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] [instance: b185217d-5f23-40c8-9f11-7cf87a1886c3] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 09:44:50 compute-1 nova_compute[189066]: 2025-12-05 09:44:50.671 189070 DEBUG nova.virt.libvirt.driver [None req-d179c2a5-18c7-43fe-8d46-cda14b25e832 fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] [instance: b185217d-5f23-40c8-9f11-7cf87a1886c3] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 09:44:50 compute-1 nova_compute[189066]: 2025-12-05 09:44:50.671 189070 DEBUG nova.virt.libvirt.driver [None req-d179c2a5-18c7-43fe-8d46-cda14b25e832 fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] [instance: b185217d-5f23-40c8-9f11-7cf87a1886c3] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 09:44:50 compute-1 nova_compute[189066]: 2025-12-05 09:44:50.672 189070 DEBUG nova.virt.libvirt.driver [None req-d179c2a5-18c7-43fe-8d46-cda14b25e832 fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] [instance: b185217d-5f23-40c8-9f11-7cf87a1886c3] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 09:44:50 compute-1 nova_compute[189066]: 2025-12-05 09:44:50.672 189070 DEBUG nova.virt.libvirt.driver [None req-d179c2a5-18c7-43fe-8d46-cda14b25e832 fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] [instance: b185217d-5f23-40c8-9f11-7cf87a1886c3] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 09:44:50 compute-1 nova_compute[189066]: 2025-12-05 09:44:50.673 189070 DEBUG nova.virt.libvirt.driver [None req-d179c2a5-18c7-43fe-8d46-cda14b25e832 fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] [instance: b185217d-5f23-40c8-9f11-7cf87a1886c3] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 09:44:50 compute-1 nova_compute[189066]: 2025-12-05 09:44:50.673 189070 DEBUG nova.virt.libvirt.driver [None req-d179c2a5-18c7-43fe-8d46-cda14b25e832 fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] [instance: b185217d-5f23-40c8-9f11-7cf87a1886c3] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 09:44:50 compute-1 nova_compute[189066]: 2025-12-05 09:44:50.704 189070 INFO nova.compute.manager [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] [instance: b185217d-5f23-40c8-9f11-7cf87a1886c3] During sync_power_state the instance has a pending task (spawning). Skip.
Dec 05 09:44:50 compute-1 nova_compute[189066]: 2025-12-05 09:44:50.767 189070 INFO nova.compute.manager [None req-d179c2a5-18c7-43fe-8d46-cda14b25e832 fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] [instance: b185217d-5f23-40c8-9f11-7cf87a1886c3] Took 11.07 seconds to spawn the instance on the hypervisor.
Dec 05 09:44:50 compute-1 nova_compute[189066]: 2025-12-05 09:44:50.768 189070 DEBUG nova.compute.manager [None req-d179c2a5-18c7-43fe-8d46-cda14b25e832 fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] [instance: b185217d-5f23-40c8-9f11-7cf87a1886c3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 09:44:50 compute-1 nova_compute[189066]: 2025-12-05 09:44:50.854 189070 INFO nova.compute.manager [None req-d179c2a5-18c7-43fe-8d46-cda14b25e832 fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] [instance: b185217d-5f23-40c8-9f11-7cf87a1886c3] Took 12.02 seconds to build instance.
Dec 05 09:44:50 compute-1 nova_compute[189066]: 2025-12-05 09:44:50.882 189070 DEBUG oslo_concurrency.lockutils [None req-d179c2a5-18c7-43fe-8d46-cda14b25e832 fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] Lock "b185217d-5f23-40c8-9f11-7cf87a1886c3" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 12.166s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:44:51 compute-1 nova_compute[189066]: 2025-12-05 09:44:51.138 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:44:51 compute-1 nova_compute[189066]: 2025-12-05 09:44:51.197 189070 DEBUG nova.network.neutron [req-a8743a21-7ced-4766-bd7a-fb17a298273f req-289bcffd-0d39-4500-9d9a-b4486d878fd1 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: b185217d-5f23-40c8-9f11-7cf87a1886c3] Updated VIF entry in instance network info cache for port 5db43502-374a-4ff3-8bcc-a41bb1ae8440. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Dec 05 09:44:51 compute-1 nova_compute[189066]: 2025-12-05 09:44:51.198 189070 DEBUG nova.network.neutron [req-a8743a21-7ced-4766-bd7a-fb17a298273f req-289bcffd-0d39-4500-9d9a-b4486d878fd1 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: b185217d-5f23-40c8-9f11-7cf87a1886c3] Updating instance_info_cache with network_info: [{"id": "5db43502-374a-4ff3-8bcc-a41bb1ae8440", "address": "fa:16:3e:47:d0:61", "network": {"id": "ae013098-2701-4ae8-8621-a0515ff9b432", "bridge": "br-int", "label": "tempest-network-smoke--758374883", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe47:d061", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fa1cd463d74b49139a088d332d37e611", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5db43502-37", "ovs_interfaceid": "5db43502-374a-4ff3-8bcc-a41bb1ae8440", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 09:44:51 compute-1 nova_compute[189066]: 2025-12-05 09:44:51.235 189070 DEBUG oslo_concurrency.lockutils [req-a8743a21-7ced-4766-bd7a-fb17a298273f req-289bcffd-0d39-4500-9d9a-b4486d878fd1 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Releasing lock "refresh_cache-b185217d-5f23-40c8-9f11-7cf87a1886c3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 09:44:51 compute-1 nova_compute[189066]: 2025-12-05 09:44:51.571 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:44:51 compute-1 podman[233042]: 2025-12-05 09:44:51.624387384 +0000 UTC m=+0.057611032 container health_status b9f45cdcb567bb250381fa80d86c78c226d5638c7de1b6d79ee017275814ff68 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 05 09:44:53 compute-1 nova_compute[189066]: 2025-12-05 09:44:53.437 189070 DEBUG nova.compute.manager [req-6847ccab-f637-4b8d-a1a1-a4ac2c9ab8c7 req-9edde50b-5af8-4722-917d-5e9887558457 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: b185217d-5f23-40c8-9f11-7cf87a1886c3] Received event network-vif-plugged-5db43502-374a-4ff3-8bcc-a41bb1ae8440 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 09:44:53 compute-1 nova_compute[189066]: 2025-12-05 09:44:53.438 189070 DEBUG oslo_concurrency.lockutils [req-6847ccab-f637-4b8d-a1a1-a4ac2c9ab8c7 req-9edde50b-5af8-4722-917d-5e9887558457 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Acquiring lock "b185217d-5f23-40c8-9f11-7cf87a1886c3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:44:53 compute-1 nova_compute[189066]: 2025-12-05 09:44:53.438 189070 DEBUG oslo_concurrency.lockutils [req-6847ccab-f637-4b8d-a1a1-a4ac2c9ab8c7 req-9edde50b-5af8-4722-917d-5e9887558457 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Lock "b185217d-5f23-40c8-9f11-7cf87a1886c3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:44:53 compute-1 nova_compute[189066]: 2025-12-05 09:44:53.438 189070 DEBUG oslo_concurrency.lockutils [req-6847ccab-f637-4b8d-a1a1-a4ac2c9ab8c7 req-9edde50b-5af8-4722-917d-5e9887558457 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Lock "b185217d-5f23-40c8-9f11-7cf87a1886c3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:44:53 compute-1 nova_compute[189066]: 2025-12-05 09:44:53.439 189070 DEBUG nova.compute.manager [req-6847ccab-f637-4b8d-a1a1-a4ac2c9ab8c7 req-9edde50b-5af8-4722-917d-5e9887558457 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: b185217d-5f23-40c8-9f11-7cf87a1886c3] No waiting events found dispatching network-vif-plugged-5db43502-374a-4ff3-8bcc-a41bb1ae8440 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 09:44:53 compute-1 nova_compute[189066]: 2025-12-05 09:44:53.439 189070 WARNING nova.compute.manager [req-6847ccab-f637-4b8d-a1a1-a4ac2c9ab8c7 req-9edde50b-5af8-4722-917d-5e9887558457 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: b185217d-5f23-40c8-9f11-7cf87a1886c3] Received unexpected event network-vif-plugged-5db43502-374a-4ff3-8bcc-a41bb1ae8440 for instance with vm_state active and task_state None.
Dec 05 09:44:55 compute-1 podman[233069]: 2025-12-05 09:44:55.63165805 +0000 UTC m=+0.066978972 container health_status 550b604b3b40c506829669f3af0f3e8dc4e7ac01464222ce7f26998708fe1a96 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Dec 05 09:44:56 compute-1 nova_compute[189066]: 2025-12-05 09:44:56.182 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:44:56 compute-1 nova_compute[189066]: 2025-12-05 09:44:56.573 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:45:01 compute-1 nova_compute[189066]: 2025-12-05 09:45:01.188 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:45:01 compute-1 nova_compute[189066]: 2025-12-05 09:45:01.576 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:45:01 compute-1 podman[233093]: 2025-12-05 09:45:01.643906966 +0000 UTC m=+0.078325349 container health_status d60d3dfd47255bc7c068dbedc03dbf86678863f0414a1b8f8536e4d53dd67a13 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a42d21893a42142b0b00e96296a13ac9d2c0fedf55d6438ad104fd0309c63376-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ceilometer_agent_compute, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 05 09:45:03 compute-1 sshd-session[233068]: error: kex_exchange_identification: read: Connection timed out
Dec 05 09:45:03 compute-1 sshd-session[233068]: banner exchange: Connection from 221.237.163.202 port 34922: Connection timed out
Dec 05 09:45:04 compute-1 ovn_controller[95809]: 2025-12-05T09:45:04Z|00043|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:47:d0:61 10.100.0.4
Dec 05 09:45:04 compute-1 ovn_controller[95809]: 2025-12-05T09:45:04Z|00044|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:47:d0:61 10.100.0.4
Dec 05 09:45:05 compute-1 nova_compute[189066]: 2025-12-05 09:45:05.271 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:45:05 compute-1 NetworkManager[55704]: <info>  [1764927905.2722] manager: (patch-br-int-to-provnet-540ef657-1e81-4b72-856b-0e1c84504735): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/135)
Dec 05 09:45:05 compute-1 NetworkManager[55704]: <info>  [1764927905.2734] manager: (patch-provnet-540ef657-1e81-4b72-856b-0e1c84504735-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/136)
Dec 05 09:45:05 compute-1 nova_compute[189066]: 2025-12-05 09:45:05.408 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:45:05 compute-1 ovn_controller[95809]: 2025-12-05T09:45:05Z|00258|binding|INFO|Releasing lport 8c05e074-0cad-4652-a193-a5450971b273 from this chassis (sb_readonly=0)
Dec 05 09:45:05 compute-1 nova_compute[189066]: 2025-12-05 09:45:05.430 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:45:06 compute-1 nova_compute[189066]: 2025-12-05 09:45:06.228 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:45:06 compute-1 nova_compute[189066]: 2025-12-05 09:45:06.579 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:45:08 compute-1 nova_compute[189066]: 2025-12-05 09:45:08.635 189070 DEBUG nova.compute.manager [req-c13e69d8-bdb9-46c5-9dbe-9595c6a82911 req-8fa8594b-cbb8-4ea3-b4b5-98396b0f8d80 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: b185217d-5f23-40c8-9f11-7cf87a1886c3] Received event network-changed-5db43502-374a-4ff3-8bcc-a41bb1ae8440 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 09:45:08 compute-1 nova_compute[189066]: 2025-12-05 09:45:08.636 189070 DEBUG nova.compute.manager [req-c13e69d8-bdb9-46c5-9dbe-9595c6a82911 req-8fa8594b-cbb8-4ea3-b4b5-98396b0f8d80 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: b185217d-5f23-40c8-9f11-7cf87a1886c3] Refreshing instance network info cache due to event network-changed-5db43502-374a-4ff3-8bcc-a41bb1ae8440. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Dec 05 09:45:08 compute-1 nova_compute[189066]: 2025-12-05 09:45:08.636 189070 DEBUG oslo_concurrency.lockutils [req-c13e69d8-bdb9-46c5-9dbe-9595c6a82911 req-8fa8594b-cbb8-4ea3-b4b5-98396b0f8d80 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Acquiring lock "refresh_cache-b185217d-5f23-40c8-9f11-7cf87a1886c3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 09:45:08 compute-1 nova_compute[189066]: 2025-12-05 09:45:08.637 189070 DEBUG oslo_concurrency.lockutils [req-c13e69d8-bdb9-46c5-9dbe-9595c6a82911 req-8fa8594b-cbb8-4ea3-b4b5-98396b0f8d80 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Acquired lock "refresh_cache-b185217d-5f23-40c8-9f11-7cf87a1886c3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 09:45:08 compute-1 nova_compute[189066]: 2025-12-05 09:45:08.637 189070 DEBUG nova.network.neutron [req-c13e69d8-bdb9-46c5-9dbe-9595c6a82911 req-8fa8594b-cbb8-4ea3-b4b5-98396b0f8d80 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: b185217d-5f23-40c8-9f11-7cf87a1886c3] Refreshing network info cache for port 5db43502-374a-4ff3-8bcc-a41bb1ae8440 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Dec 05 09:45:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:45:08.893 105272 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:45:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:45:08.895 105272 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:45:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:45:08.896 105272 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:45:09 compute-1 podman[233125]: 2025-12-05 09:45:09.69034908 +0000 UTC m=+0.128632571 container health_status 0cdc951f3b455a1459471b20e1f1626ca53f702831c125f835d1b670aeb17ccf (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a42d21893a42142b0b00e96296a13ac9d2c0fedf55d6438ad104fd0309c63376-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.763 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'b185217d-5f23-40c8-9f11-7cf87a1886c3', 'name': 'tempest-TestGettingAddress-server-252150820', 'flavor': {'id': 'fbadeab4-f24f-4100-963a-d228b2a6f7c4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '3ebffd97-b242-42d7-b245-ebdaf8e4377c'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000032', 'OS-EXT-SRV-ATTR:host': 'compute-1.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': 'fa1cd463d74b49139a088d332d37e611', 'user_id': 'fae1c60e378945ea84b34c4824b835b1', 'hostId': '2bcb83f9ba5db29ea79c746e62a8a2dc63c5c999436a0571013bf81c', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.765 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.804 12 DEBUG ceilometer.compute.pollsters [-] b185217d-5f23-40c8-9f11-7cf87a1886c3/disk.device.read.latency volume: 212000734 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.806 12 DEBUG ceilometer.compute.pollsters [-] b185217d-5f23-40c8-9f11-7cf87a1886c3/disk.device.read.latency volume: 26553177 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.813 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a9f1e4dc-d726-4b08-8cb1-ce38eb14f05f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 212000734, 'user_id': 'fae1c60e378945ea84b34c4824b835b1', 'user_name': None, 'project_id': 'fa1cd463d74b49139a088d332d37e611', 'project_name': None, 'resource_id': 'b185217d-5f23-40c8-9f11-7cf87a1886c3-vda', 'timestamp': '2025-12-05T09:45:10.766439', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-252150820', 'name': 'instance-00000032', 'instance_id': 'b185217d-5f23-40c8-9f11-7cf87a1886c3', 'instance_type': 'm1.nano', 'host': '2bcb83f9ba5db29ea79c746e62a8a2dc63c5c999436a0571013bf81c', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'fbadeab4-f24f-4100-963a-d228b2a6f7c4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3ebffd97-b242-42d7-b245-ebdaf8e4377c'}, 'image_ref': '3ebffd97-b242-42d7-b245-ebdaf8e4377c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '17485432-d1bf-11f0-a2f3-fa163e9454b0', 'monotonic_time': 5183.824132009, 'message_signature': '6616bac159079a4f936ce18fa4dc2d7833be45d03ae79f0b48f565fe0b29e339'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 26553177, 'user_id': 'fae1c60e378945ea84b34c4824b835b1', 'user_name': None, 'project_id': 'fa1cd463d74b49139a088d332d37e611', 'project_name': None, 'resource_id': 'b185217d-5f23-40c8-9f11-7cf87a1886c3-sda', 'timestamp': '2025-12-05T09:45:10.766439', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-252150820', 'name': 'instance-00000032', 'instance_id': 'b185217d-5f23-40c8-9f11-7cf87a1886c3', 'instance_type': 'm1.nano', 'host': '2bcb83f9ba5db29ea79c746e62a8a2dc63c5c999436a0571013bf81c', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'fbadeab4-f24f-4100-963a-d228b2a6f7c4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3ebffd97-b242-42d7-b245-ebdaf8e4377c'}, 'image_ref': '3ebffd97-b242-42d7-b245-ebdaf8e4377c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '1748801a-d1bf-11f0-a2f3-fa163e9454b0', 'monotonic_time': 5183.824132009, 'message_signature': '089b4b9cb9994b09bd8158202cbf2db6f3e1d74cb48b519db2aa72098943ce82'}]}, 'timestamp': '2025-12-05 09:45:10.807458', '_unique_id': '7f2b45fe98524f61b5e76325e68fe461'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.813 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.813 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.813 12 ERROR oslo_messaging.notify.messaging     yield
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.813 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.813 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.813 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.813 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.813 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.813 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.813 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.813 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.813 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.813 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.813 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.813 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.813 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.813 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.813 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.813 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.813 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.813 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.813 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.813 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.813 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.813 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.813 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.813 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.813 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.813 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.813 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.813 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.813 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.813 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.813 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.813 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.813 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.813 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.813 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.813 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.813 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.813 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.813 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.813 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.813 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.813 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.813 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.813 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.813 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.813 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.813 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.813 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.813 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.813 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.813 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.814 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.819 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for b185217d-5f23-40c8-9f11-7cf87a1886c3 / tap5db43502-37 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.819 12 DEBUG ceilometer.compute.pollsters [-] b185217d-5f23-40c8-9f11-7cf87a1886c3/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.821 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '24cb33bb-2197-4df0-87a0-d67b8db211fa', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'fae1c60e378945ea84b34c4824b835b1', 'user_name': None, 'project_id': 'fa1cd463d74b49139a088d332d37e611', 'project_name': None, 'resource_id': 'instance-00000032-b185217d-5f23-40c8-9f11-7cf87a1886c3-tap5db43502-37', 'timestamp': '2025-12-05T09:45:10.815085', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-252150820', 'name': 'tap5db43502-37', 'instance_id': 'b185217d-5f23-40c8-9f11-7cf87a1886c3', 'instance_type': 'm1.nano', 'host': '2bcb83f9ba5db29ea79c746e62a8a2dc63c5c999436a0571013bf81c', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'fbadeab4-f24f-4100-963a-d228b2a6f7c4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3ebffd97-b242-42d7-b245-ebdaf8e4377c'}, 'image_ref': '3ebffd97-b242-42d7-b245-ebdaf8e4377c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:47:d0:61', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5db43502-37'}, 'message_id': '174a7b72-d1bf-11f0-a2f3-fa163e9454b0', 'monotonic_time': 5183.872686508, 'message_signature': '1e60855d6267f80f727add73f066e23afb10f76286cc8c9dfd2312c1ad35ff16'}]}, 'timestamp': '2025-12-05 09:45:10.820326', '_unique_id': 'd5bdb03c8d8f4c638eff099916b74488'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.821 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.821 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.821 12 ERROR oslo_messaging.notify.messaging     yield
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.821 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.821 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.821 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.821 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.821 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.821 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.821 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.821 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.821 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.821 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.821 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.821 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.821 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.821 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.821 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.821 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.821 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.821 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.821 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.821 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.821 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.821 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.821 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.821 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.821 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.821 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.821 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.821 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.821 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.821 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.821 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.821 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.821 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.821 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.821 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.821 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.821 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.821 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.821 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.821 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.821 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.821 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.821 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.821 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.821 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.821 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.821 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.821 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.821 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.821 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.821 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.822 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.839 12 DEBUG ceilometer.compute.pollsters [-] b185217d-5f23-40c8-9f11-7cf87a1886c3/memory.usage volume: 40.37890625 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.841 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8ab42b12-1318-46a4-90f0-068aa60f13b0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 40.37890625, 'user_id': 'fae1c60e378945ea84b34c4824b835b1', 'user_name': None, 'project_id': 'fa1cd463d74b49139a088d332d37e611', 'project_name': None, 'resource_id': 'b185217d-5f23-40c8-9f11-7cf87a1886c3', 'timestamp': '2025-12-05T09:45:10.822367', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-252150820', 'name': 'instance-00000032', 'instance_id': 'b185217d-5f23-40c8-9f11-7cf87a1886c3', 'instance_type': 'm1.nano', 'host': '2bcb83f9ba5db29ea79c746e62a8a2dc63c5c999436a0571013bf81c', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'fbadeab4-f24f-4100-963a-d228b2a6f7c4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3ebffd97-b242-42d7-b245-ebdaf8e4377c'}, 'image_ref': '3ebffd97-b242-42d7-b245-ebdaf8e4377c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': '174d8682-d1bf-11f0-a2f3-fa163e9454b0', 'monotonic_time': 5183.896798998, 'message_signature': '9421c2a8944b8f0163c2582433bbd050443a6d23f601d51da843d1177721d562'}]}, 'timestamp': '2025-12-05 09:45:10.840356', '_unique_id': '933bcdf578824ad3960f9dbc494869f1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.841 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.841 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.841 12 ERROR oslo_messaging.notify.messaging     yield
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.841 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.841 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.841 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.841 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.841 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.841 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.841 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.841 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.841 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.841 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.841 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.841 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.841 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.841 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.841 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.841 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.841 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.841 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.841 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.841 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.841 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.841 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.841 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.841 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.841 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.841 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.841 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.841 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.841 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.841 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.841 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.841 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.841 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.841 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.841 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.841 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.841 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.841 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.841 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.841 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.841 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.841 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.841 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.841 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.841 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.841 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.841 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.841 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.841 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.841 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.841 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.842 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.843 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.843 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-TestGettingAddress-server-252150820>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestGettingAddress-server-252150820>]
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.843 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.843 12 DEBUG ceilometer.compute.pollsters [-] b185217d-5f23-40c8-9f11-7cf87a1886c3/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.844 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7084724b-d709-49ba-b63a-a5c4c8a2adc0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'fae1c60e378945ea84b34c4824b835b1', 'user_name': None, 'project_id': 'fa1cd463d74b49139a088d332d37e611', 'project_name': None, 'resource_id': 'instance-00000032-b185217d-5f23-40c8-9f11-7cf87a1886c3-tap5db43502-37', 'timestamp': '2025-12-05T09:45:10.843596', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-252150820', 'name': 'tap5db43502-37', 'instance_id': 'b185217d-5f23-40c8-9f11-7cf87a1886c3', 'instance_type': 'm1.nano', 'host': '2bcb83f9ba5db29ea79c746e62a8a2dc63c5c999436a0571013bf81c', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'fbadeab4-f24f-4100-963a-d228b2a6f7c4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3ebffd97-b242-42d7-b245-ebdaf8e4377c'}, 'image_ref': '3ebffd97-b242-42d7-b245-ebdaf8e4377c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:47:d0:61', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5db43502-37'}, 'message_id': '174e16ec-d1bf-11f0-a2f3-fa163e9454b0', 'monotonic_time': 5183.872686508, 'message_signature': '6d03c48f241979ac89821d4f77a063c2068d6fd55ba35e97e3df1aa34815ac24'}]}, 'timestamp': '2025-12-05 09:45:10.843905', '_unique_id': 'ed6f1c0b5918428bbb76718df584c926'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.844 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.844 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.844 12 ERROR oslo_messaging.notify.messaging     yield
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.844 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.844 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.844 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.844 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.844 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.844 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.844 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.844 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.844 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.844 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.844 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.844 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.844 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.844 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.844 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.844 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.844 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.844 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.844 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.844 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.844 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.844 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.844 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.844 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.844 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.844 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.844 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.844 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.844 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.844 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.844 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.844 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.844 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.844 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.844 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.844 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.844 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.844 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.844 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.844 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.844 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.844 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.844 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.844 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.844 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.844 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.844 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.844 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.844 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.844 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.844 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.845 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.845 12 DEBUG ceilometer.compute.pollsters [-] b185217d-5f23-40c8-9f11-7cf87a1886c3/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.846 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f71a7eb9-de85-4116-8789-cb004eecdfc9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'fae1c60e378945ea84b34c4824b835b1', 'user_name': None, 'project_id': 'fa1cd463d74b49139a088d332d37e611', 'project_name': None, 'resource_id': 'instance-00000032-b185217d-5f23-40c8-9f11-7cf87a1886c3-tap5db43502-37', 'timestamp': '2025-12-05T09:45:10.845288', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-252150820', 'name': 'tap5db43502-37', 'instance_id': 'b185217d-5f23-40c8-9f11-7cf87a1886c3', 'instance_type': 'm1.nano', 'host': '2bcb83f9ba5db29ea79c746e62a8a2dc63c5c999436a0571013bf81c', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'fbadeab4-f24f-4100-963a-d228b2a6f7c4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3ebffd97-b242-42d7-b245-ebdaf8e4377c'}, 'image_ref': '3ebffd97-b242-42d7-b245-ebdaf8e4377c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:47:d0:61', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5db43502-37'}, 'message_id': '174e5774-d1bf-11f0-a2f3-fa163e9454b0', 'monotonic_time': 5183.872686508, 'message_signature': '2881a83b582e81bd217adb038053966811e4ea6a902d09dbb8ee8aab65c566d3'}]}, 'timestamp': '2025-12-05 09:45:10.845565', '_unique_id': '7aca5d7b002c4024a9b33a50f59d60d5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.846 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.846 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.846 12 ERROR oslo_messaging.notify.messaging     yield
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.846 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.846 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.846 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.846 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.846 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.846 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.846 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.846 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.846 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.846 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.846 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.846 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.846 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.846 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.846 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.846 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.846 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.846 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.846 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.846 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.846 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.846 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.846 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.846 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.846 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.846 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.846 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.846 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.846 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.846 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.846 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.846 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.846 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.846 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.846 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.846 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.846 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.846 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.846 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.846 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.846 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.846 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.846 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.846 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.846 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.846 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.846 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.846 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.846 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.846 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.846 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.846 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.846 12 DEBUG ceilometer.compute.pollsters [-] b185217d-5f23-40c8-9f11-7cf87a1886c3/disk.device.write.latency volume: 2393608375 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.846 12 DEBUG ceilometer.compute.pollsters [-] b185217d-5f23-40c8-9f11-7cf87a1886c3/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.847 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '91430118-e429-4302-805d-9e24b20017a9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 2393608375, 'user_id': 'fae1c60e378945ea84b34c4824b835b1', 'user_name': None, 'project_id': 'fa1cd463d74b49139a088d332d37e611', 'project_name': None, 'resource_id': 'b185217d-5f23-40c8-9f11-7cf87a1886c3-vda', 'timestamp': '2025-12-05T09:45:10.846681', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-252150820', 'name': 'instance-00000032', 'instance_id': 'b185217d-5f23-40c8-9f11-7cf87a1886c3', 'instance_type': 'm1.nano', 'host': '2bcb83f9ba5db29ea79c746e62a8a2dc63c5c999436a0571013bf81c', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'fbadeab4-f24f-4100-963a-d228b2a6f7c4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3ebffd97-b242-42d7-b245-ebdaf8e4377c'}, 'image_ref': '3ebffd97-b242-42d7-b245-ebdaf8e4377c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '174e8df2-d1bf-11f0-a2f3-fa163e9454b0', 'monotonic_time': 5183.824132009, 'message_signature': '07160e057569b9b9528f0256c479107de413a8b5ea3b5226cca89222160f16ba'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': 'fae1c60e378945ea84b34c4824b835b1', 'user_name': None, 'project_id': 'fa1cd463d74b49139a088d332d37e611', 'project_name': None, 'resource_id': 'b185217d-5f23-40c8-9f11-7cf87a1886c3-sda', 'timestamp': '2025-12-05T09:45:10.846681', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-252150820', 'name': 'instance-00000032', 'instance_id': 'b185217d-5f23-40c8-9f11-7cf87a1886c3', 'instance_type': 'm1.nano', 'host': '2bcb83f9ba5db29ea79c746e62a8a2dc63c5c999436a0571013bf81c', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'fbadeab4-f24f-4100-963a-d228b2a6f7c4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3ebffd97-b242-42d7-b245-ebdaf8e4377c'}, 'image_ref': '3ebffd97-b242-42d7-b245-ebdaf8e4377c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '174e9658-d1bf-11f0-a2f3-fa163e9454b0', 'monotonic_time': 5183.824132009, 'message_signature': '84e549914b283a7d41213e674389f79d2140129505368775f13e0cad6129f1e3'}]}, 'timestamp': '2025-12-05 09:45:10.847132', '_unique_id': 'fbd8f3e938434bce8fb15d6982638091'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.847 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.847 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.847 12 ERROR oslo_messaging.notify.messaging     yield
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.847 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.847 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.847 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.847 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.847 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.847 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.847 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.847 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.847 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.847 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.847 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.847 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.847 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.847 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.847 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.847 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.847 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.847 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.847 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.847 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.847 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.847 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.847 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.847 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.847 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.847 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.847 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.847 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.847 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.847 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.847 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.847 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.847 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.847 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.847 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.847 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.847 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.847 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.847 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.847 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.847 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.847 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.847 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.847 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.847 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.847 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.847 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.847 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.847 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.847 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.847 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.848 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.858 12 DEBUG ceilometer.compute.pollsters [-] b185217d-5f23-40c8-9f11-7cf87a1886c3/disk.device.usage volume: 29949952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.859 12 DEBUG ceilometer.compute.pollsters [-] b185217d-5f23-40c8-9f11-7cf87a1886c3/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.861 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4003fafd-298c-4b1b-8146-619ebb5ab8a6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 29949952, 'user_id': 'fae1c60e378945ea84b34c4824b835b1', 'user_name': None, 'project_id': 'fa1cd463d74b49139a088d332d37e611', 'project_name': None, 'resource_id': 'b185217d-5f23-40c8-9f11-7cf87a1886c3-vda', 'timestamp': '2025-12-05T09:45:10.848299', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-252150820', 'name': 'instance-00000032', 'instance_id': 'b185217d-5f23-40c8-9f11-7cf87a1886c3', 'instance_type': 'm1.nano', 'host': '2bcb83f9ba5db29ea79c746e62a8a2dc63c5c999436a0571013bf81c', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'fbadeab4-f24f-4100-963a-d228b2a6f7c4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3ebffd97-b242-42d7-b245-ebdaf8e4377c'}, 'image_ref': '3ebffd97-b242-42d7-b245-ebdaf8e4377c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '17507a4a-d1bf-11f0-a2f3-fa163e9454b0', 'monotonic_time': 5183.905851849, 'message_signature': '0488630cacbea8b30d42764d71d71c81695c92420b38692f20ab6d8f66f1a65e'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'fae1c60e378945ea84b34c4824b835b1', 'user_name': None, 'project_id': 'fa1cd463d74b49139a088d332d37e611', 'project_name': None, 'resource_id': 'b185217d-5f23-40c8-9f11-7cf87a1886c3-sda', 'timestamp': '2025-12-05T09:45:10.848299', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-252150820', 'name': 'instance-00000032', 'instance_id': 'b185217d-5f23-40c8-9f11-7cf87a1886c3', 'instance_type': 'm1.nano', 'host': '2bcb83f9ba5db29ea79c746e62a8a2dc63c5c999436a0571013bf81c', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'fbadeab4-f24f-4100-963a-d228b2a6f7c4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3ebffd97-b242-42d7-b245-ebdaf8e4377c'}, 'image_ref': '3ebffd97-b242-42d7-b245-ebdaf8e4377c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '17508b34-d1bf-11f0-a2f3-fa163e9454b0', 'monotonic_time': 5183.905851849, 'message_signature': '34764866b13047be60e31c860099165eec8218ff9b92dd513986d8996c868824'}]}, 'timestamp': '2025-12-05 09:45:10.859978', '_unique_id': '6ccaf6cca3434269905eb616981ad06e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.861 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.861 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.861 12 ERROR oslo_messaging.notify.messaging     yield
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.861 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.861 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.861 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.861 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.861 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.861 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.861 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.861 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.861 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.861 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.861 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.861 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.861 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.861 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.861 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.861 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.861 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.861 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.861 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.861 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.861 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.861 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.861 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.861 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.861 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.861 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.861 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.861 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.861 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.861 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.861 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.861 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.861 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.861 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.861 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.861 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.861 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.861 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.861 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.861 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.861 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.861 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.861 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.861 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.861 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.861 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.861 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.861 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.861 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.861 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.861 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.862 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.862 12 DEBUG ceilometer.compute.pollsters [-] b185217d-5f23-40c8-9f11-7cf87a1886c3/network.incoming.packets volume: 16 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.863 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f9df1f04-a9f2-4f77-9621-15b23b4bb9a0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 16, 'user_id': 'fae1c60e378945ea84b34c4824b835b1', 'user_name': None, 'project_id': 'fa1cd463d74b49139a088d332d37e611', 'project_name': None, 'resource_id': 'instance-00000032-b185217d-5f23-40c8-9f11-7cf87a1886c3-tap5db43502-37', 'timestamp': '2025-12-05T09:45:10.862243', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-252150820', 'name': 'tap5db43502-37', 'instance_id': 'b185217d-5f23-40c8-9f11-7cf87a1886c3', 'instance_type': 'm1.nano', 'host': '2bcb83f9ba5db29ea79c746e62a8a2dc63c5c999436a0571013bf81c', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'fbadeab4-f24f-4100-963a-d228b2a6f7c4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3ebffd97-b242-42d7-b245-ebdaf8e4377c'}, 'image_ref': '3ebffd97-b242-42d7-b245-ebdaf8e4377c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:47:d0:61', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5db43502-37'}, 'message_id': '1750ef5c-d1bf-11f0-a2f3-fa163e9454b0', 'monotonic_time': 5183.872686508, 'message_signature': '59b2d2a5c9d5a2df0972a02d319fa867b932e4d85dd8bce04ae3213c15d7dda8'}]}, 'timestamp': '2025-12-05 09:45:10.862608', '_unique_id': '7bab94fa6e5f4c4ab32757fbee11a3ad'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.863 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.863 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.863 12 ERROR oslo_messaging.notify.messaging     yield
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.863 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.863 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.863 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.863 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.863 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.863 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.863 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.863 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.863 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.863 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.863 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.863 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.863 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.863 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.863 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.863 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.863 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.863 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.863 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.863 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.863 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.863 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.863 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.863 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.863 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.863 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.863 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.863 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.863 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.863 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.863 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.863 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.863 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.863 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.863 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.863 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.863 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.863 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.863 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.863 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.863 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.863 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.863 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.863 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.863 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.863 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.863 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.863 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.863 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.863 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.863 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.864 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.864 12 DEBUG ceilometer.compute.pollsters [-] b185217d-5f23-40c8-9f11-7cf87a1886c3/network.incoming.bytes volume: 1866 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.865 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '424b628f-bffe-4c9d-83e6-695c7fafa573', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1866, 'user_id': 'fae1c60e378945ea84b34c4824b835b1', 'user_name': None, 'project_id': 'fa1cd463d74b49139a088d332d37e611', 'project_name': None, 'resource_id': 'instance-00000032-b185217d-5f23-40c8-9f11-7cf87a1886c3-tap5db43502-37', 'timestamp': '2025-12-05T09:45:10.864460', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-252150820', 'name': 'tap5db43502-37', 'instance_id': 'b185217d-5f23-40c8-9f11-7cf87a1886c3', 'instance_type': 'm1.nano', 'host': '2bcb83f9ba5db29ea79c746e62a8a2dc63c5c999436a0571013bf81c', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'fbadeab4-f24f-4100-963a-d228b2a6f7c4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3ebffd97-b242-42d7-b245-ebdaf8e4377c'}, 'image_ref': '3ebffd97-b242-42d7-b245-ebdaf8e4377c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:47:d0:61', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5db43502-37'}, 'message_id': '17514704-d1bf-11f0-a2f3-fa163e9454b0', 'monotonic_time': 5183.872686508, 'message_signature': '7b45d91268a73604db9fb2077c4dd172b5ef3c33a45da3dc295dab003751d92d'}]}, 'timestamp': '2025-12-05 09:45:10.864833', '_unique_id': 'f3e028a318d4481ebe5abeb3a6ceaa4d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.865 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.865 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.865 12 ERROR oslo_messaging.notify.messaging     yield
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.865 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.865 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.865 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.865 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.865 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.865 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.865 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.865 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.865 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.865 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.865 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.865 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.865 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.865 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.865 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.865 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.865 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.865 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.865 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.865 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.865 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.865 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.865 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.865 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.865 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.865 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.865 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.865 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.865 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.865 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.865 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.865 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.865 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.865 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.865 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.865 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.865 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.865 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.865 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.865 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.865 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.865 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.865 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.865 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.865 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.865 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.865 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.865 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.865 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.865 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.865 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.866 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.866 12 DEBUG ceilometer.compute.pollsters [-] b185217d-5f23-40c8-9f11-7cf87a1886c3/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.867 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9989c747-6d48-41b7-9c3e-1e0f253b4b82', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'fae1c60e378945ea84b34c4824b835b1', 'user_name': None, 'project_id': 'fa1cd463d74b49139a088d332d37e611', 'project_name': None, 'resource_id': 'instance-00000032-b185217d-5f23-40c8-9f11-7cf87a1886c3-tap5db43502-37', 'timestamp': '2025-12-05T09:45:10.866609', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-252150820', 'name': 'tap5db43502-37', 'instance_id': 'b185217d-5f23-40c8-9f11-7cf87a1886c3', 'instance_type': 'm1.nano', 'host': '2bcb83f9ba5db29ea79c746e62a8a2dc63c5c999436a0571013bf81c', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'fbadeab4-f24f-4100-963a-d228b2a6f7c4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3ebffd97-b242-42d7-b245-ebdaf8e4377c'}, 'image_ref': '3ebffd97-b242-42d7-b245-ebdaf8e4377c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:47:d0:61', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5db43502-37'}, 'message_id': '17519902-d1bf-11f0-a2f3-fa163e9454b0', 'monotonic_time': 5183.872686508, 'message_signature': 'ff7dcc682e12c081c57674d9b221b84fa71bfd216dc4bf2b62b23cae7d4721d2'}]}, 'timestamp': '2025-12-05 09:45:10.866908', '_unique_id': '636b368184bc4894a765c4cc9a6072f6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.867 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.867 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.867 12 ERROR oslo_messaging.notify.messaging     yield
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.867 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.867 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.867 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.867 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.867 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.867 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.867 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.867 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.867 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.867 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.867 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.867 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.867 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.867 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.867 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.867 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.867 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.867 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.867 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.867 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.867 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.867 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.867 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.867 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.867 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.867 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.867 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.867 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.867 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.867 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.867 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.867 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.867 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.867 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.867 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.867 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.867 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.867 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.867 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.867 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.867 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.867 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.867 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.867 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.867 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.867 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.867 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.867 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.867 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.867 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.867 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.868 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.868 12 DEBUG ceilometer.compute.pollsters [-] b185217d-5f23-40c8-9f11-7cf87a1886c3/network.outgoing.packets volume: 21 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.869 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '176b0585-edef-4292-99af-6e6d8b6f6daa', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 21, 'user_id': 'fae1c60e378945ea84b34c4824b835b1', 'user_name': None, 'project_id': 'fa1cd463d74b49139a088d332d37e611', 'project_name': None, 'resource_id': 'instance-00000032-b185217d-5f23-40c8-9f11-7cf87a1886c3-tap5db43502-37', 'timestamp': '2025-12-05T09:45:10.868548', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-252150820', 'name': 'tap5db43502-37', 'instance_id': 'b185217d-5f23-40c8-9f11-7cf87a1886c3', 'instance_type': 'm1.nano', 'host': '2bcb83f9ba5db29ea79c746e62a8a2dc63c5c999436a0571013bf81c', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'fbadeab4-f24f-4100-963a-d228b2a6f7c4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3ebffd97-b242-42d7-b245-ebdaf8e4377c'}, 'image_ref': '3ebffd97-b242-42d7-b245-ebdaf8e4377c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:47:d0:61', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5db43502-37'}, 'message_id': '1751e4b6-d1bf-11f0-a2f3-fa163e9454b0', 'monotonic_time': 5183.872686508, 'message_signature': 'a67636fa05f9ffc7e3033824950395efea6b7f1c90ab43170cab283f97d1ddd7'}]}, 'timestamp': '2025-12-05 09:45:10.868877', '_unique_id': 'ff9c537512cb441bbb7111b961a89704'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.869 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.869 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.869 12 ERROR oslo_messaging.notify.messaging     yield
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.869 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.869 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.869 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.869 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.869 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.869 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.869 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.869 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.869 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.869 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.869 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.869 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.869 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.869 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.869 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.869 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.869 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.869 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.869 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.869 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.869 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.869 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.869 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.869 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.869 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.869 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.869 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.869 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.869 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.869 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.869 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.869 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.869 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.869 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.869 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.869 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.869 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.869 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.869 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.869 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.869 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.869 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.869 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.869 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.869 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.869 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.869 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.869 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.869 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.869 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.869 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.870 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.870 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.870 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-TestGettingAddress-server-252150820>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestGettingAddress-server-252150820>]
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.870 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.870 12 DEBUG ceilometer.compute.pollsters [-] b185217d-5f23-40c8-9f11-7cf87a1886c3/disk.device.write.bytes volume: 72904704 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.871 12 DEBUG ceilometer.compute.pollsters [-] b185217d-5f23-40c8-9f11-7cf87a1886c3/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.872 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f1541d2c-25fe-4349-b106-8a1a5b60a7c3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 72904704, 'user_id': 'fae1c60e378945ea84b34c4824b835b1', 'user_name': None, 'project_id': 'fa1cd463d74b49139a088d332d37e611', 'project_name': None, 'resource_id': 'b185217d-5f23-40c8-9f11-7cf87a1886c3-vda', 'timestamp': '2025-12-05T09:45:10.870832', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-252150820', 'name': 'instance-00000032', 'instance_id': 'b185217d-5f23-40c8-9f11-7cf87a1886c3', 'instance_type': 'm1.nano', 'host': '2bcb83f9ba5db29ea79c746e62a8a2dc63c5c999436a0571013bf81c', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'fbadeab4-f24f-4100-963a-d228b2a6f7c4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3ebffd97-b242-42d7-b245-ebdaf8e4377c'}, 'image_ref': '3ebffd97-b242-42d7-b245-ebdaf8e4377c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '17523f1a-d1bf-11f0-a2f3-fa163e9454b0', 'monotonic_time': 5183.824132009, 'message_signature': 'f31cf1fc35ffbfc5cd2e20032468df5f2efd1fd7f0a6692960da4b36092bb849'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'fae1c60e378945ea84b34c4824b835b1', 'user_name': None, 'project_id': 'fa1cd463d74b49139a088d332d37e611', 'project_name': None, 'resource_id': 'b185217d-5f23-40c8-9f11-7cf87a1886c3-sda', 'timestamp': '2025-12-05T09:45:10.870832', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-252150820', 'name': 'instance-00000032', 'instance_id': 'b185217d-5f23-40c8-9f11-7cf87a1886c3', 'instance_type': 'm1.nano', 'host': '2bcb83f9ba5db29ea79c746e62a8a2dc63c5c999436a0571013bf81c', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'fbadeab4-f24f-4100-963a-d228b2a6f7c4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3ebffd97-b242-42d7-b245-ebdaf8e4377c'}, 'image_ref': '3ebffd97-b242-42d7-b245-ebdaf8e4377c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '175249c4-d1bf-11f0-a2f3-fa163e9454b0', 'monotonic_time': 5183.824132009, 'message_signature': 'a794c3dc9f2dbf43056120b5706b4c3f0d6243638b63cf387ef00facaa288004'}]}, 'timestamp': '2025-12-05 09:45:10.871391', '_unique_id': '5b4c47e55131461e84d52422bc42ba7e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.872 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.872 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.872 12 ERROR oslo_messaging.notify.messaging     yield
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.872 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.872 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.872 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.872 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.872 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.872 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.872 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.872 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.872 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.872 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.872 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.872 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.872 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.872 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.872 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.872 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.872 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.872 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.872 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.872 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.872 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.872 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.872 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.872 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.872 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.872 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.872 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.872 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.872 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.872 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.872 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.872 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.872 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.872 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.872 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.872 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.872 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.872 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.872 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.872 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.872 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.872 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.872 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.872 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.872 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.872 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.872 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.872 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.872 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.872 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.872 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.872 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.872 12 DEBUG ceilometer.compute.pollsters [-] b185217d-5f23-40c8-9f11-7cf87a1886c3/cpu volume: 12990000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.873 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5fa4dd16-2d3b-4349-aef5-f3c8717b4c50', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 12990000000, 'user_id': 'fae1c60e378945ea84b34c4824b835b1', 'user_name': None, 'project_id': 'fa1cd463d74b49139a088d332d37e611', 'project_name': None, 'resource_id': 'b185217d-5f23-40c8-9f11-7cf87a1886c3', 'timestamp': '2025-12-05T09:45:10.872919', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-252150820', 'name': 'instance-00000032', 'instance_id': 'b185217d-5f23-40c8-9f11-7cf87a1886c3', 'instance_type': 'm1.nano', 'host': '2bcb83f9ba5db29ea79c746e62a8a2dc63c5c999436a0571013bf81c', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'fbadeab4-f24f-4100-963a-d228b2a6f7c4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3ebffd97-b242-42d7-b245-ebdaf8e4377c'}, 'image_ref': '3ebffd97-b242-42d7-b245-ebdaf8e4377c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '17528fba-d1bf-11f0-a2f3-fa163e9454b0', 'monotonic_time': 5183.896798998, 'message_signature': 'd9edc15a981f6d7b2049c8428f883cae6fcb1117eaa7f8a178b42b94220bc955'}]}, 'timestamp': '2025-12-05 09:45:10.873195', '_unique_id': '5eae595137bc46259df1967c4c595809'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.873 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.873 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.873 12 ERROR oslo_messaging.notify.messaging     yield
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.873 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.873 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.873 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.873 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.873 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.873 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.873 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.873 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.873 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.873 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.873 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.873 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.873 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.873 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.873 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.873 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.873 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.873 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.873 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.873 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.873 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.873 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.873 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.873 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.873 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.873 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.873 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.873 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.873 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.873 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.873 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.873 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.873 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.873 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.873 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.873 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.873 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.873 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.873 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.873 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.873 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.873 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.873 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.873 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.873 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.873 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.873 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.873 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.873 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.873 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.873 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.874 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.874 12 DEBUG ceilometer.compute.pollsters [-] b185217d-5f23-40c8-9f11-7cf87a1886c3/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.874 12 DEBUG ceilometer.compute.pollsters [-] b185217d-5f23-40c8-9f11-7cf87a1886c3/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.875 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '06701e67-b5b3-4797-ad3c-fb711d034ea6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'fae1c60e378945ea84b34c4824b835b1', 'user_name': None, 'project_id': 'fa1cd463d74b49139a088d332d37e611', 'project_name': None, 'resource_id': 'b185217d-5f23-40c8-9f11-7cf87a1886c3-vda', 'timestamp': '2025-12-05T09:45:10.874402', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-252150820', 'name': 'instance-00000032', 'instance_id': 'b185217d-5f23-40c8-9f11-7cf87a1886c3', 'instance_type': 'm1.nano', 'host': '2bcb83f9ba5db29ea79c746e62a8a2dc63c5c999436a0571013bf81c', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'fbadeab4-f24f-4100-963a-d228b2a6f7c4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3ebffd97-b242-42d7-b245-ebdaf8e4377c'}, 'image_ref': '3ebffd97-b242-42d7-b245-ebdaf8e4377c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '1752ca16-d1bf-11f0-a2f3-fa163e9454b0', 'monotonic_time': 5183.905851849, 'message_signature': '72882abf78dfad6d7b64af0b71d4caf93dabc3056c244525c64885dc0c0f7be7'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'fae1c60e378945ea84b34c4824b835b1', 'user_name': None, 'project_id': 'fa1cd463d74b49139a088d332d37e611', 'project_name': None, 'resource_id': 'b185217d-5f23-40c8-9f11-7cf87a1886c3-sda', 'timestamp': '2025-12-05T09:45:10.874402', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-252150820', 'name': 'instance-00000032', 'instance_id': 'b185217d-5f23-40c8-9f11-7cf87a1886c3', 'instance_type': 'm1.nano', 'host': '2bcb83f9ba5db29ea79c746e62a8a2dc63c5c999436a0571013bf81c', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'fbadeab4-f24f-4100-963a-d228b2a6f7c4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3ebffd97-b242-42d7-b245-ebdaf8e4377c'}, 'image_ref': '3ebffd97-b242-42d7-b245-ebdaf8e4377c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '1752d2a4-d1bf-11f0-a2f3-fa163e9454b0', 'monotonic_time': 5183.905851849, 'message_signature': '5460c55642d66d47b2b2be6c099d4bdd6ad03da5d01bc54e4e13fb2e73f1e34e'}]}, 'timestamp': '2025-12-05 09:45:10.874909', '_unique_id': '669e024c9a314f8498b8169cb4915001'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.875 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.875 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.875 12 ERROR oslo_messaging.notify.messaging     yield
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.875 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.875 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.875 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.875 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.875 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.875 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.875 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.875 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.875 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.875 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.875 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.875 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.875 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.875 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.875 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.875 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.875 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.875 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.875 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.875 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.875 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.875 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.875 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.875 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.875 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.875 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.875 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.875 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.875 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.875 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.875 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.875 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.875 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.875 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.875 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.875 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.875 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.875 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.875 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.875 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.875 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.875 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.875 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.875 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.875 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.875 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.875 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.875 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.875 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.875 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.875 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.876 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.876 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.876 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-TestGettingAddress-server-252150820>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestGettingAddress-server-252150820>]
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.876 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.876 12 DEBUG ceilometer.compute.pollsters [-] b185217d-5f23-40c8-9f11-7cf87a1886c3/disk.device.allocation volume: 30351360 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.876 12 DEBUG ceilometer.compute.pollsters [-] b185217d-5f23-40c8-9f11-7cf87a1886c3/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.877 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '33277cee-1f54-470f-ab8f-4648102f6e5c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30351360, 'user_id': 'fae1c60e378945ea84b34c4824b835b1', 'user_name': None, 'project_id': 'fa1cd463d74b49139a088d332d37e611', 'project_name': None, 'resource_id': 'b185217d-5f23-40c8-9f11-7cf87a1886c3-vda', 'timestamp': '2025-12-05T09:45:10.876401', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-252150820', 'name': 'instance-00000032', 'instance_id': 'b185217d-5f23-40c8-9f11-7cf87a1886c3', 'instance_type': 'm1.nano', 'host': '2bcb83f9ba5db29ea79c746e62a8a2dc63c5c999436a0571013bf81c', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'fbadeab4-f24f-4100-963a-d228b2a6f7c4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3ebffd97-b242-42d7-b245-ebdaf8e4377c'}, 'image_ref': '3ebffd97-b242-42d7-b245-ebdaf8e4377c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '17531700-d1bf-11f0-a2f3-fa163e9454b0', 'monotonic_time': 5183.905851849, 'message_signature': '6141ae0bd3fc9b07a85497f7be27a405a2a3ce9b1c8974c844beab7bce476485'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': 'fae1c60e378945ea84b34c4824b835b1', 'user_name': None, 'project_id': 'fa1cd463d74b49139a088d332d37e611', 'project_name': None, 'resource_id': 'b185217d-5f23-40c8-9f11-7cf87a1886c3-sda', 'timestamp': '2025-12-05T09:45:10.876401', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-252150820', 'name': 'instance-00000032', 'instance_id': 'b185217d-5f23-40c8-9f11-7cf87a1886c3', 'instance_type': 'm1.nano', 'host': '2bcb83f9ba5db29ea79c746e62a8a2dc63c5c999436a0571013bf81c', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'fbadeab4-f24f-4100-963a-d228b2a6f7c4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3ebffd97-b242-42d7-b245-ebdaf8e4377c'}, 'image_ref': '3ebffd97-b242-42d7-b245-ebdaf8e4377c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '17531f52-d1bf-11f0-a2f3-fa163e9454b0', 'monotonic_time': 5183.905851849, 'message_signature': '0783f16ba538514dc0114f5773fd88308d75525a4b12f8e92e216ffc56ee6137'}]}, 'timestamp': '2025-12-05 09:45:10.876851', '_unique_id': '1fb3171e7cda42239b68e5d94e4f4e1c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.877 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.877 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.877 12 ERROR oslo_messaging.notify.messaging     yield
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.877 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.877 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.877 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.877 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.877 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.877 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.877 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.877 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.877 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.877 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.877 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.877 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.877 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.877 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.877 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.877 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.877 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.877 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.877 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.877 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.877 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.877 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.877 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.877 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.877 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.877 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.877 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.877 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.877 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.877 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.877 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.877 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.877 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.877 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.877 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.877 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.877 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.877 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.877 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.877 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.877 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.877 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.877 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.877 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.877 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.877 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.877 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.877 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.877 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.877 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.877 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.877 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.878 12 DEBUG ceilometer.compute.pollsters [-] b185217d-5f23-40c8-9f11-7cf87a1886c3/disk.device.read.requests volume: 1087 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.878 12 DEBUG ceilometer.compute.pollsters [-] b185217d-5f23-40c8-9f11-7cf87a1886c3/disk.device.read.requests volume: 108 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.879 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'da486d3b-8817-4f0a-a1af-965afef0719f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1087, 'user_id': 'fae1c60e378945ea84b34c4824b835b1', 'user_name': None, 'project_id': 'fa1cd463d74b49139a088d332d37e611', 'project_name': None, 'resource_id': 'b185217d-5f23-40c8-9f11-7cf87a1886c3-vda', 'timestamp': '2025-12-05T09:45:10.878067', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-252150820', 'name': 'instance-00000032', 'instance_id': 'b185217d-5f23-40c8-9f11-7cf87a1886c3', 'instance_type': 'm1.nano', 'host': '2bcb83f9ba5db29ea79c746e62a8a2dc63c5c999436a0571013bf81c', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'fbadeab4-f24f-4100-963a-d228b2a6f7c4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3ebffd97-b242-42d7-b245-ebdaf8e4377c'}, 'image_ref': '3ebffd97-b242-42d7-b245-ebdaf8e4377c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '17535850-d1bf-11f0-a2f3-fa163e9454b0', 'monotonic_time': 5183.824132009, 'message_signature': '869778efe82dadd1687fb24f898ca41f06d9c601f3e24432cf4f5804a864fa16'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 108, 'user_id': 'fae1c60e378945ea84b34c4824b835b1', 'user_name': None, 'project_id': 'fa1cd463d74b49139a088d332d37e611', 'project_name': None, 'resource_id': 'b185217d-5f23-40c8-9f11-7cf87a1886c3-sda', 'timestamp': '2025-12-05T09:45:10.878067', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-252150820', 'name': 'instance-00000032', 'instance_id': 'b185217d-5f23-40c8-9f11-7cf87a1886c3', 'instance_type': 'm1.nano', 'host': '2bcb83f9ba5db29ea79c746e62a8a2dc63c5c999436a0571013bf81c', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'fbadeab4-f24f-4100-963a-d228b2a6f7c4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3ebffd97-b242-42d7-b245-ebdaf8e4377c'}, 'image_ref': '3ebffd97-b242-42d7-b245-ebdaf8e4377c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '17536156-d1bf-11f0-a2f3-fa163e9454b0', 'monotonic_time': 5183.824132009, 'message_signature': 'dc74b4e7272459377f80de303092301fa913a5c0e792d6ea378b761a965f34f8'}]}, 'timestamp': '2025-12-05 09:45:10.878566', '_unique_id': '6c7ce19b9b854b559d0c2adb7952bfbf'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.879 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.879 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.879 12 ERROR oslo_messaging.notify.messaging     yield
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.879 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.879 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.879 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.879 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.879 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.879 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.879 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.879 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.879 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.879 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.879 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.879 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.879 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.879 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.879 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.879 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.879 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.879 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.879 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.879 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.879 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.879 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.879 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.879 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.879 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.879 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.879 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.879 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.879 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.879 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.879 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.879 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.879 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.879 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.879 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.879 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.879 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.879 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.879 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.879 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.879 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.879 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.879 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.879 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.879 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.879 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.879 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.879 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.879 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.879 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.879 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.879 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.879 12 DEBUG ceilometer.compute.pollsters [-] b185217d-5f23-40c8-9f11-7cf87a1886c3/network.outgoing.bytes volume: 2572 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.880 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e6ccff6f-b854-4dd8-89d8-94bbc40a46ec', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2572, 'user_id': 'fae1c60e378945ea84b34c4824b835b1', 'user_name': None, 'project_id': 'fa1cd463d74b49139a088d332d37e611', 'project_name': None, 'resource_id': 'instance-00000032-b185217d-5f23-40c8-9f11-7cf87a1886c3-tap5db43502-37', 'timestamp': '2025-12-05T09:45:10.879824', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-252150820', 'name': 'tap5db43502-37', 'instance_id': 'b185217d-5f23-40c8-9f11-7cf87a1886c3', 'instance_type': 'm1.nano', 'host': '2bcb83f9ba5db29ea79c746e62a8a2dc63c5c999436a0571013bf81c', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'fbadeab4-f24f-4100-963a-d228b2a6f7c4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3ebffd97-b242-42d7-b245-ebdaf8e4377c'}, 'image_ref': '3ebffd97-b242-42d7-b245-ebdaf8e4377c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:47:d0:61', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5db43502-37'}, 'message_id': '17539d6a-d1bf-11f0-a2f3-fa163e9454b0', 'monotonic_time': 5183.872686508, 'message_signature': '9235906808ca096359dd2070b0fb9ade9f1f200353f311c72819bf35530bcfae'}]}, 'timestamp': '2025-12-05 09:45:10.880144', '_unique_id': 'aece08bb70cf4856892a3f90f41cf4e4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.880 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.880 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.880 12 ERROR oslo_messaging.notify.messaging     yield
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.880 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.880 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.880 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.880 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.880 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.880 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.880 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.880 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.880 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.880 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.880 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.880 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.880 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.880 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.880 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.880 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.880 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.880 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.880 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.880 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.880 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.880 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.880 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.880 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.880 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.880 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.880 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.880 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.880 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.880 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.880 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.880 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.880 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.880 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.880 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.880 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.880 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.880 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.880 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.880 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.880 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.880 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.880 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.880 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.880 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.880 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.880 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.880 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.880 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.880 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.880 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.881 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.881 12 DEBUG ceilometer.compute.pollsters [-] b185217d-5f23-40c8-9f11-7cf87a1886c3/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.882 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'dbc2b6c3-8ead-4059-8bba-fcf15d6ab986', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'fae1c60e378945ea84b34c4824b835b1', 'user_name': None, 'project_id': 'fa1cd463d74b49139a088d332d37e611', 'project_name': None, 'resource_id': 'instance-00000032-b185217d-5f23-40c8-9f11-7cf87a1886c3-tap5db43502-37', 'timestamp': '2025-12-05T09:45:10.881362', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-252150820', 'name': 'tap5db43502-37', 'instance_id': 'b185217d-5f23-40c8-9f11-7cf87a1886c3', 'instance_type': 'm1.nano', 'host': '2bcb83f9ba5db29ea79c746e62a8a2dc63c5c999436a0571013bf81c', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'fbadeab4-f24f-4100-963a-d228b2a6f7c4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3ebffd97-b242-42d7-b245-ebdaf8e4377c'}, 'image_ref': '3ebffd97-b242-42d7-b245-ebdaf8e4377c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:47:d0:61', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5db43502-37'}, 'message_id': '1753d80c-d1bf-11f0-a2f3-fa163e9454b0', 'monotonic_time': 5183.872686508, 'message_signature': 'ab4bfeec602fde7c50eaebb0a3d605eb4695d2d919d5529eb1b4eb2b5889be2f'}]}, 'timestamp': '2025-12-05 09:45:10.881627', '_unique_id': 'd35a996cda1249b68fd83a9ab3679215'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.882 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.882 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.882 12 ERROR oslo_messaging.notify.messaging     yield
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.882 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.882 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.882 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.882 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.882 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.882 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.882 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.882 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.882 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.882 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.882 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.882 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.882 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.882 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.882 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.882 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.882 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.882 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.882 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.882 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.882 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.882 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.882 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.882 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.882 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.882 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.882 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.882 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.882 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.882 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.882 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.882 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.882 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.882 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.882 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.882 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.882 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.882 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.882 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.882 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.882 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.882 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.882 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.882 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.882 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.882 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.882 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.882 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.882 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.882 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.882 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.882 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.882 12 DEBUG ceilometer.compute.pollsters [-] b185217d-5f23-40c8-9f11-7cf87a1886c3/disk.device.write.requests volume: 314 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.883 12 DEBUG ceilometer.compute.pollsters [-] b185217d-5f23-40c8-9f11-7cf87a1886c3/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.884 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '15d5603f-9c08-4253-bfde-25dac354e36f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 314, 'user_id': 'fae1c60e378945ea84b34c4824b835b1', 'user_name': None, 'project_id': 'fa1cd463d74b49139a088d332d37e611', 'project_name': None, 'resource_id': 'b185217d-5f23-40c8-9f11-7cf87a1886c3-vda', 'timestamp': '2025-12-05T09:45:10.882895', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-252150820', 'name': 'instance-00000032', 'instance_id': 'b185217d-5f23-40c8-9f11-7cf87a1886c3', 'instance_type': 'm1.nano', 'host': '2bcb83f9ba5db29ea79c746e62a8a2dc63c5c999436a0571013bf81c', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'fbadeab4-f24f-4100-963a-d228b2a6f7c4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3ebffd97-b242-42d7-b245-ebdaf8e4377c'}, 'image_ref': '3ebffd97-b242-42d7-b245-ebdaf8e4377c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '175414e8-d1bf-11f0-a2f3-fa163e9454b0', 'monotonic_time': 5183.824132009, 'message_signature': 'cb3fc4eafd9ea5dfb427f4b2b3458595e68ddcad3991b84514d2dbdd9febcaec'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': 'fae1c60e378945ea84b34c4824b835b1', 'user_name': None, 'project_id': 'fa1cd463d74b49139a088d332d37e611', 'project_name': None, 'resource_id': 'b185217d-5f23-40c8-9f11-7cf87a1886c3-sda', 'timestamp': '2025-12-05T09:45:10.882895', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-252150820', 'name': 'instance-00000032', 'instance_id': 'b185217d-5f23-40c8-9f11-7cf87a1886c3', 'instance_type': 'm1.nano', 'host': '2bcb83f9ba5db29ea79c746e62a8a2dc63c5c999436a0571013bf81c', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'fbadeab4-f24f-4100-963a-d228b2a6f7c4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3ebffd97-b242-42d7-b245-ebdaf8e4377c'}, 'image_ref': '3ebffd97-b242-42d7-b245-ebdaf8e4377c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '17541ee8-d1bf-11f0-a2f3-fa163e9454b0', 'monotonic_time': 5183.824132009, 'message_signature': '0061315e94e14bcb95de93a6cd30f1db06e3ad449bd023e0c97603a77eb72c77'}]}, 'timestamp': '2025-12-05 09:45:10.883401', '_unique_id': '4d2e5750f641407c8b0cc9664a7d76df'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.884 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.884 12 ERROR oslo_messaging.notify.messaging     yield
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.884 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.884 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.884 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.884 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.884 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.884 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.884 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.884 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.884 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.884 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.884 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.884 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.884 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.884 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.884 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.884 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.884 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.884 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.884 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.884 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.884 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.884 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.884 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.884 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.884 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.884 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.884 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.884 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.884 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.884 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.884 12 DEBUG ceilometer.compute.pollsters [-] b185217d-5f23-40c8-9f11-7cf87a1886c3/disk.device.read.bytes volume: 30206464 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.885 12 DEBUG ceilometer.compute.pollsters [-] b185217d-5f23-40c8-9f11-7cf87a1886c3/disk.device.read.bytes volume: 274750 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.885 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1d275058-5878-47fc-8d41-40445431c831', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 30206464, 'user_id': 'fae1c60e378945ea84b34c4824b835b1', 'user_name': None, 'project_id': 'fa1cd463d74b49139a088d332d37e611', 'project_name': None, 'resource_id': 'b185217d-5f23-40c8-9f11-7cf87a1886c3-vda', 'timestamp': '2025-12-05T09:45:10.884856', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-252150820', 'name': 'instance-00000032', 'instance_id': 'b185217d-5f23-40c8-9f11-7cf87a1886c3', 'instance_type': 'm1.nano', 'host': '2bcb83f9ba5db29ea79c746e62a8a2dc63c5c999436a0571013bf81c', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'fbadeab4-f24f-4100-963a-d228b2a6f7c4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3ebffd97-b242-42d7-b245-ebdaf8e4377c'}, 'image_ref': '3ebffd97-b242-42d7-b245-ebdaf8e4377c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '175460e2-d1bf-11f0-a2f3-fa163e9454b0', 'monotonic_time': 5183.824132009, 'message_signature': '5f2a1c5778ee1703eda43625b085efcd71aaf334f59389afd0949c50e14e9c24'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 274750, 'user_id': 'fae1c60e378945ea84b34c4824b835b1', 'user_name': None, 'project_id': 'fa1cd463d74b49139a088d332d37e611', 'project_name': None, 'resource_id': 'b185217d-5f23-40c8-9f11-7cf87a1886c3-sda', 'timestamp': '2025-12-05T09:45:10.884856', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-252150820', 'name': 'instance-00000032', 'instance_id': 'b185217d-5f23-40c8-9f11-7cf87a1886c3', 'instance_type': 'm1.nano', 'host': '2bcb83f9ba5db29ea79c746e62a8a2dc63c5c999436a0571013bf81c', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'fbadeab4-f24f-4100-963a-d228b2a6f7c4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3ebffd97-b242-42d7-b245-ebdaf8e4377c'}, 'image_ref': '3ebffd97-b242-42d7-b245-ebdaf8e4377c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '17546a9c-d1bf-11f0-a2f3-fa163e9454b0', 'monotonic_time': 5183.824132009, 'message_signature': 'bab48af7901823ab4adfd189087c83c8e54b77aa55d4ff4d277fc3b043857420'}]}, 'timestamp': '2025-12-05 09:45:10.885335', '_unique_id': '1c5fbe9a48004c94b10b63e2c8f81f9b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.885 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.885 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.885 12 ERROR oslo_messaging.notify.messaging     yield
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.885 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.885 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.885 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.885 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.885 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.885 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.885 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.885 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.885 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.885 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.885 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.885 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.885 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.885 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.885 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.885 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.885 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.885 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.885 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.885 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.885 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.885 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.885 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.885 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.885 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.885 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.885 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.885 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.885 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.885 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.885 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.885 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.885 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.885 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.885 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.885 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.885 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.885 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.885 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.885 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.885 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.885 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.885 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.885 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.885 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.885 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.885 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.885 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.885 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.885 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.885 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.886 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.886 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.886 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-TestGettingAddress-server-252150820>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestGettingAddress-server-252150820>]
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.886 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.887 12 DEBUG ceilometer.compute.pollsters [-] b185217d-5f23-40c8-9f11-7cf87a1886c3/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.887 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f028bc1d-351b-4b63-a67a-a43363c351f1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'fae1c60e378945ea84b34c4824b835b1', 'user_name': None, 'project_id': 'fa1cd463d74b49139a088d332d37e611', 'project_name': None, 'resource_id': 'instance-00000032-b185217d-5f23-40c8-9f11-7cf87a1886c3-tap5db43502-37', 'timestamp': '2025-12-05T09:45:10.886998', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-252150820', 'name': 'tap5db43502-37', 'instance_id': 'b185217d-5f23-40c8-9f11-7cf87a1886c3', 'instance_type': 'm1.nano', 'host': '2bcb83f9ba5db29ea79c746e62a8a2dc63c5c999436a0571013bf81c', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'fbadeab4-f24f-4100-963a-d228b2a6f7c4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3ebffd97-b242-42d7-b245-ebdaf8e4377c'}, 'image_ref': '3ebffd97-b242-42d7-b245-ebdaf8e4377c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:47:d0:61', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5db43502-37'}, 'message_id': '1754b470-d1bf-11f0-a2f3-fa163e9454b0', 'monotonic_time': 5183.872686508, 'message_signature': '3d0da1027758c62438db184f4cdbc5ceea32449b4c2e814fdb75bb5b23727d45'}]}, 'timestamp': '2025-12-05 09:45:10.887239', '_unique_id': '41e4c36362744b7a965b97d277e642b9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.887 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.887 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.887 12 ERROR oslo_messaging.notify.messaging     yield
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.887 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.887 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.887 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.887 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.887 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.887 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.887 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.887 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.887 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.887 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.887 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.887 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.887 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.887 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.887 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.887 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.887 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.887 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.887 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.887 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.887 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.887 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.887 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.887 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.887 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.887 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.887 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.887 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.887 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.887 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.887 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.887 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.887 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.887 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.887 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.887 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.887 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.887 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.887 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.887 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.887 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.887 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.887 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.887 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.887 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.887 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.887 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.887 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.887 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.887 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 05 09:45:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:45:10.887 12 ERROR oslo_messaging.notify.messaging 
Dec 05 09:45:11 compute-1 nova_compute[189066]: 2025-12-05 09:45:11.287 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:45:11 compute-1 nova_compute[189066]: 2025-12-05 09:45:11.582 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:45:11 compute-1 podman[233152]: 2025-12-05 09:45:11.629182651 +0000 UTC m=+0.061529118 container health_status e8cea6352187f28e672aa25c227f2705a90d58074b8cf42235a56df5d4026fd5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a42d21893a42142b0b00e96296a13ac9d2c0fedf55d6438ad104fd0309c63376-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, managed_by=edpm_ansible)
Dec 05 09:45:12 compute-1 nova_compute[189066]: 2025-12-05 09:45:12.432 189070 DEBUG nova.network.neutron [req-c13e69d8-bdb9-46c5-9dbe-9595c6a82911 req-8fa8594b-cbb8-4ea3-b4b5-98396b0f8d80 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: b185217d-5f23-40c8-9f11-7cf87a1886c3] Updated VIF entry in instance network info cache for port 5db43502-374a-4ff3-8bcc-a41bb1ae8440. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Dec 05 09:45:12 compute-1 nova_compute[189066]: 2025-12-05 09:45:12.433 189070 DEBUG nova.network.neutron [req-c13e69d8-bdb9-46c5-9dbe-9595c6a82911 req-8fa8594b-cbb8-4ea3-b4b5-98396b0f8d80 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: b185217d-5f23-40c8-9f11-7cf87a1886c3] Updating instance_info_cache with network_info: [{"id": "5db43502-374a-4ff3-8bcc-a41bb1ae8440", "address": "fa:16:3e:47:d0:61", "network": {"id": "ae013098-2701-4ae8-8621-a0515ff9b432", "bridge": "br-int", "label": "tempest-network-smoke--758374883", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe47:d061", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.231", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fa1cd463d74b49139a088d332d37e611", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5db43502-37", "ovs_interfaceid": "5db43502-374a-4ff3-8bcc-a41bb1ae8440", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 09:45:12 compute-1 nova_compute[189066]: 2025-12-05 09:45:12.460 189070 DEBUG oslo_concurrency.lockutils [req-c13e69d8-bdb9-46c5-9dbe-9595c6a82911 req-8fa8594b-cbb8-4ea3-b4b5-98396b0f8d80 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Releasing lock "refresh_cache-b185217d-5f23-40c8-9f11-7cf87a1886c3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 09:45:14 compute-1 podman[233172]: 2025-12-05 09:45:14.628280067 +0000 UTC m=+0.063429154 container health_status 3f3288243aefe700d53cffce184353529fbaf6e44f1c19ec4792ed576af41176 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=multipathd, io.buildah.version=1.41.3, container_name=multipathd)
Dec 05 09:45:16 compute-1 nova_compute[189066]: 2025-12-05 09:45:16.340 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:45:16 compute-1 nova_compute[189066]: 2025-12-05 09:45:16.584 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:45:17 compute-1 podman[233192]: 2025-12-05 09:45:17.644747069 +0000 UTC m=+0.082812039 container health_status b07f36300303a5c1729056a61e344d6c6fd9b2e404082d8e8ce7185d45652c2b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.33.7, vendor=Red Hat, Inc., release=1755695350, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., io.openshift.tags=minimal rhel9, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, architecture=x86_64, build-date=2025-08-20T13:12:41, name=ubi9-minimal, io.openshift.expose-services=, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, config_id=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b)
Dec 05 09:45:21 compute-1 nova_compute[189066]: 2025-12-05 09:45:21.381 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:45:21 compute-1 nova_compute[189066]: 2025-12-05 09:45:21.585 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:45:22 compute-1 podman[233215]: 2025-12-05 09:45:22.644032011 +0000 UTC m=+0.084060029 container health_status b9f45cdcb567bb250381fa80d86c78c226d5638c7de1b6d79ee017275814ff68 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Dec 05 09:45:24 compute-1 nova_compute[189066]: 2025-12-05 09:45:24.015 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:45:24 compute-1 nova_compute[189066]: 2025-12-05 09:45:24.020 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:45:24 compute-1 nova_compute[189066]: 2025-12-05 09:45:24.020 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:45:26 compute-1 nova_compute[189066]: 2025-12-05 09:45:26.383 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:45:26 compute-1 nova_compute[189066]: 2025-12-05 09:45:26.588 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:45:26 compute-1 podman[233240]: 2025-12-05 09:45:26.625627919 +0000 UTC m=+0.065509705 container health_status 550b604b3b40c506829669f3af0f3e8dc4e7ac01464222ce7f26998708fe1a96 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Dec 05 09:45:27 compute-1 nova_compute[189066]: 2025-12-05 09:45:27.021 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:45:27 compute-1 nova_compute[189066]: 2025-12-05 09:45:27.021 189070 DEBUG nova.compute.manager [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 05 09:45:27 compute-1 nova_compute[189066]: 2025-12-05 09:45:27.021 189070 DEBUG nova.compute.manager [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 05 09:45:27 compute-1 nova_compute[189066]: 2025-12-05 09:45:27.495 189070 DEBUG oslo_concurrency.lockutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Acquiring lock "refresh_cache-b185217d-5f23-40c8-9f11-7cf87a1886c3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 09:45:27 compute-1 nova_compute[189066]: 2025-12-05 09:45:27.496 189070 DEBUG oslo_concurrency.lockutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Acquired lock "refresh_cache-b185217d-5f23-40c8-9f11-7cf87a1886c3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 09:45:27 compute-1 nova_compute[189066]: 2025-12-05 09:45:27.496 189070 DEBUG nova.network.neutron [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] [instance: b185217d-5f23-40c8-9f11-7cf87a1886c3] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Dec 05 09:45:27 compute-1 nova_compute[189066]: 2025-12-05 09:45:27.496 189070 DEBUG nova.objects.instance [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Lazy-loading 'info_cache' on Instance uuid b185217d-5f23-40c8-9f11-7cf87a1886c3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 09:45:31 compute-1 nova_compute[189066]: 2025-12-05 09:45:31.537 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:45:31 compute-1 nova_compute[189066]: 2025-12-05 09:45:31.591 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:45:32 compute-1 podman[233264]: 2025-12-05 09:45:32.632421811 +0000 UTC m=+0.077529750 container health_status d60d3dfd47255bc7c068dbedc03dbf86678863f0414a1b8f8536e4d53dd67a13 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a42d21893a42142b0b00e96296a13ac9d2c0fedf55d6438ad104fd0309c63376-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Dec 05 09:45:33 compute-1 nova_compute[189066]: 2025-12-05 09:45:33.249 189070 DEBUG nova.network.neutron [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] [instance: b185217d-5f23-40c8-9f11-7cf87a1886c3] Updating instance_info_cache with network_info: [{"id": "5db43502-374a-4ff3-8bcc-a41bb1ae8440", "address": "fa:16:3e:47:d0:61", "network": {"id": "ae013098-2701-4ae8-8621-a0515ff9b432", "bridge": "br-int", "label": "tempest-network-smoke--758374883", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe47:d061", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.231", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fa1cd463d74b49139a088d332d37e611", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5db43502-37", "ovs_interfaceid": "5db43502-374a-4ff3-8bcc-a41bb1ae8440", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 09:45:33 compute-1 nova_compute[189066]: 2025-12-05 09:45:33.675 189070 DEBUG oslo_concurrency.lockutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Releasing lock "refresh_cache-b185217d-5f23-40c8-9f11-7cf87a1886c3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 09:45:33 compute-1 nova_compute[189066]: 2025-12-05 09:45:33.675 189070 DEBUG nova.compute.manager [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] [instance: b185217d-5f23-40c8-9f11-7cf87a1886c3] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Dec 05 09:45:33 compute-1 nova_compute[189066]: 2025-12-05 09:45:33.676 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:45:33 compute-1 nova_compute[189066]: 2025-12-05 09:45:33.677 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:45:33 compute-1 nova_compute[189066]: 2025-12-05 09:45:33.678 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:45:33 compute-1 nova_compute[189066]: 2025-12-05 09:45:33.678 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:45:33 compute-1 nova_compute[189066]: 2025-12-05 09:45:33.679 189070 DEBUG nova.compute.manager [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 05 09:45:33 compute-1 nova_compute[189066]: 2025-12-05 09:45:33.680 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:45:33 compute-1 nova_compute[189066]: 2025-12-05 09:45:33.772 189070 DEBUG oslo_concurrency.lockutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:45:33 compute-1 nova_compute[189066]: 2025-12-05 09:45:33.773 189070 DEBUG oslo_concurrency.lockutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:45:33 compute-1 nova_compute[189066]: 2025-12-05 09:45:33.774 189070 DEBUG oslo_concurrency.lockutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:45:33 compute-1 nova_compute[189066]: 2025-12-05 09:45:33.774 189070 DEBUG nova.compute.resource_tracker [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 05 09:45:33 compute-1 nova_compute[189066]: 2025-12-05 09:45:33.921 189070 DEBUG oslo_concurrency.processutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b185217d-5f23-40c8-9f11-7cf87a1886c3/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 09:45:33 compute-1 nova_compute[189066]: 2025-12-05 09:45:33.995 189070 DEBUG oslo_concurrency.processutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b185217d-5f23-40c8-9f11-7cf87a1886c3/disk --force-share --output=json" returned: 0 in 0.074s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 09:45:33 compute-1 nova_compute[189066]: 2025-12-05 09:45:33.997 189070 DEBUG oslo_concurrency.processutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b185217d-5f23-40c8-9f11-7cf87a1886c3/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 09:45:34 compute-1 nova_compute[189066]: 2025-12-05 09:45:34.057 189070 DEBUG oslo_concurrency.processutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b185217d-5f23-40c8-9f11-7cf87a1886c3/disk --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 09:45:34 compute-1 nova_compute[189066]: 2025-12-05 09:45:34.210 189070 WARNING nova.virt.libvirt.driver [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 05 09:45:34 compute-1 nova_compute[189066]: 2025-12-05 09:45:34.212 189070 DEBUG nova.compute.resource_tracker [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5587MB free_disk=73.29489517211914GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 05 09:45:34 compute-1 nova_compute[189066]: 2025-12-05 09:45:34.212 189070 DEBUG oslo_concurrency.lockutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:45:34 compute-1 nova_compute[189066]: 2025-12-05 09:45:34.213 189070 DEBUG oslo_concurrency.lockutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:45:34 compute-1 nova_compute[189066]: 2025-12-05 09:45:34.440 189070 DEBUG nova.compute.resource_tracker [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Instance b185217d-5f23-40c8-9f11-7cf87a1886c3 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Dec 05 09:45:34 compute-1 nova_compute[189066]: 2025-12-05 09:45:34.441 189070 DEBUG nova.compute.resource_tracker [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 05 09:45:34 compute-1 nova_compute[189066]: 2025-12-05 09:45:34.441 189070 DEBUG nova.compute.resource_tracker [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 05 09:45:34 compute-1 nova_compute[189066]: 2025-12-05 09:45:34.579 189070 DEBUG nova.compute.provider_tree [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Inventory has not changed in ProviderTree for provider: be68f9f1-7820-4bfa-8dbd-210e13729f64 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 09:45:34 compute-1 nova_compute[189066]: 2025-12-05 09:45:34.959 189070 DEBUG nova.scheduler.client.report [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Inventory has not changed for provider be68f9f1-7820-4bfa-8dbd-210e13729f64 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 09:45:34 compute-1 nova_compute[189066]: 2025-12-05 09:45:34.989 189070 DEBUG nova.compute.resource_tracker [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 05 09:45:34 compute-1 nova_compute[189066]: 2025-12-05 09:45:34.990 189070 DEBUG oslo_concurrency.lockutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.777s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:45:36 compute-1 nova_compute[189066]: 2025-12-05 09:45:36.540 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:45:36 compute-1 nova_compute[189066]: 2025-12-05 09:45:36.593 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:45:40 compute-1 podman[233291]: 2025-12-05 09:45:40.65869527 +0000 UTC m=+0.100537304 container health_status 0cdc951f3b455a1459471b20e1f1626ca53f702831c125f835d1b670aeb17ccf (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a42d21893a42142b0b00e96296a13ac9d2c0fedf55d6438ad104fd0309c63376-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller)
Dec 05 09:45:41 compute-1 nova_compute[189066]: 2025-12-05 09:45:41.546 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:45:41 compute-1 nova_compute[189066]: 2025-12-05 09:45:41.595 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:45:42 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:45:42.052 105272 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=28, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f2:d3:89', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '8e:43:2c:7d:20:dd'}, ipsec=False) old=SB_Global(nb_cfg=27) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 09:45:42 compute-1 nova_compute[189066]: 2025-12-05 09:45:42.053 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:45:42 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:45:42.055 105272 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Dec 05 09:45:42 compute-1 podman[233317]: 2025-12-05 09:45:42.616061045 +0000 UTC m=+0.056017883 container health_status e8cea6352187f28e672aa25c227f2705a90d58074b8cf42235a56df5d4026fd5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a42d21893a42142b0b00e96296a13ac9d2c0fedf55d6438ad104fd0309c63376-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, config_id=ovn_metadata_agent)
Dec 05 09:45:45 compute-1 podman[233337]: 2025-12-05 09:45:45.659621379 +0000 UTC m=+0.082903701 container health_status 3f3288243aefe700d53cffce184353529fbaf6e44f1c19ec4792ed576af41176 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.vendor=CentOS, container_name=multipathd)
Dec 05 09:45:46 compute-1 nova_compute[189066]: 2025-12-05 09:45:46.552 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:45:46 compute-1 nova_compute[189066]: 2025-12-05 09:45:46.598 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:45:48 compute-1 podman[233357]: 2025-12-05 09:45:48.640804127 +0000 UTC m=+0.077961201 container health_status b07f36300303a5c1729056a61e344d6c6fd9b2e404082d8e8ce7185d45652c2b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.33.7, version=9.6, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, vcs-type=git, config_id=openstack_network_exporter, container_name=openstack_network_exporter, distribution-scope=public, release=1755695350, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, architecture=x86_64, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 05 09:45:51 compute-1 nova_compute[189066]: 2025-12-05 09:45:51.554 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:45:51 compute-1 nova_compute[189066]: 2025-12-05 09:45:51.600 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:45:52 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:45:52.058 105272 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=2deaed7a-68f6-453c-b7f8-10ef033f3762, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '28'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 09:45:53 compute-1 podman[233379]: 2025-12-05 09:45:53.618129827 +0000 UTC m=+0.055691765 container health_status b9f45cdcb567bb250381fa80d86c78c226d5638c7de1b6d79ee017275814ff68 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Dec 05 09:45:56 compute-1 nova_compute[189066]: 2025-12-05 09:45:56.555 189070 DEBUG nova.compute.manager [req-c147e3d0-97c6-4fcf-b951-9f0d5c6605eb req-32c6aa1a-a292-4955-9f4d-b7b936dd96cf 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: b185217d-5f23-40c8-9f11-7cf87a1886c3] Received event network-changed-5db43502-374a-4ff3-8bcc-a41bb1ae8440 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 09:45:56 compute-1 nova_compute[189066]: 2025-12-05 09:45:56.555 189070 DEBUG nova.compute.manager [req-c147e3d0-97c6-4fcf-b951-9f0d5c6605eb req-32c6aa1a-a292-4955-9f4d-b7b936dd96cf 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: b185217d-5f23-40c8-9f11-7cf87a1886c3] Refreshing instance network info cache due to event network-changed-5db43502-374a-4ff3-8bcc-a41bb1ae8440. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Dec 05 09:45:56 compute-1 nova_compute[189066]: 2025-12-05 09:45:56.555 189070 DEBUG oslo_concurrency.lockutils [req-c147e3d0-97c6-4fcf-b951-9f0d5c6605eb req-32c6aa1a-a292-4955-9f4d-b7b936dd96cf 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Acquiring lock "refresh_cache-b185217d-5f23-40c8-9f11-7cf87a1886c3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 09:45:56 compute-1 nova_compute[189066]: 2025-12-05 09:45:56.556 189070 DEBUG oslo_concurrency.lockutils [req-c147e3d0-97c6-4fcf-b951-9f0d5c6605eb req-32c6aa1a-a292-4955-9f4d-b7b936dd96cf 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Acquired lock "refresh_cache-b185217d-5f23-40c8-9f11-7cf87a1886c3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 09:45:56 compute-1 nova_compute[189066]: 2025-12-05 09:45:56.556 189070 DEBUG nova.network.neutron [req-c147e3d0-97c6-4fcf-b951-9f0d5c6605eb req-32c6aa1a-a292-4955-9f4d-b7b936dd96cf 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: b185217d-5f23-40c8-9f11-7cf87a1886c3] Refreshing network info cache for port 5db43502-374a-4ff3-8bcc-a41bb1ae8440 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Dec 05 09:45:56 compute-1 nova_compute[189066]: 2025-12-05 09:45:56.599 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:45:56 compute-1 nova_compute[189066]: 2025-12-05 09:45:56.603 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:45:57 compute-1 nova_compute[189066]: 2025-12-05 09:45:57.159 189070 DEBUG oslo_concurrency.lockutils [None req-844502e2-9f9a-4ce7-80cd-25140ea4fb0e fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] Acquiring lock "b185217d-5f23-40c8-9f11-7cf87a1886c3" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:45:57 compute-1 nova_compute[189066]: 2025-12-05 09:45:57.160 189070 DEBUG oslo_concurrency.lockutils [None req-844502e2-9f9a-4ce7-80cd-25140ea4fb0e fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] Lock "b185217d-5f23-40c8-9f11-7cf87a1886c3" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:45:57 compute-1 nova_compute[189066]: 2025-12-05 09:45:57.160 189070 DEBUG oslo_concurrency.lockutils [None req-844502e2-9f9a-4ce7-80cd-25140ea4fb0e fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] Acquiring lock "b185217d-5f23-40c8-9f11-7cf87a1886c3-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:45:57 compute-1 nova_compute[189066]: 2025-12-05 09:45:57.160 189070 DEBUG oslo_concurrency.lockutils [None req-844502e2-9f9a-4ce7-80cd-25140ea4fb0e fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] Lock "b185217d-5f23-40c8-9f11-7cf87a1886c3-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:45:57 compute-1 nova_compute[189066]: 2025-12-05 09:45:57.161 189070 DEBUG oslo_concurrency.lockutils [None req-844502e2-9f9a-4ce7-80cd-25140ea4fb0e fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] Lock "b185217d-5f23-40c8-9f11-7cf87a1886c3-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:45:57 compute-1 nova_compute[189066]: 2025-12-05 09:45:57.162 189070 INFO nova.compute.manager [None req-844502e2-9f9a-4ce7-80cd-25140ea4fb0e fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] [instance: b185217d-5f23-40c8-9f11-7cf87a1886c3] Terminating instance
Dec 05 09:45:57 compute-1 nova_compute[189066]: 2025-12-05 09:45:57.163 189070 DEBUG nova.compute.manager [None req-844502e2-9f9a-4ce7-80cd-25140ea4fb0e fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] [instance: b185217d-5f23-40c8-9f11-7cf87a1886c3] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Dec 05 09:45:57 compute-1 kernel: tap5db43502-37 (unregistering): left promiscuous mode
Dec 05 09:45:57 compute-1 NetworkManager[55704]: <info>  [1764927957.1909] device (tap5db43502-37): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec 05 09:45:57 compute-1 ovn_controller[95809]: 2025-12-05T09:45:57Z|00259|binding|INFO|Releasing lport 5db43502-374a-4ff3-8bcc-a41bb1ae8440 from this chassis (sb_readonly=0)
Dec 05 09:45:57 compute-1 nova_compute[189066]: 2025-12-05 09:45:57.204 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:45:57 compute-1 ovn_controller[95809]: 2025-12-05T09:45:57Z|00260|binding|INFO|Setting lport 5db43502-374a-4ff3-8bcc-a41bb1ae8440 down in Southbound
Dec 05 09:45:57 compute-1 ovn_controller[95809]: 2025-12-05T09:45:57Z|00261|binding|INFO|Removing iface tap5db43502-37 ovn-installed in OVS
Dec 05 09:45:57 compute-1 nova_compute[189066]: 2025-12-05 09:45:57.207 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:45:57 compute-1 nova_compute[189066]: 2025-12-05 09:45:57.222 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:45:57 compute-1 systemd[1]: machine-qemu\x2d21\x2dinstance\x2d00000032.scope: Deactivated successfully.
Dec 05 09:45:57 compute-1 systemd[1]: machine-qemu\x2d21\x2dinstance\x2d00000032.scope: Consumed 17.239s CPU time.
Dec 05 09:45:57 compute-1 systemd-machined[154815]: Machine qemu-21-instance-00000032 terminated.
Dec 05 09:45:57 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:45:57.255 105272 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:47:d0:61 10.100.0.4 2001:db8::f816:3eff:fe47:d061'], port_security=['fa:16:3e:47:d0:61 10.100.0.4 2001:db8::f816:3eff:fe47:d061'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28 2001:db8::f816:3eff:fe47:d061/64', 'neutron:device_id': 'b185217d-5f23-40c8-9f11-7cf87a1886c3', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ae013098-2701-4ae8-8621-a0515ff9b432', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fa1cd463d74b49139a088d332d37e611', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'dd3745ba-bc1d-4ff0-a3b8-e555336632a0', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b5135dc1-9df2-4212-8110-1f16b401ed19, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fc06f54c970>], logical_port=5db43502-374a-4ff3-8bcc-a41bb1ae8440) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fc06f54c970>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 09:45:57 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:45:57.259 105272 INFO neutron.agent.ovn.metadata.agent [-] Port 5db43502-374a-4ff3-8bcc-a41bb1ae8440 in datapath ae013098-2701-4ae8-8621-a0515ff9b432 unbound from our chassis
Dec 05 09:45:57 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:45:57.261 105272 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network ae013098-2701-4ae8-8621-a0515ff9b432, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 05 09:45:57 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:45:57.263 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[701ed602-cdd6-45ef-8964-475c6aa4f35a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:45:57 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:45:57.264 105272 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-ae013098-2701-4ae8-8621-a0515ff9b432 namespace which is not needed anymore
Dec 05 09:45:57 compute-1 podman[233403]: 2025-12-05 09:45:57.271593618 +0000 UTC m=+0.056845193 container health_status 550b604b3b40c506829669f3af0f3e8dc4e7ac01464222ce7f26998708fe1a96 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec 05 09:45:57 compute-1 nova_compute[189066]: 2025-12-05 09:45:57.392 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:45:57 compute-1 nova_compute[189066]: 2025-12-05 09:45:57.397 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:45:57 compute-1 neutron-haproxy-ovnmeta-ae013098-2701-4ae8-8621-a0515ff9b432[233027]: [NOTICE]   (233031) : haproxy version is 2.8.14-c23fe91
Dec 05 09:45:57 compute-1 neutron-haproxy-ovnmeta-ae013098-2701-4ae8-8621-a0515ff9b432[233027]: [NOTICE]   (233031) : path to executable is /usr/sbin/haproxy
Dec 05 09:45:57 compute-1 neutron-haproxy-ovnmeta-ae013098-2701-4ae8-8621-a0515ff9b432[233027]: [WARNING]  (233031) : Exiting Master process...
Dec 05 09:45:57 compute-1 neutron-haproxy-ovnmeta-ae013098-2701-4ae8-8621-a0515ff9b432[233027]: [ALERT]    (233031) : Current worker (233033) exited with code 143 (Terminated)
Dec 05 09:45:57 compute-1 neutron-haproxy-ovnmeta-ae013098-2701-4ae8-8621-a0515ff9b432[233027]: [WARNING]  (233031) : All workers exited. Exiting... (0)
Dec 05 09:45:57 compute-1 systemd[1]: libpod-de53b0a6a1de582e8df938d580856413681176263400b69b752ca4ec9ec480d8.scope: Deactivated successfully.
Dec 05 09:45:57 compute-1 podman[233451]: 2025-12-05 09:45:57.420141796 +0000 UTC m=+0.057393866 container died de53b0a6a1de582e8df938d580856413681176263400b69b752ca4ec9ec480d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ae013098-2701-4ae8-8621-a0515ff9b432, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251125, io.buildah.version=1.41.3)
Dec 05 09:45:57 compute-1 nova_compute[189066]: 2025-12-05 09:45:57.445 189070 INFO nova.virt.libvirt.driver [-] [instance: b185217d-5f23-40c8-9f11-7cf87a1886c3] Instance destroyed successfully.
Dec 05 09:45:57 compute-1 nova_compute[189066]: 2025-12-05 09:45:57.446 189070 DEBUG nova.objects.instance [None req-844502e2-9f9a-4ce7-80cd-25140ea4fb0e fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] Lazy-loading 'resources' on Instance uuid b185217d-5f23-40c8-9f11-7cf87a1886c3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 09:45:57 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-de53b0a6a1de582e8df938d580856413681176263400b69b752ca4ec9ec480d8-userdata-shm.mount: Deactivated successfully.
Dec 05 09:45:57 compute-1 systemd[1]: var-lib-containers-storage-overlay-c3ea5bfa49793bf54265759ba8bdca0abcad146031c0a36cfdee477dff3e66f5-merged.mount: Deactivated successfully.
Dec 05 09:45:57 compute-1 nova_compute[189066]: 2025-12-05 09:45:57.484 189070 DEBUG nova.virt.libvirt.vif [None req-844502e2-9f9a-4ce7-80cd-25140ea4fb0e fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-05T09:44:37Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestGettingAddress-server-252150820',display_name='tempest-TestGettingAddress-server-252150820',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testgettingaddress-server-252150820',id=50,image_ref='3ebffd97-b242-42d7-b245-ebdaf8e4377c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBLHoHBnhl39mFqNwjOALF3l4BETvPQ2aaZWcNux2LXAx5w8R9/0SIqzLs8h9GojCCKVzXry2j15SmWDirbWSgGInNAL1B7ZEoog4LHyeWtsxbXu6L3ScevWo6qmZ3/Z1CQ==',key_name='tempest-TestGettingAddress-998480094',keypairs=<?>,launch_index=0,launched_at=2025-12-05T09:44:50Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='fa1cd463d74b49139a088d332d37e611',ramdisk_id='',reservation_id='r-ok2ytoz0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='3ebffd97-b242-42d7-b245-ebdaf8e4377c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestGettingAddress-8368731',owner_user_name='tempest-TestGettingAddress-8368731-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-05T09:44:50Z,user_data=None,user_id='fae1c60e378945ea84b34c4824b835b1',uuid=b185217d-5f23-40c8-9f11-7cf87a1886c3,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "5db43502-374a-4ff3-8bcc-a41bb1ae8440", "address": "fa:16:3e:47:d0:61", "network": {"id": "ae013098-2701-4ae8-8621-a0515ff9b432", "bridge": "br-int", "label": "tempest-network-smoke--758374883", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe47:d061", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.231", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fa1cd463d74b49139a088d332d37e611", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5db43502-37", "ovs_interfaceid": "5db43502-374a-4ff3-8bcc-a41bb1ae8440", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Dec 05 09:45:57 compute-1 nova_compute[189066]: 2025-12-05 09:45:57.485 189070 DEBUG nova.network.os_vif_util [None req-844502e2-9f9a-4ce7-80cd-25140ea4fb0e fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] Converting VIF {"id": "5db43502-374a-4ff3-8bcc-a41bb1ae8440", "address": "fa:16:3e:47:d0:61", "network": {"id": "ae013098-2701-4ae8-8621-a0515ff9b432", "bridge": "br-int", "label": "tempest-network-smoke--758374883", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe47:d061", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.231", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fa1cd463d74b49139a088d332d37e611", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5db43502-37", "ovs_interfaceid": "5db43502-374a-4ff3-8bcc-a41bb1ae8440", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 09:45:57 compute-1 nova_compute[189066]: 2025-12-05 09:45:57.486 189070 DEBUG nova.network.os_vif_util [None req-844502e2-9f9a-4ce7-80cd-25140ea4fb0e fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:47:d0:61,bridge_name='br-int',has_traffic_filtering=True,id=5db43502-374a-4ff3-8bcc-a41bb1ae8440,network=Network(ae013098-2701-4ae8-8621-a0515ff9b432),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5db43502-37') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 09:45:57 compute-1 nova_compute[189066]: 2025-12-05 09:45:57.487 189070 DEBUG os_vif [None req-844502e2-9f9a-4ce7-80cd-25140ea4fb0e fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:47:d0:61,bridge_name='br-int',has_traffic_filtering=True,id=5db43502-374a-4ff3-8bcc-a41bb1ae8440,network=Network(ae013098-2701-4ae8-8621-a0515ff9b432),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5db43502-37') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Dec 05 09:45:57 compute-1 nova_compute[189066]: 2025-12-05 09:45:57.489 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:45:57 compute-1 nova_compute[189066]: 2025-12-05 09:45:57.489 189070 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap5db43502-37, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 09:45:57 compute-1 podman[233451]: 2025-12-05 09:45:57.496375483 +0000 UTC m=+0.133627553 container cleanup de53b0a6a1de582e8df938d580856413681176263400b69b752ca4ec9ec480d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ae013098-2701-4ae8-8621-a0515ff9b432, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125)
Dec 05 09:45:57 compute-1 nova_compute[189066]: 2025-12-05 09:45:57.496 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:45:57 compute-1 nova_compute[189066]: 2025-12-05 09:45:57.498 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 09:45:57 compute-1 systemd[1]: libpod-conmon-de53b0a6a1de582e8df938d580856413681176263400b69b752ca4ec9ec480d8.scope: Deactivated successfully.
Dec 05 09:45:57 compute-1 nova_compute[189066]: 2025-12-05 09:45:57.503 189070 INFO os_vif [None req-844502e2-9f9a-4ce7-80cd-25140ea4fb0e fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:47:d0:61,bridge_name='br-int',has_traffic_filtering=True,id=5db43502-374a-4ff3-8bcc-a41bb1ae8440,network=Network(ae013098-2701-4ae8-8621-a0515ff9b432),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5db43502-37')
Dec 05 09:45:57 compute-1 nova_compute[189066]: 2025-12-05 09:45:57.504 189070 INFO nova.virt.libvirt.driver [None req-844502e2-9f9a-4ce7-80cd-25140ea4fb0e fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] [instance: b185217d-5f23-40c8-9f11-7cf87a1886c3] Deleting instance files /var/lib/nova/instances/b185217d-5f23-40c8-9f11-7cf87a1886c3_del
Dec 05 09:45:57 compute-1 nova_compute[189066]: 2025-12-05 09:45:57.505 189070 INFO nova.virt.libvirt.driver [None req-844502e2-9f9a-4ce7-80cd-25140ea4fb0e fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] [instance: b185217d-5f23-40c8-9f11-7cf87a1886c3] Deletion of /var/lib/nova/instances/b185217d-5f23-40c8-9f11-7cf87a1886c3_del complete
Dec 05 09:45:57 compute-1 podman[233492]: 2025-12-05 09:45:57.566415198 +0000 UTC m=+0.045778622 container remove de53b0a6a1de582e8df938d580856413681176263400b69b752ca4ec9ec480d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ae013098-2701-4ae8-8621-a0515ff9b432, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125)
Dec 05 09:45:57 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:45:57.571 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[23e6fa03-8b8a-488e-b084-5bb99821a04b]: (4, ('Fri Dec  5 09:45:57 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-ae013098-2701-4ae8-8621-a0515ff9b432 (de53b0a6a1de582e8df938d580856413681176263400b69b752ca4ec9ec480d8)\nde53b0a6a1de582e8df938d580856413681176263400b69b752ca4ec9ec480d8\nFri Dec  5 09:45:57 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-ae013098-2701-4ae8-8621-a0515ff9b432 (de53b0a6a1de582e8df938d580856413681176263400b69b752ca4ec9ec480d8)\nde53b0a6a1de582e8df938d580856413681176263400b69b752ca4ec9ec480d8\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:45:57 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:45:57.574 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[4bbc720f-e810-4566-ba05-8900482513eb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:45:57 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:45:57.575 105272 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapae013098-20, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 09:45:57 compute-1 nova_compute[189066]: 2025-12-05 09:45:57.577 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:45:57 compute-1 kernel: tapae013098-20: left promiscuous mode
Dec 05 09:45:57 compute-1 nova_compute[189066]: 2025-12-05 09:45:57.579 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:45:57 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:45:57.583 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[a4a73c1c-c7d0-4814-a7f9-1282289246b3]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:45:57 compute-1 nova_compute[189066]: 2025-12-05 09:45:57.589 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:45:57 compute-1 nova_compute[189066]: 2025-12-05 09:45:57.598 189070 INFO nova.compute.manager [None req-844502e2-9f9a-4ce7-80cd-25140ea4fb0e fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] [instance: b185217d-5f23-40c8-9f11-7cf87a1886c3] Took 0.43 seconds to destroy the instance on the hypervisor.
Dec 05 09:45:57 compute-1 nova_compute[189066]: 2025-12-05 09:45:57.599 189070 DEBUG oslo.service.loopingcall [None req-844502e2-9f9a-4ce7-80cd-25140ea4fb0e fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Dec 05 09:45:57 compute-1 nova_compute[189066]: 2025-12-05 09:45:57.599 189070 DEBUG nova.compute.manager [-] [instance: b185217d-5f23-40c8-9f11-7cf87a1886c3] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Dec 05 09:45:57 compute-1 nova_compute[189066]: 2025-12-05 09:45:57.599 189070 DEBUG nova.network.neutron [-] [instance: b185217d-5f23-40c8-9f11-7cf87a1886c3] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Dec 05 09:45:57 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:45:57.609 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[5103640e-196e-41f0-84a6-20958c62f796]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:45:57 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:45:57.611 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[9aabbbaa-2257-4f16-bf73-7bec28467a1e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:45:57 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:45:57.631 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[3d9ea641-c014-45ec-a429-5f730b88e880]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 516040, 'reachable_time': 35295, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 233507, 'error': None, 'target': 'ovnmeta-ae013098-2701-4ae8-8621-a0515ff9b432', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:45:57 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:45:57.636 105809 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-ae013098-2701-4ae8-8621-a0515ff9b432 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Dec 05 09:45:57 compute-1 systemd[1]: run-netns-ovnmeta\x2dae013098\x2d2701\x2d4ae8\x2d8621\x2da0515ff9b432.mount: Deactivated successfully.
Dec 05 09:45:57 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:45:57.637 105809 DEBUG oslo.privsep.daemon [-] privsep: reply[928dad8e-fbf1-4bf2-b408-1b9e9f357bb0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:45:58 compute-1 nova_compute[189066]: 2025-12-05 09:45:58.941 189070 DEBUG nova.compute.manager [req-0d427896-3acb-4d0b-b5cc-fcec8f08273c req-7d7d6217-c3f2-46ac-94d3-63a4dd7b1b09 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: b185217d-5f23-40c8-9f11-7cf87a1886c3] Received event network-vif-unplugged-5db43502-374a-4ff3-8bcc-a41bb1ae8440 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 09:45:58 compute-1 nova_compute[189066]: 2025-12-05 09:45:58.942 189070 DEBUG oslo_concurrency.lockutils [req-0d427896-3acb-4d0b-b5cc-fcec8f08273c req-7d7d6217-c3f2-46ac-94d3-63a4dd7b1b09 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Acquiring lock "b185217d-5f23-40c8-9f11-7cf87a1886c3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:45:58 compute-1 nova_compute[189066]: 2025-12-05 09:45:58.943 189070 DEBUG oslo_concurrency.lockutils [req-0d427896-3acb-4d0b-b5cc-fcec8f08273c req-7d7d6217-c3f2-46ac-94d3-63a4dd7b1b09 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Lock "b185217d-5f23-40c8-9f11-7cf87a1886c3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:45:58 compute-1 nova_compute[189066]: 2025-12-05 09:45:58.943 189070 DEBUG oslo_concurrency.lockutils [req-0d427896-3acb-4d0b-b5cc-fcec8f08273c req-7d7d6217-c3f2-46ac-94d3-63a4dd7b1b09 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Lock "b185217d-5f23-40c8-9f11-7cf87a1886c3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:45:58 compute-1 nova_compute[189066]: 2025-12-05 09:45:58.944 189070 DEBUG nova.compute.manager [req-0d427896-3acb-4d0b-b5cc-fcec8f08273c req-7d7d6217-c3f2-46ac-94d3-63a4dd7b1b09 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: b185217d-5f23-40c8-9f11-7cf87a1886c3] No waiting events found dispatching network-vif-unplugged-5db43502-374a-4ff3-8bcc-a41bb1ae8440 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 09:45:58 compute-1 nova_compute[189066]: 2025-12-05 09:45:58.944 189070 DEBUG nova.compute.manager [req-0d427896-3acb-4d0b-b5cc-fcec8f08273c req-7d7d6217-c3f2-46ac-94d3-63a4dd7b1b09 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: b185217d-5f23-40c8-9f11-7cf87a1886c3] Received event network-vif-unplugged-5db43502-374a-4ff3-8bcc-a41bb1ae8440 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Dec 05 09:46:00 compute-1 nova_compute[189066]: 2025-12-05 09:46:00.185 189070 DEBUG nova.network.neutron [req-c147e3d0-97c6-4fcf-b951-9f0d5c6605eb req-32c6aa1a-a292-4955-9f4d-b7b936dd96cf 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: b185217d-5f23-40c8-9f11-7cf87a1886c3] Updated VIF entry in instance network info cache for port 5db43502-374a-4ff3-8bcc-a41bb1ae8440. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Dec 05 09:46:00 compute-1 nova_compute[189066]: 2025-12-05 09:46:00.186 189070 DEBUG nova.network.neutron [req-c147e3d0-97c6-4fcf-b951-9f0d5c6605eb req-32c6aa1a-a292-4955-9f4d-b7b936dd96cf 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: b185217d-5f23-40c8-9f11-7cf87a1886c3] Updating instance_info_cache with network_info: [{"id": "5db43502-374a-4ff3-8bcc-a41bb1ae8440", "address": "fa:16:3e:47:d0:61", "network": {"id": "ae013098-2701-4ae8-8621-a0515ff9b432", "bridge": "br-int", "label": "tempest-network-smoke--758374883", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe47:d061", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fa1cd463d74b49139a088d332d37e611", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5db43502-37", "ovs_interfaceid": "5db43502-374a-4ff3-8bcc-a41bb1ae8440", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 09:46:00 compute-1 nova_compute[189066]: 2025-12-05 09:46:00.197 189070 DEBUG nova.network.neutron [-] [instance: b185217d-5f23-40c8-9f11-7cf87a1886c3] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 09:46:00 compute-1 nova_compute[189066]: 2025-12-05 09:46:00.379 189070 INFO nova.compute.manager [-] [instance: b185217d-5f23-40c8-9f11-7cf87a1886c3] Took 2.78 seconds to deallocate network for instance.
Dec 05 09:46:00 compute-1 nova_compute[189066]: 2025-12-05 09:46:00.735 189070 DEBUG oslo_concurrency.lockutils [req-c147e3d0-97c6-4fcf-b951-9f0d5c6605eb req-32c6aa1a-a292-4955-9f4d-b7b936dd96cf 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Releasing lock "refresh_cache-b185217d-5f23-40c8-9f11-7cf87a1886c3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 09:46:00 compute-1 nova_compute[189066]: 2025-12-05 09:46:00.879 189070 DEBUG oslo_concurrency.lockutils [None req-844502e2-9f9a-4ce7-80cd-25140ea4fb0e fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:46:00 compute-1 nova_compute[189066]: 2025-12-05 09:46:00.879 189070 DEBUG oslo_concurrency.lockutils [None req-844502e2-9f9a-4ce7-80cd-25140ea4fb0e fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:46:00 compute-1 nova_compute[189066]: 2025-12-05 09:46:00.948 189070 DEBUG nova.compute.provider_tree [None req-844502e2-9f9a-4ce7-80cd-25140ea4fb0e fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] Inventory has not changed in ProviderTree for provider: be68f9f1-7820-4bfa-8dbd-210e13729f64 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 09:46:01 compute-1 nova_compute[189066]: 2025-12-05 09:46:01.070 189070 DEBUG nova.scheduler.client.report [None req-844502e2-9f9a-4ce7-80cd-25140ea4fb0e fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] Inventory has not changed for provider be68f9f1-7820-4bfa-8dbd-210e13729f64 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 09:46:01 compute-1 nova_compute[189066]: 2025-12-05 09:46:01.145 189070 DEBUG nova.compute.manager [req-cd7df7b9-15a0-4362-ae23-23aacef33814 req-08871a5c-6693-4cfe-a42a-c2fa378bfa20 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: b185217d-5f23-40c8-9f11-7cf87a1886c3] Received event network-vif-plugged-5db43502-374a-4ff3-8bcc-a41bb1ae8440 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 09:46:01 compute-1 nova_compute[189066]: 2025-12-05 09:46:01.146 189070 DEBUG oslo_concurrency.lockutils [req-cd7df7b9-15a0-4362-ae23-23aacef33814 req-08871a5c-6693-4cfe-a42a-c2fa378bfa20 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Acquiring lock "b185217d-5f23-40c8-9f11-7cf87a1886c3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:46:01 compute-1 nova_compute[189066]: 2025-12-05 09:46:01.146 189070 DEBUG oslo_concurrency.lockutils [req-cd7df7b9-15a0-4362-ae23-23aacef33814 req-08871a5c-6693-4cfe-a42a-c2fa378bfa20 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Lock "b185217d-5f23-40c8-9f11-7cf87a1886c3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:46:01 compute-1 nova_compute[189066]: 2025-12-05 09:46:01.147 189070 DEBUG oslo_concurrency.lockutils [req-cd7df7b9-15a0-4362-ae23-23aacef33814 req-08871a5c-6693-4cfe-a42a-c2fa378bfa20 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Lock "b185217d-5f23-40c8-9f11-7cf87a1886c3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:46:01 compute-1 nova_compute[189066]: 2025-12-05 09:46:01.147 189070 DEBUG nova.compute.manager [req-cd7df7b9-15a0-4362-ae23-23aacef33814 req-08871a5c-6693-4cfe-a42a-c2fa378bfa20 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: b185217d-5f23-40c8-9f11-7cf87a1886c3] No waiting events found dispatching network-vif-plugged-5db43502-374a-4ff3-8bcc-a41bb1ae8440 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 09:46:01 compute-1 nova_compute[189066]: 2025-12-05 09:46:01.148 189070 WARNING nova.compute.manager [req-cd7df7b9-15a0-4362-ae23-23aacef33814 req-08871a5c-6693-4cfe-a42a-c2fa378bfa20 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: b185217d-5f23-40c8-9f11-7cf87a1886c3] Received unexpected event network-vif-plugged-5db43502-374a-4ff3-8bcc-a41bb1ae8440 for instance with vm_state deleted and task_state None.
Dec 05 09:46:01 compute-1 nova_compute[189066]: 2025-12-05 09:46:01.148 189070 DEBUG nova.compute.manager [req-cd7df7b9-15a0-4362-ae23-23aacef33814 req-08871a5c-6693-4cfe-a42a-c2fa378bfa20 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: b185217d-5f23-40c8-9f11-7cf87a1886c3] Received event network-vif-deleted-5db43502-374a-4ff3-8bcc-a41bb1ae8440 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 09:46:01 compute-1 nova_compute[189066]: 2025-12-05 09:46:01.149 189070 INFO nova.compute.manager [req-cd7df7b9-15a0-4362-ae23-23aacef33814 req-08871a5c-6693-4cfe-a42a-c2fa378bfa20 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: b185217d-5f23-40c8-9f11-7cf87a1886c3] Neutron deleted interface 5db43502-374a-4ff3-8bcc-a41bb1ae8440; detaching it from the instance and deleting it from the info cache
Dec 05 09:46:01 compute-1 nova_compute[189066]: 2025-12-05 09:46:01.149 189070 DEBUG nova.network.neutron [req-cd7df7b9-15a0-4362-ae23-23aacef33814 req-08871a5c-6693-4cfe-a42a-c2fa378bfa20 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: b185217d-5f23-40c8-9f11-7cf87a1886c3] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 09:46:01 compute-1 nova_compute[189066]: 2025-12-05 09:46:01.195 189070 DEBUG oslo_concurrency.lockutils [None req-844502e2-9f9a-4ce7-80cd-25140ea4fb0e fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.316s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:46:01 compute-1 nova_compute[189066]: 2025-12-05 09:46:01.219 189070 DEBUG nova.compute.manager [req-cd7df7b9-15a0-4362-ae23-23aacef33814 req-08871a5c-6693-4cfe-a42a-c2fa378bfa20 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: b185217d-5f23-40c8-9f11-7cf87a1886c3] Detach interface failed, port_id=5db43502-374a-4ff3-8bcc-a41bb1ae8440, reason: Instance b185217d-5f23-40c8-9f11-7cf87a1886c3 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10882
Dec 05 09:46:01 compute-1 nova_compute[189066]: 2025-12-05 09:46:01.228 189070 INFO nova.scheduler.client.report [None req-844502e2-9f9a-4ce7-80cd-25140ea4fb0e fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] Deleted allocations for instance b185217d-5f23-40c8-9f11-7cf87a1886c3
Dec 05 09:46:01 compute-1 nova_compute[189066]: 2025-12-05 09:46:01.447 189070 DEBUG oslo_concurrency.lockutils [None req-844502e2-9f9a-4ce7-80cd-25140ea4fb0e fae1c60e378945ea84b34c4824b835b1 fa1cd463d74b49139a088d332d37e611 - - default default] Lock "b185217d-5f23-40c8-9f11-7cf87a1886c3" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 4.287s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:46:01 compute-1 nova_compute[189066]: 2025-12-05 09:46:01.603 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:46:02 compute-1 nova_compute[189066]: 2025-12-05 09:46:02.493 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:46:03 compute-1 podman[233508]: 2025-12-05 09:46:03.643476432 +0000 UTC m=+0.076257938 container health_status d60d3dfd47255bc7c068dbedc03dbf86678863f0414a1b8f8536e4d53dd67a13 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a42d21893a42142b0b00e96296a13ac9d2c0fedf55d6438ad104fd0309c63376-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Dec 05 09:46:06 compute-1 nova_compute[189066]: 2025-12-05 09:46:06.606 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:46:07 compute-1 nova_compute[189066]: 2025-12-05 09:46:07.497 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:46:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:46:08.895 105272 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:46:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:46:08.896 105272 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:46:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:46:08.896 105272 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:46:10 compute-1 nova_compute[189066]: 2025-12-05 09:46:10.845 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:46:11 compute-1 nova_compute[189066]: 2025-12-05 09:46:11.027 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:46:11 compute-1 nova_compute[189066]: 2025-12-05 09:46:11.607 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:46:11 compute-1 podman[233530]: 2025-12-05 09:46:11.683768456 +0000 UTC m=+0.107132666 container health_status 0cdc951f3b455a1459471b20e1f1626ca53f702831c125f835d1b670aeb17ccf (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a42d21893a42142b0b00e96296a13ac9d2c0fedf55d6438ad104fd0309c63376-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Dec 05 09:46:12 compute-1 nova_compute[189066]: 2025-12-05 09:46:12.444 189070 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764927957.4430914, b185217d-5f23-40c8-9f11-7cf87a1886c3 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 09:46:12 compute-1 nova_compute[189066]: 2025-12-05 09:46:12.445 189070 INFO nova.compute.manager [-] [instance: b185217d-5f23-40c8-9f11-7cf87a1886c3] VM Stopped (Lifecycle Event)
Dec 05 09:46:12 compute-1 nova_compute[189066]: 2025-12-05 09:46:12.498 189070 DEBUG nova.compute.manager [None req-9f074518-bf52-41df-aa2d-b5b0218dbb31 - - - - - -] [instance: b185217d-5f23-40c8-9f11-7cf87a1886c3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 09:46:12 compute-1 nova_compute[189066]: 2025-12-05 09:46:12.499 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:46:13 compute-1 podman[233556]: 2025-12-05 09:46:13.623264592 +0000 UTC m=+0.060397080 container health_status e8cea6352187f28e672aa25c227f2705a90d58074b8cf42235a56df5d4026fd5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a42d21893a42142b0b00e96296a13ac9d2c0fedf55d6438ad104fd0309c63376-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Dec 05 09:46:16 compute-1 nova_compute[189066]: 2025-12-05 09:46:16.608 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:46:16 compute-1 podman[233575]: 2025-12-05 09:46:16.646849448 +0000 UTC m=+0.087213347 container health_status 3f3288243aefe700d53cffce184353529fbaf6e44f1c19ec4792ed576af41176 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_id=multipathd, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3)
Dec 05 09:46:17 compute-1 nova_compute[189066]: 2025-12-05 09:46:17.502 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:46:19 compute-1 podman[233595]: 2025-12-05 09:46:19.631075391 +0000 UTC m=+0.060175065 container health_status b07f36300303a5c1729056a61e344d6c6fd9b2e404082d8e8ce7185d45652c2b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc., config_id=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, architecture=x86_64, io.buildah.version=1.33.7, build-date=2025-08-20T13:12:41, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, maintainer=Red Hat, Inc., release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.expose-services=, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git)
Dec 05 09:46:21 compute-1 nova_compute[189066]: 2025-12-05 09:46:21.610 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:46:22 compute-1 nova_compute[189066]: 2025-12-05 09:46:22.504 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:46:24 compute-1 podman[233616]: 2025-12-05 09:46:24.621313879 +0000 UTC m=+0.064666265 container health_status b9f45cdcb567bb250381fa80d86c78c226d5638c7de1b6d79ee017275814ff68 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Dec 05 09:46:25 compute-1 nova_compute[189066]: 2025-12-05 09:46:25.335 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:46:25 compute-1 nova_compute[189066]: 2025-12-05 09:46:25.335 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:46:26 compute-1 nova_compute[189066]: 2025-12-05 09:46:26.033 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:46:26 compute-1 nova_compute[189066]: 2025-12-05 09:46:26.033 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:46:26 compute-1 nova_compute[189066]: 2025-12-05 09:46:26.611 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:46:27 compute-1 nova_compute[189066]: 2025-12-05 09:46:27.507 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:46:27 compute-1 podman[233640]: 2025-12-05 09:46:27.63406936 +0000 UTC m=+0.071260737 container health_status 550b604b3b40c506829669f3af0f3e8dc4e7ac01464222ce7f26998708fe1a96 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Dec 05 09:46:28 compute-1 nova_compute[189066]: 2025-12-05 09:46:28.021 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:46:28 compute-1 nova_compute[189066]: 2025-12-05 09:46:28.022 189070 DEBUG nova.compute.manager [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 05 09:46:28 compute-1 nova_compute[189066]: 2025-12-05 09:46:28.022 189070 DEBUG nova.compute.manager [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 05 09:46:28 compute-1 nova_compute[189066]: 2025-12-05 09:46:28.040 189070 DEBUG nova.compute.manager [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Dec 05 09:46:30 compute-1 nova_compute[189066]: 2025-12-05 09:46:30.020 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:46:30 compute-1 nova_compute[189066]: 2025-12-05 09:46:30.021 189070 DEBUG nova.compute.manager [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 05 09:46:31 compute-1 nova_compute[189066]: 2025-12-05 09:46:31.021 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:46:31 compute-1 nova_compute[189066]: 2025-12-05 09:46:31.613 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:46:32 compute-1 nova_compute[189066]: 2025-12-05 09:46:32.020 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:46:32 compute-1 nova_compute[189066]: 2025-12-05 09:46:32.021 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:46:32 compute-1 nova_compute[189066]: 2025-12-05 09:46:32.071 189070 DEBUG oslo_concurrency.lockutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:46:32 compute-1 nova_compute[189066]: 2025-12-05 09:46:32.072 189070 DEBUG oslo_concurrency.lockutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:46:32 compute-1 nova_compute[189066]: 2025-12-05 09:46:32.072 189070 DEBUG oslo_concurrency.lockutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:46:32 compute-1 nova_compute[189066]: 2025-12-05 09:46:32.073 189070 DEBUG nova.compute.resource_tracker [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 05 09:46:32 compute-1 nova_compute[189066]: 2025-12-05 09:46:32.278 189070 WARNING nova.virt.libvirt.driver [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 05 09:46:32 compute-1 nova_compute[189066]: 2025-12-05 09:46:32.279 189070 DEBUG nova.compute.resource_tracker [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5762MB free_disk=73.32379150390625GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 05 09:46:32 compute-1 nova_compute[189066]: 2025-12-05 09:46:32.279 189070 DEBUG oslo_concurrency.lockutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:46:32 compute-1 nova_compute[189066]: 2025-12-05 09:46:32.280 189070 DEBUG oslo_concurrency.lockutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:46:32 compute-1 nova_compute[189066]: 2025-12-05 09:46:32.399 189070 DEBUG nova.compute.resource_tracker [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 05 09:46:32 compute-1 nova_compute[189066]: 2025-12-05 09:46:32.399 189070 DEBUG nova.compute.resource_tracker [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 05 09:46:32 compute-1 nova_compute[189066]: 2025-12-05 09:46:32.428 189070 DEBUG nova.compute.provider_tree [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Inventory has not changed in ProviderTree for provider: be68f9f1-7820-4bfa-8dbd-210e13729f64 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 09:46:32 compute-1 nova_compute[189066]: 2025-12-05 09:46:32.465 189070 DEBUG nova.scheduler.client.report [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Inventory has not changed for provider be68f9f1-7820-4bfa-8dbd-210e13729f64 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 09:46:32 compute-1 nova_compute[189066]: 2025-12-05 09:46:32.510 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:46:32 compute-1 nova_compute[189066]: 2025-12-05 09:46:32.513 189070 DEBUG nova.compute.resource_tracker [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 05 09:46:32 compute-1 nova_compute[189066]: 2025-12-05 09:46:32.514 189070 DEBUG oslo_concurrency.lockutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.234s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:46:33 compute-1 nova_compute[189066]: 2025-12-05 09:46:33.515 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:46:34 compute-1 podman[233665]: 2025-12-05 09:46:34.66974424 +0000 UTC m=+0.092749193 container health_status d60d3dfd47255bc7c068dbedc03dbf86678863f0414a1b8f8536e4d53dd67a13 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a42d21893a42142b0b00e96296a13ac9d2c0fedf55d6438ad104fd0309c63376-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Dec 05 09:46:36 compute-1 nova_compute[189066]: 2025-12-05 09:46:36.615 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:46:37 compute-1 nova_compute[189066]: 2025-12-05 09:46:37.514 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:46:41 compute-1 nova_compute[189066]: 2025-12-05 09:46:41.617 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:46:42 compute-1 nova_compute[189066]: 2025-12-05 09:46:42.518 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:46:42 compute-1 podman[233685]: 2025-12-05 09:46:42.670207538 +0000 UTC m=+0.102979505 container health_status 0cdc951f3b455a1459471b20e1f1626ca53f702831c125f835d1b670aeb17ccf (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a42d21893a42142b0b00e96296a13ac9d2c0fedf55d6438ad104fd0309c63376-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.build-date=20251125, tcib_managed=true)
Dec 05 09:46:43 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:46:43.592 105272 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=29, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f2:d3:89', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '8e:43:2c:7d:20:dd'}, ipsec=False) old=SB_Global(nb_cfg=28) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 09:46:43 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:46:43.593 105272 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Dec 05 09:46:43 compute-1 nova_compute[189066]: 2025-12-05 09:46:43.642 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:46:44 compute-1 podman[233713]: 2025-12-05 09:46:44.624719323 +0000 UTC m=+0.058555816 container health_status e8cea6352187f28e672aa25c227f2705a90d58074b8cf42235a56df5d4026fd5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a42d21893a42142b0b00e96296a13ac9d2c0fedf55d6438ad104fd0309c63376-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 05 09:46:46 compute-1 nova_compute[189066]: 2025-12-05 09:46:46.619 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:46:47 compute-1 nova_compute[189066]: 2025-12-05 09:46:47.522 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:46:47 compute-1 podman[233732]: 2025-12-05 09:46:47.623396205 +0000 UTC m=+0.066060030 container health_status 3f3288243aefe700d53cffce184353529fbaf6e44f1c19ec4792ed576af41176 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251125)
Dec 05 09:46:50 compute-1 podman[233753]: 2025-12-05 09:46:50.629386568 +0000 UTC m=+0.068864648 container health_status b07f36300303a5c1729056a61e344d6c6fd9b2e404082d8e8ce7185d45652c2b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.component=ubi9-minimal-container, config_id=openstack_network_exporter, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, container_name=openstack_network_exporter, managed_by=edpm_ansible, architecture=x86_64, build-date=2025-08-20T13:12:41, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, name=ubi9-minimal, release=1755695350, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.6, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.expose-services=, distribution-scope=public, maintainer=Red Hat, Inc.)
Dec 05 09:46:51 compute-1 nova_compute[189066]: 2025-12-05 09:46:51.621 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:46:52 compute-1 nova_compute[189066]: 2025-12-05 09:46:52.525 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:46:52 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:46:52.595 105272 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=2deaed7a-68f6-453c-b7f8-10ef033f3762, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '29'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 09:46:55 compute-1 podman[233775]: 2025-12-05 09:46:55.67610473 +0000 UTC m=+0.105428505 container health_status b9f45cdcb567bb250381fa80d86c78c226d5638c7de1b6d79ee017275814ff68 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Dec 05 09:46:56 compute-1 nova_compute[189066]: 2025-12-05 09:46:56.624 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:46:57 compute-1 nova_compute[189066]: 2025-12-05 09:46:57.562 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:46:58 compute-1 podman[233799]: 2025-12-05 09:46:58.628389006 +0000 UTC m=+0.059932170 container health_status 550b604b3b40c506829669f3af0f3e8dc4e7ac01464222ce7f26998708fe1a96 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Dec 05 09:47:01 compute-1 nova_compute[189066]: 2025-12-05 09:47:01.625 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:47:01 compute-1 nova_compute[189066]: 2025-12-05 09:47:01.757 189070 DEBUG oslo_concurrency.lockutils [None req-84516989-4f0e-418d-84d5-6ea48e6d87be d96d7aa420944947ad70e496aec2bea1 0279a3edc4d145a3815f122291b494bf - - default default] Acquiring lock "4018e31a-e4b1-4786-8a23-a6824cfe324b" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:47:01 compute-1 nova_compute[189066]: 2025-12-05 09:47:01.757 189070 DEBUG oslo_concurrency.lockutils [None req-84516989-4f0e-418d-84d5-6ea48e6d87be d96d7aa420944947ad70e496aec2bea1 0279a3edc4d145a3815f122291b494bf - - default default] Lock "4018e31a-e4b1-4786-8a23-a6824cfe324b" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:47:01 compute-1 nova_compute[189066]: 2025-12-05 09:47:01.806 189070 DEBUG nova.compute.manager [None req-84516989-4f0e-418d-84d5-6ea48e6d87be d96d7aa420944947ad70e496aec2bea1 0279a3edc4d145a3815f122291b494bf - - default default] [instance: 4018e31a-e4b1-4786-8a23-a6824cfe324b] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Dec 05 09:47:01 compute-1 nova_compute[189066]: 2025-12-05 09:47:01.932 189070 DEBUG oslo_concurrency.lockutils [None req-84516989-4f0e-418d-84d5-6ea48e6d87be d96d7aa420944947ad70e496aec2bea1 0279a3edc4d145a3815f122291b494bf - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:47:01 compute-1 nova_compute[189066]: 2025-12-05 09:47:01.932 189070 DEBUG oslo_concurrency.lockutils [None req-84516989-4f0e-418d-84d5-6ea48e6d87be d96d7aa420944947ad70e496aec2bea1 0279a3edc4d145a3815f122291b494bf - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:47:01 compute-1 nova_compute[189066]: 2025-12-05 09:47:01.941 189070 DEBUG nova.virt.hardware [None req-84516989-4f0e-418d-84d5-6ea48e6d87be d96d7aa420944947ad70e496aec2bea1 0279a3edc4d145a3815f122291b494bf - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Dec 05 09:47:01 compute-1 nova_compute[189066]: 2025-12-05 09:47:01.941 189070 INFO nova.compute.claims [None req-84516989-4f0e-418d-84d5-6ea48e6d87be d96d7aa420944947ad70e496aec2bea1 0279a3edc4d145a3815f122291b494bf - - default default] [instance: 4018e31a-e4b1-4786-8a23-a6824cfe324b] Claim successful on node compute-1.ctlplane.example.com
Dec 05 09:47:02 compute-1 nova_compute[189066]: 2025-12-05 09:47:02.130 189070 DEBUG nova.compute.provider_tree [None req-84516989-4f0e-418d-84d5-6ea48e6d87be d96d7aa420944947ad70e496aec2bea1 0279a3edc4d145a3815f122291b494bf - - default default] Inventory has not changed in ProviderTree for provider: be68f9f1-7820-4bfa-8dbd-210e13729f64 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 09:47:02 compute-1 nova_compute[189066]: 2025-12-05 09:47:02.149 189070 DEBUG nova.scheduler.client.report [None req-84516989-4f0e-418d-84d5-6ea48e6d87be d96d7aa420944947ad70e496aec2bea1 0279a3edc4d145a3815f122291b494bf - - default default] Inventory has not changed for provider be68f9f1-7820-4bfa-8dbd-210e13729f64 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 09:47:02 compute-1 nova_compute[189066]: 2025-12-05 09:47:02.198 189070 DEBUG oslo_concurrency.lockutils [None req-84516989-4f0e-418d-84d5-6ea48e6d87be d96d7aa420944947ad70e496aec2bea1 0279a3edc4d145a3815f122291b494bf - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.265s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:47:02 compute-1 nova_compute[189066]: 2025-12-05 09:47:02.199 189070 DEBUG nova.compute.manager [None req-84516989-4f0e-418d-84d5-6ea48e6d87be d96d7aa420944947ad70e496aec2bea1 0279a3edc4d145a3815f122291b494bf - - default default] [instance: 4018e31a-e4b1-4786-8a23-a6824cfe324b] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Dec 05 09:47:02 compute-1 nova_compute[189066]: 2025-12-05 09:47:02.267 189070 DEBUG nova.compute.manager [None req-84516989-4f0e-418d-84d5-6ea48e6d87be d96d7aa420944947ad70e496aec2bea1 0279a3edc4d145a3815f122291b494bf - - default default] [instance: 4018e31a-e4b1-4786-8a23-a6824cfe324b] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Dec 05 09:47:02 compute-1 nova_compute[189066]: 2025-12-05 09:47:02.268 189070 DEBUG nova.network.neutron [None req-84516989-4f0e-418d-84d5-6ea48e6d87be d96d7aa420944947ad70e496aec2bea1 0279a3edc4d145a3815f122291b494bf - - default default] [instance: 4018e31a-e4b1-4786-8a23-a6824cfe324b] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Dec 05 09:47:02 compute-1 nova_compute[189066]: 2025-12-05 09:47:02.308 189070 INFO nova.virt.libvirt.driver [None req-84516989-4f0e-418d-84d5-6ea48e6d87be d96d7aa420944947ad70e496aec2bea1 0279a3edc4d145a3815f122291b494bf - - default default] [instance: 4018e31a-e4b1-4786-8a23-a6824cfe324b] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Dec 05 09:47:02 compute-1 nova_compute[189066]: 2025-12-05 09:47:02.332 189070 DEBUG nova.compute.manager [None req-84516989-4f0e-418d-84d5-6ea48e6d87be d96d7aa420944947ad70e496aec2bea1 0279a3edc4d145a3815f122291b494bf - - default default] [instance: 4018e31a-e4b1-4786-8a23-a6824cfe324b] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Dec 05 09:47:02 compute-1 nova_compute[189066]: 2025-12-05 09:47:02.499 189070 DEBUG nova.compute.manager [None req-84516989-4f0e-418d-84d5-6ea48e6d87be d96d7aa420944947ad70e496aec2bea1 0279a3edc4d145a3815f122291b494bf - - default default] [instance: 4018e31a-e4b1-4786-8a23-a6824cfe324b] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Dec 05 09:47:02 compute-1 nova_compute[189066]: 2025-12-05 09:47:02.500 189070 DEBUG nova.virt.libvirt.driver [None req-84516989-4f0e-418d-84d5-6ea48e6d87be d96d7aa420944947ad70e496aec2bea1 0279a3edc4d145a3815f122291b494bf - - default default] [instance: 4018e31a-e4b1-4786-8a23-a6824cfe324b] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Dec 05 09:47:02 compute-1 nova_compute[189066]: 2025-12-05 09:47:02.501 189070 INFO nova.virt.libvirt.driver [None req-84516989-4f0e-418d-84d5-6ea48e6d87be d96d7aa420944947ad70e496aec2bea1 0279a3edc4d145a3815f122291b494bf - - default default] [instance: 4018e31a-e4b1-4786-8a23-a6824cfe324b] Creating image(s)
Dec 05 09:47:02 compute-1 nova_compute[189066]: 2025-12-05 09:47:02.502 189070 DEBUG oslo_concurrency.lockutils [None req-84516989-4f0e-418d-84d5-6ea48e6d87be d96d7aa420944947ad70e496aec2bea1 0279a3edc4d145a3815f122291b494bf - - default default] Acquiring lock "/var/lib/nova/instances/4018e31a-e4b1-4786-8a23-a6824cfe324b/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:47:02 compute-1 nova_compute[189066]: 2025-12-05 09:47:02.502 189070 DEBUG oslo_concurrency.lockutils [None req-84516989-4f0e-418d-84d5-6ea48e6d87be d96d7aa420944947ad70e496aec2bea1 0279a3edc4d145a3815f122291b494bf - - default default] Lock "/var/lib/nova/instances/4018e31a-e4b1-4786-8a23-a6824cfe324b/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:47:02 compute-1 nova_compute[189066]: 2025-12-05 09:47:02.503 189070 DEBUG oslo_concurrency.lockutils [None req-84516989-4f0e-418d-84d5-6ea48e6d87be d96d7aa420944947ad70e496aec2bea1 0279a3edc4d145a3815f122291b494bf - - default default] Lock "/var/lib/nova/instances/4018e31a-e4b1-4786-8a23-a6824cfe324b/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:47:02 compute-1 nova_compute[189066]: 2025-12-05 09:47:02.520 189070 DEBUG oslo_concurrency.processutils [None req-84516989-4f0e-418d-84d5-6ea48e6d87be d96d7aa420944947ad70e496aec2bea1 0279a3edc4d145a3815f122291b494bf - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5b1bdb5cc50b9bf43ebe3094e9279a12a18df0ce --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 09:47:02 compute-1 nova_compute[189066]: 2025-12-05 09:47:02.565 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:47:02 compute-1 nova_compute[189066]: 2025-12-05 09:47:02.584 189070 DEBUG oslo_concurrency.processutils [None req-84516989-4f0e-418d-84d5-6ea48e6d87be d96d7aa420944947ad70e496aec2bea1 0279a3edc4d145a3815f122291b494bf - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5b1bdb5cc50b9bf43ebe3094e9279a12a18df0ce --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 09:47:02 compute-1 nova_compute[189066]: 2025-12-05 09:47:02.586 189070 DEBUG oslo_concurrency.lockutils [None req-84516989-4f0e-418d-84d5-6ea48e6d87be d96d7aa420944947ad70e496aec2bea1 0279a3edc4d145a3815f122291b494bf - - default default] Acquiring lock "5b1bdb5cc50b9bf43ebe3094e9279a12a18df0ce" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:47:02 compute-1 nova_compute[189066]: 2025-12-05 09:47:02.586 189070 DEBUG oslo_concurrency.lockutils [None req-84516989-4f0e-418d-84d5-6ea48e6d87be d96d7aa420944947ad70e496aec2bea1 0279a3edc4d145a3815f122291b494bf - - default default] Lock "5b1bdb5cc50b9bf43ebe3094e9279a12a18df0ce" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:47:02 compute-1 nova_compute[189066]: 2025-12-05 09:47:02.600 189070 DEBUG oslo_concurrency.processutils [None req-84516989-4f0e-418d-84d5-6ea48e6d87be d96d7aa420944947ad70e496aec2bea1 0279a3edc4d145a3815f122291b494bf - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5b1bdb5cc50b9bf43ebe3094e9279a12a18df0ce --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 09:47:02 compute-1 nova_compute[189066]: 2025-12-05 09:47:02.658 189070 DEBUG oslo_concurrency.processutils [None req-84516989-4f0e-418d-84d5-6ea48e6d87be d96d7aa420944947ad70e496aec2bea1 0279a3edc4d145a3815f122291b494bf - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5b1bdb5cc50b9bf43ebe3094e9279a12a18df0ce --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 09:47:02 compute-1 nova_compute[189066]: 2025-12-05 09:47:02.660 189070 DEBUG oslo_concurrency.processutils [None req-84516989-4f0e-418d-84d5-6ea48e6d87be d96d7aa420944947ad70e496aec2bea1 0279a3edc4d145a3815f122291b494bf - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5b1bdb5cc50b9bf43ebe3094e9279a12a18df0ce,backing_fmt=raw /var/lib/nova/instances/4018e31a-e4b1-4786-8a23-a6824cfe324b/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 09:47:02 compute-1 nova_compute[189066]: 2025-12-05 09:47:02.697 189070 DEBUG oslo_concurrency.processutils [None req-84516989-4f0e-418d-84d5-6ea48e6d87be d96d7aa420944947ad70e496aec2bea1 0279a3edc4d145a3815f122291b494bf - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5b1bdb5cc50b9bf43ebe3094e9279a12a18df0ce,backing_fmt=raw /var/lib/nova/instances/4018e31a-e4b1-4786-8a23-a6824cfe324b/disk 1073741824" returned: 0 in 0.038s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 09:47:02 compute-1 nova_compute[189066]: 2025-12-05 09:47:02.699 189070 DEBUG oslo_concurrency.lockutils [None req-84516989-4f0e-418d-84d5-6ea48e6d87be d96d7aa420944947ad70e496aec2bea1 0279a3edc4d145a3815f122291b494bf - - default default] Lock "5b1bdb5cc50b9bf43ebe3094e9279a12a18df0ce" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.112s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:47:02 compute-1 nova_compute[189066]: 2025-12-05 09:47:02.699 189070 DEBUG oslo_concurrency.processutils [None req-84516989-4f0e-418d-84d5-6ea48e6d87be d96d7aa420944947ad70e496aec2bea1 0279a3edc4d145a3815f122291b494bf - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5b1bdb5cc50b9bf43ebe3094e9279a12a18df0ce --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 09:47:02 compute-1 nova_compute[189066]: 2025-12-05 09:47:02.760 189070 DEBUG oslo_concurrency.processutils [None req-84516989-4f0e-418d-84d5-6ea48e6d87be d96d7aa420944947ad70e496aec2bea1 0279a3edc4d145a3815f122291b494bf - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5b1bdb5cc50b9bf43ebe3094e9279a12a18df0ce --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 09:47:02 compute-1 nova_compute[189066]: 2025-12-05 09:47:02.761 189070 DEBUG nova.virt.disk.api [None req-84516989-4f0e-418d-84d5-6ea48e6d87be d96d7aa420944947ad70e496aec2bea1 0279a3edc4d145a3815f122291b494bf - - default default] Checking if we can resize image /var/lib/nova/instances/4018e31a-e4b1-4786-8a23-a6824cfe324b/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Dec 05 09:47:02 compute-1 nova_compute[189066]: 2025-12-05 09:47:02.762 189070 DEBUG oslo_concurrency.processutils [None req-84516989-4f0e-418d-84d5-6ea48e6d87be d96d7aa420944947ad70e496aec2bea1 0279a3edc4d145a3815f122291b494bf - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4018e31a-e4b1-4786-8a23-a6824cfe324b/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 09:47:02 compute-1 nova_compute[189066]: 2025-12-05 09:47:02.818 189070 DEBUG oslo_concurrency.processutils [None req-84516989-4f0e-418d-84d5-6ea48e6d87be d96d7aa420944947ad70e496aec2bea1 0279a3edc4d145a3815f122291b494bf - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4018e31a-e4b1-4786-8a23-a6824cfe324b/disk --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 09:47:02 compute-1 nova_compute[189066]: 2025-12-05 09:47:02.819 189070 DEBUG nova.virt.disk.api [None req-84516989-4f0e-418d-84d5-6ea48e6d87be d96d7aa420944947ad70e496aec2bea1 0279a3edc4d145a3815f122291b494bf - - default default] Cannot resize image /var/lib/nova/instances/4018e31a-e4b1-4786-8a23-a6824cfe324b/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Dec 05 09:47:02 compute-1 nova_compute[189066]: 2025-12-05 09:47:02.819 189070 DEBUG nova.objects.instance [None req-84516989-4f0e-418d-84d5-6ea48e6d87be d96d7aa420944947ad70e496aec2bea1 0279a3edc4d145a3815f122291b494bf - - default default] Lazy-loading 'migration_context' on Instance uuid 4018e31a-e4b1-4786-8a23-a6824cfe324b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 09:47:02 compute-1 nova_compute[189066]: 2025-12-05 09:47:02.849 189070 DEBUG nova.virt.libvirt.driver [None req-84516989-4f0e-418d-84d5-6ea48e6d87be d96d7aa420944947ad70e496aec2bea1 0279a3edc4d145a3815f122291b494bf - - default default] [instance: 4018e31a-e4b1-4786-8a23-a6824cfe324b] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Dec 05 09:47:02 compute-1 nova_compute[189066]: 2025-12-05 09:47:02.850 189070 DEBUG nova.virt.libvirt.driver [None req-84516989-4f0e-418d-84d5-6ea48e6d87be d96d7aa420944947ad70e496aec2bea1 0279a3edc4d145a3815f122291b494bf - - default default] [instance: 4018e31a-e4b1-4786-8a23-a6824cfe324b] Ensure instance console log exists: /var/lib/nova/instances/4018e31a-e4b1-4786-8a23-a6824cfe324b/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Dec 05 09:47:02 compute-1 nova_compute[189066]: 2025-12-05 09:47:02.851 189070 DEBUG oslo_concurrency.lockutils [None req-84516989-4f0e-418d-84d5-6ea48e6d87be d96d7aa420944947ad70e496aec2bea1 0279a3edc4d145a3815f122291b494bf - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:47:02 compute-1 nova_compute[189066]: 2025-12-05 09:47:02.851 189070 DEBUG oslo_concurrency.lockutils [None req-84516989-4f0e-418d-84d5-6ea48e6d87be d96d7aa420944947ad70e496aec2bea1 0279a3edc4d145a3815f122291b494bf - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:47:02 compute-1 nova_compute[189066]: 2025-12-05 09:47:02.851 189070 DEBUG oslo_concurrency.lockutils [None req-84516989-4f0e-418d-84d5-6ea48e6d87be d96d7aa420944947ad70e496aec2bea1 0279a3edc4d145a3815f122291b494bf - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:47:04 compute-1 nova_compute[189066]: 2025-12-05 09:47:04.287 189070 DEBUG nova.network.neutron [None req-84516989-4f0e-418d-84d5-6ea48e6d87be d96d7aa420944947ad70e496aec2bea1 0279a3edc4d145a3815f122291b494bf - - default default] [instance: 4018e31a-e4b1-4786-8a23-a6824cfe324b] Successfully created port: b627bc29-57d3-4693-9334-e4986065502a _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Dec 05 09:47:05 compute-1 podman[233841]: 2025-12-05 09:47:05.684819596 +0000 UTC m=+0.121074038 container health_status d60d3dfd47255bc7c068dbedc03dbf86678863f0414a1b8f8536e4d53dd67a13 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a42d21893a42142b0b00e96296a13ac9d2c0fedf55d6438ad104fd0309c63376-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 05 09:47:06 compute-1 sshd[130045]: Timeout before authentication for connection from 221.237.163.202 to 38.102.83.154, pid = 233122
Dec 05 09:47:06 compute-1 nova_compute[189066]: 2025-12-05 09:47:06.350 189070 DEBUG nova.network.neutron [None req-84516989-4f0e-418d-84d5-6ea48e6d87be d96d7aa420944947ad70e496aec2bea1 0279a3edc4d145a3815f122291b494bf - - default default] [instance: 4018e31a-e4b1-4786-8a23-a6824cfe324b] Successfully updated port: b627bc29-57d3-4693-9334-e4986065502a _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Dec 05 09:47:06 compute-1 nova_compute[189066]: 2025-12-05 09:47:06.397 189070 DEBUG oslo_concurrency.lockutils [None req-84516989-4f0e-418d-84d5-6ea48e6d87be d96d7aa420944947ad70e496aec2bea1 0279a3edc4d145a3815f122291b494bf - - default default] Acquiring lock "refresh_cache-4018e31a-e4b1-4786-8a23-a6824cfe324b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 09:47:06 compute-1 nova_compute[189066]: 2025-12-05 09:47:06.398 189070 DEBUG oslo_concurrency.lockutils [None req-84516989-4f0e-418d-84d5-6ea48e6d87be d96d7aa420944947ad70e496aec2bea1 0279a3edc4d145a3815f122291b494bf - - default default] Acquired lock "refresh_cache-4018e31a-e4b1-4786-8a23-a6824cfe324b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 09:47:06 compute-1 nova_compute[189066]: 2025-12-05 09:47:06.398 189070 DEBUG nova.network.neutron [None req-84516989-4f0e-418d-84d5-6ea48e6d87be d96d7aa420944947ad70e496aec2bea1 0279a3edc4d145a3815f122291b494bf - - default default] [instance: 4018e31a-e4b1-4786-8a23-a6824cfe324b] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Dec 05 09:47:06 compute-1 nova_compute[189066]: 2025-12-05 09:47:06.545 189070 DEBUG nova.compute.manager [req-0b300092-5611-4c9a-9494-47dfb399e07e req-d6a37062-7d0e-4576-8df1-37331a1e868e 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 4018e31a-e4b1-4786-8a23-a6824cfe324b] Received event network-changed-b627bc29-57d3-4693-9334-e4986065502a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 09:47:06 compute-1 nova_compute[189066]: 2025-12-05 09:47:06.546 189070 DEBUG nova.compute.manager [req-0b300092-5611-4c9a-9494-47dfb399e07e req-d6a37062-7d0e-4576-8df1-37331a1e868e 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 4018e31a-e4b1-4786-8a23-a6824cfe324b] Refreshing instance network info cache due to event network-changed-b627bc29-57d3-4693-9334-e4986065502a. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Dec 05 09:47:06 compute-1 nova_compute[189066]: 2025-12-05 09:47:06.546 189070 DEBUG oslo_concurrency.lockutils [req-0b300092-5611-4c9a-9494-47dfb399e07e req-d6a37062-7d0e-4576-8df1-37331a1e868e 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Acquiring lock "refresh_cache-4018e31a-e4b1-4786-8a23-a6824cfe324b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 05 09:47:06 compute-1 nova_compute[189066]: 2025-12-05 09:47:06.628 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:47:06 compute-1 nova_compute[189066]: 2025-12-05 09:47:06.778 189070 DEBUG nova.network.neutron [None req-84516989-4f0e-418d-84d5-6ea48e6d87be d96d7aa420944947ad70e496aec2bea1 0279a3edc4d145a3815f122291b494bf - - default default] [instance: 4018e31a-e4b1-4786-8a23-a6824cfe324b] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Dec 05 09:47:07 compute-1 nova_compute[189066]: 2025-12-05 09:47:07.607 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:47:08 compute-1 nova_compute[189066]: 2025-12-05 09:47:08.633 189070 DEBUG nova.network.neutron [None req-84516989-4f0e-418d-84d5-6ea48e6d87be d96d7aa420944947ad70e496aec2bea1 0279a3edc4d145a3815f122291b494bf - - default default] [instance: 4018e31a-e4b1-4786-8a23-a6824cfe324b] Updating instance_info_cache with network_info: [{"id": "b627bc29-57d3-4693-9334-e4986065502a", "address": "fa:16:3e:43:43:ca", "network": {"id": "71ae38f4-8c55-4792-9422-fcbcaa5f09ab", "bridge": "br-int", "label": "tempest-TestServerMultinode-503343283-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "66bb236c6ae84372bfe66c14c34eff73", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb627bc29-57", "ovs_interfaceid": "b627bc29-57d3-4693-9334-e4986065502a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 09:47:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:47:08.896 105272 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:47:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:47:08.897 105272 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:47:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:47:08.897 105272 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:47:09 compute-1 nova_compute[189066]: 2025-12-05 09:47:09.031 189070 DEBUG oslo_concurrency.lockutils [None req-84516989-4f0e-418d-84d5-6ea48e6d87be d96d7aa420944947ad70e496aec2bea1 0279a3edc4d145a3815f122291b494bf - - default default] Releasing lock "refresh_cache-4018e31a-e4b1-4786-8a23-a6824cfe324b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 09:47:09 compute-1 nova_compute[189066]: 2025-12-05 09:47:09.031 189070 DEBUG nova.compute.manager [None req-84516989-4f0e-418d-84d5-6ea48e6d87be d96d7aa420944947ad70e496aec2bea1 0279a3edc4d145a3815f122291b494bf - - default default] [instance: 4018e31a-e4b1-4786-8a23-a6824cfe324b] Instance network_info: |[{"id": "b627bc29-57d3-4693-9334-e4986065502a", "address": "fa:16:3e:43:43:ca", "network": {"id": "71ae38f4-8c55-4792-9422-fcbcaa5f09ab", "bridge": "br-int", "label": "tempest-TestServerMultinode-503343283-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "66bb236c6ae84372bfe66c14c34eff73", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb627bc29-57", "ovs_interfaceid": "b627bc29-57d3-4693-9334-e4986065502a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Dec 05 09:47:09 compute-1 nova_compute[189066]: 2025-12-05 09:47:09.032 189070 DEBUG oslo_concurrency.lockutils [req-0b300092-5611-4c9a-9494-47dfb399e07e req-d6a37062-7d0e-4576-8df1-37331a1e868e 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Acquired lock "refresh_cache-4018e31a-e4b1-4786-8a23-a6824cfe324b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 05 09:47:09 compute-1 nova_compute[189066]: 2025-12-05 09:47:09.033 189070 DEBUG nova.network.neutron [req-0b300092-5611-4c9a-9494-47dfb399e07e req-d6a37062-7d0e-4576-8df1-37331a1e868e 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 4018e31a-e4b1-4786-8a23-a6824cfe324b] Refreshing network info cache for port b627bc29-57d3-4693-9334-e4986065502a _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Dec 05 09:47:09 compute-1 nova_compute[189066]: 2025-12-05 09:47:09.037 189070 DEBUG nova.virt.libvirt.driver [None req-84516989-4f0e-418d-84d5-6ea48e6d87be d96d7aa420944947ad70e496aec2bea1 0279a3edc4d145a3815f122291b494bf - - default default] [instance: 4018e31a-e4b1-4786-8a23-a6824cfe324b] Start _get_guest_xml network_info=[{"id": "b627bc29-57d3-4693-9334-e4986065502a", "address": "fa:16:3e:43:43:ca", "network": {"id": "71ae38f4-8c55-4792-9422-fcbcaa5f09ab", "bridge": "br-int", "label": "tempest-TestServerMultinode-503343283-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "66bb236c6ae84372bfe66c14c34eff73", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb627bc29-57", "ovs_interfaceid": "b627bc29-57d3-4693-9334-e4986065502a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T09:19:04Z,direct_url=<?>,disk_format='qcow2',id=3ebffd97-b242-42d7-b245-ebdaf8e4377c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='1cb29f3743454b7fb71af92993455437',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T09:19:06Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encrypted': False, 'encryption_secret_uuid': None, 'size': 0, 'boot_index': 0, 'device_name': '/dev/vda', 'disk_bus': 'virtio', 'guest_format': None, 'encryption_options': None, 'encryption_format': None, 'device_type': 'disk', 'image_id': '3ebffd97-b242-42d7-b245-ebdaf8e4377c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Dec 05 09:47:09 compute-1 nova_compute[189066]: 2025-12-05 09:47:09.044 189070 WARNING nova.virt.libvirt.driver [None req-84516989-4f0e-418d-84d5-6ea48e6d87be d96d7aa420944947ad70e496aec2bea1 0279a3edc4d145a3815f122291b494bf - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 05 09:47:09 compute-1 nova_compute[189066]: 2025-12-05 09:47:09.052 189070 DEBUG nova.virt.libvirt.host [None req-84516989-4f0e-418d-84d5-6ea48e6d87be d96d7aa420944947ad70e496aec2bea1 0279a3edc4d145a3815f122291b494bf - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Dec 05 09:47:09 compute-1 nova_compute[189066]: 2025-12-05 09:47:09.052 189070 DEBUG nova.virt.libvirt.host [None req-84516989-4f0e-418d-84d5-6ea48e6d87be d96d7aa420944947ad70e496aec2bea1 0279a3edc4d145a3815f122291b494bf - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Dec 05 09:47:09 compute-1 nova_compute[189066]: 2025-12-05 09:47:09.057 189070 DEBUG nova.virt.libvirt.host [None req-84516989-4f0e-418d-84d5-6ea48e6d87be d96d7aa420944947ad70e496aec2bea1 0279a3edc4d145a3815f122291b494bf - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Dec 05 09:47:09 compute-1 nova_compute[189066]: 2025-12-05 09:47:09.058 189070 DEBUG nova.virt.libvirt.host [None req-84516989-4f0e-418d-84d5-6ea48e6d87be d96d7aa420944947ad70e496aec2bea1 0279a3edc4d145a3815f122291b494bf - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Dec 05 09:47:09 compute-1 nova_compute[189066]: 2025-12-05 09:47:09.063 189070 DEBUG nova.virt.libvirt.driver [None req-84516989-4f0e-418d-84d5-6ea48e6d87be d96d7aa420944947ad70e496aec2bea1 0279a3edc4d145a3815f122291b494bf - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Dec 05 09:47:09 compute-1 nova_compute[189066]: 2025-12-05 09:47:09.064 189070 DEBUG nova.virt.hardware [None req-84516989-4f0e-418d-84d5-6ea48e6d87be d96d7aa420944947ad70e496aec2bea1 0279a3edc4d145a3815f122291b494bf - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-05T09:19:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='fbadeab4-f24f-4100-963a-d228b2a6f7c4',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T09:19:04Z,direct_url=<?>,disk_format='qcow2',id=3ebffd97-b242-42d7-b245-ebdaf8e4377c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='1cb29f3743454b7fb71af92993455437',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T09:19:06Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Dec 05 09:47:09 compute-1 nova_compute[189066]: 2025-12-05 09:47:09.066 189070 DEBUG nova.virt.hardware [None req-84516989-4f0e-418d-84d5-6ea48e6d87be d96d7aa420944947ad70e496aec2bea1 0279a3edc4d145a3815f122291b494bf - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Dec 05 09:47:09 compute-1 nova_compute[189066]: 2025-12-05 09:47:09.067 189070 DEBUG nova.virt.hardware [None req-84516989-4f0e-418d-84d5-6ea48e6d87be d96d7aa420944947ad70e496aec2bea1 0279a3edc4d145a3815f122291b494bf - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Dec 05 09:47:09 compute-1 nova_compute[189066]: 2025-12-05 09:47:09.067 189070 DEBUG nova.virt.hardware [None req-84516989-4f0e-418d-84d5-6ea48e6d87be d96d7aa420944947ad70e496aec2bea1 0279a3edc4d145a3815f122291b494bf - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Dec 05 09:47:09 compute-1 nova_compute[189066]: 2025-12-05 09:47:09.068 189070 DEBUG nova.virt.hardware [None req-84516989-4f0e-418d-84d5-6ea48e6d87be d96d7aa420944947ad70e496aec2bea1 0279a3edc4d145a3815f122291b494bf - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Dec 05 09:47:09 compute-1 nova_compute[189066]: 2025-12-05 09:47:09.068 189070 DEBUG nova.virt.hardware [None req-84516989-4f0e-418d-84d5-6ea48e6d87be d96d7aa420944947ad70e496aec2bea1 0279a3edc4d145a3815f122291b494bf - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Dec 05 09:47:09 compute-1 nova_compute[189066]: 2025-12-05 09:47:09.069 189070 DEBUG nova.virt.hardware [None req-84516989-4f0e-418d-84d5-6ea48e6d87be d96d7aa420944947ad70e496aec2bea1 0279a3edc4d145a3815f122291b494bf - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Dec 05 09:47:09 compute-1 nova_compute[189066]: 2025-12-05 09:47:09.070 189070 DEBUG nova.virt.hardware [None req-84516989-4f0e-418d-84d5-6ea48e6d87be d96d7aa420944947ad70e496aec2bea1 0279a3edc4d145a3815f122291b494bf - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Dec 05 09:47:09 compute-1 nova_compute[189066]: 2025-12-05 09:47:09.071 189070 DEBUG nova.virt.hardware [None req-84516989-4f0e-418d-84d5-6ea48e6d87be d96d7aa420944947ad70e496aec2bea1 0279a3edc4d145a3815f122291b494bf - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Dec 05 09:47:09 compute-1 nova_compute[189066]: 2025-12-05 09:47:09.071 189070 DEBUG nova.virt.hardware [None req-84516989-4f0e-418d-84d5-6ea48e6d87be d96d7aa420944947ad70e496aec2bea1 0279a3edc4d145a3815f122291b494bf - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Dec 05 09:47:09 compute-1 nova_compute[189066]: 2025-12-05 09:47:09.072 189070 DEBUG nova.virt.hardware [None req-84516989-4f0e-418d-84d5-6ea48e6d87be d96d7aa420944947ad70e496aec2bea1 0279a3edc4d145a3815f122291b494bf - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Dec 05 09:47:09 compute-1 nova_compute[189066]: 2025-12-05 09:47:09.085 189070 DEBUG nova.virt.libvirt.vif [None req-84516989-4f0e-418d-84d5-6ea48e6d87be d96d7aa420944947ad70e496aec2bea1 0279a3edc4d145a3815f122291b494bf - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T09:47:00Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestServerMultinode-server-738303947',display_name='tempest-TestServerMultinode-server-738303947',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testservermultinode-server-738303947',id=55,image_ref='3ebffd97-b242-42d7-b245-ebdaf8e4377c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='0279a3edc4d145a3815f122291b494bf',ramdisk_id='',reservation_id='r-im3vetyr',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member,admin',image_base_image_ref='3ebffd97-b242-42d7-b245-ebdaf8e4377c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestServerMultinode-1600634262',owner_user_name='tempest-TestServerMultinode-1600634262-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T09:47:02Z,user_data=None,user_id='d96d7aa420944947ad70e496aec2bea1',uuid=4018e31a-e4b1-4786-8a23-a6824cfe324b,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "b627bc29-57d3-4693-9334-e4986065502a", "address": "fa:16:3e:43:43:ca", "network": {"id": "71ae38f4-8c55-4792-9422-fcbcaa5f09ab", "bridge": "br-int", "label": "tempest-TestServerMultinode-503343283-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "66bb236c6ae84372bfe66c14c34eff73", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb627bc29-57", "ovs_interfaceid": "b627bc29-57d3-4693-9334-e4986065502a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Dec 05 09:47:09 compute-1 nova_compute[189066]: 2025-12-05 09:47:09.086 189070 DEBUG nova.network.os_vif_util [None req-84516989-4f0e-418d-84d5-6ea48e6d87be d96d7aa420944947ad70e496aec2bea1 0279a3edc4d145a3815f122291b494bf - - default default] Converting VIF {"id": "b627bc29-57d3-4693-9334-e4986065502a", "address": "fa:16:3e:43:43:ca", "network": {"id": "71ae38f4-8c55-4792-9422-fcbcaa5f09ab", "bridge": "br-int", "label": "tempest-TestServerMultinode-503343283-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "66bb236c6ae84372bfe66c14c34eff73", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb627bc29-57", "ovs_interfaceid": "b627bc29-57d3-4693-9334-e4986065502a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 09:47:09 compute-1 nova_compute[189066]: 2025-12-05 09:47:09.089 189070 DEBUG nova.network.os_vif_util [None req-84516989-4f0e-418d-84d5-6ea48e6d87be d96d7aa420944947ad70e496aec2bea1 0279a3edc4d145a3815f122291b494bf - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:43:43:ca,bridge_name='br-int',has_traffic_filtering=True,id=b627bc29-57d3-4693-9334-e4986065502a,network=Network(71ae38f4-8c55-4792-9422-fcbcaa5f09ab),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb627bc29-57') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 09:47:09 compute-1 nova_compute[189066]: 2025-12-05 09:47:09.090 189070 DEBUG nova.objects.instance [None req-84516989-4f0e-418d-84d5-6ea48e6d87be d96d7aa420944947ad70e496aec2bea1 0279a3edc4d145a3815f122291b494bf - - default default] Lazy-loading 'pci_devices' on Instance uuid 4018e31a-e4b1-4786-8a23-a6824cfe324b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 09:47:09 compute-1 nova_compute[189066]: 2025-12-05 09:47:09.318 189070 DEBUG nova.virt.libvirt.driver [None req-84516989-4f0e-418d-84d5-6ea48e6d87be d96d7aa420944947ad70e496aec2bea1 0279a3edc4d145a3815f122291b494bf - - default default] [instance: 4018e31a-e4b1-4786-8a23-a6824cfe324b] End _get_guest_xml xml=<domain type="kvm">
Dec 05 09:47:09 compute-1 nova_compute[189066]:   <uuid>4018e31a-e4b1-4786-8a23-a6824cfe324b</uuid>
Dec 05 09:47:09 compute-1 nova_compute[189066]:   <name>instance-00000037</name>
Dec 05 09:47:09 compute-1 nova_compute[189066]:   <memory>131072</memory>
Dec 05 09:47:09 compute-1 nova_compute[189066]:   <vcpu>1</vcpu>
Dec 05 09:47:09 compute-1 nova_compute[189066]:   <metadata>
Dec 05 09:47:09 compute-1 nova_compute[189066]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 05 09:47:09 compute-1 nova_compute[189066]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec 05 09:47:09 compute-1 nova_compute[189066]:       <nova:name>tempest-TestServerMultinode-server-738303947</nova:name>
Dec 05 09:47:09 compute-1 nova_compute[189066]:       <nova:creationTime>2025-12-05 09:47:09</nova:creationTime>
Dec 05 09:47:09 compute-1 nova_compute[189066]:       <nova:flavor name="m1.nano">
Dec 05 09:47:09 compute-1 nova_compute[189066]:         <nova:memory>128</nova:memory>
Dec 05 09:47:09 compute-1 nova_compute[189066]:         <nova:disk>1</nova:disk>
Dec 05 09:47:09 compute-1 nova_compute[189066]:         <nova:swap>0</nova:swap>
Dec 05 09:47:09 compute-1 nova_compute[189066]:         <nova:ephemeral>0</nova:ephemeral>
Dec 05 09:47:09 compute-1 nova_compute[189066]:         <nova:vcpus>1</nova:vcpus>
Dec 05 09:47:09 compute-1 nova_compute[189066]:       </nova:flavor>
Dec 05 09:47:09 compute-1 nova_compute[189066]:       <nova:owner>
Dec 05 09:47:09 compute-1 nova_compute[189066]:         <nova:user uuid="d96d7aa420944947ad70e496aec2bea1">tempest-TestServerMultinode-1600634262-project-admin</nova:user>
Dec 05 09:47:09 compute-1 nova_compute[189066]:         <nova:project uuid="0279a3edc4d145a3815f122291b494bf">tempest-TestServerMultinode-1600634262</nova:project>
Dec 05 09:47:09 compute-1 nova_compute[189066]:       </nova:owner>
Dec 05 09:47:09 compute-1 nova_compute[189066]:       <nova:root type="image" uuid="3ebffd97-b242-42d7-b245-ebdaf8e4377c"/>
Dec 05 09:47:09 compute-1 nova_compute[189066]:       <nova:ports>
Dec 05 09:47:09 compute-1 nova_compute[189066]:         <nova:port uuid="b627bc29-57d3-4693-9334-e4986065502a">
Dec 05 09:47:09 compute-1 nova_compute[189066]:           <nova:ip type="fixed" address="10.100.0.7" ipVersion="4"/>
Dec 05 09:47:09 compute-1 nova_compute[189066]:         </nova:port>
Dec 05 09:47:09 compute-1 nova_compute[189066]:       </nova:ports>
Dec 05 09:47:09 compute-1 nova_compute[189066]:     </nova:instance>
Dec 05 09:47:09 compute-1 nova_compute[189066]:   </metadata>
Dec 05 09:47:09 compute-1 nova_compute[189066]:   <sysinfo type="smbios">
Dec 05 09:47:09 compute-1 nova_compute[189066]:     <system>
Dec 05 09:47:09 compute-1 nova_compute[189066]:       <entry name="manufacturer">RDO</entry>
Dec 05 09:47:09 compute-1 nova_compute[189066]:       <entry name="product">OpenStack Compute</entry>
Dec 05 09:47:09 compute-1 nova_compute[189066]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec 05 09:47:09 compute-1 nova_compute[189066]:       <entry name="serial">4018e31a-e4b1-4786-8a23-a6824cfe324b</entry>
Dec 05 09:47:09 compute-1 nova_compute[189066]:       <entry name="uuid">4018e31a-e4b1-4786-8a23-a6824cfe324b</entry>
Dec 05 09:47:09 compute-1 nova_compute[189066]:       <entry name="family">Virtual Machine</entry>
Dec 05 09:47:09 compute-1 nova_compute[189066]:     </system>
Dec 05 09:47:09 compute-1 nova_compute[189066]:   </sysinfo>
Dec 05 09:47:09 compute-1 nova_compute[189066]:   <os>
Dec 05 09:47:09 compute-1 nova_compute[189066]:     <type arch="x86_64" machine="q35">hvm</type>
Dec 05 09:47:09 compute-1 nova_compute[189066]:     <boot dev="hd"/>
Dec 05 09:47:09 compute-1 nova_compute[189066]:     <smbios mode="sysinfo"/>
Dec 05 09:47:09 compute-1 nova_compute[189066]:   </os>
Dec 05 09:47:09 compute-1 nova_compute[189066]:   <features>
Dec 05 09:47:09 compute-1 nova_compute[189066]:     <acpi/>
Dec 05 09:47:09 compute-1 nova_compute[189066]:     <apic/>
Dec 05 09:47:09 compute-1 nova_compute[189066]:     <vmcoreinfo/>
Dec 05 09:47:09 compute-1 nova_compute[189066]:   </features>
Dec 05 09:47:09 compute-1 nova_compute[189066]:   <clock offset="utc">
Dec 05 09:47:09 compute-1 nova_compute[189066]:     <timer name="pit" tickpolicy="delay"/>
Dec 05 09:47:09 compute-1 nova_compute[189066]:     <timer name="rtc" tickpolicy="catchup"/>
Dec 05 09:47:09 compute-1 nova_compute[189066]:     <timer name="hpet" present="no"/>
Dec 05 09:47:09 compute-1 nova_compute[189066]:   </clock>
Dec 05 09:47:09 compute-1 nova_compute[189066]:   <cpu mode="custom" match="exact">
Dec 05 09:47:09 compute-1 nova_compute[189066]:     <model>Nehalem</model>
Dec 05 09:47:09 compute-1 nova_compute[189066]:     <topology sockets="1" cores="1" threads="1"/>
Dec 05 09:47:09 compute-1 nova_compute[189066]:   </cpu>
Dec 05 09:47:09 compute-1 nova_compute[189066]:   <devices>
Dec 05 09:47:09 compute-1 nova_compute[189066]:     <disk type="file" device="disk">
Dec 05 09:47:09 compute-1 nova_compute[189066]:       <driver name="qemu" type="qcow2" cache="none"/>
Dec 05 09:47:09 compute-1 nova_compute[189066]:       <source file="/var/lib/nova/instances/4018e31a-e4b1-4786-8a23-a6824cfe324b/disk"/>
Dec 05 09:47:09 compute-1 nova_compute[189066]:       <target dev="vda" bus="virtio"/>
Dec 05 09:47:09 compute-1 nova_compute[189066]:     </disk>
Dec 05 09:47:09 compute-1 nova_compute[189066]:     <disk type="file" device="cdrom">
Dec 05 09:47:09 compute-1 nova_compute[189066]:       <driver name="qemu" type="raw" cache="none"/>
Dec 05 09:47:09 compute-1 nova_compute[189066]:       <source file="/var/lib/nova/instances/4018e31a-e4b1-4786-8a23-a6824cfe324b/disk.config"/>
Dec 05 09:47:09 compute-1 nova_compute[189066]:       <target dev="sda" bus="sata"/>
Dec 05 09:47:09 compute-1 nova_compute[189066]:     </disk>
Dec 05 09:47:09 compute-1 nova_compute[189066]:     <interface type="ethernet">
Dec 05 09:47:09 compute-1 nova_compute[189066]:       <mac address="fa:16:3e:43:43:ca"/>
Dec 05 09:47:09 compute-1 nova_compute[189066]:       <model type="virtio"/>
Dec 05 09:47:09 compute-1 nova_compute[189066]:       <driver name="vhost" rx_queue_size="512"/>
Dec 05 09:47:09 compute-1 nova_compute[189066]:       <mtu size="1442"/>
Dec 05 09:47:09 compute-1 nova_compute[189066]:       <target dev="tapb627bc29-57"/>
Dec 05 09:47:09 compute-1 nova_compute[189066]:     </interface>
Dec 05 09:47:09 compute-1 nova_compute[189066]:     <serial type="pty">
Dec 05 09:47:09 compute-1 nova_compute[189066]:       <log file="/var/lib/nova/instances/4018e31a-e4b1-4786-8a23-a6824cfe324b/console.log" append="off"/>
Dec 05 09:47:09 compute-1 nova_compute[189066]:     </serial>
Dec 05 09:47:09 compute-1 nova_compute[189066]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Dec 05 09:47:09 compute-1 nova_compute[189066]:     <video>
Dec 05 09:47:09 compute-1 nova_compute[189066]:       <model type="virtio"/>
Dec 05 09:47:09 compute-1 nova_compute[189066]:     </video>
Dec 05 09:47:09 compute-1 nova_compute[189066]:     <input type="tablet" bus="usb"/>
Dec 05 09:47:09 compute-1 nova_compute[189066]:     <rng model="virtio">
Dec 05 09:47:09 compute-1 nova_compute[189066]:       <backend model="random">/dev/urandom</backend>
Dec 05 09:47:09 compute-1 nova_compute[189066]:     </rng>
Dec 05 09:47:09 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root"/>
Dec 05 09:47:09 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:47:09 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:47:09 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:47:09 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:47:09 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:47:09 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:47:09 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:47:09 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:47:09 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:47:09 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:47:09 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:47:09 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:47:09 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:47:09 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:47:09 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:47:09 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:47:09 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:47:09 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:47:09 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:47:09 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:47:09 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:47:09 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:47:09 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:47:09 compute-1 nova_compute[189066]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 09:47:09 compute-1 nova_compute[189066]:     <controller type="usb" index="0"/>
Dec 05 09:47:09 compute-1 nova_compute[189066]:     <memballoon model="virtio">
Dec 05 09:47:09 compute-1 nova_compute[189066]:       <stats period="10"/>
Dec 05 09:47:09 compute-1 nova_compute[189066]:     </memballoon>
Dec 05 09:47:09 compute-1 nova_compute[189066]:   </devices>
Dec 05 09:47:09 compute-1 nova_compute[189066]: </domain>
Dec 05 09:47:09 compute-1 nova_compute[189066]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Dec 05 09:47:09 compute-1 nova_compute[189066]: 2025-12-05 09:47:09.319 189070 DEBUG nova.compute.manager [None req-84516989-4f0e-418d-84d5-6ea48e6d87be d96d7aa420944947ad70e496aec2bea1 0279a3edc4d145a3815f122291b494bf - - default default] [instance: 4018e31a-e4b1-4786-8a23-a6824cfe324b] Preparing to wait for external event network-vif-plugged-b627bc29-57d3-4693-9334-e4986065502a prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Dec 05 09:47:09 compute-1 nova_compute[189066]: 2025-12-05 09:47:09.320 189070 DEBUG oslo_concurrency.lockutils [None req-84516989-4f0e-418d-84d5-6ea48e6d87be d96d7aa420944947ad70e496aec2bea1 0279a3edc4d145a3815f122291b494bf - - default default] Acquiring lock "4018e31a-e4b1-4786-8a23-a6824cfe324b-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:47:09 compute-1 nova_compute[189066]: 2025-12-05 09:47:09.320 189070 DEBUG oslo_concurrency.lockutils [None req-84516989-4f0e-418d-84d5-6ea48e6d87be d96d7aa420944947ad70e496aec2bea1 0279a3edc4d145a3815f122291b494bf - - default default] Lock "4018e31a-e4b1-4786-8a23-a6824cfe324b-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:47:09 compute-1 nova_compute[189066]: 2025-12-05 09:47:09.320 189070 DEBUG oslo_concurrency.lockutils [None req-84516989-4f0e-418d-84d5-6ea48e6d87be d96d7aa420944947ad70e496aec2bea1 0279a3edc4d145a3815f122291b494bf - - default default] Lock "4018e31a-e4b1-4786-8a23-a6824cfe324b-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:47:09 compute-1 nova_compute[189066]: 2025-12-05 09:47:09.321 189070 DEBUG nova.virt.libvirt.vif [None req-84516989-4f0e-418d-84d5-6ea48e6d87be d96d7aa420944947ad70e496aec2bea1 0279a3edc4d145a3815f122291b494bf - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T09:47:00Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestServerMultinode-server-738303947',display_name='tempest-TestServerMultinode-server-738303947',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testservermultinode-server-738303947',id=55,image_ref='3ebffd97-b242-42d7-b245-ebdaf8e4377c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='0279a3edc4d145a3815f122291b494bf',ramdisk_id='',reservation_id='r-im3vetyr',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member,admin',image_base_image_ref='3ebffd97-b242-42d7-b245-ebdaf8e4377c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestServerMultinode-1600634262',owner_user_name='tempest-TestServerMultinode-1600634262-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T09:47:02Z,user_data=None,user_id='d96d7aa420944947ad70e496aec2bea1',uuid=4018e31a-e4b1-4786-8a23-a6824cfe324b,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "b627bc29-57d3-4693-9334-e4986065502a", "address": "fa:16:3e:43:43:ca", "network": {"id": "71ae38f4-8c55-4792-9422-fcbcaa5f09ab", "bridge": "br-int", "label": "tempest-TestServerMultinode-503343283-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "66bb236c6ae84372bfe66c14c34eff73", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb627bc29-57", "ovs_interfaceid": "b627bc29-57d3-4693-9334-e4986065502a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Dec 05 09:47:09 compute-1 nova_compute[189066]: 2025-12-05 09:47:09.321 189070 DEBUG nova.network.os_vif_util [None req-84516989-4f0e-418d-84d5-6ea48e6d87be d96d7aa420944947ad70e496aec2bea1 0279a3edc4d145a3815f122291b494bf - - default default] Converting VIF {"id": "b627bc29-57d3-4693-9334-e4986065502a", "address": "fa:16:3e:43:43:ca", "network": {"id": "71ae38f4-8c55-4792-9422-fcbcaa5f09ab", "bridge": "br-int", "label": "tempest-TestServerMultinode-503343283-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "66bb236c6ae84372bfe66c14c34eff73", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb627bc29-57", "ovs_interfaceid": "b627bc29-57d3-4693-9334-e4986065502a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 09:47:09 compute-1 nova_compute[189066]: 2025-12-05 09:47:09.322 189070 DEBUG nova.network.os_vif_util [None req-84516989-4f0e-418d-84d5-6ea48e6d87be d96d7aa420944947ad70e496aec2bea1 0279a3edc4d145a3815f122291b494bf - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:43:43:ca,bridge_name='br-int',has_traffic_filtering=True,id=b627bc29-57d3-4693-9334-e4986065502a,network=Network(71ae38f4-8c55-4792-9422-fcbcaa5f09ab),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb627bc29-57') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 09:47:09 compute-1 nova_compute[189066]: 2025-12-05 09:47:09.322 189070 DEBUG os_vif [None req-84516989-4f0e-418d-84d5-6ea48e6d87be d96d7aa420944947ad70e496aec2bea1 0279a3edc4d145a3815f122291b494bf - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:43:43:ca,bridge_name='br-int',has_traffic_filtering=True,id=b627bc29-57d3-4693-9334-e4986065502a,network=Network(71ae38f4-8c55-4792-9422-fcbcaa5f09ab),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb627bc29-57') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Dec 05 09:47:09 compute-1 nova_compute[189066]: 2025-12-05 09:47:09.323 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:47:09 compute-1 nova_compute[189066]: 2025-12-05 09:47:09.324 189070 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 09:47:09 compute-1 nova_compute[189066]: 2025-12-05 09:47:09.324 189070 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 09:47:09 compute-1 nova_compute[189066]: 2025-12-05 09:47:09.335 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:47:09 compute-1 nova_compute[189066]: 2025-12-05 09:47:09.336 189070 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapb627bc29-57, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 09:47:09 compute-1 nova_compute[189066]: 2025-12-05 09:47:09.337 189070 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapb627bc29-57, col_values=(('external_ids', {'iface-id': 'b627bc29-57d3-4693-9334-e4986065502a', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:43:43:ca', 'vm-uuid': '4018e31a-e4b1-4786-8a23-a6824cfe324b'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 09:47:09 compute-1 nova_compute[189066]: 2025-12-05 09:47:09.339 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:47:09 compute-1 nova_compute[189066]: 2025-12-05 09:47:09.342 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 09:47:09 compute-1 NetworkManager[55704]: <info>  [1764928029.3442] manager: (tapb627bc29-57): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/137)
Dec 05 09:47:09 compute-1 nova_compute[189066]: 2025-12-05 09:47:09.352 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:47:09 compute-1 nova_compute[189066]: 2025-12-05 09:47:09.355 189070 INFO os_vif [None req-84516989-4f0e-418d-84d5-6ea48e6d87be d96d7aa420944947ad70e496aec2bea1 0279a3edc4d145a3815f122291b494bf - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:43:43:ca,bridge_name='br-int',has_traffic_filtering=True,id=b627bc29-57d3-4693-9334-e4986065502a,network=Network(71ae38f4-8c55-4792-9422-fcbcaa5f09ab),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb627bc29-57')
Dec 05 09:47:10 compute-1 nova_compute[189066]: 2025-12-05 09:47:10.257 189070 DEBUG nova.virt.libvirt.driver [None req-84516989-4f0e-418d-84d5-6ea48e6d87be d96d7aa420944947ad70e496aec2bea1 0279a3edc4d145a3815f122291b494bf - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 05 09:47:10 compute-1 nova_compute[189066]: 2025-12-05 09:47:10.258 189070 DEBUG nova.virt.libvirt.driver [None req-84516989-4f0e-418d-84d5-6ea48e6d87be d96d7aa420944947ad70e496aec2bea1 0279a3edc4d145a3815f122291b494bf - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 05 09:47:10 compute-1 nova_compute[189066]: 2025-12-05 09:47:10.258 189070 DEBUG nova.virt.libvirt.driver [None req-84516989-4f0e-418d-84d5-6ea48e6d87be d96d7aa420944947ad70e496aec2bea1 0279a3edc4d145a3815f122291b494bf - - default default] No VIF found with MAC fa:16:3e:43:43:ca, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Dec 05 09:47:10 compute-1 nova_compute[189066]: 2025-12-05 09:47:10.260 189070 INFO nova.virt.libvirt.driver [None req-84516989-4f0e-418d-84d5-6ea48e6d87be d96d7aa420944947ad70e496aec2bea1 0279a3edc4d145a3815f122291b494bf - - default default] [instance: 4018e31a-e4b1-4786-8a23-a6824cfe324b] Using config drive
Dec 05 09:47:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:47:10.760 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '4018e31a-e4b1-4786-8a23-a6824cfe324b', 'name': 'tempest-TestServerMultinode-server-738303947', 'flavor': {'id': 'fbadeab4-f24f-4100-963a-d228b2a6f7c4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '3ebffd97-b242-42d7-b245-ebdaf8e4377c'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000037', 'OS-EXT-SRV-ATTR:host': 'compute-1.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'shutdown', 'tenant_id': '0279a3edc4d145a3815f122291b494bf', 'user_id': 'd96d7aa420944947ad70e496aec2bea1', 'hostId': '510be072085e9053d270a6a47e235526b840bc171d132cbe60e99985', 'status': 'stopped', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Dec 05 09:47:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:47:10.762 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Dec 05 09:47:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:47:10.764 12 DEBUG ceilometer.compute.pollsters [-] Instance 4018e31a-e4b1-4786-8a23-a6824cfe324b was shut off while getting sample of network.incoming.packets: Failed to inspect data of instance <name=instance-00000037, id=4018e31a-e4b1-4786-8a23-a6824cfe324b>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Dec 05 09:47:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:47:10.764 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Dec 05 09:47:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:47:10.765 12 DEBUG ceilometer.compute.pollsters [-] Instance 4018e31a-e4b1-4786-8a23-a6824cfe324b was shut off while getting sample of disk.device.allocation: Failed to inspect data of instance <name=instance-00000037, id=4018e31a-e4b1-4786-8a23-a6824cfe324b>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Dec 05 09:47:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:47:10.765 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Dec 05 09:47:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:47:10.765 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Dec 05 09:47:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:47:10.765 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-TestServerMultinode-server-738303947>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestServerMultinode-server-738303947>]
Dec 05 09:47:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:47:10.766 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Dec 05 09:47:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:47:10.767 12 DEBUG ceilometer.compute.pollsters [-] Instance 4018e31a-e4b1-4786-8a23-a6824cfe324b was shut off while getting sample of disk.device.read.bytes: Failed to inspect data of instance <name=instance-00000037, id=4018e31a-e4b1-4786-8a23-a6824cfe324b>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Dec 05 09:47:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:47:10.767 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Dec 05 09:47:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:47:10.768 12 DEBUG ceilometer.compute.pollsters [-] Instance 4018e31a-e4b1-4786-8a23-a6824cfe324b was shut off while getting sample of network.incoming.packets.error: Failed to inspect data of instance <name=instance-00000037, id=4018e31a-e4b1-4786-8a23-a6824cfe324b>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Dec 05 09:47:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:47:10.768 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Dec 05 09:47:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:47:10.768 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Dec 05 09:47:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:47:10.768 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-TestServerMultinode-server-738303947>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestServerMultinode-server-738303947>]
Dec 05 09:47:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:47:10.768 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Dec 05 09:47:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:47:10.768 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Dec 05 09:47:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:47:10.768 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-TestServerMultinode-server-738303947>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestServerMultinode-server-738303947>]
Dec 05 09:47:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:47:10.769 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Dec 05 09:47:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:47:10.769 12 DEBUG ceilometer.compute.pollsters [-] Instance 4018e31a-e4b1-4786-8a23-a6824cfe324b was shut off while getting sample of disk.device.usage: Failed to inspect data of instance <name=instance-00000037, id=4018e31a-e4b1-4786-8a23-a6824cfe324b>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Dec 05 09:47:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:47:10.770 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Dec 05 09:47:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:47:10.770 12 DEBUG ceilometer.compute.pollsters [-] Instance 4018e31a-e4b1-4786-8a23-a6824cfe324b was shut off while getting sample of disk.device.write.bytes: Failed to inspect data of instance <name=instance-00000037, id=4018e31a-e4b1-4786-8a23-a6824cfe324b>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Dec 05 09:47:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:47:10.770 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Dec 05 09:47:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:47:10.771 12 DEBUG ceilometer.compute.pollsters [-] Instance 4018e31a-e4b1-4786-8a23-a6824cfe324b was shut off while getting sample of disk.device.read.latency: Failed to inspect data of instance <name=instance-00000037, id=4018e31a-e4b1-4786-8a23-a6824cfe324b>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Dec 05 09:47:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:47:10.771 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Dec 05 09:47:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:47:10.772 12 DEBUG ceilometer.compute.pollsters [-] Instance 4018e31a-e4b1-4786-8a23-a6824cfe324b was shut off while getting sample of network.incoming.bytes: Failed to inspect data of instance <name=instance-00000037, id=4018e31a-e4b1-4786-8a23-a6824cfe324b>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Dec 05 09:47:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:47:10.772 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Dec 05 09:47:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:47:10.773 12 DEBUG ceilometer.compute.pollsters [-] Instance 4018e31a-e4b1-4786-8a23-a6824cfe324b was shut off while getting sample of network.outgoing.bytes: Failed to inspect data of instance <name=instance-00000037, id=4018e31a-e4b1-4786-8a23-a6824cfe324b>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Dec 05 09:47:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:47:10.773 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Dec 05 09:47:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:47:10.774 12 DEBUG ceilometer.compute.pollsters [-] Instance 4018e31a-e4b1-4786-8a23-a6824cfe324b was shut off while getting sample of disk.device.read.requests: Failed to inspect data of instance <name=instance-00000037, id=4018e31a-e4b1-4786-8a23-a6824cfe324b>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Dec 05 09:47:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:47:10.774 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Dec 05 09:47:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:47:10.775 12 DEBUG ceilometer.compute.pollsters [-] Instance 4018e31a-e4b1-4786-8a23-a6824cfe324b was shut off while getting sample of network.outgoing.packets.error: Failed to inspect data of instance <name=instance-00000037, id=4018e31a-e4b1-4786-8a23-a6824cfe324b>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Dec 05 09:47:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:47:10.775 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Dec 05 09:47:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:47:10.776 12 DEBUG ceilometer.compute.pollsters [-] Instance 4018e31a-e4b1-4786-8a23-a6824cfe324b was shut off while getting sample of network.outgoing.packets: Failed to inspect data of instance <name=instance-00000037, id=4018e31a-e4b1-4786-8a23-a6824cfe324b>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Dec 05 09:47:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:47:10.776 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Dec 05 09:47:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:47:10.777 12 DEBUG ceilometer.compute.pollsters [-] Instance 4018e31a-e4b1-4786-8a23-a6824cfe324b was shut off while getting sample of network.incoming.bytes.delta: Failed to inspect data of instance <name=instance-00000037, id=4018e31a-e4b1-4786-8a23-a6824cfe324b>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Dec 05 09:47:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:47:10.777 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Dec 05 09:47:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:47:10.777 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Dec 05 09:47:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:47:10.777 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-TestServerMultinode-server-738303947>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestServerMultinode-server-738303947>]
Dec 05 09:47:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:47:10.778 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Dec 05 09:47:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:47:10.778 12 DEBUG ceilometer.compute.pollsters [-] Instance 4018e31a-e4b1-4786-8a23-a6824cfe324b was shut off while getting sample of memory.usage: Failed to inspect data of instance <name=instance-00000037, id=4018e31a-e4b1-4786-8a23-a6824cfe324b>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Dec 05 09:47:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:47:10.779 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Dec 05 09:47:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:47:10.779 12 DEBUG ceilometer.compute.pollsters [-] Instance 4018e31a-e4b1-4786-8a23-a6824cfe324b was shut off while getting sample of disk.device.write.latency: Failed to inspect data of instance <name=instance-00000037, id=4018e31a-e4b1-4786-8a23-a6824cfe324b>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Dec 05 09:47:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:47:10.780 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Dec 05 09:47:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:47:10.780 12 DEBUG ceilometer.compute.pollsters [-] Instance 4018e31a-e4b1-4786-8a23-a6824cfe324b was shut off while getting sample of network.outgoing.packets.drop: Failed to inspect data of instance <name=instance-00000037, id=4018e31a-e4b1-4786-8a23-a6824cfe324b>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Dec 05 09:47:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:47:10.781 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Dec 05 09:47:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:47:10.781 12 DEBUG ceilometer.compute.pollsters [-] Instance 4018e31a-e4b1-4786-8a23-a6824cfe324b was shut off while getting sample of network.outgoing.bytes.delta: Failed to inspect data of instance <name=instance-00000037, id=4018e31a-e4b1-4786-8a23-a6824cfe324b>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Dec 05 09:47:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:47:10.782 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Dec 05 09:47:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:47:10.783 12 DEBUG ceilometer.compute.pollsters [-] Instance 4018e31a-e4b1-4786-8a23-a6824cfe324b was shut off while getting sample of cpu: Failed to inspect data of instance <name=instance-00000037, id=4018e31a-e4b1-4786-8a23-a6824cfe324b>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Dec 05 09:47:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:47:10.783 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Dec 05 09:47:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:47:10.783 12 DEBUG ceilometer.compute.pollsters [-] Instance 4018e31a-e4b1-4786-8a23-a6824cfe324b was shut off while getting sample of disk.device.write.requests: Failed to inspect data of instance <name=instance-00000037, id=4018e31a-e4b1-4786-8a23-a6824cfe324b>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Dec 05 09:47:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:47:10.784 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Dec 05 09:47:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:47:10.784 12 DEBUG ceilometer.compute.pollsters [-] Instance 4018e31a-e4b1-4786-8a23-a6824cfe324b was shut off while getting sample of network.incoming.packets.drop: Failed to inspect data of instance <name=instance-00000037, id=4018e31a-e4b1-4786-8a23-a6824cfe324b>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Dec 05 09:47:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:47:10.785 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Dec 05 09:47:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:47:10.786 12 DEBUG ceilometer.compute.pollsters [-] Instance 4018e31a-e4b1-4786-8a23-a6824cfe324b was shut off while getting sample of disk.device.capacity: Failed to inspect data of instance <name=instance-00000037, id=4018e31a-e4b1-4786-8a23-a6824cfe324b>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Dec 05 09:47:11 compute-1 nova_compute[189066]: 2025-12-05 09:47:11.589 189070 INFO nova.virt.libvirt.driver [None req-84516989-4f0e-418d-84d5-6ea48e6d87be d96d7aa420944947ad70e496aec2bea1 0279a3edc4d145a3815f122291b494bf - - default default] [instance: 4018e31a-e4b1-4786-8a23-a6824cfe324b] Creating config drive at /var/lib/nova/instances/4018e31a-e4b1-4786-8a23-a6824cfe324b/disk.config
Dec 05 09:47:11 compute-1 nova_compute[189066]: 2025-12-05 09:47:11.596 189070 DEBUG oslo_concurrency.processutils [None req-84516989-4f0e-418d-84d5-6ea48e6d87be d96d7aa420944947ad70e496aec2bea1 0279a3edc4d145a3815f122291b494bf - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/4018e31a-e4b1-4786-8a23-a6824cfe324b/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp471a6lhi execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 05 09:47:11 compute-1 nova_compute[189066]: 2025-12-05 09:47:11.630 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:47:11 compute-1 nova_compute[189066]: 2025-12-05 09:47:11.751 189070 DEBUG oslo_concurrency.processutils [None req-84516989-4f0e-418d-84d5-6ea48e6d87be d96d7aa420944947ad70e496aec2bea1 0279a3edc4d145a3815f122291b494bf - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/4018e31a-e4b1-4786-8a23-a6824cfe324b/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp471a6lhi" returned: 0 in 0.155s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 05 09:47:11 compute-1 kernel: tapb627bc29-57: entered promiscuous mode
Dec 05 09:47:11 compute-1 NetworkManager[55704]: <info>  [1764928031.8235] manager: (tapb627bc29-57): new Tun device (/org/freedesktop/NetworkManager/Devices/138)
Dec 05 09:47:11 compute-1 nova_compute[189066]: 2025-12-05 09:47:11.823 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:47:11 compute-1 nova_compute[189066]: 2025-12-05 09:47:11.826 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:47:11 compute-1 ovn_controller[95809]: 2025-12-05T09:47:11Z|00262|binding|INFO|Claiming lport b627bc29-57d3-4693-9334-e4986065502a for this chassis.
Dec 05 09:47:11 compute-1 ovn_controller[95809]: 2025-12-05T09:47:11Z|00263|binding|INFO|b627bc29-57d3-4693-9334-e4986065502a: Claiming fa:16:3e:43:43:ca 10.100.0.7
Dec 05 09:47:11 compute-1 nova_compute[189066]: 2025-12-05 09:47:11.831 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:47:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:47:11.839 105272 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:43:43:ca 10.100.0.7'], port_security=['fa:16:3e:43:43:ca 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': '4018e31a-e4b1-4786-8a23-a6824cfe324b', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-71ae38f4-8c55-4792-9422-fcbcaa5f09ab', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0279a3edc4d145a3815f122291b494bf', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'fd674063-42aa-4a30-b09e-f4f4a4cb3d69', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=51cdcecf-b9b8-4fb1-8306-272b186b0af7, chassis=[<ovs.db.idl.Row object at 0x7fc06f54c970>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fc06f54c970>], logical_port=b627bc29-57d3-4693-9334-e4986065502a) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 09:47:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:47:11.840 105272 INFO neutron.agent.ovn.metadata.agent [-] Port b627bc29-57d3-4693-9334-e4986065502a in datapath 71ae38f4-8c55-4792-9422-fcbcaa5f09ab bound to our chassis
Dec 05 09:47:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:47:11.842 105272 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 71ae38f4-8c55-4792-9422-fcbcaa5f09ab
Dec 05 09:47:11 compute-1 systemd-udevd[233883]: Network interface NamePolicy= disabled on kernel command line.
Dec 05 09:47:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:47:11.859 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[2031f8ea-aafc-4db6-94f2-2941d2108ccc]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:47:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:47:11.861 105272 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap71ae38f4-81 in ovnmeta-71ae38f4-8c55-4792-9422-fcbcaa5f09ab namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Dec 05 09:47:11 compute-1 systemd-machined[154815]: New machine qemu-22-instance-00000037.
Dec 05 09:47:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:47:11.866 220556 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap71ae38f4-80 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Dec 05 09:47:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:47:11.866 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[ca70afba-36b2-4c83-90d9-3e28b656e41a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:47:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:47:11.867 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[b7fa004d-bfce-4f63-bf17-620d44305838]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:47:11 compute-1 NetworkManager[55704]: <info>  [1764928031.8791] device (tapb627bc29-57): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec 05 09:47:11 compute-1 NetworkManager[55704]: <info>  [1764928031.8807] device (tapb627bc29-57): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec 05 09:47:11 compute-1 nova_compute[189066]: 2025-12-05 09:47:11.881 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:47:11 compute-1 ovn_controller[95809]: 2025-12-05T09:47:11Z|00264|binding|INFO|Setting lport b627bc29-57d3-4693-9334-e4986065502a ovn-installed in OVS
Dec 05 09:47:11 compute-1 ovn_controller[95809]: 2025-12-05T09:47:11Z|00265|binding|INFO|Setting lport b627bc29-57d3-4693-9334-e4986065502a up in Southbound
Dec 05 09:47:11 compute-1 systemd[1]: Started Virtual Machine qemu-22-instance-00000037.
Dec 05 09:47:11 compute-1 nova_compute[189066]: 2025-12-05 09:47:11.886 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:47:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:47:11.885 105809 DEBUG oslo.privsep.daemon [-] privsep: reply[0a05521d-46a7-4219-8edb-53b556b64fe0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:47:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:47:11.926 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[45f9d9e1-ecc7-46ff-beec-fa8f8a6395ec]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:47:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:47:11.973 220896 DEBUG oslo.privsep.daemon [-] privsep: reply[4bef7c82-dcfb-4c0a-b557-afbf5fc33059]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:47:11 compute-1 systemd-udevd[233886]: Network interface NamePolicy= disabled on kernel command line.
Dec 05 09:47:11 compute-1 NetworkManager[55704]: <info>  [1764928031.9824] manager: (tap71ae38f4-80): new Veth device (/org/freedesktop/NetworkManager/Devices/139)
Dec 05 09:47:11 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:47:11.981 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[42d330be-525b-4602-b745-f1908fb873fd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:47:12 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:47:12.019 220896 DEBUG oslo.privsep.daemon [-] privsep: reply[8e00d2ea-43ec-4805-854d-5e19873a94ee]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:47:12 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:47:12.024 220896 DEBUG oslo.privsep.daemon [-] privsep: reply[1f0ee774-eab1-4b61-8c8e-20ecd8b2b14f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:47:12 compute-1 NetworkManager[55704]: <info>  [1764928032.0529] device (tap71ae38f4-80): carrier: link connected
Dec 05 09:47:12 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:47:12.060 220896 DEBUG oslo.privsep.daemon [-] privsep: reply[3afdfc0f-94a0-4417-ba75-c630ed7b2cfe]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:47:12 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:47:12.086 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[93b8440a-5f5a-41c7-ae8e-6aa6bacca8e1]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap71ae38f4-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:1d:62:03'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 78], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 530505, 'reachable_time': 40460, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 233915, 'error': None, 'target': 'ovnmeta-71ae38f4-8c55-4792-9422-fcbcaa5f09ab', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:47:12 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:47:12.112 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[12ed4930-7306-4275-8d05-ffa308a0c16f]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe1d:6203'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 530505, 'tstamp': 530505}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 233916, 'error': None, 'target': 'ovnmeta-71ae38f4-8c55-4792-9422-fcbcaa5f09ab', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:47:12 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:47:12.133 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[6183380e-3817-463b-ae68-2831533d1dd0]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap71ae38f4-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:1d:62:03'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 78], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 530505, 'reachable_time': 40460, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 148, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 148, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 233917, 'error': None, 'target': 'ovnmeta-71ae38f4-8c55-4792-9422-fcbcaa5f09ab', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:47:12 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:47:12.188 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[74a1ae7b-b2a7-4567-8cae-21251e87673e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:47:12 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:47:12.272 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[34dc014e-ef33-4bfa-af3f-666ebdbb6cad]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:47:12 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:47:12.275 105272 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap71ae38f4-80, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 09:47:12 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:47:12.275 105272 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 09:47:12 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:47:12.276 105272 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap71ae38f4-80, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 09:47:12 compute-1 NetworkManager[55704]: <info>  [1764928032.2802] manager: (tap71ae38f4-80): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/140)
Dec 05 09:47:12 compute-1 kernel: tap71ae38f4-80: entered promiscuous mode
Dec 05 09:47:12 compute-1 nova_compute[189066]: 2025-12-05 09:47:12.280 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:47:12 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:47:12.284 105272 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap71ae38f4-80, col_values=(('external_ids', {'iface-id': '31bf87d4-df56-40df-b27e-398b61c779fb'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 09:47:12 compute-1 nova_compute[189066]: 2025-12-05 09:47:12.286 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:47:12 compute-1 ovn_controller[95809]: 2025-12-05T09:47:12Z|00266|binding|INFO|Releasing lport 31bf87d4-df56-40df-b27e-398b61c779fb from this chassis (sb_readonly=0)
Dec 05 09:47:12 compute-1 nova_compute[189066]: 2025-12-05 09:47:12.300 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:47:12 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:47:12.302 105272 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/71ae38f4-8c55-4792-9422-fcbcaa5f09ab.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/71ae38f4-8c55-4792-9422-fcbcaa5f09ab.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Dec 05 09:47:12 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:47:12.304 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[b29ae711-daa1-4ce1-8fa4-f2ee0d590d95]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:47:12 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:47:12.305 105272 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Dec 05 09:47:12 compute-1 ovn_metadata_agent[105267]: global
Dec 05 09:47:12 compute-1 ovn_metadata_agent[105267]:     log         /dev/log local0 debug
Dec 05 09:47:12 compute-1 ovn_metadata_agent[105267]:     log-tag     haproxy-metadata-proxy-71ae38f4-8c55-4792-9422-fcbcaa5f09ab
Dec 05 09:47:12 compute-1 ovn_metadata_agent[105267]:     user        root
Dec 05 09:47:12 compute-1 ovn_metadata_agent[105267]:     group       root
Dec 05 09:47:12 compute-1 ovn_metadata_agent[105267]:     maxconn     1024
Dec 05 09:47:12 compute-1 ovn_metadata_agent[105267]:     pidfile     /var/lib/neutron/external/pids/71ae38f4-8c55-4792-9422-fcbcaa5f09ab.pid.haproxy
Dec 05 09:47:12 compute-1 ovn_metadata_agent[105267]:     daemon
Dec 05 09:47:12 compute-1 ovn_metadata_agent[105267]: 
Dec 05 09:47:12 compute-1 ovn_metadata_agent[105267]: defaults
Dec 05 09:47:12 compute-1 ovn_metadata_agent[105267]:     log global
Dec 05 09:47:12 compute-1 ovn_metadata_agent[105267]:     mode http
Dec 05 09:47:12 compute-1 ovn_metadata_agent[105267]:     option httplog
Dec 05 09:47:12 compute-1 ovn_metadata_agent[105267]:     option dontlognull
Dec 05 09:47:12 compute-1 ovn_metadata_agent[105267]:     option http-server-close
Dec 05 09:47:12 compute-1 ovn_metadata_agent[105267]:     option forwardfor
Dec 05 09:47:12 compute-1 ovn_metadata_agent[105267]:     retries                 3
Dec 05 09:47:12 compute-1 ovn_metadata_agent[105267]:     timeout http-request    30s
Dec 05 09:47:12 compute-1 ovn_metadata_agent[105267]:     timeout connect         30s
Dec 05 09:47:12 compute-1 ovn_metadata_agent[105267]:     timeout client          32s
Dec 05 09:47:12 compute-1 ovn_metadata_agent[105267]:     timeout server          32s
Dec 05 09:47:12 compute-1 ovn_metadata_agent[105267]:     timeout http-keep-alive 30s
Dec 05 09:47:12 compute-1 ovn_metadata_agent[105267]: 
Dec 05 09:47:12 compute-1 ovn_metadata_agent[105267]: 
Dec 05 09:47:12 compute-1 ovn_metadata_agent[105267]: listen listener
Dec 05 09:47:12 compute-1 ovn_metadata_agent[105267]:     bind 169.254.169.254:80
Dec 05 09:47:12 compute-1 ovn_metadata_agent[105267]:     server metadata /var/lib/neutron/metadata_proxy
Dec 05 09:47:12 compute-1 ovn_metadata_agent[105267]:     http-request add-header X-OVN-Network-ID 71ae38f4-8c55-4792-9422-fcbcaa5f09ab
Dec 05 09:47:12 compute-1 ovn_metadata_agent[105267]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Dec 05 09:47:12 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:47:12.306 105272 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-71ae38f4-8c55-4792-9422-fcbcaa5f09ab', 'env', 'PROCESS_TAG=haproxy-71ae38f4-8c55-4792-9422-fcbcaa5f09ab', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/71ae38f4-8c55-4792-9422-fcbcaa5f09ab.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Dec 05 09:47:12 compute-1 nova_compute[189066]: 2025-12-05 09:47:12.423 189070 DEBUG nova.virt.driver [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] Emitting event <LifecycleEvent: 1764928032.4229114, 4018e31a-e4b1-4786-8a23-a6824cfe324b => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 09:47:12 compute-1 nova_compute[189066]: 2025-12-05 09:47:12.424 189070 INFO nova.compute.manager [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] [instance: 4018e31a-e4b1-4786-8a23-a6824cfe324b] VM Started (Lifecycle Event)
Dec 05 09:47:12 compute-1 nova_compute[189066]: 2025-12-05 09:47:12.607 189070 DEBUG nova.compute.manager [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] [instance: 4018e31a-e4b1-4786-8a23-a6824cfe324b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 09:47:12 compute-1 nova_compute[189066]: 2025-12-05 09:47:12.615 189070 DEBUG nova.virt.driver [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] Emitting event <LifecycleEvent: 1764928032.4240413, 4018e31a-e4b1-4786-8a23-a6824cfe324b => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 09:47:12 compute-1 nova_compute[189066]: 2025-12-05 09:47:12.616 189070 INFO nova.compute.manager [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] [instance: 4018e31a-e4b1-4786-8a23-a6824cfe324b] VM Paused (Lifecycle Event)
Dec 05 09:47:12 compute-1 nova_compute[189066]: 2025-12-05 09:47:12.637 189070 DEBUG nova.compute.manager [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] [instance: 4018e31a-e4b1-4786-8a23-a6824cfe324b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 09:47:12 compute-1 nova_compute[189066]: 2025-12-05 09:47:12.641 189070 DEBUG nova.compute.manager [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] [instance: 4018e31a-e4b1-4786-8a23-a6824cfe324b] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 09:47:12 compute-1 nova_compute[189066]: 2025-12-05 09:47:12.676 189070 INFO nova.compute.manager [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] [instance: 4018e31a-e4b1-4786-8a23-a6824cfe324b] During sync_power_state the instance has a pending task (spawning). Skip.
Dec 05 09:47:12 compute-1 podman[233956]: 2025-12-05 09:47:12.765319275 +0000 UTC m=+0.056247199 container create 5ed7bfbd8409e25e3e515b3d71c4e778df055f071b175fe35d68c01da83da604 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-71ae38f4-8c55-4792-9422-fcbcaa5f09ab, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.build-date=20251125)
Dec 05 09:47:12 compute-1 systemd[1]: Started libpod-conmon-5ed7bfbd8409e25e3e515b3d71c4e778df055f071b175fe35d68c01da83da604.scope.
Dec 05 09:47:12 compute-1 podman[233956]: 2025-12-05 09:47:12.735482764 +0000 UTC m=+0.026410718 image pull 014dc726c85414b29f2dde7b5d875685d08784761c0f0ffa8630d1583a877bf9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Dec 05 09:47:12 compute-1 systemd[1]: Started libcrun container.
Dec 05 09:47:12 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d0d01513a596861ad521e323c13fd25c9e39be09a56039fb0f7bdfc736f32bf7/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 05 09:47:12 compute-1 podman[233956]: 2025-12-05 09:47:12.872379709 +0000 UTC m=+0.163307663 container init 5ed7bfbd8409e25e3e515b3d71c4e778df055f071b175fe35d68c01da83da604 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-71ae38f4-8c55-4792-9422-fcbcaa5f09ab, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251125)
Dec 05 09:47:12 compute-1 podman[233956]: 2025-12-05 09:47:12.879115203 +0000 UTC m=+0.170043137 container start 5ed7bfbd8409e25e3e515b3d71c4e778df055f071b175fe35d68c01da83da604 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-71ae38f4-8c55-4792-9422-fcbcaa5f09ab, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 05 09:47:12 compute-1 neutron-haproxy-ovnmeta-71ae38f4-8c55-4792-9422-fcbcaa5f09ab[233972]: [NOTICE]   (233994) : New worker (234000) forked
Dec 05 09:47:12 compute-1 neutron-haproxy-ovnmeta-71ae38f4-8c55-4792-9422-fcbcaa5f09ab[233972]: [NOTICE]   (233994) : Loading success.
Dec 05 09:47:12 compute-1 podman[233969]: 2025-12-05 09:47:12.929248432 +0000 UTC m=+0.109993656 container health_status 0cdc951f3b455a1459471b20e1f1626ca53f702831c125f835d1b670aeb17ccf (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a42d21893a42142b0b00e96296a13ac9d2c0fedf55d6438ad104fd0309c63376-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0)
Dec 05 09:47:13 compute-1 nova_compute[189066]: 2025-12-05 09:47:13.613 189070 DEBUG nova.compute.manager [req-08b8a792-670d-45a6-ab90-8a2a8bbe1a9f req-6e2b3445-8568-4de5-a76e-bf67818593ac 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 4018e31a-e4b1-4786-8a23-a6824cfe324b] Received event network-vif-plugged-b627bc29-57d3-4693-9334-e4986065502a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 09:47:13 compute-1 nova_compute[189066]: 2025-12-05 09:47:13.614 189070 DEBUG oslo_concurrency.lockutils [req-08b8a792-670d-45a6-ab90-8a2a8bbe1a9f req-6e2b3445-8568-4de5-a76e-bf67818593ac 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Acquiring lock "4018e31a-e4b1-4786-8a23-a6824cfe324b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:47:13 compute-1 nova_compute[189066]: 2025-12-05 09:47:13.615 189070 DEBUG oslo_concurrency.lockutils [req-08b8a792-670d-45a6-ab90-8a2a8bbe1a9f req-6e2b3445-8568-4de5-a76e-bf67818593ac 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Lock "4018e31a-e4b1-4786-8a23-a6824cfe324b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:47:13 compute-1 nova_compute[189066]: 2025-12-05 09:47:13.615 189070 DEBUG oslo_concurrency.lockutils [req-08b8a792-670d-45a6-ab90-8a2a8bbe1a9f req-6e2b3445-8568-4de5-a76e-bf67818593ac 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Lock "4018e31a-e4b1-4786-8a23-a6824cfe324b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:47:13 compute-1 nova_compute[189066]: 2025-12-05 09:47:13.615 189070 DEBUG nova.compute.manager [req-08b8a792-670d-45a6-ab90-8a2a8bbe1a9f req-6e2b3445-8568-4de5-a76e-bf67818593ac 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 4018e31a-e4b1-4786-8a23-a6824cfe324b] Processing event network-vif-plugged-b627bc29-57d3-4693-9334-e4986065502a _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Dec 05 09:47:13 compute-1 nova_compute[189066]: 2025-12-05 09:47:13.616 189070 DEBUG nova.compute.manager [req-08b8a792-670d-45a6-ab90-8a2a8bbe1a9f req-6e2b3445-8568-4de5-a76e-bf67818593ac 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 4018e31a-e4b1-4786-8a23-a6824cfe324b] Received event network-vif-plugged-b627bc29-57d3-4693-9334-e4986065502a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 09:47:13 compute-1 nova_compute[189066]: 2025-12-05 09:47:13.616 189070 DEBUG oslo_concurrency.lockutils [req-08b8a792-670d-45a6-ab90-8a2a8bbe1a9f req-6e2b3445-8568-4de5-a76e-bf67818593ac 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Acquiring lock "4018e31a-e4b1-4786-8a23-a6824cfe324b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:47:13 compute-1 nova_compute[189066]: 2025-12-05 09:47:13.616 189070 DEBUG oslo_concurrency.lockutils [req-08b8a792-670d-45a6-ab90-8a2a8bbe1a9f req-6e2b3445-8568-4de5-a76e-bf67818593ac 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Lock "4018e31a-e4b1-4786-8a23-a6824cfe324b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:47:13 compute-1 nova_compute[189066]: 2025-12-05 09:47:13.616 189070 DEBUG oslo_concurrency.lockutils [req-08b8a792-670d-45a6-ab90-8a2a8bbe1a9f req-6e2b3445-8568-4de5-a76e-bf67818593ac 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Lock "4018e31a-e4b1-4786-8a23-a6824cfe324b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:47:13 compute-1 nova_compute[189066]: 2025-12-05 09:47:13.617 189070 DEBUG nova.compute.manager [req-08b8a792-670d-45a6-ab90-8a2a8bbe1a9f req-6e2b3445-8568-4de5-a76e-bf67818593ac 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 4018e31a-e4b1-4786-8a23-a6824cfe324b] No waiting events found dispatching network-vif-plugged-b627bc29-57d3-4693-9334-e4986065502a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 09:47:13 compute-1 nova_compute[189066]: 2025-12-05 09:47:13.617 189070 WARNING nova.compute.manager [req-08b8a792-670d-45a6-ab90-8a2a8bbe1a9f req-6e2b3445-8568-4de5-a76e-bf67818593ac 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 4018e31a-e4b1-4786-8a23-a6824cfe324b] Received unexpected event network-vif-plugged-b627bc29-57d3-4693-9334-e4986065502a for instance with vm_state building and task_state spawning.
Dec 05 09:47:13 compute-1 nova_compute[189066]: 2025-12-05 09:47:13.618 189070 DEBUG nova.compute.manager [None req-84516989-4f0e-418d-84d5-6ea48e6d87be d96d7aa420944947ad70e496aec2bea1 0279a3edc4d145a3815f122291b494bf - - default default] [instance: 4018e31a-e4b1-4786-8a23-a6824cfe324b] Instance event wait completed in 1 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Dec 05 09:47:13 compute-1 nova_compute[189066]: 2025-12-05 09:47:13.623 189070 DEBUG nova.virt.driver [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] Emitting event <LifecycleEvent: 1764928033.6229758, 4018e31a-e4b1-4786-8a23-a6824cfe324b => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 09:47:13 compute-1 nova_compute[189066]: 2025-12-05 09:47:13.623 189070 INFO nova.compute.manager [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] [instance: 4018e31a-e4b1-4786-8a23-a6824cfe324b] VM Resumed (Lifecycle Event)
Dec 05 09:47:13 compute-1 nova_compute[189066]: 2025-12-05 09:47:13.627 189070 DEBUG nova.virt.libvirt.driver [None req-84516989-4f0e-418d-84d5-6ea48e6d87be d96d7aa420944947ad70e496aec2bea1 0279a3edc4d145a3815f122291b494bf - - default default] [instance: 4018e31a-e4b1-4786-8a23-a6824cfe324b] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Dec 05 09:47:13 compute-1 nova_compute[189066]: 2025-12-05 09:47:13.633 189070 INFO nova.virt.libvirt.driver [-] [instance: 4018e31a-e4b1-4786-8a23-a6824cfe324b] Instance spawned successfully.
Dec 05 09:47:13 compute-1 nova_compute[189066]: 2025-12-05 09:47:13.634 189070 DEBUG nova.virt.libvirt.driver [None req-84516989-4f0e-418d-84d5-6ea48e6d87be d96d7aa420944947ad70e496aec2bea1 0279a3edc4d145a3815f122291b494bf - - default default] [instance: 4018e31a-e4b1-4786-8a23-a6824cfe324b] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Dec 05 09:47:13 compute-1 nova_compute[189066]: 2025-12-05 09:47:13.662 189070 DEBUG nova.compute.manager [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] [instance: 4018e31a-e4b1-4786-8a23-a6824cfe324b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 09:47:13 compute-1 nova_compute[189066]: 2025-12-05 09:47:13.671 189070 DEBUG nova.compute.manager [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] [instance: 4018e31a-e4b1-4786-8a23-a6824cfe324b] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 05 09:47:13 compute-1 nova_compute[189066]: 2025-12-05 09:47:13.676 189070 DEBUG nova.virt.libvirt.driver [None req-84516989-4f0e-418d-84d5-6ea48e6d87be d96d7aa420944947ad70e496aec2bea1 0279a3edc4d145a3815f122291b494bf - - default default] [instance: 4018e31a-e4b1-4786-8a23-a6824cfe324b] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 09:47:13 compute-1 nova_compute[189066]: 2025-12-05 09:47:13.676 189070 DEBUG nova.virt.libvirt.driver [None req-84516989-4f0e-418d-84d5-6ea48e6d87be d96d7aa420944947ad70e496aec2bea1 0279a3edc4d145a3815f122291b494bf - - default default] [instance: 4018e31a-e4b1-4786-8a23-a6824cfe324b] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 09:47:13 compute-1 nova_compute[189066]: 2025-12-05 09:47:13.677 189070 DEBUG nova.virt.libvirt.driver [None req-84516989-4f0e-418d-84d5-6ea48e6d87be d96d7aa420944947ad70e496aec2bea1 0279a3edc4d145a3815f122291b494bf - - default default] [instance: 4018e31a-e4b1-4786-8a23-a6824cfe324b] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 09:47:13 compute-1 nova_compute[189066]: 2025-12-05 09:47:13.678 189070 DEBUG nova.virt.libvirt.driver [None req-84516989-4f0e-418d-84d5-6ea48e6d87be d96d7aa420944947ad70e496aec2bea1 0279a3edc4d145a3815f122291b494bf - - default default] [instance: 4018e31a-e4b1-4786-8a23-a6824cfe324b] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 09:47:13 compute-1 nova_compute[189066]: 2025-12-05 09:47:13.678 189070 DEBUG nova.virt.libvirt.driver [None req-84516989-4f0e-418d-84d5-6ea48e6d87be d96d7aa420944947ad70e496aec2bea1 0279a3edc4d145a3815f122291b494bf - - default default] [instance: 4018e31a-e4b1-4786-8a23-a6824cfe324b] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 09:47:13 compute-1 nova_compute[189066]: 2025-12-05 09:47:13.679 189070 DEBUG nova.virt.libvirt.driver [None req-84516989-4f0e-418d-84d5-6ea48e6d87be d96d7aa420944947ad70e496aec2bea1 0279a3edc4d145a3815f122291b494bf - - default default] [instance: 4018e31a-e4b1-4786-8a23-a6824cfe324b] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 05 09:47:13 compute-1 nova_compute[189066]: 2025-12-05 09:47:13.721 189070 INFO nova.compute.manager [None req-f4fc3854-b4f3-4a3a-acce-8fbd1e6777ab - - - - - -] [instance: 4018e31a-e4b1-4786-8a23-a6824cfe324b] During sync_power_state the instance has a pending task (spawning). Skip.
Dec 05 09:47:13 compute-1 nova_compute[189066]: 2025-12-05 09:47:13.766 189070 INFO nova.compute.manager [None req-84516989-4f0e-418d-84d5-6ea48e6d87be d96d7aa420944947ad70e496aec2bea1 0279a3edc4d145a3815f122291b494bf - - default default] [instance: 4018e31a-e4b1-4786-8a23-a6824cfe324b] Took 11.27 seconds to spawn the instance on the hypervisor.
Dec 05 09:47:13 compute-1 nova_compute[189066]: 2025-12-05 09:47:13.767 189070 DEBUG nova.compute.manager [None req-84516989-4f0e-418d-84d5-6ea48e6d87be d96d7aa420944947ad70e496aec2bea1 0279a3edc4d145a3815f122291b494bf - - default default] [instance: 4018e31a-e4b1-4786-8a23-a6824cfe324b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 09:47:13 compute-1 nova_compute[189066]: 2025-12-05 09:47:13.865 189070 INFO nova.compute.manager [None req-84516989-4f0e-418d-84d5-6ea48e6d87be d96d7aa420944947ad70e496aec2bea1 0279a3edc4d145a3815f122291b494bf - - default default] [instance: 4018e31a-e4b1-4786-8a23-a6824cfe324b] Took 11.99 seconds to build instance.
Dec 05 09:47:13 compute-1 nova_compute[189066]: 2025-12-05 09:47:13.885 189070 DEBUG oslo_concurrency.lockutils [None req-84516989-4f0e-418d-84d5-6ea48e6d87be d96d7aa420944947ad70e496aec2bea1 0279a3edc4d145a3815f122291b494bf - - default default] Lock "4018e31a-e4b1-4786-8a23-a6824cfe324b" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 12.127s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:47:14 compute-1 nova_compute[189066]: 2025-12-05 09:47:14.064 189070 DEBUG nova.network.neutron [req-0b300092-5611-4c9a-9494-47dfb399e07e req-d6a37062-7d0e-4576-8df1-37331a1e868e 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 4018e31a-e4b1-4786-8a23-a6824cfe324b] Updated VIF entry in instance network info cache for port b627bc29-57d3-4693-9334-e4986065502a. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Dec 05 09:47:14 compute-1 nova_compute[189066]: 2025-12-05 09:47:14.065 189070 DEBUG nova.network.neutron [req-0b300092-5611-4c9a-9494-47dfb399e07e req-d6a37062-7d0e-4576-8df1-37331a1e868e 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 4018e31a-e4b1-4786-8a23-a6824cfe324b] Updating instance_info_cache with network_info: [{"id": "b627bc29-57d3-4693-9334-e4986065502a", "address": "fa:16:3e:43:43:ca", "network": {"id": "71ae38f4-8c55-4792-9422-fcbcaa5f09ab", "bridge": "br-int", "label": "tempest-TestServerMultinode-503343283-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "66bb236c6ae84372bfe66c14c34eff73", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb627bc29-57", "ovs_interfaceid": "b627bc29-57d3-4693-9334-e4986065502a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 09:47:14 compute-1 nova_compute[189066]: 2025-12-05 09:47:14.089 189070 DEBUG oslo_concurrency.lockutils [req-0b300092-5611-4c9a-9494-47dfb399e07e req-d6a37062-7d0e-4576-8df1-37331a1e868e 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Releasing lock "refresh_cache-4018e31a-e4b1-4786-8a23-a6824cfe324b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 05 09:47:14 compute-1 nova_compute[189066]: 2025-12-05 09:47:14.341 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:47:15 compute-1 podman[234013]: 2025-12-05 09:47:15.633404128 +0000 UTC m=+0.070052648 container health_status e8cea6352187f28e672aa25c227f2705a90d58074b8cf42235a56df5d4026fd5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a42d21893a42142b0b00e96296a13ac9d2c0fedf55d6438ad104fd0309c63376-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent)
Dec 05 09:47:16 compute-1 nova_compute[189066]: 2025-12-05 09:47:16.633 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:47:18 compute-1 podman[234033]: 2025-12-05 09:47:18.639651017 +0000 UTC m=+0.070960760 container health_status 3f3288243aefe700d53cffce184353529fbaf6e44f1c19ec4792ed576af41176 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_id=multipathd, container_name=multipathd, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Dec 05 09:47:19 compute-1 nova_compute[189066]: 2025-12-05 09:47:19.346 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:47:21 compute-1 podman[234054]: 2025-12-05 09:47:21.627971636 +0000 UTC m=+0.065835984 container health_status b07f36300303a5c1729056a61e344d6c6fd9b2e404082d8e8ce7185d45652c2b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, io.openshift.expose-services=, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=9.6, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., vcs-type=git, io.buildah.version=1.33.7, release=1755695350, vendor=Red Hat, Inc., architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=openstack_network_exporter, managed_by=edpm_ansible)
Dec 05 09:47:21 compute-1 nova_compute[189066]: 2025-12-05 09:47:21.637 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:47:24 compute-1 nova_compute[189066]: 2025-12-05 09:47:24.349 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:47:25 compute-1 nova_compute[189066]: 2025-12-05 09:47:25.015 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:47:25 compute-1 nova_compute[189066]: 2025-12-05 09:47:25.020 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:47:25 compute-1 nova_compute[189066]: 2025-12-05 09:47:25.742 189070 DEBUG oslo_concurrency.lockutils [None req-7043a1bb-281a-4a9f-9fe2-e0b9dbe53362 d96d7aa420944947ad70e496aec2bea1 0279a3edc4d145a3815f122291b494bf - - default default] Acquiring lock "4018e31a-e4b1-4786-8a23-a6824cfe324b" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:47:25 compute-1 nova_compute[189066]: 2025-12-05 09:47:25.743 189070 DEBUG oslo_concurrency.lockutils [None req-7043a1bb-281a-4a9f-9fe2-e0b9dbe53362 d96d7aa420944947ad70e496aec2bea1 0279a3edc4d145a3815f122291b494bf - - default default] Lock "4018e31a-e4b1-4786-8a23-a6824cfe324b" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:47:25 compute-1 nova_compute[189066]: 2025-12-05 09:47:25.743 189070 DEBUG oslo_concurrency.lockutils [None req-7043a1bb-281a-4a9f-9fe2-e0b9dbe53362 d96d7aa420944947ad70e496aec2bea1 0279a3edc4d145a3815f122291b494bf - - default default] Acquiring lock "4018e31a-e4b1-4786-8a23-a6824cfe324b-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:47:25 compute-1 nova_compute[189066]: 2025-12-05 09:47:25.743 189070 DEBUG oslo_concurrency.lockutils [None req-7043a1bb-281a-4a9f-9fe2-e0b9dbe53362 d96d7aa420944947ad70e496aec2bea1 0279a3edc4d145a3815f122291b494bf - - default default] Lock "4018e31a-e4b1-4786-8a23-a6824cfe324b-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:47:25 compute-1 nova_compute[189066]: 2025-12-05 09:47:25.744 189070 DEBUG oslo_concurrency.lockutils [None req-7043a1bb-281a-4a9f-9fe2-e0b9dbe53362 d96d7aa420944947ad70e496aec2bea1 0279a3edc4d145a3815f122291b494bf - - default default] Lock "4018e31a-e4b1-4786-8a23-a6824cfe324b-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:47:25 compute-1 nova_compute[189066]: 2025-12-05 09:47:25.745 189070 INFO nova.compute.manager [None req-7043a1bb-281a-4a9f-9fe2-e0b9dbe53362 d96d7aa420944947ad70e496aec2bea1 0279a3edc4d145a3815f122291b494bf - - default default] [instance: 4018e31a-e4b1-4786-8a23-a6824cfe324b] Terminating instance
Dec 05 09:47:25 compute-1 nova_compute[189066]: 2025-12-05 09:47:25.746 189070 DEBUG nova.compute.manager [None req-7043a1bb-281a-4a9f-9fe2-e0b9dbe53362 d96d7aa420944947ad70e496aec2bea1 0279a3edc4d145a3815f122291b494bf - - default default] [instance: 4018e31a-e4b1-4786-8a23-a6824cfe324b] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Dec 05 09:47:25 compute-1 kernel: tapb627bc29-57 (unregistering): left promiscuous mode
Dec 05 09:47:25 compute-1 NetworkManager[55704]: <info>  [1764928045.7749] device (tapb627bc29-57): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec 05 09:47:25 compute-1 nova_compute[189066]: 2025-12-05 09:47:25.793 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:47:25 compute-1 ovn_controller[95809]: 2025-12-05T09:47:25Z|00267|binding|INFO|Releasing lport b627bc29-57d3-4693-9334-e4986065502a from this chassis (sb_readonly=0)
Dec 05 09:47:25 compute-1 ovn_controller[95809]: 2025-12-05T09:47:25Z|00268|binding|INFO|Setting lport b627bc29-57d3-4693-9334-e4986065502a down in Southbound
Dec 05 09:47:25 compute-1 ovn_controller[95809]: 2025-12-05T09:47:25Z|00269|binding|INFO|Removing iface tapb627bc29-57 ovn-installed in OVS
Dec 05 09:47:25 compute-1 nova_compute[189066]: 2025-12-05 09:47:25.796 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:47:25 compute-1 nova_compute[189066]: 2025-12-05 09:47:25.811 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:47:25 compute-1 systemd[1]: machine-qemu\x2d22\x2dinstance\x2d00000037.scope: Deactivated successfully.
Dec 05 09:47:25 compute-1 systemd[1]: machine-qemu\x2d22\x2dinstance\x2d00000037.scope: Consumed 12.560s CPU time.
Dec 05 09:47:25 compute-1 systemd-machined[154815]: Machine qemu-22-instance-00000037 terminated.
Dec 05 09:47:25 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:47:25.856 105272 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:43:43:ca 10.100.0.7'], port_security=['fa:16:3e:43:43:ca 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': '4018e31a-e4b1-4786-8a23-a6824cfe324b', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-71ae38f4-8c55-4792-9422-fcbcaa5f09ab', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0279a3edc4d145a3815f122291b494bf', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'fd674063-42aa-4a30-b09e-f4f4a4cb3d69', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=51cdcecf-b9b8-4fb1-8306-272b186b0af7, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fc06f54c970>], logical_port=b627bc29-57d3-4693-9334-e4986065502a) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fc06f54c970>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 09:47:25 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:47:25.858 105272 INFO neutron.agent.ovn.metadata.agent [-] Port b627bc29-57d3-4693-9334-e4986065502a in datapath 71ae38f4-8c55-4792-9422-fcbcaa5f09ab unbound from our chassis
Dec 05 09:47:25 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:47:25.860 105272 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 71ae38f4-8c55-4792-9422-fcbcaa5f09ab, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 05 09:47:25 compute-1 podman[234076]: 2025-12-05 09:47:25.863218523 +0000 UTC m=+0.058441954 container health_status b9f45cdcb567bb250381fa80d86c78c226d5638c7de1b6d79ee017275814ff68 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 05 09:47:25 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:47:25.863 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[ca0b8a1d-2c06-4c70-bd14-24437bd6507f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:47:25 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:47:25.864 105272 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-71ae38f4-8c55-4792-9422-fcbcaa5f09ab namespace which is not needed anymore
Dec 05 09:47:25 compute-1 nova_compute[189066]: 2025-12-05 09:47:25.975 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:47:25 compute-1 nova_compute[189066]: 2025-12-05 09:47:25.978 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:47:26 compute-1 neutron-haproxy-ovnmeta-71ae38f4-8c55-4792-9422-fcbcaa5f09ab[233972]: [NOTICE]   (233994) : haproxy version is 2.8.14-c23fe91
Dec 05 09:47:26 compute-1 neutron-haproxy-ovnmeta-71ae38f4-8c55-4792-9422-fcbcaa5f09ab[233972]: [NOTICE]   (233994) : path to executable is /usr/sbin/haproxy
Dec 05 09:47:26 compute-1 neutron-haproxy-ovnmeta-71ae38f4-8c55-4792-9422-fcbcaa5f09ab[233972]: [WARNING]  (233994) : Exiting Master process...
Dec 05 09:47:26 compute-1 neutron-haproxy-ovnmeta-71ae38f4-8c55-4792-9422-fcbcaa5f09ab[233972]: [ALERT]    (233994) : Current worker (234000) exited with code 143 (Terminated)
Dec 05 09:47:26 compute-1 neutron-haproxy-ovnmeta-71ae38f4-8c55-4792-9422-fcbcaa5f09ab[233972]: [WARNING]  (233994) : All workers exited. Exiting... (0)
Dec 05 09:47:26 compute-1 nova_compute[189066]: 2025-12-05 09:47:26.022 189070 INFO nova.virt.libvirt.driver [-] [instance: 4018e31a-e4b1-4786-8a23-a6824cfe324b] Instance destroyed successfully.
Dec 05 09:47:26 compute-1 nova_compute[189066]: 2025-12-05 09:47:26.023 189070 DEBUG nova.objects.instance [None req-7043a1bb-281a-4a9f-9fe2-e0b9dbe53362 d96d7aa420944947ad70e496aec2bea1 0279a3edc4d145a3815f122291b494bf - - default default] Lazy-loading 'resources' on Instance uuid 4018e31a-e4b1-4786-8a23-a6824cfe324b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 05 09:47:26 compute-1 systemd[1]: libpod-5ed7bfbd8409e25e3e515b3d71c4e778df055f071b175fe35d68c01da83da604.scope: Deactivated successfully.
Dec 05 09:47:26 compute-1 podman[234122]: 2025-12-05 09:47:26.028582365 +0000 UTC m=+0.055739167 container died 5ed7bfbd8409e25e3e515b3d71c4e778df055f071b175fe35d68c01da83da604 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-71ae38f4-8c55-4792-9422-fcbcaa5f09ab, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125)
Dec 05 09:47:26 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-5ed7bfbd8409e25e3e515b3d71c4e778df055f071b175fe35d68c01da83da604-userdata-shm.mount: Deactivated successfully.
Dec 05 09:47:26 compute-1 systemd[1]: var-lib-containers-storage-overlay-d0d01513a596861ad521e323c13fd25c9e39be09a56039fb0f7bdfc736f32bf7-merged.mount: Deactivated successfully.
Dec 05 09:47:26 compute-1 podman[234122]: 2025-12-05 09:47:26.064126035 +0000 UTC m=+0.091282827 container cleanup 5ed7bfbd8409e25e3e515b3d71c4e778df055f071b175fe35d68c01da83da604 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-71ae38f4-8c55-4792-9422-fcbcaa5f09ab, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Dec 05 09:47:26 compute-1 systemd[1]: libpod-conmon-5ed7bfbd8409e25e3e515b3d71c4e778df055f071b175fe35d68c01da83da604.scope: Deactivated successfully.
Dec 05 09:47:26 compute-1 podman[234165]: 2025-12-05 09:47:26.134786747 +0000 UTC m=+0.044964862 container remove 5ed7bfbd8409e25e3e515b3d71c4e778df055f071b175fe35d68c01da83da604 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-71ae38f4-8c55-4792-9422-fcbcaa5f09ab, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, tcib_managed=true)
Dec 05 09:47:26 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:47:26.141 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[1cd5a4df-8495-43b9-91a8-4f728d98608c]: (4, ('Fri Dec  5 09:47:25 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-71ae38f4-8c55-4792-9422-fcbcaa5f09ab (5ed7bfbd8409e25e3e515b3d71c4e778df055f071b175fe35d68c01da83da604)\n5ed7bfbd8409e25e3e515b3d71c4e778df055f071b175fe35d68c01da83da604\nFri Dec  5 09:47:26 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-71ae38f4-8c55-4792-9422-fcbcaa5f09ab (5ed7bfbd8409e25e3e515b3d71c4e778df055f071b175fe35d68c01da83da604)\n5ed7bfbd8409e25e3e515b3d71c4e778df055f071b175fe35d68c01da83da604\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:47:26 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:47:26.144 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[59b7d134-c95a-4d01-9c30-61f74574f8dc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:47:26 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:47:26.145 105272 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap71ae38f4-80, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 09:47:26 compute-1 nova_compute[189066]: 2025-12-05 09:47:26.148 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:47:26 compute-1 kernel: tap71ae38f4-80: left promiscuous mode
Dec 05 09:47:26 compute-1 nova_compute[189066]: 2025-12-05 09:47:26.163 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:47:26 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:47:26.166 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[045be5fb-d94d-45d7-bee3-bd9da668e3d2]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:47:26 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:47:26.186 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[6762fff5-6a37-49ac-a739-033011d7ea09]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:47:26 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:47:26.187 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[ef4c8537-53b1-4de7-b756-00661e68389e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:47:26 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:47:26.207 220556 DEBUG oslo.privsep.daemon [-] privsep: reply[e4b1af93-4653-42f1-87be-15f1c4aa6714]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 530496, 'reachable_time': 42955, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 234184, 'error': None, 'target': 'ovnmeta-71ae38f4-8c55-4792-9422-fcbcaa5f09ab', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:47:26 compute-1 systemd[1]: run-netns-ovnmeta\x2d71ae38f4\x2d8c55\x2d4792\x2d9422\x2dfcbcaa5f09ab.mount: Deactivated successfully.
Dec 05 09:47:26 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:47:26.212 105809 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-71ae38f4-8c55-4792-9422-fcbcaa5f09ab deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Dec 05 09:47:26 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:47:26.213 105809 DEBUG oslo.privsep.daemon [-] privsep: reply[9f790ea8-de80-4631-907f-dd06c625ae4c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 05 09:47:26 compute-1 nova_compute[189066]: 2025-12-05 09:47:26.259 189070 DEBUG nova.virt.libvirt.vif [None req-7043a1bb-281a-4a9f-9fe2-e0b9dbe53362 d96d7aa420944947ad70e496aec2bea1 0279a3edc4d145a3815f122291b494bf - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-05T09:47:00Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestServerMultinode-server-738303947',display_name='tempest-TestServerMultinode-server-738303947',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testservermultinode-server-738303947',id=55,image_ref='3ebffd97-b242-42d7-b245-ebdaf8e4377c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-05T09:47:13Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='0279a3edc4d145a3815f122291b494bf',ramdisk_id='',reservation_id='r-im3vetyr',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member,admin',image_base_image_ref='3ebffd97-b242-42d7-b245-ebdaf8e4377c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestServerMultinode-1600634262',owner_user_name='tempest-TestServerMultinode-1600634262-project-admin'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-05T09:47:13Z,user_data=None,user_id='d96d7aa420944947ad70e496aec2bea1',uuid=4018e31a-e4b1-4786-8a23-a6824cfe324b,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "b627bc29-57d3-4693-9334-e4986065502a", "address": "fa:16:3e:43:43:ca", "network": {"id": "71ae38f4-8c55-4792-9422-fcbcaa5f09ab", "bridge": "br-int", "label": "tempest-TestServerMultinode-503343283-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "66bb236c6ae84372bfe66c14c34eff73", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb627bc29-57", "ovs_interfaceid": "b627bc29-57d3-4693-9334-e4986065502a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Dec 05 09:47:26 compute-1 nova_compute[189066]: 2025-12-05 09:47:26.261 189070 DEBUG nova.network.os_vif_util [None req-7043a1bb-281a-4a9f-9fe2-e0b9dbe53362 d96d7aa420944947ad70e496aec2bea1 0279a3edc4d145a3815f122291b494bf - - default default] Converting VIF {"id": "b627bc29-57d3-4693-9334-e4986065502a", "address": "fa:16:3e:43:43:ca", "network": {"id": "71ae38f4-8c55-4792-9422-fcbcaa5f09ab", "bridge": "br-int", "label": "tempest-TestServerMultinode-503343283-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "66bb236c6ae84372bfe66c14c34eff73", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb627bc29-57", "ovs_interfaceid": "b627bc29-57d3-4693-9334-e4986065502a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 05 09:47:26 compute-1 nova_compute[189066]: 2025-12-05 09:47:26.261 189070 DEBUG nova.network.os_vif_util [None req-7043a1bb-281a-4a9f-9fe2-e0b9dbe53362 d96d7aa420944947ad70e496aec2bea1 0279a3edc4d145a3815f122291b494bf - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:43:43:ca,bridge_name='br-int',has_traffic_filtering=True,id=b627bc29-57d3-4693-9334-e4986065502a,network=Network(71ae38f4-8c55-4792-9422-fcbcaa5f09ab),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb627bc29-57') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 05 09:47:26 compute-1 nova_compute[189066]: 2025-12-05 09:47:26.262 189070 DEBUG os_vif [None req-7043a1bb-281a-4a9f-9fe2-e0b9dbe53362 d96d7aa420944947ad70e496aec2bea1 0279a3edc4d145a3815f122291b494bf - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:43:43:ca,bridge_name='br-int',has_traffic_filtering=True,id=b627bc29-57d3-4693-9334-e4986065502a,network=Network(71ae38f4-8c55-4792-9422-fcbcaa5f09ab),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb627bc29-57') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Dec 05 09:47:26 compute-1 nova_compute[189066]: 2025-12-05 09:47:26.265 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:47:26 compute-1 nova_compute[189066]: 2025-12-05 09:47:26.265 189070 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb627bc29-57, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 09:47:26 compute-1 nova_compute[189066]: 2025-12-05 09:47:26.270 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 09:47:26 compute-1 nova_compute[189066]: 2025-12-05 09:47:26.275 189070 INFO os_vif [None req-7043a1bb-281a-4a9f-9fe2-e0b9dbe53362 d96d7aa420944947ad70e496aec2bea1 0279a3edc4d145a3815f122291b494bf - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:43:43:ca,bridge_name='br-int',has_traffic_filtering=True,id=b627bc29-57d3-4693-9334-e4986065502a,network=Network(71ae38f4-8c55-4792-9422-fcbcaa5f09ab),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb627bc29-57')
Dec 05 09:47:26 compute-1 nova_compute[189066]: 2025-12-05 09:47:26.275 189070 INFO nova.virt.libvirt.driver [None req-7043a1bb-281a-4a9f-9fe2-e0b9dbe53362 d96d7aa420944947ad70e496aec2bea1 0279a3edc4d145a3815f122291b494bf - - default default] [instance: 4018e31a-e4b1-4786-8a23-a6824cfe324b] Deleting instance files /var/lib/nova/instances/4018e31a-e4b1-4786-8a23-a6824cfe324b_del
Dec 05 09:47:26 compute-1 nova_compute[189066]: 2025-12-05 09:47:26.276 189070 INFO nova.virt.libvirt.driver [None req-7043a1bb-281a-4a9f-9fe2-e0b9dbe53362 d96d7aa420944947ad70e496aec2bea1 0279a3edc4d145a3815f122291b494bf - - default default] [instance: 4018e31a-e4b1-4786-8a23-a6824cfe324b] Deletion of /var/lib/nova/instances/4018e31a-e4b1-4786-8a23-a6824cfe324b_del complete
Dec 05 09:47:26 compute-1 nova_compute[189066]: 2025-12-05 09:47:26.454 189070 INFO nova.compute.manager [None req-7043a1bb-281a-4a9f-9fe2-e0b9dbe53362 d96d7aa420944947ad70e496aec2bea1 0279a3edc4d145a3815f122291b494bf - - default default] [instance: 4018e31a-e4b1-4786-8a23-a6824cfe324b] Took 0.71 seconds to destroy the instance on the hypervisor.
Dec 05 09:47:26 compute-1 nova_compute[189066]: 2025-12-05 09:47:26.455 189070 DEBUG oslo.service.loopingcall [None req-7043a1bb-281a-4a9f-9fe2-e0b9dbe53362 d96d7aa420944947ad70e496aec2bea1 0279a3edc4d145a3815f122291b494bf - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Dec 05 09:47:26 compute-1 nova_compute[189066]: 2025-12-05 09:47:26.456 189070 DEBUG nova.compute.manager [-] [instance: 4018e31a-e4b1-4786-8a23-a6824cfe324b] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Dec 05 09:47:26 compute-1 nova_compute[189066]: 2025-12-05 09:47:26.456 189070 DEBUG nova.network.neutron [-] [instance: 4018e31a-e4b1-4786-8a23-a6824cfe324b] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Dec 05 09:47:26 compute-1 nova_compute[189066]: 2025-12-05 09:47:26.639 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:47:27 compute-1 nova_compute[189066]: 2025-12-05 09:47:27.021 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:47:27 compute-1 nova_compute[189066]: 2025-12-05 09:47:27.181 189070 DEBUG nova.network.neutron [-] [instance: 4018e31a-e4b1-4786-8a23-a6824cfe324b] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 05 09:47:27 compute-1 nova_compute[189066]: 2025-12-05 09:47:27.205 189070 INFO nova.compute.manager [-] [instance: 4018e31a-e4b1-4786-8a23-a6824cfe324b] Took 0.75 seconds to deallocate network for instance.
Dec 05 09:47:27 compute-1 nova_compute[189066]: 2025-12-05 09:47:27.260 189070 DEBUG oslo_concurrency.lockutils [None req-7043a1bb-281a-4a9f-9fe2-e0b9dbe53362 d96d7aa420944947ad70e496aec2bea1 0279a3edc4d145a3815f122291b494bf - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:47:27 compute-1 nova_compute[189066]: 2025-12-05 09:47:27.260 189070 DEBUG oslo_concurrency.lockutils [None req-7043a1bb-281a-4a9f-9fe2-e0b9dbe53362 d96d7aa420944947ad70e496aec2bea1 0279a3edc4d145a3815f122291b494bf - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:47:27 compute-1 nova_compute[189066]: 2025-12-05 09:47:27.341 189070 DEBUG nova.scheduler.client.report [None req-7043a1bb-281a-4a9f-9fe2-e0b9dbe53362 d96d7aa420944947ad70e496aec2bea1 0279a3edc4d145a3815f122291b494bf - - default default] Refreshing inventories for resource provider be68f9f1-7820-4bfa-8dbd-210e13729f64 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Dec 05 09:47:27 compute-1 nova_compute[189066]: 2025-12-05 09:47:27.417 189070 DEBUG nova.scheduler.client.report [None req-7043a1bb-281a-4a9f-9fe2-e0b9dbe53362 d96d7aa420944947ad70e496aec2bea1 0279a3edc4d145a3815f122291b494bf - - default default] Updating ProviderTree inventory for provider be68f9f1-7820-4bfa-8dbd-210e13729f64 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Dec 05 09:47:27 compute-1 nova_compute[189066]: 2025-12-05 09:47:27.418 189070 DEBUG nova.compute.provider_tree [None req-7043a1bb-281a-4a9f-9fe2-e0b9dbe53362 d96d7aa420944947ad70e496aec2bea1 0279a3edc4d145a3815f122291b494bf - - default default] Updating inventory in ProviderTree for provider be68f9f1-7820-4bfa-8dbd-210e13729f64 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Dec 05 09:47:27 compute-1 nova_compute[189066]: 2025-12-05 09:47:27.445 189070 DEBUG nova.scheduler.client.report [None req-7043a1bb-281a-4a9f-9fe2-e0b9dbe53362 d96d7aa420944947ad70e496aec2bea1 0279a3edc4d145a3815f122291b494bf - - default default] Refreshing aggregate associations for resource provider be68f9f1-7820-4bfa-8dbd-210e13729f64, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Dec 05 09:47:27 compute-1 nova_compute[189066]: 2025-12-05 09:47:27.477 189070 DEBUG nova.scheduler.client.report [None req-7043a1bb-281a-4a9f-9fe2-e0b9dbe53362 d96d7aa420944947ad70e496aec2bea1 0279a3edc4d145a3815f122291b494bf - - default default] Refreshing trait associations for resource provider be68f9f1-7820-4bfa-8dbd-210e13729f64, traits: COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_DEVICE_TAGGING,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_STORAGE_BUS_USB,COMPUTE_NODE,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_VOLUME_EXTEND,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_GRAPHICS_MODEL_NONE,HW_CPU_X86_MMX,COMPUTE_TRUSTED_CERTS,COMPUTE_SECURITY_UEFI_SECURE_BOOT,HW_CPU_X86_SSE,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_RESCUE_BFV,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_NET_VIF_MODEL_E1000E,HW_CPU_X86_SSE41,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_STORAGE_BUS_FDC,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_STORAGE_BUS_IDE,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_SECURITY_TPM_1_2,HW_CPU_X86_SSE42,COMPUTE_SECURITY_TPM_2_0,COMPUTE_NET_VIF_MODEL_NE2K_PCI,HW_CPU_X86_SSSE3,COMPUTE_STORAGE_BUS_SATA,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_ACCELERATORS,HW_CPU_X86_SSE2,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_NET_VIF_MODEL_RTL8139 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Dec 05 09:47:27 compute-1 nova_compute[189066]: 2025-12-05 09:47:27.533 189070 DEBUG nova.compute.provider_tree [None req-7043a1bb-281a-4a9f-9fe2-e0b9dbe53362 d96d7aa420944947ad70e496aec2bea1 0279a3edc4d145a3815f122291b494bf - - default default] Inventory has not changed in ProviderTree for provider: be68f9f1-7820-4bfa-8dbd-210e13729f64 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 09:47:27 compute-1 nova_compute[189066]: 2025-12-05 09:47:27.550 189070 DEBUG nova.scheduler.client.report [None req-7043a1bb-281a-4a9f-9fe2-e0b9dbe53362 d96d7aa420944947ad70e496aec2bea1 0279a3edc4d145a3815f122291b494bf - - default default] Inventory has not changed for provider be68f9f1-7820-4bfa-8dbd-210e13729f64 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 09:47:27 compute-1 nova_compute[189066]: 2025-12-05 09:47:27.574 189070 DEBUG oslo_concurrency.lockutils [None req-7043a1bb-281a-4a9f-9fe2-e0b9dbe53362 d96d7aa420944947ad70e496aec2bea1 0279a3edc4d145a3815f122291b494bf - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.314s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:47:27 compute-1 nova_compute[189066]: 2025-12-05 09:47:27.611 189070 INFO nova.scheduler.client.report [None req-7043a1bb-281a-4a9f-9fe2-e0b9dbe53362 d96d7aa420944947ad70e496aec2bea1 0279a3edc4d145a3815f122291b494bf - - default default] Deleted allocations for instance 4018e31a-e4b1-4786-8a23-a6824cfe324b
Dec 05 09:47:28 compute-1 nova_compute[189066]: 2025-12-05 09:47:28.670 189070 DEBUG oslo_concurrency.lockutils [None req-7043a1bb-281a-4a9f-9fe2-e0b9dbe53362 d96d7aa420944947ad70e496aec2bea1 0279a3edc4d145a3815f122291b494bf - - default default] Lock "4018e31a-e4b1-4786-8a23-a6824cfe324b" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.927s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:47:28 compute-1 nova_compute[189066]: 2025-12-05 09:47:28.961 189070 DEBUG nova.compute.manager [req-70943175-d44d-4744-b22d-47c41b24208d req-afe04b23-c8bf-4710-83e9-090b8c9ab0f2 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 4018e31a-e4b1-4786-8a23-a6824cfe324b] Received event network-vif-unplugged-b627bc29-57d3-4693-9334-e4986065502a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 09:47:28 compute-1 nova_compute[189066]: 2025-12-05 09:47:28.962 189070 DEBUG oslo_concurrency.lockutils [req-70943175-d44d-4744-b22d-47c41b24208d req-afe04b23-c8bf-4710-83e9-090b8c9ab0f2 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Acquiring lock "4018e31a-e4b1-4786-8a23-a6824cfe324b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:47:28 compute-1 nova_compute[189066]: 2025-12-05 09:47:28.962 189070 DEBUG oslo_concurrency.lockutils [req-70943175-d44d-4744-b22d-47c41b24208d req-afe04b23-c8bf-4710-83e9-090b8c9ab0f2 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Lock "4018e31a-e4b1-4786-8a23-a6824cfe324b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:47:28 compute-1 nova_compute[189066]: 2025-12-05 09:47:28.962 189070 DEBUG oslo_concurrency.lockutils [req-70943175-d44d-4744-b22d-47c41b24208d req-afe04b23-c8bf-4710-83e9-090b8c9ab0f2 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Lock "4018e31a-e4b1-4786-8a23-a6824cfe324b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:47:28 compute-1 nova_compute[189066]: 2025-12-05 09:47:28.963 189070 DEBUG nova.compute.manager [req-70943175-d44d-4744-b22d-47c41b24208d req-afe04b23-c8bf-4710-83e9-090b8c9ab0f2 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 4018e31a-e4b1-4786-8a23-a6824cfe324b] No waiting events found dispatching network-vif-unplugged-b627bc29-57d3-4693-9334-e4986065502a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 09:47:28 compute-1 nova_compute[189066]: 2025-12-05 09:47:28.963 189070 WARNING nova.compute.manager [req-70943175-d44d-4744-b22d-47c41b24208d req-afe04b23-c8bf-4710-83e9-090b8c9ab0f2 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 4018e31a-e4b1-4786-8a23-a6824cfe324b] Received unexpected event network-vif-unplugged-b627bc29-57d3-4693-9334-e4986065502a for instance with vm_state deleted and task_state None.
Dec 05 09:47:28 compute-1 nova_compute[189066]: 2025-12-05 09:47:28.963 189070 DEBUG nova.compute.manager [req-70943175-d44d-4744-b22d-47c41b24208d req-afe04b23-c8bf-4710-83e9-090b8c9ab0f2 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 4018e31a-e4b1-4786-8a23-a6824cfe324b] Received event network-vif-plugged-b627bc29-57d3-4693-9334-e4986065502a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 09:47:28 compute-1 nova_compute[189066]: 2025-12-05 09:47:28.963 189070 DEBUG oslo_concurrency.lockutils [req-70943175-d44d-4744-b22d-47c41b24208d req-afe04b23-c8bf-4710-83e9-090b8c9ab0f2 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Acquiring lock "4018e31a-e4b1-4786-8a23-a6824cfe324b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:47:28 compute-1 nova_compute[189066]: 2025-12-05 09:47:28.964 189070 DEBUG oslo_concurrency.lockutils [req-70943175-d44d-4744-b22d-47c41b24208d req-afe04b23-c8bf-4710-83e9-090b8c9ab0f2 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Lock "4018e31a-e4b1-4786-8a23-a6824cfe324b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:47:28 compute-1 nova_compute[189066]: 2025-12-05 09:47:28.964 189070 DEBUG oslo_concurrency.lockutils [req-70943175-d44d-4744-b22d-47c41b24208d req-afe04b23-c8bf-4710-83e9-090b8c9ab0f2 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] Lock "4018e31a-e4b1-4786-8a23-a6824cfe324b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:47:28 compute-1 nova_compute[189066]: 2025-12-05 09:47:28.964 189070 DEBUG nova.compute.manager [req-70943175-d44d-4744-b22d-47c41b24208d req-afe04b23-c8bf-4710-83e9-090b8c9ab0f2 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 4018e31a-e4b1-4786-8a23-a6824cfe324b] No waiting events found dispatching network-vif-plugged-b627bc29-57d3-4693-9334-e4986065502a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 05 09:47:28 compute-1 nova_compute[189066]: 2025-12-05 09:47:28.964 189070 WARNING nova.compute.manager [req-70943175-d44d-4744-b22d-47c41b24208d req-afe04b23-c8bf-4710-83e9-090b8c9ab0f2 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 4018e31a-e4b1-4786-8a23-a6824cfe324b] Received unexpected event network-vif-plugged-b627bc29-57d3-4693-9334-e4986065502a for instance with vm_state deleted and task_state None.
Dec 05 09:47:28 compute-1 nova_compute[189066]: 2025-12-05 09:47:28.965 189070 DEBUG nova.compute.manager [req-70943175-d44d-4744-b22d-47c41b24208d req-afe04b23-c8bf-4710-83e9-090b8c9ab0f2 300efca1163a4646b45e6146c4831f3b f968ed8d66cb40dd9205c6353b3b33e8 - - default default] [instance: 4018e31a-e4b1-4786-8a23-a6824cfe324b] Received event network-vif-deleted-b627bc29-57d3-4693-9334-e4986065502a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 05 09:47:29 compute-1 nova_compute[189066]: 2025-12-05 09:47:29.021 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:47:29 compute-1 nova_compute[189066]: 2025-12-05 09:47:29.022 189070 DEBUG nova.compute.manager [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 05 09:47:29 compute-1 nova_compute[189066]: 2025-12-05 09:47:29.022 189070 DEBUG nova.compute.manager [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 05 09:47:29 compute-1 nova_compute[189066]: 2025-12-05 09:47:29.225 189070 DEBUG nova.compute.manager [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Dec 05 09:47:29 compute-1 podman[234185]: 2025-12-05 09:47:29.652256713 +0000 UTC m=+0.083070826 container health_status 550b604b3b40c506829669f3af0f3e8dc4e7ac01464222ce7f26998708fe1a96 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Dec 05 09:47:31 compute-1 nova_compute[189066]: 2025-12-05 09:47:31.020 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:47:31 compute-1 nova_compute[189066]: 2025-12-05 09:47:31.020 189070 DEBUG nova.compute.manager [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 05 09:47:31 compute-1 nova_compute[189066]: 2025-12-05 09:47:31.268 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:47:31 compute-1 nova_compute[189066]: 2025-12-05 09:47:31.641 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:47:32 compute-1 nova_compute[189066]: 2025-12-05 09:47:32.021 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:47:33 compute-1 nova_compute[189066]: 2025-12-05 09:47:33.020 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:47:33 compute-1 nova_compute[189066]: 2025-12-05 09:47:33.021 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:47:34 compute-1 nova_compute[189066]: 2025-12-05 09:47:34.562 189070 DEBUG oslo_concurrency.lockutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:47:34 compute-1 nova_compute[189066]: 2025-12-05 09:47:34.563 189070 DEBUG oslo_concurrency.lockutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:47:34 compute-1 nova_compute[189066]: 2025-12-05 09:47:34.563 189070 DEBUG oslo_concurrency.lockutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:47:34 compute-1 nova_compute[189066]: 2025-12-05 09:47:34.563 189070 DEBUG nova.compute.resource_tracker [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 05 09:47:34 compute-1 nova_compute[189066]: 2025-12-05 09:47:34.758 189070 WARNING nova.virt.libvirt.driver [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 05 09:47:34 compute-1 nova_compute[189066]: 2025-12-05 09:47:34.760 189070 DEBUG nova.compute.resource_tracker [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5722MB free_disk=73.32378387451172GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 05 09:47:34 compute-1 nova_compute[189066]: 2025-12-05 09:47:34.760 189070 DEBUG oslo_concurrency.lockutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:47:34 compute-1 nova_compute[189066]: 2025-12-05 09:47:34.760 189070 DEBUG oslo_concurrency.lockutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:47:34 compute-1 nova_compute[189066]: 2025-12-05 09:47:34.838 189070 DEBUG nova.compute.resource_tracker [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 05 09:47:34 compute-1 nova_compute[189066]: 2025-12-05 09:47:34.838 189070 DEBUG nova.compute.resource_tracker [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 05 09:47:34 compute-1 nova_compute[189066]: 2025-12-05 09:47:34.863 189070 DEBUG nova.compute.provider_tree [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Inventory has not changed in ProviderTree for provider: be68f9f1-7820-4bfa-8dbd-210e13729f64 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 09:47:34 compute-1 nova_compute[189066]: 2025-12-05 09:47:34.880 189070 DEBUG nova.scheduler.client.report [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Inventory has not changed for provider be68f9f1-7820-4bfa-8dbd-210e13729f64 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 09:47:34 compute-1 nova_compute[189066]: 2025-12-05 09:47:34.917 189070 DEBUG nova.compute.resource_tracker [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 05 09:47:34 compute-1 nova_compute[189066]: 2025-12-05 09:47:34.917 189070 DEBUG oslo_concurrency.lockutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.157s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:47:35 compute-1 nova_compute[189066]: 2025-12-05 09:47:35.918 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:47:36 compute-1 nova_compute[189066]: 2025-12-05 09:47:36.270 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:47:36 compute-1 nova_compute[189066]: 2025-12-05 09:47:36.643 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:47:36 compute-1 podman[234211]: 2025-12-05 09:47:36.650660961 +0000 UTC m=+0.091793030 container health_status d60d3dfd47255bc7c068dbedc03dbf86678863f0414a1b8f8536e4d53dd67a13 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a42d21893a42142b0b00e96296a13ac9d2c0fedf55d6438ad104fd0309c63376-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 05 09:47:36 compute-1 sshd-session[233839]: ssh_dispatch_run_fatal: Connection from 101.47.162.91 port 58786: Connection timed out [preauth]
Dec 05 09:47:37 compute-1 nova_compute[189066]: 2025-12-05 09:47:37.021 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:47:37 compute-1 nova_compute[189066]: 2025-12-05 09:47:37.021 189070 DEBUG nova.compute.manager [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Dec 05 09:47:38 compute-1 nova_compute[189066]: 2025-12-05 09:47:38.088 189070 DEBUG nova.compute.manager [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Dec 05 09:47:38 compute-1 nova_compute[189066]: 2025-12-05 09:47:38.089 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:47:38 compute-1 nova_compute[189066]: 2025-12-05 09:47:38.089 189070 DEBUG nova.compute.manager [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Dec 05 09:47:41 compute-1 nova_compute[189066]: 2025-12-05 09:47:41.019 189070 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764928046.0190063, 4018e31a-e4b1-4786-8a23-a6824cfe324b => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 05 09:47:41 compute-1 nova_compute[189066]: 2025-12-05 09:47:41.020 189070 INFO nova.compute.manager [-] [instance: 4018e31a-e4b1-4786-8a23-a6824cfe324b] VM Stopped (Lifecycle Event)
Dec 05 09:47:41 compute-1 nova_compute[189066]: 2025-12-05 09:47:41.271 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:47:41 compute-1 nova_compute[189066]: 2025-12-05 09:47:41.544 189070 DEBUG nova.compute.manager [None req-a9339317-a3e3-49f9-98f1-b954551dcb26 - - - - - -] [instance: 4018e31a-e4b1-4786-8a23-a6824cfe324b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 05 09:47:41 compute-1 nova_compute[189066]: 2025-12-05 09:47:41.645 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:47:43 compute-1 podman[234232]: 2025-12-05 09:47:43.65860158 +0000 UTC m=+0.100387510 container health_status 0cdc951f3b455a1459471b20e1f1626ca53f702831c125f835d1b670aeb17ccf (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_id=ovn_controller, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a42d21893a42142b0b00e96296a13ac9d2c0fedf55d6438ad104fd0309c63376-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller)
Dec 05 09:47:44 compute-1 nova_compute[189066]: 2025-12-05 09:47:44.021 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:47:46 compute-1 nova_compute[189066]: 2025-12-05 09:47:46.043 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:47:46 compute-1 nova_compute[189066]: 2025-12-05 09:47:46.273 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:47:46 compute-1 podman[234259]: 2025-12-05 09:47:46.609590455 +0000 UTC m=+0.054228500 container health_status e8cea6352187f28e672aa25c227f2705a90d58074b8cf42235a56df5d4026fd5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a42d21893a42142b0b00e96296a13ac9d2c0fedf55d6438ad104fd0309c63376-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Dec 05 09:47:46 compute-1 nova_compute[189066]: 2025-12-05 09:47:46.647 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:47:49 compute-1 podman[234278]: 2025-12-05 09:47:49.654775528 +0000 UTC m=+0.094799725 container health_status 3f3288243aefe700d53cffce184353529fbaf6e44f1c19ec4792ed576af41176 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Dec 05 09:47:51 compute-1 nova_compute[189066]: 2025-12-05 09:47:51.276 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:47:51 compute-1 nova_compute[189066]: 2025-12-05 09:47:51.649 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:47:52 compute-1 podman[234296]: 2025-12-05 09:47:52.624150073 +0000 UTC m=+0.061518528 container health_status b07f36300303a5c1729056a61e344d6c6fd9b2e404082d8e8ce7185d45652c2b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, architecture=x86_64, build-date=2025-08-20T13:12:41, config_id=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., distribution-scope=public, version=9.6, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, io.buildah.version=1.33.7, managed_by=edpm_ansible, release=1755695350, com.redhat.component=ubi9-minimal-container, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9)
Dec 05 09:47:56 compute-1 nova_compute[189066]: 2025-12-05 09:47:56.278 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:47:56 compute-1 podman[234318]: 2025-12-05 09:47:56.620839213 +0000 UTC m=+0.061732724 container health_status b9f45cdcb567bb250381fa80d86c78c226d5638c7de1b6d79ee017275814ff68 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Dec 05 09:47:56 compute-1 nova_compute[189066]: 2025-12-05 09:47:56.651 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:47:57 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:47:57.919 105272 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=30, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f2:d3:89', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '8e:43:2c:7d:20:dd'}, ipsec=False) old=SB_Global(nb_cfg=29) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 05 09:47:57 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:47:57.920 105272 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Dec 05 09:47:57 compute-1 nova_compute[189066]: 2025-12-05 09:47:57.920 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:47:59 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:47:59.923 105272 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=2deaed7a-68f6-453c-b7f8-10ef033f3762, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '30'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 09:48:00 compute-1 podman[234343]: 2025-12-05 09:48:00.624733529 +0000 UTC m=+0.064380429 container health_status 550b604b3b40c506829669f3af0f3e8dc4e7ac01464222ce7f26998708fe1a96 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Dec 05 09:48:01 compute-1 nova_compute[189066]: 2025-12-05 09:48:01.279 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:48:01 compute-1 nova_compute[189066]: 2025-12-05 09:48:01.653 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:48:06 compute-1 nova_compute[189066]: 2025-12-05 09:48:06.377 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:48:06 compute-1 nova_compute[189066]: 2025-12-05 09:48:06.655 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:48:07 compute-1 podman[234368]: 2025-12-05 09:48:07.637634842 +0000 UTC m=+0.078079585 container health_status d60d3dfd47255bc7c068dbedc03dbf86678863f0414a1b8f8536e4d53dd67a13 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a42d21893a42142b0b00e96296a13ac9d2c0fedf55d6438ad104fd0309c63376-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3)
Dec 05 09:48:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:48:08.898 105272 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:48:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:48:08.900 105272 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:48:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:48:08.900 105272 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:48:11 compute-1 nova_compute[189066]: 2025-12-05 09:48:11.379 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:48:11 compute-1 nova_compute[189066]: 2025-12-05 09:48:11.657 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:48:14 compute-1 podman[234389]: 2025-12-05 09:48:14.664789854 +0000 UTC m=+0.104732107 container health_status 0cdc951f3b455a1459471b20e1f1626ca53f702831c125f835d1b670aeb17ccf (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a42d21893a42142b0b00e96296a13ac9d2c0fedf55d6438ad104fd0309c63376-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true)
Dec 05 09:48:16 compute-1 nova_compute[189066]: 2025-12-05 09:48:16.398 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:48:16 compute-1 nova_compute[189066]: 2025-12-05 09:48:16.658 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:48:17 compute-1 podman[234417]: 2025-12-05 09:48:17.614836075 +0000 UTC m=+0.054392193 container health_status e8cea6352187f28e672aa25c227f2705a90d58074b8cf42235a56df5d4026fd5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a42d21893a42142b0b00e96296a13ac9d2c0fedf55d6438ad104fd0309c63376-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 05 09:48:20 compute-1 podman[234436]: 2025-12-05 09:48:20.615717993 +0000 UTC m=+0.059309625 container health_status 3f3288243aefe700d53cffce184353529fbaf6e44f1c19ec4792ed576af41176 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=multipathd, org.label-schema.schema-version=1.0)
Dec 05 09:48:21 compute-1 nova_compute[189066]: 2025-12-05 09:48:21.400 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:48:21 compute-1 nova_compute[189066]: 2025-12-05 09:48:21.659 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:48:23 compute-1 podman[234456]: 2025-12-05 09:48:23.615244787 +0000 UTC m=+0.054828985 container health_status b07f36300303a5c1729056a61e344d6c6fd9b2e404082d8e8ce7185d45652c2b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, distribution-scope=public, io.buildah.version=1.33.7, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, version=9.6, com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_id=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, architecture=x86_64, release=1755695350, maintainer=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, io.openshift.expose-services=)
Dec 05 09:48:26 compute-1 nova_compute[189066]: 2025-12-05 09:48:26.435 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:48:26 compute-1 nova_compute[189066]: 2025-12-05 09:48:26.662 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:48:27 compute-1 nova_compute[189066]: 2025-12-05 09:48:27.498 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:48:27 compute-1 nova_compute[189066]: 2025-12-05 09:48:27.498 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:48:27 compute-1 podman[234478]: 2025-12-05 09:48:27.65898798 +0000 UTC m=+0.080203856 container health_status b9f45cdcb567bb250381fa80d86c78c226d5638c7de1b6d79ee017275814ff68 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 05 09:48:28 compute-1 nova_compute[189066]: 2025-12-05 09:48:28.014 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:48:29 compute-1 nova_compute[189066]: 2025-12-05 09:48:29.021 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:48:30 compute-1 nova_compute[189066]: 2025-12-05 09:48:30.022 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:48:30 compute-1 nova_compute[189066]: 2025-12-05 09:48:30.022 189070 DEBUG nova.compute.manager [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 05 09:48:30 compute-1 nova_compute[189066]: 2025-12-05 09:48:30.023 189070 DEBUG nova.compute.manager [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 05 09:48:30 compute-1 nova_compute[189066]: 2025-12-05 09:48:30.057 189070 DEBUG nova.compute.manager [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Dec 05 09:48:31 compute-1 nova_compute[189066]: 2025-12-05 09:48:31.438 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:48:31 compute-1 podman[234502]: 2025-12-05 09:48:31.624453704 +0000 UTC m=+0.054516267 container health_status 550b604b3b40c506829669f3af0f3e8dc4e7ac01464222ce7f26998708fe1a96 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Dec 05 09:48:31 compute-1 nova_compute[189066]: 2025-12-05 09:48:31.663 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:48:32 compute-1 nova_compute[189066]: 2025-12-05 09:48:32.021 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:48:32 compute-1 nova_compute[189066]: 2025-12-05 09:48:32.021 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:48:32 compute-1 nova_compute[189066]: 2025-12-05 09:48:32.021 189070 DEBUG nova.compute.manager [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 05 09:48:33 compute-1 nova_compute[189066]: 2025-12-05 09:48:33.021 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:48:35 compute-1 nova_compute[189066]: 2025-12-05 09:48:35.021 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:48:35 compute-1 nova_compute[189066]: 2025-12-05 09:48:35.021 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:48:35 compute-1 nova_compute[189066]: 2025-12-05 09:48:35.054 189070 DEBUG oslo_concurrency.lockutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:48:35 compute-1 nova_compute[189066]: 2025-12-05 09:48:35.054 189070 DEBUG oslo_concurrency.lockutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:48:35 compute-1 rsyslogd[1007]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Dec 05 09:48:35 compute-1 nova_compute[189066]: 2025-12-05 09:48:35.055 189070 DEBUG oslo_concurrency.lockutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:48:35 compute-1 rsyslogd[1007]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Dec 05 09:48:35 compute-1 nova_compute[189066]: 2025-12-05 09:48:35.055 189070 DEBUG nova.compute.resource_tracker [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 05 09:48:35 compute-1 nova_compute[189066]: 2025-12-05 09:48:35.225 189070 WARNING nova.virt.libvirt.driver [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 05 09:48:35 compute-1 nova_compute[189066]: 2025-12-05 09:48:35.226 189070 DEBUG nova.compute.resource_tracker [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5756MB free_disk=73.32380294799805GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 05 09:48:35 compute-1 nova_compute[189066]: 2025-12-05 09:48:35.227 189070 DEBUG oslo_concurrency.lockutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:48:35 compute-1 nova_compute[189066]: 2025-12-05 09:48:35.227 189070 DEBUG oslo_concurrency.lockutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:48:35 compute-1 nova_compute[189066]: 2025-12-05 09:48:35.289 189070 DEBUG nova.compute.resource_tracker [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 05 09:48:35 compute-1 nova_compute[189066]: 2025-12-05 09:48:35.290 189070 DEBUG nova.compute.resource_tracker [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 05 09:48:35 compute-1 nova_compute[189066]: 2025-12-05 09:48:35.311 189070 DEBUG nova.compute.provider_tree [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Inventory has not changed in ProviderTree for provider: be68f9f1-7820-4bfa-8dbd-210e13729f64 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 09:48:35 compute-1 nova_compute[189066]: 2025-12-05 09:48:35.332 189070 DEBUG nova.scheduler.client.report [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Inventory has not changed for provider be68f9f1-7820-4bfa-8dbd-210e13729f64 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 09:48:35 compute-1 nova_compute[189066]: 2025-12-05 09:48:35.333 189070 DEBUG nova.compute.resource_tracker [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 05 09:48:35 compute-1 nova_compute[189066]: 2025-12-05 09:48:35.333 189070 DEBUG oslo_concurrency.lockutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.106s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:48:36 compute-1 nova_compute[189066]: 2025-12-05 09:48:36.440 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:48:36 compute-1 nova_compute[189066]: 2025-12-05 09:48:36.664 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:48:38 compute-1 podman[234527]: 2025-12-05 09:48:38.629787362 +0000 UTC m=+0.066910811 container health_status d60d3dfd47255bc7c068dbedc03dbf86678863f0414a1b8f8536e4d53dd67a13 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a42d21893a42142b0b00e96296a13ac9d2c0fedf55d6438ad104fd0309c63376-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, container_name=ceilometer_agent_compute, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251125)
Dec 05 09:48:41 compute-1 ovn_controller[95809]: 2025-12-05T09:48:41Z|00270|memory_trim|INFO|Detected inactivity (last active 30028 ms ago): trimming memory
Dec 05 09:48:41 compute-1 nova_compute[189066]: 2025-12-05 09:48:41.442 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:48:41 compute-1 nova_compute[189066]: 2025-12-05 09:48:41.667 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:48:45 compute-1 podman[234548]: 2025-12-05 09:48:45.670936667 +0000 UTC m=+0.104719717 container health_status 0cdc951f3b455a1459471b20e1f1626ca53f702831c125f835d1b670aeb17ccf (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a42d21893a42142b0b00e96296a13ac9d2c0fedf55d6438ad104fd0309c63376-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Dec 05 09:48:46 compute-1 nova_compute[189066]: 2025-12-05 09:48:46.494 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:48:46 compute-1 nova_compute[189066]: 2025-12-05 09:48:46.674 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:48:48 compute-1 podman[234574]: 2025-12-05 09:48:48.634828177 +0000 UTC m=+0.074503557 container health_status e8cea6352187f28e672aa25c227f2705a90d58074b8cf42235a56df5d4026fd5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a42d21893a42142b0b00e96296a13ac9d2c0fedf55d6438ad104fd0309c63376-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125)
Dec 05 09:48:51 compute-1 nova_compute[189066]: 2025-12-05 09:48:51.530 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:48:51 compute-1 podman[234593]: 2025-12-05 09:48:51.640746668 +0000 UTC m=+0.079168831 container health_status 3f3288243aefe700d53cffce184353529fbaf6e44f1c19ec4792ed576af41176 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, tcib_managed=true)
Dec 05 09:48:51 compute-1 nova_compute[189066]: 2025-12-05 09:48:51.675 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:48:54 compute-1 podman[234613]: 2025-12-05 09:48:54.626193128 +0000 UTC m=+0.065374633 container health_status b07f36300303a5c1729056a61e344d6c6fd9b2e404082d8e8ce7185d45652c2b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41, vendor=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, release=1755695350, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., version=9.6, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, config_id=openstack_network_exporter, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.openshift.expose-services=)
Dec 05 09:48:56 compute-1 nova_compute[189066]: 2025-12-05 09:48:56.534 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:48:56 compute-1 nova_compute[189066]: 2025-12-05 09:48:56.677 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:48:58 compute-1 podman[234636]: 2025-12-05 09:48:58.642774605 +0000 UTC m=+0.085132617 container health_status b9f45cdcb567bb250381fa80d86c78c226d5638c7de1b6d79ee017275814ff68 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Dec 05 09:49:01 compute-1 nova_compute[189066]: 2025-12-05 09:49:01.537 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:49:01 compute-1 nova_compute[189066]: 2025-12-05 09:49:01.679 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:49:02 compute-1 podman[234660]: 2025-12-05 09:49:02.615485617 +0000 UTC m=+0.058922276 container health_status 550b604b3b40c506829669f3af0f3e8dc4e7ac01464222ce7f26998708fe1a96 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec 05 09:49:06 compute-1 nova_compute[189066]: 2025-12-05 09:49:06.575 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:49:06 compute-1 nova_compute[189066]: 2025-12-05 09:49:06.681 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:49:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:49:08.900 105272 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:49:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:49:08.902 105272 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:49:08 compute-1 ovn_metadata_agent[105267]: 2025-12-05 09:49:08.902 105272 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:49:09 compute-1 podman[234685]: 2025-12-05 09:49:09.616562191 +0000 UTC m=+0.060702739 container health_status d60d3dfd47255bc7c068dbedc03dbf86678863f0414a1b8f8536e4d53dd67a13 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a42d21893a42142b0b00e96296a13ac9d2c0fedf55d6438ad104fd0309c63376-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Dec 05 09:49:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:49:10.758 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 09:49:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:49:10.759 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 09:49:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:49:10.759 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 09:49:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:49:10.759 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 09:49:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:49:10.759 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 09:49:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:49:10.760 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 09:49:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:49:10.760 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 09:49:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:49:10.760 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 09:49:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:49:10.760 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 09:49:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:49:10.760 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 09:49:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:49:10.760 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 09:49:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:49:10.760 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 09:49:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:49:10.760 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 09:49:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:49:10.760 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 09:49:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:49:10.760 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 09:49:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:49:10.760 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 09:49:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:49:10.760 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 09:49:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:49:10.761 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 09:49:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:49:10.761 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 09:49:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:49:10.761 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 09:49:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:49:10.761 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 09:49:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:49:10.761 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 09:49:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:49:10.761 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 09:49:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:49:10.761 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 09:49:10 compute-1 ceilometer_agent_compute[198994]: 2025-12-05 09:49:10.761 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 05 09:49:11 compute-1 nova_compute[189066]: 2025-12-05 09:49:11.620 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:49:11 compute-1 nova_compute[189066]: 2025-12-05 09:49:11.682 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:49:16 compute-1 nova_compute[189066]: 2025-12-05 09:49:16.684 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 09:49:16 compute-1 nova_compute[189066]: 2025-12-05 09:49:16.686 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 09:49:16 compute-1 nova_compute[189066]: 2025-12-05 09:49:16.686 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Dec 05 09:49:16 compute-1 nova_compute[189066]: 2025-12-05 09:49:16.686 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 05 09:49:16 compute-1 nova_compute[189066]: 2025-12-05 09:49:16.690 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:49:16 compute-1 nova_compute[189066]: 2025-12-05 09:49:16.691 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 05 09:49:16 compute-1 nova_compute[189066]: 2025-12-05 09:49:16.691 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:49:16 compute-1 podman[234705]: 2025-12-05 09:49:16.73787032 +0000 UTC m=+0.175130553 container health_status 0cdc951f3b455a1459471b20e1f1626ca53f702831c125f835d1b670aeb17ccf (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a42d21893a42142b0b00e96296a13ac9d2c0fedf55d6438ad104fd0309c63376-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Dec 05 09:49:19 compute-1 podman[234731]: 2025-12-05 09:49:19.628795973 +0000 UTC m=+0.056479275 container health_status e8cea6352187f28e672aa25c227f2705a90d58074b8cf42235a56df5d4026fd5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a42d21893a42142b0b00e96296a13ac9d2c0fedf55d6438ad104fd0309c63376-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3)
Dec 05 09:49:21 compute-1 nova_compute[189066]: 2025-12-05 09:49:21.692 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 09:49:21 compute-1 nova_compute[189066]: 2025-12-05 09:49:21.694 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 09:49:21 compute-1 nova_compute[189066]: 2025-12-05 09:49:21.695 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Dec 05 09:49:21 compute-1 nova_compute[189066]: 2025-12-05 09:49:21.695 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 05 09:49:21 compute-1 nova_compute[189066]: 2025-12-05 09:49:21.722 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:49:21 compute-1 nova_compute[189066]: 2025-12-05 09:49:21.723 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 05 09:49:22 compute-1 podman[234750]: 2025-12-05 09:49:22.656206971 +0000 UTC m=+0.096566438 container health_status 3f3288243aefe700d53cffce184353529fbaf6e44f1c19ec4792ed576af41176 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=multipathd, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Dec 05 09:49:25 compute-1 podman[234771]: 2025-12-05 09:49:25.639441885 +0000 UTC m=+0.068572541 container health_status b07f36300303a5c1729056a61e344d6c6fd9b2e404082d8e8ce7185d45652c2b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, container_name=openstack_network_exporter, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9-minimal, release=1755695350, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, config_id=openstack_network_exporter, distribution-scope=public, vendor=Red Hat, Inc., architecture=x86_64, build-date=2025-08-20T13:12:41, maintainer=Red Hat, Inc., managed_by=edpm_ansible, version=9.6, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Dec 05 09:49:26 compute-1 nova_compute[189066]: 2025-12-05 09:49:26.723 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:49:26 compute-1 nova_compute[189066]: 2025-12-05 09:49:26.726 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 09:49:27 compute-1 nova_compute[189066]: 2025-12-05 09:49:27.334 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:49:27 compute-1 sshd-session[234792]: Accepted publickey for zuul from 192.168.122.10 port 33926 ssh2: ECDSA SHA256:xFeNApY160LxGIiSPyA1pr6/OAQjwgC1t0EAbYvSE3Q
Dec 05 09:49:27 compute-1 systemd-logind[807]: New session 32 of user zuul.
Dec 05 09:49:27 compute-1 systemd[1]: Started Session 32 of User zuul.
Dec 05 09:49:27 compute-1 sshd-session[234792]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 05 09:49:27 compute-1 sudo[234796]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/bash -c 'rm -rf /var/tmp/sos-osp && mkdir /var/tmp/sos-osp && sos report --batch --all-logs --tmp-dir=/var/tmp/sos-osp  -p container,openstack_edpm,system,storage,virt'
Dec 05 09:49:27 compute-1 sudo[234796]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 09:49:28 compute-1 nova_compute[189066]: 2025-12-05 09:49:28.015 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:49:28 compute-1 podman[234830]: 2025-12-05 09:49:28.955471915 +0000 UTC m=+0.078253089 container health_status b9f45cdcb567bb250381fa80d86c78c226d5638c7de1b6d79ee017275814ff68 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 05 09:49:31 compute-1 nova_compute[189066]: 2025-12-05 09:49:31.022 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:49:31 compute-1 nova_compute[189066]: 2025-12-05 09:49:31.726 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 05 09:49:31 compute-1 nova_compute[189066]: 2025-12-05 09:49:31.777 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:49:32 compute-1 nova_compute[189066]: 2025-12-05 09:49:32.021 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:49:32 compute-1 nova_compute[189066]: 2025-12-05 09:49:32.021 189070 DEBUG nova.compute.manager [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 05 09:49:32 compute-1 nova_compute[189066]: 2025-12-05 09:49:32.021 189070 DEBUG nova.compute.manager [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 05 09:49:32 compute-1 nova_compute[189066]: 2025-12-05 09:49:32.070 189070 DEBUG nova.compute.manager [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Dec 05 09:49:32 compute-1 nova_compute[189066]: 2025-12-05 09:49:32.071 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:49:32 compute-1 nova_compute[189066]: 2025-12-05 09:49:32.071 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:49:32 compute-1 nova_compute[189066]: 2025-12-05 09:49:32.071 189070 DEBUG nova.compute.manager [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 05 09:49:32 compute-1 podman[234971]: 2025-12-05 09:49:32.859438503 +0000 UTC m=+0.072804906 container health_status 550b604b3b40c506829669f3af0f3e8dc4e7ac01464222ce7f26998708fe1a96 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Dec 05 09:49:33 compute-1 ovs-vsctl[235022]: ovs|00001|db_ctl_base|ERR|no key "dpdk-init" in Open_vSwitch record "." column other_config
Dec 05 09:49:34 compute-1 nova_compute[189066]: 2025-12-05 09:49:34.022 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:49:34 compute-1 systemd[1]: proc-sys-fs-binfmt_misc.automount: Got automount request for /proc/sys/fs/binfmt_misc, triggered by 234820 (sos)
Dec 05 09:49:34 compute-1 systemd[1]: Mounting Arbitrary Executable File Formats File System...
Dec 05 09:49:34 compute-1 systemd[1]: Mounted Arbitrary Executable File Formats File System.
Dec 05 09:49:34 compute-1 virtqemud[188731]: Failed to connect socket to '/var/run/libvirt/virtnetworkd-sock-ro': No such file or directory
Dec 05 09:49:34 compute-1 virtqemud[188731]: Failed to connect socket to '/var/run/libvirt/virtnwfilterd-sock-ro': No such file or directory
Dec 05 09:49:34 compute-1 virtqemud[188731]: Failed to connect socket to '/var/run/libvirt/virtstoraged-sock-ro': No such file or directory
Dec 05 09:49:35 compute-1 nova_compute[189066]: 2025-12-05 09:49:35.020 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:49:35 compute-1 nova_compute[189066]: 2025-12-05 09:49:35.104 189070 DEBUG oslo_concurrency.lockutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:49:35 compute-1 nova_compute[189066]: 2025-12-05 09:49:35.106 189070 DEBUG oslo_concurrency.lockutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:49:35 compute-1 nova_compute[189066]: 2025-12-05 09:49:35.106 189070 DEBUG oslo_concurrency.lockutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:49:35 compute-1 nova_compute[189066]: 2025-12-05 09:49:35.106 189070 DEBUG nova.compute.resource_tracker [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 05 09:49:35 compute-1 nova_compute[189066]: 2025-12-05 09:49:35.276 189070 WARNING nova.virt.libvirt.driver [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 05 09:49:35 compute-1 nova_compute[189066]: 2025-12-05 09:49:35.278 189070 DEBUG nova.compute.resource_tracker [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5658MB free_disk=73.30874252319336GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 05 09:49:35 compute-1 nova_compute[189066]: 2025-12-05 09:49:35.279 189070 DEBUG oslo_concurrency.lockutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 05 09:49:35 compute-1 nova_compute[189066]: 2025-12-05 09:49:35.279 189070 DEBUG oslo_concurrency.lockutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 05 09:49:35 compute-1 nova_compute[189066]: 2025-12-05 09:49:35.366 189070 DEBUG nova.compute.resource_tracker [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 05 09:49:35 compute-1 nova_compute[189066]: 2025-12-05 09:49:35.366 189070 DEBUG nova.compute.resource_tracker [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 05 09:49:35 compute-1 nova_compute[189066]: 2025-12-05 09:49:35.394 189070 DEBUG nova.compute.provider_tree [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Inventory has not changed in ProviderTree for provider: be68f9f1-7820-4bfa-8dbd-210e13729f64 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 05 09:49:35 compute-1 nova_compute[189066]: 2025-12-05 09:49:35.425 189070 DEBUG nova.scheduler.client.report [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Inventory has not changed for provider be68f9f1-7820-4bfa-8dbd-210e13729f64 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 05 09:49:35 compute-1 nova_compute[189066]: 2025-12-05 09:49:35.427 189070 DEBUG nova.compute.resource_tracker [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 05 09:49:35 compute-1 nova_compute[189066]: 2025-12-05 09:49:35.427 189070 DEBUG oslo_concurrency.lockutils [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.148s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 05 09:49:36 compute-1 crontab[235442]: (root) LIST (root)
Dec 05 09:49:36 compute-1 nova_compute[189066]: 2025-12-05 09:49:36.731 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:49:36 compute-1 nova_compute[189066]: 2025-12-05 09:49:36.779 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:49:37 compute-1 nova_compute[189066]: 2025-12-05 09:49:37.428 189070 DEBUG oslo_service.periodic_task [None req-4ac191e7-6301-4690-83aa-8dbb42c15763 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 05 09:49:38 compute-1 systemd[1]: Starting Hostname Service...
Dec 05 09:49:38 compute-1 systemd[1]: Started Hostname Service.
Dec 05 09:49:39 compute-1 podman[235648]: 2025-12-05 09:49:39.820778681 +0000 UTC m=+0.098572496 container health_status d60d3dfd47255bc7c068dbedc03dbf86678863f0414a1b8f8536e4d53dd67a13 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a42d21893a42142b0b00e96296a13ac9d2c0fedf55d6438ad104fd0309c63376-6e7b060d56e295bda0cde218c00ea7828671fc81db3a49bfe8d19accdca66f3a-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78-cde65eec64dc19d709ac52c107b101eccb690e2885c4e35be1bf3bdc20865c78'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_id=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 05 09:49:41 compute-1 nova_compute[189066]: 2025-12-05 09:49:41.778 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 05 09:49:41 compute-1 nova_compute[189066]: 2025-12-05 09:49:41.781 189070 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
